The Ethics Behind Data Use - A Data Panel Discussion
In this article, we take a deep dive into the takeaways from the panel of experts at this Data Driven Guernsey event.
Data analytics is all around us, quietly collecting data in the background of our daily lives. Helping us make decisions and making life more convenient, shaping our exercise, our journeys, and our choice of food and cinema. Even suggesting possible new friends for us. But where do we draw the line between blind faith in numbers and the people behind them? In the fifth event of Data Driven Guernsey Week, we were joined by panelists Nichole Culverwell CHART. PR, MCIPR, Director of Black Vanilla, Matt Thornton is Co-Founder of Cortex Technologies, Sarah Snelson is Director of Public Policy at Frontier Economics, and Rachel Masterton, Deputy Data Protection Commissioner at the ODPA. The data panel discussion was compared by Justin Bellinger, Carrier, Wholesale & Business Development Director at Sure Guernsey.
Is data the new oil? Is it tradeable? An asset that is traded fluidly or is it more like land, harder to trade and needing to be nurtured?
Data is everywhere and being collected by everyone, it has become a crucial currency in the fight for customers' attention and enables businesses of all kinds to better understand their customers' needs and how they respond and use their goods and services. Through an economic lens, we can take a look at the value of data as it exists in many different areas - data in and of itself may not be valuable, it’s the way you process and use that data that determines the value.
Rachel shared her concept of how data is more like sunshine. 'You don’t want to use it all up or get rid of it as there’s a lot you can get from the data you hold, you want it to be sustainable. If the sun stopped shining, all hell would break loose in about 8 minutes. This analogy can probably be likened to data too.'
Smart devices can now collect enough data to allow it to be able to spot onset symptoms of health issues, how valuable should we consider our data to be?
Our environments are full of smart technology, whether your smart fridge is telling you you’ve run out of milk or your smart pet flap keeping the neighbor's cat out of your house we are seeing all these smart devices capturing all this data and personalizing it for your daily use. Our decisions are heavily affected by the data that is collected and then presented back to us. For example, you leave the house for work and when you’re stepping into your car- your phone tells you how long it will take to get to work, where to avoid traffic, and any coupon deals at the cafe where you buy coffee. You haven't inputted this data into your phone, your technology is collecting hidden data that is smartly guessing where you’re likely to be going at 9 AM on a Monday morning and how it can make that journey most convenient for you.
Have you used a social media platform or free app lately? Most people may not understand what data is being collected on them by companies providing 'free' services ie. Facebook—let alone what those companies may do with such data. As Nichole says, 'If something is free to use then YOU are probably the product.’
The value is in the trust it can create, we as business owners trust in numbers- being able to talk to the public about statistics has brought people together, and customers trust us enough to share their data with us, but if the trust is broken then you’re looking at far more than a hefty fine. The damage of reputation.
'Behave like a custodian of the data you are being tasked to protect! Think about the people, the people behind your data, humanising this data will help you decide how you want to use it and what steps to take if you have a data breach.'
Nichole Culverwell CHART. PR, MCIPR, Director of Black Vanilla
What if you lose or have data harvested?
The loss of the data is one thing, trust broken and share prices ruined. The second hit for a company is how you behave in the aftermath. As these breaches become more common, do we become more used to this risk? Customers don’t become more forgiving of organisations messing up and responding badly. An earnest apology will show your customers that yes, your business made a mistake handling their data but you are doing your best to make up for it.
Attitude can often be more damaging to reputation than anything else. It’s important not just to think that a fine under the data protection law is bad enough, and simply an impact on your business's expenses. You have to remember the impact it will have on your customers.
Matt mentioned that from a technology point of view, it’s inevitable that you’ll have a data loss at some point. If you do get breached then you should be proactive. Lock accounts, change passwords. Lock bank cards to avoid money being stolen. What can you do to show you’re trying to protect your customers?
Apart from data protection how can we be confident of ethics being written into algorithms?
Justin started with, 'I think the danger is that actually, the reasons for designing the algorithm will inherently be biased in some way, though we would like to believe that that wouldn’t be so. I think that we can’t be certain that these sorts of algorithms will be clear of bias.'
There are lots of rules for using data from an ethical perspective. We have the data protection law locally and as a business, you want to layer over these rules with your own. Think about if it was your own data, does it make you nervous to have it shared? Apply this empathy to your rules! Do you need to gather this data? Is it ethically viable? Are you collecting data to try and have better social outcomes? These are questions that need to be asked from the boardroom down so that the ethical rules can be decided and applied to new and existing algorithms.
Matt explained, ‘this highlights the difference between when the computer goes bad to when the people go bad, tech cannot be blamed as such, faults in the system can be forgiven vs the decisions people make as to how they decide to use people’s data. Data requires us to think about it with an ethical lens and criticise it all the way through the building and useage process.’
'When using these data for policy decisions, there’s always a general level in an algorithm where we can look to see if there are discriminatory biases and attend to them.' Sarah added.
Takeaways from the panel
- Sarah Snelson says, I think we (as practitioners) have got to find balance and we have not yet got on top of that balancing act when it comes to data and ethics. We have to be very mindful of the individual's data that we are using, shining a light on the things that can go wrong and then deciding your next steps.
Matt Thornton says, technology-wise there’s work to be done. More people need to be saying 'should I trust this data?'
Nichole Culverwell says, when it comes down to trust, every data touchpoint needs to have that ethical lens overlaying it. I may not be a data scientist but I collect and use data in my job so I need to take the steps to educate myself and take that personal responsibility if things do go wrong.
Rachel Masterton ended the event with the quote, “Whenever I get gloomy with the state of the world, I think about the plethora of data in servers. General opinion’s starting to make out that we live in a world controlled by computers, but I don’t see that. It seems to me that people are everywhere. Often they are not particularly tech-savvy or blog-worthy, but they are always there – programmers and designers, engineers and project managers, customers and clients, users, surfers, tweeters. If you look for them, I’ve got a sneaky feeling you’ll find people actually are all around.”