MARK MCGEOGHEGAN Senior Research Executive
A large part of our economic model is now completely dependent on the willingness of users to hand over their personal data.
So, it is understandable that commentators and practitioners look at the low public trust in any organisation to collect and use data responsibly, and declare it a ‘crisis’ particularly with declining trust in information online. But, as ever, the reality is more complicated.
It’s not controversial to say that trust in how business handles data is low. An Ipsos/World Economic Foundation global study found that fewer than two in five people trust any public organisations or private companies to use the personal data they hold ‘in the right way’.64 Only healthcare providers are trusted by a majority to use their personal data correctly.
However, there is no evidence to suggest this is anything new. In 1991, two-thirds of EU citizens on average were concerned about data privacy; by 2008 that figure for the EU27 had barely changed (although there were country by country moves).65 Between 2010 and 2015 the average proportion of EU citizens concerned about misuse of their data dropped only one point.66
Europe is one of the most privacy sceptical regions in the world, so it should come as no surprise that people tend to be uncomfortable about handing control of personal data over to others, and that this has been the case since before the world wide web existed.
Yet this sceptical and distrusting environment is the one in which the new digital economy has emerged and thrives.
The privacy paradox
A 2010 study found that perceived privacy – a user’s subjective expectation of how much personal data they are disclosing – and trust in an organisation to not misuse data, or collect it unnecessarily, could predict whether a user would disclose personal data.67 In cases of low trust, high perceived privacy led to disclosure; in cases of low perceived privacy, high trust in an organisation did likewise. A customer may not trust big businesses such as Tesco, Arcadia or Amazon, but will still disclose a range of personal details to make a purchase and have it delivered to them, because they see this disclosure as necessary and not excessive.
However, per the study, where users don’t see an immediate need to disclose data, or discover they are disclosing more than they thought, they will refuse disclosure unless they trust the business they are making the disclosure to. From this view, the narrative that data distrust is a crisis makes sense – concern about privacy, in the absence of trust, should cripple the new digital economy.
Yet concern and distrust have not stopped us from ‘confessing’ to Google, revealing our inner selves on Facebook, and allowing sites to track us across the web using cookies. These ‘revealed preferences’ pose a significant problem for the ‘crisis’ narrative, because they show that we are, generally, OK with disclosing personal data regardless of our ‘expressed’ concern and distrust.
A 2018 evidence review by Ipsos MORI suggested several factors that might explain this.68 We may be more apathetic than we like to admit – saying that we worry about data collection and misuse, but not caring ‘in the moment’. We may also overestimate our knowledge of both the extent and methods of data collection, and our understanding of how to protect our privacy. We should also consider how easy it is to disclose personal data. Data collection is passive, it doesn’t require us to do anything, making disclosure the path of least resistance in most cases. Disclosure is made all the easier by a devaluing of data – users don’t see their personal data the way businesses do, and are generally unaware of how valuable it is.
A new model for your data?
The dominant ‘crisis’ narrative has led many to propose new models of data collection which compensate users for collecting and using their data and give them more personal control. They advocate new incentives to overcome the barriers of concern
But concern and distrust are not the barriers to consumers’ use of online services we might think. The role of data trust in the new digital economy is smaller than often assumed, and is part of a more complex psychology in which disclosure of personal data isn’t just driven by trust, but by apathy, perception, and frictionless user experiences making ‘disclosure’ the easy thing to do.
Interrupting that seamless process of disclosure and putting a price on data will undermine the flow of data in two ways. It will stop disclosure from being the path of least resistance, prompting users to take more active decisions in which their expressed preferences will play a greater role. It will also create clearer indicators as to how much data is being disclosed and when, filling gaps in users’ knowledge about data collection and use.
The risk is that users’ concerns are confirmed, increasing distrust, while also removing some of the additional factors which make disclosure easy. Why would we expect an informed, distrusting user to disclose data just because they are being paid, and what would the equilibrium cost of any given data be? Many online businesses are completely dependent on users’ data, and their services are not a necessity – the price point at which users would disclose data could render their business models unsustainable.
This would not only impact businesses, but users too. A backlash against data disclosure could lead to services shifting from being free to subscription-based, without targeted advertising, making online interaction costlier for users, and online consumer choice more complex.
Data trust is not the be-all and end-all. In reality a range of other factors matter, and failing to appreciate the complexity of the situation precipitates the very crisis we’re seeking to avoid.