Ipsos MORI Thinks

Automation, ai and the changing nature of trust

SARAH CASTELL Head of Futures

REBECCA WRITER-DAVIES Research Manager

We are in the middle of the fourth industrial revolution. Computing has reached the point at which AI can learn directly from examples, data, and experience – enabling it to perform specific tasks intelligently and more rapidly and accurately than any human. Combine this with blockchain technology and the internet of things, and some writers have said that ‘trust’ will be revolutionised as it will become so much easier to authenticate data about provenance, technical details and more.

The public know that change is coming – but they do not know how to respond to it or how it might affect them. Around the world, people agree both that technology improves lives (76%) but many also fear that technical progress is destroying our lives (50%).69 Equally, the British public are uncertain about the societal benefits of automation (32% agree they are positive) and are unconvinced about its impact on quality of life (30% agree and 26% disagree).70 It is not just the public who are unsure – governments also express concern. UK parliamentarians surveyed by Ipsos MORI agree that automation will have a positive impact on the economy (66% agree) but also worry about its impact on jobs – almost half (45%) think more jobs will be lost than gained as a result of automation in the next 15 years.

So, for business and government, it is becoming more important to consider the ethics of using these technologies, and the kinds of worlds that consumers and citizens want to see as a result of innovation. In the UK, public engagement on this is developing; bodies such as the Academy of Medical Science, the Royal Society and the Ada Lovelace Institute are engaging the public on the role of data science, machine learning and AI in the future. Responses to these public dialogues show both an interest, as well as an uneasiness about the role of AI and other automated technologies in society.

The first challenge is explaining purpose. Globally, only one in three people say they have a clear idea of what data companies hold about them and what they do with it.71 The general public want to know how an organisation intends to use any data it collects from consumers or the public, and crucially whether this will lead to broader social benefits (like increased choice) without causing harm (like leaving vulnerable or poorer people without choice). We see high trust in public institutions, like the NHS, compared with big business – the UK public are significantly more likely to trust NHS hospitals/clinics (76%) than business generally (47%).72 But even then, they are uncertain whether they should trust public sector healthcare providers with their data – just half (51%) of the British public say they trust the NHS with their data, while a third (35%) say they distrust it.73> Our UK public dialogue work suggests that when a positive broad social purpose becomes clear, organisations are more trusted with our data.


The second challenge is how to understand the longer term effects of introducing automated systems on society. For consumers, data-driven services are already shaping ever more personalised experiences – but people in our UK dialogues express concern that if the switch to a more automated experience is made, consumer experience will change; being unable to talk to a ‘real human’ rather than a chatbot, for example. This ‘self-service checkout world’ tends to be described with constant mild annoyance and the potential benefits tend to go under the radar.

Aware of the need to keep the public on side – especially where huge public data sets are required to develop complex machine learning algorithms – industries which rely heavily on data-driven technologies are starting to come forward with guidelines for how society should proceed. In 2018, Microsoft published its six ethical principles to guide the development and use of artificial intelligence. In the same year, the UK Government set up the Centre for Data Ethics and Innovation to advise Ministers on how to unlock the potential of AI while ensuring it is developed safely. The ethics of using digital technology in healthcare have already been much discussed, given bioscience’s long tradition of ethical debate – in February 2019, the UK NHS published its Code of Conduct for data-driven health and care technology to inform the use of AI within the institution.

We think all industries need to consider how their implied social contract will change as a result of AI and automation. The public have expressed concern about the impact of AI on jobs – half (54%) believe more jobs will be lost than gained as a result of automation within the next 15 years.74 Companies need to consider what they are bringing to people as both citizens and consumers.  Though individuals may want faster, cheaper, automated services at the point of purchase, how do businesses offer a narrative as employers which benefits society? Amazon’s success has shown that worry about high street shops closing is not enough for consumers to avoid using faster, cheaper services – but at what point does its impact start to have a backlash and where are the public’s ‘red lines’?

Amara’s Law75 suggests that we tend to overestimate the effect of technology in the short-term, but underestimate its effect in the long-term. This makes it difficult for the public to imagine what the future will look like. Along with the pace of technological development, it is very difficult to future-proof legislation and governance in these areas. Knowing what the public see as ‘trustworthy’ and what they see as ‘creepy’ – both now and in the future – will be important. We will be following it closely.

“In Mexico, just one in ten people think the police are trustworthy. This rises to eight in ten in ChinaX

Trust: The Truth?

We decided to write this report because we wanted to test if the prevailing narrative matched the data. The ‘truth about trust’ is that trust is complex, and takes many forms (many of these forms are not in crisis or decline). Without some degree of trust society simply would not function…