Blog

One year after the Cambridge Analytica scandal, we need to beef up responsibility for algorithms

We have choices about the digital future that we want.


A year ago this month a previously unknown company called Cambridge Analytica shot to the front pages of the newspapers. Journalists revealed that Facebook had allowed the company to access users’ personal data to build detailed and intimate profiles of millions of people, with the aim of influencing how they vote. Facebook had been allowing app developers to access the personal data not only of their own users, but also of their friends and their friends’ private messages. Cambridge Analytica had been using this data to build detailed profiles of people without their consent.

For those who had been paying attention this was nothing new. Many of the revelations had been published 16 months earlier – in relation to Cambridge Analytica’s work with US presidential hopeful Ted Cruz – and many of the concerns had been discussed in tech circles since 2011. But something changed in 2018. Public awareness and condemnation reached a tipping point. It became obvious that Facebook’s practices weren’t the exception — they were part of a larger system of systematic mass data collection, without any oversight.

One year on, Facebook has altered some of its most damaging data policies, such as access to friends’ data or private messages, yet its fundamental model has barely changed. Facebook’s motivation is to gather and collate as much data about people as possible, including in shadow profiles’ that users never access. This data is collated into the digital profiles which algorithms use to target campaigns at us, whether for advertising or elections. Facebook’s profits are generated by the current system of mass data gathering and sharing — it was naive of us to assume that this corporate giant was going to change its model.

The scandal was a perfect storm of many of the issues that we’ve been researching at the New Economics Foundation: data sharing, digital profiles and algorithmic accountability. Facebook’s lax data sharing policies demonstrate that more severe restrictions should be placed on the onward sale and sharing of personal data.

What we see when we login to Facebook is mediated through an algorithm. The algorithm is an automated decision system that determines everything from what news wesee to what friends we are recommended. But beyond filtering information on the internet, we are moving towards a society where access to both public and private services is increasingly mediated through algorithms using digital profiles.

Each of us currently has thousands of digital profiles that we cannot see, collated from the data trail we’ve left online. Cambridge Analytica built some to refine their electioneering. Acxiom, one of the largest data brokers on the planet, has admitted that about 30% of the data held in each profile is incorrect. Given the increasing use of digital profiles in the public and private sectors, decisions based on incorrect data have dangerous implications.

Today, we have published a report called Digital Self-Control, arguing that we need a beefed up system of algorithmic accountability. This includes setting legal standards for transparency over who is responsible for an algorithm, how accurate it is, and a guaranteed right to an explanation for the decisions made by the algorithm.

But more fundamentally, we need new systems to protect our privacy and be able to prove our identity. We want the government to set up a Digital Passport and National Data Store, both run by an independent body. The Digital Passport would be a piece of technology which would allow us to prove our identities online without giving away personal information. The National Data Store would securely host all the data associated with us online. With a simple app or website, we would be able to access and correct our personal data.

We have choices about the digital future that we want. If we don’t begin designing new ways of running the digital economy, we will find our data held by largely unaccountable corporations who maintain huge, potentially inaccurate, profiles of us. The tech giants are the ones profiting from the current system, and they have no incentive to change. We have to change the rules of the digital economy to make it work for everyone.

If you value great public services, protecting the planet and reducing inequality, please support NEF today.


Make a one-off donation

£5 £10 £25 £50 £100
£

Make a monthly donation

£3 £5 £10 £25 £100
£