Publications

Blocking the data stalkers

Going beyond GDPR to tackle power in the data economy


Ninety per cent of the world’s data was created in the last two years, and over 2.5 quintillion bytes of data are produced every day.

Whole companies are built around principle of relentlessly collecting as much data about internet users as possible, and monetising it. Our digital selves are now marketable products. And this data is then used to market products to us. In 2018, almost half of all advertising spend will be online, rising to over 50% by 2020. And two digital giants – Facebook and Google – now control 84% of the market. The companies are hugely reliant on ad revenue, with Facebook collecting 97% of their overall revenue from ad spending while at Google it accounts for 88%.

When someone clicks a link to a webpage, between their clicking and the page loading, information about them is compiled and sent out in order for advertisers to assess the value of showing them an advert. These are called bid requests’, and they totally fail to ensure the protection of personal data against unauthorised access. They can even include sensitive information such as a person’s sexuality, or political beliefs. Bid requests on UK users are being sent out at a rate of almost 10 billion per day, or 164 per person per day, and are seen by hundreds if not thousands of advertisers.

25% of all ad spend is lost to fraud. The ad tech industry is potentially exposing every internet user to the non-consensual sharing of their data with thousands of companies who are all able to copy, share and sell the data on again. The now infamous Cambridge Analytica was one of many companies that had access to this stream of personal data.

While the General Data Protection Regulation (GDPR) addresses some privacy issues, it does not address the issue of power in the data economy generally, and the ad tech sector specifically. GDPR is limited because it focuses too heavily on individual actions, like giving consent, or lodging a complaint with the Information Commissioner’s Office. Accountability for tech giants is undermined by allowing justifications such as legitimate interest’ or necessity’ to be used by data collection companies. GDPR also fails to protect metadata or inferred data, despite the ability of both to identify individuals, and does not adequately control the on-sell of data between firms.

Recommendations

We recommend going further than GDPR in a number of ways.

We recommend a ban on sending personally identifiable data out to advertising networks. Instead of relying on the sale and re-sale of personal data, when users click on weblinks, bid requests should give advertisers demographic information about the audience of the website. This would allow them to show demographically appropriate advertising, without compromising the privacy of users. Where websites do sell ad space that uses personal data, they should be required to gain explicit consent from individuals in order to do so.

This proposal would be transformational.

  • It would tackle data leaks, by preventing any personal data from being sent (and therefore potentially compromised) during bid requests.
  • It would reduce the commodification of personal data, by reducing the market for personal data and diminishing the ability of companies to monetise it.
  • It would force tech giants to diversify their business model away from services based on constant surveillance and advertising.
  • It would give power back to websites which spend time producing content and have a dedicated user base.
  • It would fight back against ad fraud, by halting the revenue that can come from fraudulent sites.

We also recommend:

Devices, software, and online interactions should be subject to privacy by default and design. This means they would be automatically set to not collect, share or sell on our personal data. We would then have a series of options and tools which we could use to change this default setting to specify which third parties could gather data on us securely and for what purpose.

When consenting to data collection and sharing the terms and conditions of any website or service, providers should make it clear exactly what data is being collected and who it may be shared with or sold to. This information should be standardised and consistent. To help with this, reviews of terms and conditions could be crowdsourced, or consent could be given by proxy through trusted individuals or groups, perhaps for a small fee.

Protecting people should be prioritised over corporations’ business models by restricting the use of loopholes, like the GDPR legitimate interest’ justification.

Data sharing and selling between companies without the consent of the data subject be banned, whether in the same company family (like Google and YouTube) or totally separate. We would bring an end to this by restricting the sale of third party access to our data to cases where we have given our explicit consent to grant that specific third party access.

If you value great public services, protecting the planet and reducing inequality, please support NEF today.


Make a one-off donation

£5 £10 £25 £50 £100
£

Make a monthly donation

£3 £5 £10 £25 £100
£