Publications

I‑spy

The billion-dollar business of surveillance advertising to kids


Today, one in three internet users is a child, but they are using a digital environment that was not designed with them in mind. Unless we take active measures to limit it, our everyday activity on the Web, as well as that of our children, is recorded and tracked. Large multinational companies buy and sell this data to build detailed profiles that are used to target advertising.

Over the course of just 25 years, online advertising has evolved from a niche existence into a preeminent business model of the digital economy. Alphabet, the parent company of Google and YouTube, generated almost 84% of its 2020 revenue, around $135bn, from online adverts, while Facebook generated over 98.5% of its 2020 revenue that way, almost $70bn. Despite being little more than a decade old, so-called surveillance advertising – targeted advertising using personal data provided by websites and platforms – has become the primary mode of monetising adverts for many of these major digital companies.

This report explores the legitimate concerns around surveillance advertising and its use of largescale data collection, profiling, and the sharing of children’s personal information. Children have always been identified as being particularly vulnerable to the power of advertising. The advent of a new way to target individual people, with specific adverts based on their interests or personality, increases this vulnerability. Children are more susceptible to the pressures of marketing, less likely to recognise paid-for content, and less likely to understand how data is used for these purposes than adults.

The online advertising industry, platforms, and tech giants claim that surveillance advertising enables free internet browsing, while rewarding publishers for creating content, and enabling advertisers to promote their products or services. This sounds like a win-win situation for all involved. But in truth, individuals, publishers, and even advertisers themselves are all, to a lesser or greater extent, losing out in terms of their privacy, revenue, or autonomy (or some combination thereof).

Surveillance advertising is demonstrably affecting social cohesion for both children and adults, helping to enable disinformation, clickbait, discrimination, and bias to survive and thrive. It makes disinformation websites much more economically viable than other modes of targeted advertising. Jake Dubbins, co-chair of the Conscious Ad Network, noted that advertisers have helped fund the misinformation that stoked fires in the US Capitol”, while NewsGuard found that over 4,000 brands – including in some cases major pharmaceutical companies – bought ads on misinformation websites publishing COVID-19 myths”.

Surveillance advertising also raises questions of legality. This includes how data is collected, acquired, or bought – especially where children have existing legal protection – as well as the aggregation of this data into profiles that are broadcast over advertisement auction networks. This is not surprising. The model pre-dates modern privacy legislation, like the General Data Protection Regulation (GDPR), and was therefore designed to work in a much looser regulatory environment.

Surveillance advertising is also failing to add proportionate value for advertisers themselves. A recent report by PriceWaterhouseCooper concluded that when considering the complexity of the online advertising ecosystem and the amount of money it pockets, these challenges and complexities do not serve the principal interests of advertisers or publishers.” They found that publishers forgo 49p of every £1 spent on online adverts in favour of online advertising intermediaries who each take a thin slice of the pie. In addition, major platforms, like Facebook and LinkedIn, have been caught defrauding advertisers by providing misleading metrics to boost the perceived impact of the adverts placed.

The dubious legality of surveillance advertising, along with the harm it causes, especially to children, as well as its failure to even support advertisers’ and publishers’ revenue, means that the current system is not fit for purpose. In this report, we propose three recommendations to address these issues, each set at differing levels of ambition and effectiveness.

As an initial first step, policymakers could require platforms to use behavioural data that they have already collected to identify potential child users of their platform to ensure they are not served surveillance adverts. Any children identified on platforms with age restrictions would have had their data collected illegally and would need to be compensated and have their data deleted. However, reliance on this mechanism will likely prove inadequate on its own because it would legitimise ongoing data collection by tech companies and ultimately undermine hard-won data rights, where they exist.

A second proposal, which would go further in addressing the issues of surveillance advertising, is to implement a legal responsibility for information fiduciaries. This obligation would require companies to provide an active duty of care to data subjects. The basic concept is that when we give our personal information to an online company to get a service, that company should have a duty to do us no harm and exercise care towards us when using our information. Although it only offers partial protection from surveillance advertising, if implemented well, it could re-balance the power dynamic between a platform and its users and play a large part in helping to nurture new dynamics in the digital economy.

Finally, the most ambitious and effective solution would be to completely outlaw the practice of surveillance advertising. This would not ban advertising itself or the ability of websites to monetise their visitors’ attention by showing them adverts. All that would change is that these adverts would no longer be targeted based on a user’s personal data, but rather targeted on contextual data instead, based around the webpage and platform itself and the characteristics of their likely users.

Drafting new legislation would be the best way to achieve the ban. The new legislation could specify the limits on what information is permitted to be sent out by website owners seeking to have adverts placed on their site. It would prevent any personal information being sent to the real-time bidding (RTB) network (or other system) to provide an advert. We propose that nothing personally identifiable should be sent, with allowable permitted data limited to a new green list’. In addition, to ban surveillance advertising not done through online auctions, the legislation would also need to ban website owners (eg Facebook) from selling advertisement space on their sites using the user profiles they have built up using personal data.

Image: iStock

If you value great public services, protecting the planet and reducing inequality, please support NEF today.


Make a one-off donation

£5 £10 £25 £50 £100
£

Make a monthly donation

£3 £5 £10 £25 £100
£