Blog

What’s your score?

How discriminatory algorithms control access and opportunity


In China, a new system using data from public and private sources aims to score every citizen according to their trustworthiness’ by 2020. Cheating on a video game could lower your score. Buying a lot of nappies could give you extra points. This number determines whether you can buy plane tickets, how long you wait to see a doctor, the cost of your electricity bills and your visibility on online dating sites. 

Across Europe and the US, people are shocked by this dystopian IT-backed authoritarianism. But citizens of these countries are already being scored by systems based on the same logic – they just haven’t noticed.

Public debate in the UK on datafication’ has been overwhelmingly concerned with individual privacy and the protection of personal data. Understandably, a lot of people don’t really care, feeling they have nothing to hide.’ We should care. But for this to happen, public debate needs to shift focus to the ways our digital footprint is used to produce scoring systems that shape our lives.

Similar to Chinese social score’, these algorithms rank and rate every member of society to determine what we get access to and the condition of that access: from sorting job applications and allocating social services to deciding who sees advertisements for housing and products. They decide whether you are a desirable employee, reliable tenant, valuable customer — or a deadbeat, shirker, menace, and waste of time”.

The current trend in the corporate sharing economy has been to promote the democratising potential of access to goods and services over ownership.The actual impact of this shift has in fact been to make conditional access to private property the norm, amplifying wealth inequalities. At the same time, in the public sector, an obsession with innovative’ automated decision-making has reframed access to rights and resources (like housing or healthcare) as an issue of efficient allocation rather than fairness or justice.

Welcome to the scored society

Credit scoring has always ranked and rated citizens based on their consumer behaviour. But fintech’ (financial technology) is now pioneering the use of alternative data’ to reflect individuals’ true’ personality. With tools like Tala, whether you organise your phone contacts by their first and last names, and or call your mother regularly will generate a score that could dictate your eligibility for a loan. Other startups like the insurance company Kin are looking to use Internet of Things devices in people’s homes to price their home insurance, for example using water sensors to detect leaks.

The same systems are creeping into the rental sector, with the advent of proptech’. Desiree Fields has shown how the financialisation of the rental sector has resulted in new software platforms for private equity firms like Blackstone to manage massive portfolios of geographically dispersed homes. These technologies have gamified the tenant’s experience of renting by automating everything from maintenance requests and rent payments to evictions.

INCENTCO offers a platform through which tenants who consistently act in a way aligns with landlords’ interests (such as paying rent on time) they can build up enough points to get rewards, like new appliances, smart home technologies and general home upgrades. This automated system has worrying ramifications: if you are a single mother working on a zero-hours contract whose shift is cancelled and who unexpectedly has to pay for her kid to go to the dentist — tough. Now your score is too low to get a new fridge.

A number of new apps such as Canopy have developed similar RentPassport’ features, allowing users to demonstrate their financial prudence’, building up a reliability score’ over years of renting that informs their credit scores.

People using their phones

The power of algorithms to make decisions about our lives is also growing in the public sector. In Automating Inequality, Virginia Eubanks exposes how governments and local authorities are increasingly using digital tools to determine which families most deserve support. This comes as more and more people are living in poverty while less resources are allocated to help them under austerity.

In the US, algorithms have replaced nurses in determining how many hours of home care visits a patient is entitled to. In some places, funding dropped by as much as 42% as a result and when service users tried to understand why their hours had been cut, the state refused to share the algorithm’s decision-making process. Similar calculations sift through survey data to create a ranking of deservingness’ for housing waiting lists in places like Los Angeles.

And these developments are underway in the UK in a number of sectors. These range from automated screening processes for housing benefits as 78,000 families are currently homeless or in temporary accommodation to algorithmic prediction such as Xantura’s Early Help Profiling System’ to determine which children are at risk of abuse. While some councils develop systems in-house using their own data sets, there is also a trend towards partnering with private companies to acquire tools that incorporate other kinds of data. 

Experian, who are leading the way in the use of alternative data for credit scores, now also offer to help the public sector make better decisions”. An investigation by Big Brother Watch in Durham revealed that the police were using Experian’s services to make custody decisions. The company’s Mosaic’ system ranks individuals and households according to crude and offensive stereotypes, from disconnected youth’ (‘avid texters’ with low wages’ and names like Liam’ or Chelsea’) to crowded kaleidoscope’ (‘multicultural’ families in cramped’ flats with name like Abdi’ and Asha’) and penthouse chic’ (young professionals on astronomical salaries’ who drink a lot of champagne).

On top of deciding who gets to access basic public services, algorithms are being developed to decide who gets citizenship. Trump’s extreme vetting initiative’ in the US would use available data to predict the chances of a visa applicant’s likelihood to become a terrorist versus a contributing member of society.

Opening the black box

Algorithms with no accountability are dividing society up into haves’ and have-nots’. Seemingly innocuous data on location or patterns of behaviour and consumption used to assess an individual’s reliability’ also function as proxies for gender, race and class. The result is that these scores end up amplifying existing social inequalities. 

These scoring systems are often described as black boxes’. It’s almost impossible to find out how they work because they are run by private companies (often delivering services for the state) and therefore their inner workings qualify as trade secrets’. Even when algorithms are made public, their overwhelming complexity and scale often make them almost impossible to understand.

If these algorithms can’t be seen or understood, how can we assign responsibility for harm when they produce discriminatory outcomes? 

This emphasis on smart’ and efficient’ technological solutions to social problems obscures the political choices that produce them. Virginia Eubanks puts it best when she says that we outsource these inhuman choices to machines because they are too difficult for us…we know there is no ethical way to prioritize one life over the next”.

For more on how algorithms and data are reinforcing inequality, read our report, Controlled by Calculations.

If you value great public services, protecting the planet and reducing inequality, please support NEF today.


Make a one-off donation

£5 £10 £25 £50 £100
£

Make a monthly donation

£3 £5 £10 £25 £100
£