Digital self-control
Algorithms, accountability, and our digital selves
25 March 2019
Algorithms are not new, but thanks to the digital revolution, they’re becoming part of almost every aspect of our lives.
They’re indispensable in the online world due to the need to sort the huge volumes of information online in order to make the Internet the valuable resource that it is today. As the digital economy has grown, the reach of algorithms has extended. Today they’re responsible for almost 40% of stock trades in the UK. They fly planes for over 95% of the time the planes are in the air. And they may soon be driving our cars. Algorithms are also expanding into new areas to help people make decisions about whether to offer an applicant a job interview, whether offenders will reoffend, and what social care provision a service user needs. Despite presenting a technological veneer of objectivity around their decisions, algorithms, and the data collection that powers them, are designed by people and shaped by human decisions.
We’re moving towards a society where access to both public and private services is mediated through algorithms. Algorithms are now entering increasingly controversial areas and making decisions with real implications for people’s lives.
These algorithms analyse vast amounts of data about us to generate a score which will decide whether we can access a good or service. In the public sector, under austerity, tightening budgets have led to a need to use decision-making algorithms to save on staff costs and help decide how to allocate funds and services.
There are currently thousands of digital profiles of each of us, collated from data trails we’ve left online. Acxiom, one of the largest data brokers on the planet, concedes that about 30% of the data held in each profile is incorrect. Given the poor quality of the profiles being built about us and the increasing use of digital profiles in the public and private sectors, incorrect decision-making could have series ramifications in our lives.
This report finds that as algorithms enter increasingly sensitive areas of our lives, we need to have meaningful accountability for those who create and deploy algorithmic decision systems, especially in areas where decisions have a significant impact on individuals. We also must ensure that we, as individuals, are not held accountable for things we didn’t do, or for being someone we are not.
Recommendations
1. Establish an accountable standard for algorithms governing access to goods, services, and law enforcement.
This standard would ensure that individuals know when they are interacting with an algorithm.
Algorithm developers should:
- Ensure clear responsibility of algorithmic decision systems with rules about who is formally and legally responsible for the system.
- Provide details of the accuracy of the system together with a description of their function and intention and a list of data inputs used in deploying the system.
- Provide a statement highlighting any biases and confirming that they are not discriminatory.
- Ensure they have a secure and verifiable audit trail.
- Extend right to an explanation to any decision involving an algorithm.
2. Create a Digital Passport system and an independently run National Data Store
To ensure that digital profiles are accurate, and that individuals are not being scored incorrectly, we recommend the development of a new alternative that ensures we have ownership of our digital profile while prioritising our privacy.
Government should create a Digital Passport system. This would be an independently governed piece of decentralised infrastructure that allows us to prove our digital identity online. In addition, they should create an independently run National Data Store. This would be a decentralised digital data store for our profiles that individuals can access and control through an easy-to-use app or website. While the state would initiate the system, its structure and architecture would ensure that they don’t have easy access to it and their role is restricted to establishing and enforcing the rules and rights needed for the system to work.
We would have direct control over the data, verified attributes, and inferences in our profiles. The definition of personal data would also be extended to include inferences produced in other profiles, too. We would be able to differentiate the data that we want to share with different types of systems and algorithms. The independent organisation running the National Data Store would also stipulate conditions of access, so that companies, government agencies, and municipalities can tap into this identity system in lieu of privately maintained digital profiles and reputation scores.
For companies who chose to continue to use their own systems and profiles, algorithmic decisions that have been based on incorrect information or unverifiable and unreasonable inferences should result in fines for the company deploying the algorithmic decision system and damages for the person involved.
Topics Technology