Skip to main content
A man in a coat walks down a street carrying grocery bags

Interview: Don’t Fear the Machines, Fear the People Running Them

What Happens When You Take The ‘Human’ Out Of Human Rights?

The five-week wait for his first Universal Credit payment forced dad of three “Zach L.”, a sales assistant, into debt. © 2020 Samer Muscati/Human Rights Watch

Governments worldwide increasingly use automation to deliver social welfare programs to people. They claim automation helps speed up the delivery of welfare support. But losing human input can spell danger for people’s livelihoods. As new research shows, badly-designed algorithms in the UK are miscalculating the financial support people are entitled to, and plunging people – even those with jobs - into debt and poverty. Amos Toh spoke to Stephanie Hancock about the threat that automation poses to human rights.  

Your research found that people are being hurt by a system designed to help them. How come?

In 2013 the UK government began rolling out a revamped benefits system called Universal Credit, which gives a single monthly payment to support unemployed people or people with low incomes. But the way the government has automated the system is hurting people.

The government uses an algorithm – a sequence of rules that tells a computer how to perform certain tasks – to calculate a person’s Universal Credit. A key rule is that people with higher incomes should receive less benefit. But the data fed to the algorithm only reflect the wages people receive in any given month, not how frequently they are paid.

This creates absurd situations where the algorithm assumes that if a person receives two separate pay checks in the same calendar month, that their earnings have suddenly doubled. The amount of benefits people receive fluctuates wildly as a result. Some months people get less than half what they are entitled to, and are left hundreds of pounds short and struggling to make rent or buy food.

This reflects a bigger problem: the algorithm is not sensitive to people’s financial reality because it uses too rigid a framework to assess their needs. This can really hurt people in irregular or low-paid jobs.  

The five-week wait for his first Universal Credit payment forced dad of three “Zach L.”, a sales assistant, into debt. The government’s flawed calculation model also caused his payment to drop from £1024 in September 2019 to £471 in October. Struggling to support his partner and three children, he has been forced to rely on the foodbank. But the food parcels are ill-suited to their 6-month old’s needs. “They can’t give us nappies, which are essential for the baby. They can’t give us wipes, which are essential for the baby. The baby just started weaning, which means he needs baby food, which they can’t provide.” © 2020 Samer Muscati/Human Rights Watch

So the most vulnerable people are being hurt?

Yes. Unforeseen financial shocks triggered by the system are more likely to affect people with low incomes who don’t have much savings, people supporting large families, and single parent households, which are overwhelmingly women-led.

Which stories stick in your mind?

The first person I interviewed was Zach, a father-of-three. He didn’t have money for the bus fare so had walked for 40 minutes in the freezing cold to a food bank to collect groceries for his family. He worked as a salesperson at a computer store but this wasn’t enough to sustain his family. After he applied for Universal Credit, he endured a five-week wait for his first payment. As if that wasn’t enough, his payments also started fluctuating wildly. That pushed him into debt, and bailiffs started showing up at the family home threatening to take away his things.

Is the government going to fix the algorithm?

For years, the government resisted changing the algorithm, claiming this would require expensive human intervention and detract from its automation objective. But things got so bad that a UK appeal court stepped in, ordering the government to fix the problems the algorithm is causing for people receiving regular monthly salaries.

This is encouraging, because it shows that courts are willing to hold governments accountable when their algorithms hurt people. But the ruling doesn’t address the plight of people paid more frequently or with irregular earnings – that would require more human involvement in the decision-making process. And while the government has accepted the ruling, it has yet to fix the problem.   

Would more human intervention improve things?

Yes, but it needs to be the right kind. One thing the government did well in the pandemic was to roll out a phone helpline for people moving onto Universal Credit during Covid, staffed by real people who helped them troubleshoot their claims.  

The government also temporarily lifted some draconian rules, such as sanctions that reduce people’s benefit for not looking hard enough for a job, and clawing back government debts from people’s benefit payments. But it’s now restarted both – a worrying sign the government is trying to get back to business as usual.

Is it the algorithm that is bad, or the data?

The most obvious mistake here is the data. The government is trying do something very nuanced with incomplete data: interpret how people’s needs change. The data is not fit for purpose, causing the algorithm to misinterpret people’s financial reality.

But the real problem here is the government’s misguided thinking that the benefits system should mirror a world where people have steady, monthly-paid jobs.  The technology it has built is failing many people who don’t conform to this narrow ideal. On top of that, the government wants to promote efficiency and cut costs. When the system makes mistakes, the burden inevitably falls on people least able to afford them.

In 2018, dad of three "George S." lost his job at a tin company and started claiming Universal Credit. The Department for Work and Pensions, the government’s social security ministry, has sanctioned him for failing to submit proof online showing that he is looking for a job – a requirement he has struggled with because he cannot read or write. During the months he was sanctioned, he says his Universal Credit payments were cut in half. “The sanctions affected the children”, said George, “I can’t buy clothes for them or food. That’s why I come to the foodbank.” © 2020 Samer Muscati/Human Rights Watch

Where else has automated decision-making gone wrong?

This summer there was another algorithm scandal, when A-level (High School) students had their exams canceled due to Covid. Thousands of students had their grades unfairly changed by computer. The grading algorithm was deliberately designed to consider the historical performance of the school that the pupil attended, even though this had nothing to do with the ability of the person actually being graded. That’s the problem - not that the algorithm went off and did its own thing.

Another problem is facial recognition surveillance, which is spreading in the UK and many other countries. Facial recognition algorithms are typically developed on images of white faces, so it is no accident that they are more likely to falsely identify people of color. But even if we eliminate this bias, it’s still a powerful surveillance tool that can fuel racially biased policing.

People worry that automation will ‘go rogue’ and start harming people. Will it?

Artificial intelligence ‘goes rogue’ only if we set it down that path. But algorithms that work exactly how they are supposed to aren’t benign either; they can still be used to abuse human rights. That’s why protecting rights, such as the right to social security, is a really important starting point for developing automated systems. People whose rights are most affected by these technologies should be involved from the beginning. And the people designing these systems should change course when things go wrong.

In May 2019, “John S.” was eager to work again after a period of illness, and found a job at a garden center in Newcastle. But he ran into financial problems almost as soon as he started work. The government’s flaw method for calculating Universal Credit started causing unpredictable fluctuations in his payments. “I am behind on my rent, my Council Tax, my water [bill] is in arrears”, John said. “I am £2500 in debt because of Universal Credit.” © 2020 Samer Muscati/Human Rights Watch

Has the UK government been doing this?

Not nearly enough. The government has consistently underplayed people’s concerns about Universal Credit. I was told that when staff from the Department for Work and Pensions, which runs Universal Credit, met directly with people floundering while on the benefit, that they were visibly shocked. It suggests the government hasn’t done enough to understand people’s struggles and their needs.

Some people are scared about large systems being run by machine. Should they be?

They should not be scared of the machines themselves, but of the people running the machines.  People talk about creating robots that can think on their own, but the reality is far more mundane. And scarier.

Take the Universal Credit algorithm, which is creating terrible hardship for people even though it is simply doing what it was programmed to do. Automation has the capacity to scale up flawed policy decisions, causing incredible harm to thousands of people when not done properly. Worse still are the people in government who don’t want to fix problems they know exist.

Do governments know that algorithms can be dangerous if not designed properly? 

It’s their job to know. If they don’t, they need to find people who do. I don’t mean just computer science experts. You need people directly affected by benefits decisions and people familiar with human rights too.  If governments have the resources to develop this technology, they also have the resources to design the system to protect people’s rights. Knowing that their system is hurting people, and refusing to fix it, is a political choice.

Your tax deductible gift can help stop human rights violations and save lives around the world.

Region / Country