The ‘digital poorhouse’: coders need a Hippocratic oath to protect disadvantaged people

In America, there is the “Match.com” of homeless services and a tool that predicts if a child may be at risk of abuse.

Governments around the world are increasingly turning to automation to control how people interact with government services.

In Australia, for example, the Department of Human Services uses a data matching computer program that compares income reported to Centrelink with information held by other agencies, such as the Australian Tax Office. If a discrepancy is detected, a notice is sent automatically to the recipient.

Virginia Eubanks, an associate professor of political science at the University at Albany, State University of New York, wanted to know how the impact of automated systems looked on the ground in the United States.

In her new book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, she travelled across America to talk to people for whom social support is determined by the results of an equation.

Based on her research, she said coders needed to have their own version of the Hippocratic oath to ensure their tools do not harm disadvantaged people.

Who gets to eat?

While automated tools are often introduced in the name of efficiency, they can make life more complicated for those they affect.

In 2006, the state of Indiana contracted IBM to automate eligibility for welfare benefits, she explains in the book, taking much of the discretion away from caseworkers.

As a result, Dr Eubanks said, “one million people” in need were denied assistance.

“The experiment went so poorly that three years into a 10-year contract, the state of Indiana actually broke their contract with a coalition of high-tech companies,” she said.

In Los Angeles, the Coordinated Entry System aims to rank homeless people according to their vulnerability and match them with housing.

While this is a good idea in principle, Dr Eubanks said, the lived experience of those it affects can be very different.

She found this leaves people feeling watched and vulnerable — that they must trade their most intimate health and habits information for housing, for food or to keep their family intact.

Whose data matters?

This conflict was particularly evident for her in the case of Angel and Patrick.

They are two parents in Allegheny County, Pennsylvania, which is experimenting with a predictive-risk model called the Allegheny Family Screening Tool (AFST).

This tool aims to identify, using county data, which children might be victims of abuse or neglect in the future. But it doesn’t include everyone’s data.

According to Dr Eubanks, while the data may include government information like who is on probation or who accessed welfare services, it does not always account for details that might affect richer families.

For example, who relies on a nanny, she said, or who is undertaking addiction counselling funded by private health insurance.

Erin Dalton, deputy director of Allegheny County’s Office of Data Analysis, Research and Evaluation, told Dr Eubanks in a piece published on Wired, “We definitely oversample the poor”.

“All of the data systems we have are biased. We still think this data can be helpful in protecting kids.”

But for Angel and Patrick, who have had interactions with local child protection authorities, it can now feel like every interaction they have with government services (even one that is positive, such as volunteering) counts towards an AFST score that could see their daughter removed.

“They’re doing this algebra, this math in their head, which says if I interact with this service, will this raise my score?” Dr Eubanks explained.

“And one of my fears about this system is that the danger of being scored high risk to your children will keep people from accessing the resources they deserve and need to keep families safe.”

A Centrelink sign through a window.

Can we build a better algorithm?

One commonly-cited solution to algorithms that discriminate based on race or class is to “audit” them and ensure they are operating ethically.

This may not be the only answer, according to Dr Eubanks.

Take Allegheny county, which by some accounts did everything right: “Their design processes were participatory, they communicated with the community all the way through the process, they’ve been incredibly transparent about the model itself, they’ve released all the variables,” she explained.

And yet, for Angel and Patrick, the lived experience was no easier. Not when the fate of their family could be affected by “algebra” they may not fully understand.

According to Dr Lachlan McCalman, a principle research engineer at Data61 who has written about “ethical algorithms“, building an automated system that is accurate overall is one thing. But what is accurate is not necessarily empathetic or fair.

This is particularly important when it comes to the type of data fed into these systems.

In his view, the “naïve implementation” of some of these tools demonstrate what can go wrong when you consider predictive performance averaged across the population.

Without careful adjustment, the results may not properly address the needs of sub-populations (all too often the disadvantaged) who are not well represented in the data.

“There’s more to doing the right thing than doing the right thing for the average person,” Dr McCalman said.

Do no harm with the data

Dr Eubanks offered a so-called “Hippocratic oath” (for doctors, the rule to “do no harm”) for data scientists.

“Fundamentally, it boils down to two questions that designers should ask themselves about their systems,” she explained.

Dr McCalman said these principles apply to policy makers as well as data scientists.

“I think they apply to anyone building a system that makes decisions about (poor and working) people’s lives, whether it uses hard-coded business rules, policies for humans to enact, or some kind of machine learning algorithm,” he said.

But ultimately, can we build solutions for a problem we do not fully understand?

For Dr Eubanks, these tools are shaped by our cultural misunderstandings about poverty — that it is an individual failing, that it only happens to a minority or to potentially pathological people.

“I feel like people are asking me for a 10-point plan about how to build more just technologies, and I think the question is more frustrating and deeper than that,” Dr Eubanks said.

“Central to changing the ways these tools work is for us to get our souls right around poverty.”

Advertisements

Posted on February 11, 2018, in ConspiracyOz Posts. Bookmark the permalink. Leave a comment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this: