You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

The Injustice of Algorithms

Prejudice is often coded into software, including tools used by the government.

Peter Macdiarmid / Getty Images

The difficulty with talking about the technology industry is that it’s increasingly hard to define. A tech company can be a giant data-mining operation turned advertising platform, like Facebook or Google. But it can also be a design-heavy producer of phones, computers and software. Or perhaps it’s a transportation company pretending it’s just a marketplace, nothing to see here. Maybe it’s Amazon?

What binds all these companies, plenty of other large companies, and a host of startups is murky. Perhaps it’s the fact that they offer services via their websites and that they create software, but the software is rarely the actual product they are selling. These businesses tends to have a headquarters, or at least an outpost, within the Bay Area. Very often, they make sweeping claims to be the capital F Future. Think Facebook’s attempt to “make affordable access to basic internet services available to every person in the world,” by walling users into products of its choosing, or communities touting Amazon fulfillment centers as integral to their futures. Yet as David Yanofsky has pointed out, Groupon, Skype, Facebook, and Amazon.com all compete in different markets. This has led him, and several other writers, to declare over the past several years that there’s no such thing as a tech company or the tech industry.

This slipperiness is particularly frustrating because there’s a value to holding tech companies to account as a group. For all their differences, the companies mentioned about have each encountered serious problems with inequality and discrimination both within their organizations and among their users. Sexual harassment and racism have persistently troubled companies from Google to Uber, while Twitter has struggled to deal with intimidating and often hateful speech on its platform. A pair of recent books survey these issues, as they play out on social networks and in the wider world, in systems many Americans are not even aware of.

TECHNICALLY WRONG: SEXIST APPS, BIASED ALGORITHMS, AND OTHER THREATS OF TOXIC TECH by Sara Wachter-Boettcher
W. W. Norton & Company, 240 pp., $24.95

The first of these, Sara Wachter-Boetcher’s Technically Wrong is exactly what its subtitle, “Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech,” might lead you to expect: a primer on several years worth of disastrous failures of design and cultural problems at various stripes tech companies large and small. She focuses heavily, though not entirely, on consumer-facing companies: Facebook, Twitter, Uber, and the like. In a brisk couple of hundred pages she discusses the failure of Silicon Valley’s giants to diversify their workforces, which remain overwhelmingly white and male, and how this creates products whose full range of users aren’t accounted for. In phototagging for instance, failure to train an algorithm with a broad data meant Google Photos failed in some cases to recognize the faces of black users. Meanwhile, Facebook’s real-name policy—the rule that requires users to use their legal name and not a chosen name— and which has and continues to help abet abuse, by allowing trolls to hound their targets from the platform all together, which means losing touch with the communities and contacts they’ve built on it.

AUTOMATING INEQUALITY: HOW HIGH-TECH TOOLS PROFILE, POLICE, AND PUNISH THE POOR by Virginia Eubanks
St. Martin’s Press, 272 pp., $26.99

The book is at its best when it shows that the problems that emerge from tech companies aren’t difficult to grasp or even unique to technology companies or platforms. However, Wachter-Boetcher does sometimes seem to take at face value companies’ efforts to solve their problems, even when they should be questioned further. Most notably, she praises the attempts of NextDoor—a social networking site for neighbors—to curb racism on its platform. Some of NextDoor’s users were making posts suggesting people they’d seen around, including their own neighbors or their neighbor’s friends, were “sketchy” or dangerous based on the color of their skin or what they were wearing. While it’s true that NextDoor has taken steps to deal with racism, particularly requiring more information and specificity in reports about crime and safety, it’s a problem that has persisted. It’d be facile to expect NextDoor to solve the problem of its users’ racism simply by implementing a user interface change, but it’s perhaps worse to pretend that the problem has gone away when it has not. 

Where Technically Wrong works by honing in on some of the companies most often associated with bias and abuse in tech, Virginia Eubanks’s forthcoming Automating Inequality succeeds by almost entirely ignoring them. Eubanks, a writer and professor at SUNY Albany, spent part of the past several years investigating different semi-automated systems that have been used to study the habits of poor Americans in three different states. Indiana’s Family and Social Services Administration, for instance, booted more than a million people off welfare rolls over three years by interpreting small application mistakes, often beyond applicants’ control, as failures to cooperate. The city of Los Angeles uses a Coordinated Entry System (CES) to manage homelessness. The CES both uses an algorithm to compare how vulnerable different homeless people are, as well as requiring that homeless people allow their information to be used for seven years by more than 100 organizations, including law enforcement.

To call the stories and data Eubanks has collected infuriating feels like an understatement. In and around Pittsburgh, the county Office of Children, Youth and Families uses the Allegheny Family Screening Tool (AFST) for assessing the risk of childhood abuse and neglect through statistical modeling. This leads to disproportionate targeting of poor families because the data fed into the tool is what’s available, and that often comes from the public services and agencies that lower income families rely upon or have to deal with—public schools, the local housing authority, unemployment services, juvenile probation services, and the county police, to name just a few. The data from private services used by middle and upper class—schools, nannies, private mental health and drug treatment services, luxury rehab—simply isn’t available. AFST also tends to equate signs of poverty—such as being unable to afford a child’s medication, or neighbors complaining about a child playing unsupervised—with signs of risk of abuse, often ultimately creating more work for already beleaguered parents. 

Eubanks has been covering this topic for several years, and she and a slew of others have pointed out that marginalized people are often the first to face experiments in assessment and punishment through technological tools. Sometimes these experiments are spontaneous and vigilante, as when neo-nazi trolls zero in on minorities on Twitter. Sometimes they have government sanction, when for instance single mothers are stripped of the benefits that are supposed to be a core part of a social safety net. What’s incisive about Automating Inequality is how it underscores the subtle ways technology is used to this end. If you start to talk about algorithms and their dangers with many in the United States at the moment you’ll probably end up talking about Facebook, Russia, and the 2016 election. But the grim reality is that quitting Facebook or divesting yourself of some other part of your web presence would only remove some small portion of the sway technological systems have over your life. Law enforcement might still use your friends’ social media accounts to surveil you, running photos through facial recognition. Or, a giant system for credit assessment—a system you can’t opt out of—could leak your information in a preventable breach.

Technology is increasingly built into every part of our lives, whether it’s the social media and apps that Wachter-Boetcher discusses, the social services Eubanks outlines, or the vast information systems hosted by Amazon Web Services or Google’s cloud computing efforts. Technically Wrong and Automating Inequality, as well as other books like them, are helpful not because they bring us any closer to pinning down the technology industry, but because they testify to just how ubiquitous it has become. It’s not sufficient to think of technology as an industry. It needs to be approached as a type of infrastructure flowing through many industries, and the public sector, with all that entails. The challenge now, these books propose, is finding how to make the tools and systems around us more equitable and democratic.