If you’ve been denied job after job, even some of those entry-level, low-wage positions, Cathy O’Neil can explain. If the rates you’re finding for auto insurance are sky high, Cathy O’Neil knows why. If your amazing teacher was just fired or your employer has started charging higher health insurance premiums for not using a Fitbit or the number of cops in your neighborhood is growing at a rate disproportionate to actual crime, allow Cathy O’Neil to introduce you to the opaque, self-reinforcing world of weapons of math destruction (WMDs). If you’re confused about why people on the other side of the political debate from you think very strongly the way they do, there’s at least one concrete reason, and Cathy O’Neil knows it. If you think using data analyses and mathematics removes pesky human biases, Cathy O’Neil’s conversational “Weapons of Math Destruction” will disabuse you of that notion.
O’Neil has a doctorate in math and, after a stint teaching at Barnard College in New York City and working as a data analyst for a hedge fund, she’s awakened from what she herself admits was a techno-utopian slumber. She hopes her awakening is early enough to stop WMDs, which are already targeting us and completely codifying injustice.
What she means by the term “weapons of math destruction” in her eponymous book is the algorithms and machine-interpreted data sets that are trolling and analyzing every aspect of our lives, from how we make our way through cities and cyber-highways, to our sleep patterns and consumer behaviors. When they can’t accurately measure something, often because it’s either illegal — such as race — or because human beings are finicky and there will always be individuals who confound any so-called norms, they’re programmed to use proxies. Thus, we are measured and mysteriously scored based on our Facebook friends’ activities, the ZIP code we live in, and the routes and times we choose to commute.
Most of us want a fairer and more equitable society; this is likely why entities from Facebook to the government to employers have come to rely so heavily on these algorithms. O’Neil encourages us to imagine that the 1950s-era banker was deciding who to give a loan to based on his personal knowledge of the applicant’s church attendance, the company they kept and, likely, their race, instead of their assets, spending habits and income. We all want an equal shot at the basic necessities of modern life, such as jobs, insurance and an education. But, as “Weapons of Math Destruction” infuriatingly details, all we’ve done is translate personal biases and blind spots into algorithms that determine more and more of our lives.
Not all data or data systems are bad; O’Neil does discriminate between data analysis and weaponized data sets. But when we treat the results of a computer, which does not have the capacity either to be moral or tell us why it is sorting people and information the way it is, like an unquestionable god, we punish the poor and insulate the wealthy.
Sometimes the problem is that the algorithm used to give an excellent teacher a low rating, for example, doesn’t get the feedback that this teacher went on to win Teacher of the Year award at their next position. So its inscrutable and clearly erroneous method of scoring teachers remains unfixed.
Of course, tweaking models can be just as damaging. Crime-tracking systems with predictive elements, for example, can create these self-fulfilling feedback loops that create the very reality the models use to validate their own data. And it’s not as straightforward as just banning or disabling them. WMDs are often in place because they make the companies that use them a lot of money, or because they get desired results. For instance, political campaigns employ WMDs to sort people into groups based on how likely they are to vote and, if so, for which candidate. They can create entirely different ads to show to the various segments of people their algorithms produced.
Beyond passing out this book to the employers who may have rejected you based on a lack of keywords in your résumé or the results of a personality test (the legality of which is questionable but the use of which is wide and widening), I’m not left with a lot of options. O’Neil proposes some abstract solutions in the form of platitude-esque value statements — “we should rein these WMDs in,” “we must master them so they don’t master us” — but she saves them all for the conclusion. And most of them are lacking actual, concrete steps that citizens, especially those who are not data scientists, can take to stop the damage she so eloquently elaborates chapter by maddening chapter.
On one hand, it’s validating for the job seeker who’s been searching for months, or the young driver being gouged on their car insurance, especially if they’ve been told some version of “just keep looking” or “you just have to find a better rate.” On the other hand, I’m still waiting for the book that can describe solutions for regular people, as concretely, thoroughly and specifically as it can describe the problems.
The way to disarm a WMD, O’Neil suggests, is to use the collected data to “target” individuals for services they need, rather than to exclude or gauge them. I’m unnerved that she apparently left it up to me, a mathematically challenged and mildly technophobic reader, to figure out how.
Reprinted from Street Roots’ sister paper, Real Change News in Seattle.