Math professor Cathy O’Neil explores how algorithms affect much of your daily life and reveals corrupt practices that algorithms sustain.
Beware the Black Box
Math professor and quant Cathy O’Neil details how algorithms influence every area of life. In this detailed overview – which was long-listed for the 2016 United States National Book Award for Nonfiction – O’Neil explains the pernicious impact of these models in simple language that reveals the inner workings of the black-box mathematics that increasingly define your future. O’Neil is a moralist, and writes with a strong sense of right, wrong and injustice. Her anger at how algorithms undermine humanity in pursuit of profit fuels her reportage, but her wit and light touch will draw you into the crucial issues she describes.
Black Box
Models based on mathematical algorithms depend on large amounts of data. They use available information to predict outcomes. Bad models, or what O’Neil dubs “Weapons of Math Destruction” (WMDs), derive from lack of data and use of “proxies.” For example, someone’s zip code might become a proxy for his or her ability to repay a loan.
Models aren’t impartial – they reflect their designers’ beliefs. O’Neil gets to the heart of the matter when she writes that a model succeeds when it provides the desired answer, which can become a self-serving loop – a “black box.” Models often ignore social, systemic inequality.
These mathematical models were opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists.Cathy O’Neil
O’Neil posits the central problem: Few understand what criteria alter or influence algorithms. She states her theme plainly: Algorithms are not inherently bad, but when they exist inside a black box, they can do harm.
Manipulation of Information
The US News & World Report magazine started a college evaluation system in 1983. The magazine based everything on a survey it sent to university presidents.
The survey was popular with readers, but not with colleges. The editors’ nonscientific comments derive from hunches and entrenched social attitudes. To gain reader confidence, the editors knew their report should reflect established beliefs, such as identifying Harvard, Yale and Stanford as the best universities.
If the algorithm influenced a college’s standing negatively one year, it sustained a real-world impact because fewer students would attend, the best professors would leave, and the college’s ranking would drop. The editors discarded cost as a factor because its inclusion rose obscure schools above famous colleges. Editors feared readers might doubt the ranking.
When you create a model from proxies, it is far simpler for people to game it. This is because proxies are easier to manipulate than the complicated reality they represent.Cathy O’Neil
As the report grew in importance, more colleges falsified data. Schools inflated SAT scores, acceptance and graduation rates and alumni giving. New buildings and facilities make schools more attractive: More kids apply and colleges show a lower acceptance rate that boosts their standing. You can almost hear O’Neil grinding her teeth in frustration as she describes this corrupt process.
Personality Tests
Employment models rely on proxies such as personality tests. O’Neil reveals that most tests’ main function is to eliminate as many applicants as possible at as cheaply as possible. Though companies use personality tests to determine candidates’ fitness for a job, O’Neil explains that personality tests are terrible predictors of job performance. References work far better.
Today, algorithms eliminate 72% of applications. Applications from people with typically white names received 50% more return calls than applications from people with names that sounded typically African-American.
Analytics
O’Neil laments that algorithms have concern for human concerns.
She tells how companies use analytics to add or cut work shifts, which means employees never work regular schedules. This makes school, caring for children or taking on second gigs difficult. Companies calculate how to stretch employees to their maximum limits.
Technology firm Neustar enables companies to rank customers who call their helpline. Based on their phone numbers and other data, Neustar captures personal information that ranks people as more or less valuable as customers, and determines who receives quicker service.
Many jobs require applicants to consent to a credit check. Employers rarely inform candidates that a poor credit rating adversely affects their hiring chances, though by law they must. Credit scores – which significantly correlate to race – act as a proxy for wealth in the United States. Running credit checks in nonfinancial contexts has racist overtones.
Many states banned credit checks as a part of hiring. Facebook has a new credit model based on your network. Who you know indicates your reliability. O’Neil is outraged at the harm this could cause.
Skewed Results
O’Neil cites researchers who intentionally skewed search results about an upcoming election. One group saw information favoring one party and the other group saw more positive information about the other party – that experiment altered voter preference by 20%.
The burden of proof rests on companies, which should be required to audit their algorithms regularly for legality, fairness and accuracy.Cathy O’Neil
Many companies offer employee health incentives. To achieve them, CVS employees, for example, must state their body fat. But, O’Neil divulges, the Body Mass Index (BMI) is a discredited statistic that measures weight in kilograms divided by height in centimeters. Athletes appear overweight because muscle weighs more than fat. NBA star LeBron James, for example, is overweight, according to BMI.
A Strong Voice
O’Neil was a math professor who became a quant on Wall Street, then a data scientist. There’s no doubting her expertise nor her unique career-long overview of this field. Throughout, she underscores how the greater your wealth and privilege, the less impact algorithms have on your life. O’Neil articulately proves the reverse: Those of less power and privilege are unknowingly at the mercy of algorithms. Scientific American’s Evelyn Lamb accurately calls O’Neil’s writing “direct and easy to read,” O’Neil passionately argues that people must stop regarding that algorithms and technology as the solutions to all problems. Her willingness to share her emotions makes O’Neil’s insights all the more relatable, memorable and resonant.
Cathy O’Neil also wrote The Shame Machine and co-wrote Doing Data Science with Rachel Schutt.