Amazon cover image
Image from Amazon.com

Weapons of math destruction : how big data increases inequality and threatens democracy / Cathy O'Neil

By: Material type: TextTextLanguage: English Publication details: London : Penguin Random House, 2017.Description: x, 259 pages ; 19 cm.ISBN:
  • 9780141985411
Subject(s): Genre/Form:
Contents:
Bomb parts : what is a model? -- Shell shocked : my journey of disillusionment -- Arms race : going to college -- Propaganda machine : online advertising -- Civilian casualties : justice in the age of big data -- Ineligible to serve : getting a job -- Sweating bullets : on the job -- Collateral damage : landing credit -- No safe zone : getting insurance -- The targeted citizen : civic life.
Summary: We live in the age of the algorithm. Increasingly, the decisions that affect our lives (where we go to school, whether we get a car loan, how much we pay for health insurance) are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: everyone is judged according to the same rules, and bias is eliminated. But as Cathy O'Neil reveals in this book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they are wrong. Most troubling, they reinforce discrimination: if a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he is then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a 'toxic cocktail for democracy.' Welcome to the dark side of big data. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These 'weapons of math destruction' score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health. O'Neil calls on modelers to take more responsibility for their algorithms and on policymakers to regulate their use. But in the end, it is up to us to become more savvy about the models that govern our lives.
Holdings
Item type Current library Call number Copy number Status Date due Barcode
Book TBS Barcelona Libre acceso QA76.9.B45 ONE (Browse shelf(Opens below)) 1 Available B02001

Includes bibliographical references (pages 219-252) and index.

Bomb parts : what is a model? -- Shell shocked : my journey of disillusionment -- Arms race : going to college -- Propaganda machine : online advertising -- Civilian casualties : justice in the age of big data -- Ineligible to serve : getting a job -- Sweating bullets : on the job -- Collateral damage : landing credit -- No safe zone : getting insurance -- The targeted citizen : civic life.

We live in the age of the algorithm. Increasingly, the decisions that affect our lives (where we go to school, whether we get a car loan, how much we pay for health insurance) are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: everyone is judged according to the same rules, and bias is eliminated. But as Cathy O'Neil reveals in this book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they are wrong. Most troubling, they reinforce discrimination: if a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he is then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a 'toxic cocktail for democracy.' Welcome to the dark side of big data. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These 'weapons of math destruction' score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health. O'Neil calls on modelers to take more responsibility for their algorithms and on policymakers to regulate their use. But in the end, it is up to us to become more savvy about the models that govern our lives.

Powered by Koha