
Sign up to save your podcasts
Or


In episode 50, our Season 1, 2018 finale of DataFramed, the DataCamp podcast, Hugo speaks with Cathy O’Neil, data scientist, investigative journalist, consultant, algorithmic auditor and author of the critically acclaimed book Weapons of Math Destruction. Cathy and Hugo discuss the ingredients that make up weapons of math destruction, which are algorithms and models that are important in society, secret and harmful, from models that decide whether you keep your job, a credit card or insurance to algorithms that decide how we’re policed, sentenced to prison or given parole? Cathy and Hugo discuss the current lack of fairness in artificial intelligence, how societal biases are perpetuated by algorithms and how both transparency and auditability of algorithms will be necessary for a fairer future. What does this mean in practice? Tune in to find out. As Cathy says, “Fairness is a statistical concept. It's a notion that we need to understand at an aggregate level.” And, moreover, “data science doesn't just predict the future. It causes the future.”LINKS FROM THE SHOW
DATAFRAMED SURVEY
DATAFRAMED GUEST SUGGESTIONS
FROM THE INTERVIEW
FROM THE SEGMENTS
Data Science Best Practices (with Heather Nolis ~20:30)
Data Science Best Practices (with Ben Skrainka ~39:35)
Original music and sounds by The Sticks.
By DataCamp4.9
265265 ratings
In episode 50, our Season 1, 2018 finale of DataFramed, the DataCamp podcast, Hugo speaks with Cathy O’Neil, data scientist, investigative journalist, consultant, algorithmic auditor and author of the critically acclaimed book Weapons of Math Destruction. Cathy and Hugo discuss the ingredients that make up weapons of math destruction, which are algorithms and models that are important in society, secret and harmful, from models that decide whether you keep your job, a credit card or insurance to algorithms that decide how we’re policed, sentenced to prison or given parole? Cathy and Hugo discuss the current lack of fairness in artificial intelligence, how societal biases are perpetuated by algorithms and how both transparency and auditability of algorithms will be necessary for a fairer future. What does this mean in practice? Tune in to find out. As Cathy says, “Fairness is a statistical concept. It's a notion that we need to understand at an aggregate level.” And, moreover, “data science doesn't just predict the future. It causes the future.”LINKS FROM THE SHOW
DATAFRAMED SURVEY
DATAFRAMED GUEST SUGGESTIONS
FROM THE INTERVIEW
FROM THE SEGMENTS
Data Science Best Practices (with Heather Nolis ~20:30)
Data Science Best Practices (with Ben Skrainka ~39:35)
Original music and sounds by The Sticks.

482 Listeners

630 Listeners

583 Listeners

309 Listeners

213 Listeners

346 Listeners

210 Listeners

140 Listeners

313 Listeners

100 Listeners

110 Listeners

104 Listeners

680 Listeners

33 Listeners

40 Listeners