CaseCast

ML bias: Algorithms in the Courtroom!


Listen Later

This podcast is for educational purposes. Recorded in collaboration with Professor Lauren Cipriano.


Some references used: 

  • Mehrabi, N., et al. (2019). A Survey on Bias and Fairness in Machine Learning. arXiv.Org. 
  • Suresh, H., & Guttag, J. V. (2021). A Framework for Understanding Sources of Harm throughout the Machine Learning Life Cycle. Equity and Access in Algorithms, Mechanisms, and Optimization, 1–9. 
  • Northpointe, Inc. (2016). COMPAS risk scales: Demonstrating accuracy equity and predictive parity. https://www.documentcloud.org/documents/2998391-ProPublica-Commentary-Final-070616
  • Angwin, J. et al (2016). Machine bias. ProPublica.  https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
  • Larson, et al (2016). How we analyzed the COMPAS recidivism algorithm. ProPublica.  https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm
  • Casadei, D. (2020). Predicting prison terms and parole. Retrieved from Downtown Publications: https://www.downtownpublications.com/single-post/2020/03/24/predicting-prison-terms-and-parole
  • Dressel, J., & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science Advances, 4(1), eaao5580. 
  • Corbett-Davies et al (2016). A computer program used for bail and sentencing decisions was labeled biased against blacks. It’s actually not that clear. The Washington Post. https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-than-propublicas/
  • Jackson, E., & Mendoza, C. (2020). Setting the record straight: What the COMPAS Core Risk and Need Assessment is and is not. Harvard Data Science Review. 
  • Thomas, S. (2023). The fairness fallacy: Northpointe and the COMPAS recidivism prediction algorithm (Unpublished undergraduate thesis). Institute for the Study of Human Rights, Columbia University​
  • Angwin, J., . ProPublica responds to company’s critique of machine bias story. ProPublica. https://www.propublica.org/article/propublica-responds-to-companys-critique-of-machine-bias-story
  • Flores, A. W., et al. (2016). False positives, false negatives, and false analyses: A rejoinder to “Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks.” Federal Probation, 80(2), 38-42. https://www.uscourts.gov/sites/default/files/fed_probation_dec2016.pdf
  • Barry-Jester, A. M., et al.(2015). The new science of sentencing. The Marshall Project.  https://www.themarshallproject.org/2015/08/04/the-new-science-of-sentencing

Music sources

  • https://www.youtube.com/watch?v=OwEU8dPYCvY
  • https://www.youtube.com/watch?v=aUaTCOpbjcg

Ps: GenAI was used for editing and proofreading the script, helping to refine the content for clarity and coherence.

...more
View all episodesView all episodes
Download on the App Store

CaseCastBy Yasser Rahrovani