Jacqueline Wernimont joins Heather Ross and Andrew Maynard to talk about algorithmic bias -- what is it, and how should we think about it now and in the future.
•https://jwernimont.com/ - Jacque Wernimont's website
•https://webapp4.asu.edu/directory/person/2468640 - Jacque's ASU directory listing
•First, let’s definite what an algorithm is. Check out a definition with examples here: http://computer.howstuffworks.com/question717.htm
•Jacque talks about the phenomenon of redlining and reverse redlining in Detroit in the 20th century. Read more here: http://www.citylab.com/housing/2015/03/mapping-the-lasting-effects-of-redlining/388333/
•Read more about Frank Pasquali’s black box society here: http://www.slate.com/articles/technology/bitwise/2015/01/black_box_society_by_frank_pasquale_a_chilling_vision_of_how_big_data_has.html
•Jacque discusses the idea of incorporating diverse perspectives into creating algorithms. Flickr’s photo tagging app is an example of what goes wrong when you don’t. Read more here: http://money.cnn.com/2015/05/21/technology/flickr-racist-tags/
•https://weaponsofmathdestructionbook.com/ - a book that we mention in the podcast
•ProPublica found an example of algorithmic bias in our justice system. Read more: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing