Big data increases inequality and threatens democracy – because the privileged are processed by people and the masses by machines. The promise was that algorithms and Artificial Intelligence would reduce discrimination in our society, objectifying decisions and freeing them from human emotions, biases and prejudices. But what if the historical bias is in the data? Or the model is prejudiced? Algorithms start looking for vulnerable targets and harm the powerless people in our society – People that don't understand or question the systems. And if they do, they got 'math-shamed'.
My Guest is Cathy O'Neil. In her highly-priced book 'Weapons of Math Destruction, she gave a voice to the victims of algorithms. O'Neil is an American mathematician, data scientist and algorithmic auditor.
https://studiojuliajanssen.com/nondiscriminationbydesign.html