With the national dialogue focused on systemic and racial inequality in law enforcement and criminal justice settings, it is easy to underlook some of the covert discrimination happening on the technical front. From facial recognition and credit reporting, to recruitment and predictive policing software, algorithmic bias hurts consumers and end users by emphasizing racial or gender features. In this pilot episode of Cut the Code, I talk about how algorithmic bias occurs, the challenges it presents, some key examples, and solutions to the issue.
π΅ Music: "Algorithms" by Chad Crouch
From the Free Music Archive
CC BY NC
ππ½ Episode outline and sources found here!
ππ½ Feel free to reach out: @nikitarajput_