
Sign up to save your podcasts
Or


Today we’re joined by Sandra Wacther, an associate professor and senior research fellow at the University of Oxford.
Sandra’s work lies at the intersection of law and AI, focused on what she likes to call “algorithmic accountability”. In our conversation, we explore algorithmic accountability in three segments, explainability/transparency, data protection, and bias, fairness and discrimination. We discuss how the thinking around black boxes changes when discussing applying regulation and law, as well as a breakdown of counterfactual explanations and how they’re created. We also explore why factors like the lack of oversight lead to poor self-regulation, and the conditional demographic disparity test that she helped develop to test bias in models, which was recently adopted by Amazon.
The complete show notes for this episode can be found at twimlai.com/go/521.
By Sam Charrington4.7
419419 ratings
Today we’re joined by Sandra Wacther, an associate professor and senior research fellow at the University of Oxford.
Sandra’s work lies at the intersection of law and AI, focused on what she likes to call “algorithmic accountability”. In our conversation, we explore algorithmic accountability in three segments, explainability/transparency, data protection, and bias, fairness and discrimination. We discuss how the thinking around black boxes changes when discussing applying regulation and law, as well as a breakdown of counterfactual explanations and how they’re created. We also explore why factors like the lack of oversight lead to poor self-regulation, and the conditional demographic disparity test that she helped develop to test bias in models, which was recently adopted by Amazon.
The complete show notes for this episode can be found at twimlai.com/go/521.

480 Listeners

1,090 Listeners

170 Listeners

303 Listeners

334 Listeners

207 Listeners

203 Listeners

96 Listeners

517 Listeners

131 Listeners

227 Listeners

611 Listeners

25 Listeners

35 Listeners

40 Listeners