Compliance Perspectives

Alessia Falsarone on AI Explainability [Podcast]


Listen Later

By Adam Turteltaub
Why did the AI do that?
It’s a simple and common question, but the answer is often opaque, with people referring to black boxes, algorithms and other words that only those in the know tend to understand.
Alessia Falsarone, a non-executive director of Innovate UK, says that’s a problem.  In cases where AI has run amok, the fallout is often worse because the company is unable to explain why the AI made the decision it made and what data it was relying on.
AI, she argues, needs to be explainable to regulators and the public.  That way all sides can understand what the AI is doing (or has done) and why.
To create more explainable AI, she recommends the creation of a dashboard showing the factors that influence the decisions made.  In addition, teams need to track changes made to the model over time.
By doing so, when the regulator or public asks why something happened, the organization can respond quickly and clearly.
In addition, by embracing a more transparent process, and involving compliance early, organizations can head off potential AI issues early in the process.
Listen is to hear her explain the virtues of explainability.
...more
View all episodesView all episodes
Download on the App Store

Compliance PerspectivesBy SCCE

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

34 ratings


More shows like Compliance Perspectives

View all
Brian Windhorst & The Hoop Collective by ESPN, Omaha Productions, Brian Windhorst

Brian Windhorst & The Hoop Collective

3,903 Listeners

Affaires étrangères by France Culture

Affaires étrangères

126 Listeners

The Daily by The New York Times

The Daily

112,194 Listeners

The Mel Robbins Podcast by Mel Robbins

The Mel Robbins Podcast

19,668 Listeners

Cold Blooded: Mystery in Alaska by ABC News

Cold Blooded: Mystery in Alaska

1,472 Listeners