
Sign up to save your podcasts
Or


The award winning, Compliance into the Weeds is the only weekly podcast which takes a deep dive into a compliance related topic, literally going into the weeds to more fully explore a subject. Looking for some hard-hitting insights on compliance? Look no further than Compliance into the Weeds! In this episode Tom Fox and Matt Kelly delve into the recent speech by Michael Hsu, the head of the Office of the Comptroller of the Currency, on the accountability challenges posed by artificial intelligence in the banking sector.
The discussion highlights Hsu's emphasis on the lack of a robust accountability framework for AI, illustrating the issue with the Air Canada chatbot incident. The conversation also touches on potential systemic risks AI could pose to the financial sector, the need for explainable AI, and the shared responsibility model used in cloud computing as a potential template for addressing these challenges. The episode underscores the necessity for compliance officers to ensure contracts and IT controls are in place and stresses the importance of developing trust and accountability mechanisms before widespread AI adoption.
Key Highlights
· AI Accountability: A Regulator's Perspective
· Case Study: Air Canada's AI Mishap
· Legal and Technological Challenges
· Exploring Solutions and Shared Responsibility
Resources
Matt on Radical Compliance
Tom
YouTube
Learn more about your ad choices. Visit megaphone.fm/adchoices
By Tom Fox4
1212 ratings
The award winning, Compliance into the Weeds is the only weekly podcast which takes a deep dive into a compliance related topic, literally going into the weeds to more fully explore a subject. Looking for some hard-hitting insights on compliance? Look no further than Compliance into the Weeds! In this episode Tom Fox and Matt Kelly delve into the recent speech by Michael Hsu, the head of the Office of the Comptroller of the Currency, on the accountability challenges posed by artificial intelligence in the banking sector.
The discussion highlights Hsu's emphasis on the lack of a robust accountability framework for AI, illustrating the issue with the Air Canada chatbot incident. The conversation also touches on potential systemic risks AI could pose to the financial sector, the need for explainable AI, and the shared responsibility model used in cloud computing as a potential template for addressing these challenges. The episode underscores the necessity for compliance officers to ensure contracts and IT controls are in place and stresses the importance of developing trust and accountability mechanisms before widespread AI adoption.
Key Highlights
· AI Accountability: A Regulator's Perspective
· Case Study: Air Canada's AI Mishap
· Legal and Technological Challenges
· Exploring Solutions and Shared Responsibility
Resources
Matt on Radical Compliance
Tom
YouTube
Learn more about your ad choices. Visit megaphone.fm/adchoices

113,300 Listeners

42 Listeners

56 Listeners

2,557 Listeners

4,642 Listeners

7 Listeners

5,600 Listeners

16,494 Listeners

10,512 Listeners

675 Listeners

5 Listeners