The AI North Brief

Eight Months Before Tumbler Ridge


Listen Later

Send a text

OpenAI's safety team met with federal officials in Ottawa yesterday to explain why it didn't alert Canadian police about a ChatGPT user who described gun violence scenarios eight months before the Tumbler Ridge mass shooting. The company banned the account but determined the activity didn't meet its threshold for reporting to law enforcement. Now Canada is asking whether AI companies should face mandatory reporting requirements, and how to write a law that protects both public safety and civil liberties.

Sources:

https://www.cbc.ca/news/politics/open-ai-summoned-ottawa-tumbler-ridge-9.7103281

https://www.theglobeandmail.com/politics/article-ai-minister-summons-openai-safety-chiefs-tumbler-ridge-shooting/

https://www.cbc.ca/news/canada/british-columbia/eby-openai-tumbler-ridge-9.7102942

https://www.nationalobserver.com/2026/02/24/news/why-forcing-ai-firms-report-online-threats-not-simple

https://www.theglobeandmail.com/politics/article-experts-say-online-harms-bill-must-consider-ai-protocols-for-reporting/

https://www.cbc.ca/news/canada/british-columbia/openai-tumbler-ridge-shooter-ban-9.7100497

Tags

AI North Brief, Canadian AI, AI Policy, OpenAI, ChatGPT, Evan Solomon, Tumbler Ridge, AI Safety, Online Harms, AI Regulation, David Eby

Chapter Markers

00:00 Intro 00:20 What Happened 01:30 The Meeting 02:45 The Regulatory Gap 04:00 The Broader Problem 05:30 What Comes Next 06:45 Outro

...more
View all episodesView all episodes
Download on the App Store

The AI North BriefBy Paul Karwatsky