
Sign up to save your podcasts
Or


Summary
“Test, test, test - and don’t accept the outcome unless you’re fairly confident in the level of uncertainty that remains.”
Companion diagnostics (CDx) are inherently high-consequence because they can directly shape treatment decisions. That reality drives a higher bar for clarity: intended use boundaries, evidence expectations, and tight specificity in what the test is claiming and for whom.
In this Let’s Talk Risk! conversation, host Naveen Agarwal sits down with Chris Daly to discuss how AI is transforming the MedTech landscape, including CDx, and how FDA’s expectations are evolving in this area. Chris emphasizes anchoring on the clinical question, using AI as a tool (not a vague strategy), and making uncertainty explicit: not “can we explain everything,” but “how much uncertainty is acceptable for this intended use?”
Listen to the full 30-minute podcast or jump to a section of interest listed below.
Chapters
00:00 2026 context: uncertainty is rising; fundamentals matter
05:00 CDx basics: why CDx is different (and higher-stakes)
07:05 AI/ML + diagnostics: define the question, bound the tool
09:40 Explainability vs uncertainty: “how much confidence is enough?”
14:10 The human factor: vigilance and better questions
18:10 FDA and CDx: what “evolving approach” may signal
21:30 Dataset boundaries, bias, and representativeness as safety issues
25:40 Closing: adapting to rapid AI rise through discipline + alignment
If you enjoyed this podcast, consider subscribing to the Let’s Talk Risk! newsletter.
Suggested links:
LTR: Three Pillars for Defining Your IVD Risk Management Strategy.
LTR: Responsible AI and Future of MedTech Safety.
FDA: Companion Diagnostics.
Key Takeaways
* CDx is high-stakes by design. If the test can steer therapy, FDA (and clinicians) will demand tighter specificity on intended use, population, and claims.
* The real compliance challenge is uncertainty, not buzzwords. Move the discussion from “can you explain it?” to “how confident are we, and what uncertainty are we accepting for this use?”
* Adjusting to AI’s rise means upgrading the team’s habits. Better questions, tougher validation, and active skepticism are the guardrails, especially when outputs can be wrong or misleading.
* Your training data defines your safety boundary. If the dataset doesn’t represent the real population/use context, we should not be surprised by bias and performance gaps in the real world.
* FDA’s CDx posture is evolving, but rigor isn’t going away. Reclassification signals pathway experimentation,
* AI readiness is cross-functional risk governance. The winners align science, quality, regulatory, and commercial goals around shared definitions, shared uncertainty, and shared decision logic.
Keywords
Companion diagnostics (CDx), AI/ML diagnostics, FDA CDx policy, reclassification, intended use, uncertainty, explainability, bias, lifecycle control, vigilance.
About Chris Daly
Chris Daly is a healthcare and life science executive and Principal at IronLine Consulting, where he helps emerging medical device manufacturers build regulatory and commercialization strategies for diagnostic products, especially companion diagnostics (CDx) and AI/ML-enabled solutions.
He has supported FDA submissions, including successful clearances in infectious disease and software-as-a-medical-device (SaMD), and works with teams on quality system development across device companies, diagnostic labs, and independent diagnostic testing facilities.
Before IronLine, Chris served as Chief Operating Officer of Total Child Health (CHADIS), a web-based screening and clinical management platform that uses pre-visit questionnaires to help clinicians streamline care and improve diagnosis and management of pediatric health, emotional, and behavioral concerns.
Let’s Talk Risk! with Dr. Naveen Agarwal is a bi-weekly live audio event on LinkedIn, where we talk about risk management related topics in a casual, informal way. Join us at 11:00 am EST every other Friday on LinkedIn.
Disclaimer
Information and insights presented in this podcast are for educational purposes only, and not as legal advice. Views expressed by all speakers are their own and do not reflect those of their respective organizations.
Parts of this article were created using AI-generated content, which was subsequently reviewed, edited, and fact-checked by the author to ensure accuracy and alignment with our standards.
By Casual and informal conversations about practical aspects of medical device risk management.5
22 ratings
Summary
“Test, test, test - and don’t accept the outcome unless you’re fairly confident in the level of uncertainty that remains.”
Companion diagnostics (CDx) are inherently high-consequence because they can directly shape treatment decisions. That reality drives a higher bar for clarity: intended use boundaries, evidence expectations, and tight specificity in what the test is claiming and for whom.
In this Let’s Talk Risk! conversation, host Naveen Agarwal sits down with Chris Daly to discuss how AI is transforming the MedTech landscape, including CDx, and how FDA’s expectations are evolving in this area. Chris emphasizes anchoring on the clinical question, using AI as a tool (not a vague strategy), and making uncertainty explicit: not “can we explain everything,” but “how much uncertainty is acceptable for this intended use?”
Listen to the full 30-minute podcast or jump to a section of interest listed below.
Chapters
00:00 2026 context: uncertainty is rising; fundamentals matter
05:00 CDx basics: why CDx is different (and higher-stakes)
07:05 AI/ML + diagnostics: define the question, bound the tool
09:40 Explainability vs uncertainty: “how much confidence is enough?”
14:10 The human factor: vigilance and better questions
18:10 FDA and CDx: what “evolving approach” may signal
21:30 Dataset boundaries, bias, and representativeness as safety issues
25:40 Closing: adapting to rapid AI rise through discipline + alignment
If you enjoyed this podcast, consider subscribing to the Let’s Talk Risk! newsletter.
Suggested links:
LTR: Three Pillars for Defining Your IVD Risk Management Strategy.
LTR: Responsible AI and Future of MedTech Safety.
FDA: Companion Diagnostics.
Key Takeaways
* CDx is high-stakes by design. If the test can steer therapy, FDA (and clinicians) will demand tighter specificity on intended use, population, and claims.
* The real compliance challenge is uncertainty, not buzzwords. Move the discussion from “can you explain it?” to “how confident are we, and what uncertainty are we accepting for this use?”
* Adjusting to AI’s rise means upgrading the team’s habits. Better questions, tougher validation, and active skepticism are the guardrails, especially when outputs can be wrong or misleading.
* Your training data defines your safety boundary. If the dataset doesn’t represent the real population/use context, we should not be surprised by bias and performance gaps in the real world.
* FDA’s CDx posture is evolving, but rigor isn’t going away. Reclassification signals pathway experimentation,
* AI readiness is cross-functional risk governance. The winners align science, quality, regulatory, and commercial goals around shared definitions, shared uncertainty, and shared decision logic.
Keywords
Companion diagnostics (CDx), AI/ML diagnostics, FDA CDx policy, reclassification, intended use, uncertainty, explainability, bias, lifecycle control, vigilance.
About Chris Daly
Chris Daly is a healthcare and life science executive and Principal at IronLine Consulting, where he helps emerging medical device manufacturers build regulatory and commercialization strategies for diagnostic products, especially companion diagnostics (CDx) and AI/ML-enabled solutions.
He has supported FDA submissions, including successful clearances in infectious disease and software-as-a-medical-device (SaMD), and works with teams on quality system development across device companies, diagnostic labs, and independent diagnostic testing facilities.
Before IronLine, Chris served as Chief Operating Officer of Total Child Health (CHADIS), a web-based screening and clinical management platform that uses pre-visit questionnaires to help clinicians streamline care and improve diagnosis and management of pediatric health, emotional, and behavioral concerns.
Let’s Talk Risk! with Dr. Naveen Agarwal is a bi-weekly live audio event on LinkedIn, where we talk about risk management related topics in a casual, informal way. Join us at 11:00 am EST every other Friday on LinkedIn.
Disclaimer
Information and insights presented in this podcast are for educational purposes only, and not as legal advice. Views expressed by all speakers are their own and do not reflect those of their respective organizations.
Parts of this article were created using AI-generated content, which was subsequently reviewed, edited, and fact-checked by the author to ensure accuracy and alignment with our standards.

92 Listeners

22 Listeners