Super Data Science: ML & AI Podcast with Jon Krohn

689: Observing LLMs in Production to Automatically Catch Issues

06.20.2023 - By Jon KrohnPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

Arize's Amber Roberts and Xander Song join Jon Krohn this week, sharing invaluable insights into ML Observability, drift detection, retraining strategies, and the crucial task of ensuring fairness and ethical considerations in AI development.

This episode is brought to you by Posit, the open-source data science company (https://posit.co), by AWS Inferentia (go.aws/3zWS0au), and by Anaconda, the world's most popular Python distribution (https://superdatascience.com/anaconda). Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.

In this episode you will learn:

• What is ML Observability [05:07]

• What is Drift [08:18]

• The different kinds of model drift [15:31]

• How frequently production models should be retrained? [25:15]

• Arize's open-source product, Phoenix [30:49]

• How ML Observability relates to discovering model biases [50:30]

• Arize case studies [57:13]

• What is a developer advocate [1:04:51]

Additional materials: www.superdatascience.com/689

More episodes from Super Data Science: ML & AI Podcast with Jon Krohn