Learning Bayesian Statistics

#147 Fast Approximate Inference without Convergence Worries, with Martin Ingram


Listen Later

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

  • Intro to Bayes Course (first 2 lessons free)
  • Advanced Regression Course (first 2 lessons free)

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

Visit our Patreon page to unlock exclusive Bayesian swag ;)

Takeaways:

  • DADVI is a new approach to variational inference that aims to improve speed and accuracy.
  • DADVI allows for faster Bayesian inference without sacrificing model flexibility.
  • Linear response can help recover covariance estimates from mean estimates.
  • DADVI performs well in mixed models and hierarchical structures.
  • Normalizing flows present an interesting avenue for enhancing variational inference.
  • DADVI can handle large datasets effectively, improving predictive performance.
  • Future enhancements for DADVI may include GPU support and linear response integration.

Chapters:

13:17 Understanding DADVI: A New Approach

21:54 Mean Field Variational Inference Explained

26:38 Linear Response and Covariance Estimation

31:21 Deterministic vs Stochastic Optimization in DADVI

35:00 Understanding DADVI and Its Optimization Landscape

37:59 Theoretical Insights and Practical Applications of DADVI

42:12 Comparative Performance of DADVI in Real Applications

45:03 Challenges and Effectiveness of DADVI in Various Models

48:51 Exploring Future Directions for Variational Inference

53:04 Final Thoughts and Advice for Practitioners

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli and Guillaume Berthon.

Links from the show:

  • Martin's website: https://martiningram.github.io/
  • Martin on Linkedin: https://www.linkedin.com/in/martin-ingram-48302782/
  • Martin on GitHub: https://github.com/martiningram 
  • Martin on Google Scholar: https://scholar.google.com/citations?user=AZ-A7AEAAAAJ&hl=en
  • Fast approximate inference without convergence worries in PyMC: https://martiningram.github.io/deterministic-advi-in-pymc/
  • DADVI linear regression example: https://github.com/pymc-devs/pymc-extras/blob/main/notebooks/deterministic_advi_example.ipynb
  • LBS #142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte: https://learnbayesstats.com/episode/142-bayesian-trees-deep-learning-optimization-big-data-gabriel-stechschulte
  • Alex Andorra & Chris Fonnesbeck – A Beginner's Guide to Variational Inference | PyData Virginia 2025: https://www.youtube.com/watch?v=XECLmgnS6Ng
  • NUTS Adaptation with Normalizing Flows: https://pymc-devs.github.io/nutpie/nf-adapt.html

Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

...more
View all episodesView all episodes
Download on the App Store

Learning Bayesian StatisticsBy Alexandre Andorra

  • 4.7
  • 4.7
  • 4.7
  • 4.7
  • 4.7

4.7

66 ratings


More shows like Learning Bayesian Statistics

View all
Data Skeptic by Kyle Polich

Data Skeptic

479 Listeners

Talk Python To Me by Michael Kennedy

Talk Python To Me

585 Listeners

The Quanta Podcast by Quanta Magazine

The Quanta Podcast

529 Listeners

Macro Musings with David Beckworth by Mercatus Center at George Mason University

Macro Musings with David Beckworth

377 Listeners

Super Data Science: ML & AI Podcast with Jon Krohn by Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

302 Listeners

Azeem Azhar's Exponential View by Azeem Azhar

Azeem Azhar's Exponential View

612 Listeners

Data Engineering Podcast by Tobias Macey

Data Engineering Podcast

145 Listeners

DataFramed by DataCamp

DataFramed

269 Listeners

Practical AI by Practical AI LLC

Practical AI

209 Listeners

Google DeepMind: The Podcast by Hannah Fry

Google DeepMind: The Podcast

200 Listeners

The Real Python Podcast by Real Python

The Real Python Podcast

142 Listeners

Last Week in AI by Skynet Today

Last Week in AI

305 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

95 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

503 Listeners

No Priors: Artificial Intelligence | Technology | Startups by Conviction

No Priors: Artificial Intelligence | Technology | Startups

133 Listeners