Learning Bayesian Statistics

#74 Optimizing NUTS and Developing the ZeroSumNormal Distribution, with Adrian Seyboldt


Listen Later

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

We need to talk. I had trouble writing this introduction. Not because I didn’t know what to say (that’s hardly ever an issue for me), but because a conversation with Adrian Seyboldt always takes deliciously unexpected turns.

Adrian is one of the most brilliant, interesting and open-minded person I know. It turns out he’s courageous too: although he’s not a fan of public speaking, he accepted my invitation on this show — and I’m really glad he did!

Adrian studied math and bioinformatics in Germany and now lives in the US, where he enjoys doing maths, baking bread and hiking.

We talked about the why and how of his new project, Nutpie, a more efficient implementation of the NUTS sampler in Rust. We also dived deep into the new ZeroSumNormal distribution he created and that’s available from PyMC 4.2 onwards — what is it? Why would you use it? And when?

Adrian will also tell us about his favorite type of models, as well as what he currently sees as the biggest hurdles in the Bayesian workflow.

Each time I talk with Adrian, I learn a lot and am filled with enthusiasm — and now I hope you will too!

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey and Andreas Kröpelin.

Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

Links from the show:

  • LBS on Twitter: https://twitter.com/LearnBayesStats
  • LBS on Linkedin: https://www.linkedin.com/company/learn-bayes-stats/
  • Adrian on GitHub: https://github.com/aseyboldt
  • Nutpie repository: https://github.com/pymc-devs/nutpie
  • ZeroSumNormal distribution: https://www.pymc.io/projects/docs/en/stable/api/distributions/generated/pymc.ZeroSumNormal.html
  • Pathfinder – A parallel quasi-Newton algorithm for reaching regions of high probability mass: https://statmodeling.stat.columbia.edu/2021/08/10/pathfinder-a-parallel-quasi-newton-algorithm-for-reaching-regions-of-high-probability-mass/

Abstract

by Christoph Bamberg

Adrian Seyboldt, the guest of this week’s episode, is an active developer of the PyMC library in Python and his new tool nutpie in Rust. He is also a colleague at PyMC-Labs and friend. So naturally, this episode gets technical and nerdy. 

We talk about parametrisation, a topic important for anyone trying to implement a Bayesian model and what to do or avoid (don't use the mean of the data!). 

Adrian explains a new approach to setting categorical parameters, using the Zero Sum Normal Distribution that he developed. The approach is explained in an accessible way with examples, so everyone can understand and implement it themselves.

We also talked about further technical topics like initialising a sampler, the use of warm-up samples, mass matrix adaptation and much more. The difference between probability theory and statistics as well as his view on the challenges in Bayesian statistics complete the episode. 

...more
View all episodesView all episodes
Download on the App Store

Learning Bayesian StatisticsBy Alexandre Andorra

  • 4.7
  • 4.7
  • 4.7
  • 4.7
  • 4.7

4.7

66 ratings


More shows like Learning Bayesian Statistics

View all
Data Skeptic by Kyle Polich

Data Skeptic

476 Listeners

The Quanta Podcast by Quanta Magazine

The Quanta Podcast

506 Listeners

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) by Sam Charrington

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

436 Listeners

Super Data Science: ML & AI Podcast with Jon Krohn by Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

298 Listeners

Data Engineering Podcast by Tobias Macey

Data Engineering Podcast

141 Listeners

Machine Learning Guide by OCDevel

Machine Learning Guide

770 Listeners

DataFramed by DataCamp

DataFramed

270 Listeners

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas by Sean Carroll | Wondery

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

4,126 Listeners

Practical AI by Practical AI LLC

Practical AI

186 Listeners

Google DeepMind: The Podcast by Hannah Fry

Google DeepMind: The Podcast

196 Listeners

Last Week in AI by Skynet Today

Last Week in AI

298 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

91 Listeners

MIT Technology Review Narrated by MIT Technology Review

MIT Technology Review Narrated

256 Listeners

The Joy of Why by Steven Strogatz, Janna Levin and Quanta Magazine

The Joy of Why

497 Listeners

Latent Space: The AI Engineer Podcast by swyx + Alessio

Latent Space: The AI Engineer Podcast

72 Listeners