Linear Digressions

Multi - Armed Bandits


Listen Later

Multi-armed bandits: how to take your randomized experiment and make it harder better faster stronger. Basically, a multi-armed bandit experiment allows you to optimize for both learning and making use of your knowledge at the same time. It's what the pros (like Google Analytics) use, and it's got a great name, so... winner!
Relevant link: https://support.google.com/analytics/answer/2844870?hl=en
...more
View all episodesView all episodes
Download on the App Store

Linear DigressionsBy Ben Jaffe and Katie Malone

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

353 ratings


More shows like Linear Digressions

View all
Stuff You Should Know by iHeartPodcasts

Stuff You Should Know

78,608 Listeners

Practical AI by Practical AI LLC

Practical AI

200 Listeners