Linear Digressions

KL Divergence


Listen Later

Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!
...more
View all episodesView all episodes
Download on the App Store

Linear DigressionsBy Ben Jaffe and Katie Malone

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

353 ratings


More shows like Linear Digressions

View all
99% Invisible by Roman Mars

99% Invisible

26,134 Listeners

You Are Not So Smart by You Are Not So Smart

You Are Not So Smart

1,712 Listeners

Super Data Science: ML & AI Podcast with Jon Krohn by Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

298 Listeners

The Daily by The New York Times

The Daily

111,382 Listeners

The Ezra Klein Show by New York Times Opinion

The Ezra Klein Show

15,220 Listeners

WSJ's Take On the Week by The Wall Street Journal

WSJ's Take On the Week

132 Listeners

The Severance Podcast with Ben Stiller & Adam Scott by Audacy, Red Hour, Great Scott

The Severance Podcast with Ben Stiller & Adam Scott

2,161 Listeners