Linear Digressions

Neural Net Dropout


Listen Later

Neural networks are complex models with many parameters and can be prone to overfitting.  There's a surprisingly simple way to guard against this: randomly destroy connections between hidden units, also known as dropout.  It seems counterintuitive that undermining the structural integrity of the neural net makes it robust against overfitting, but in the world of neural nets, weirdness is just how things go sometimes.
Relevant links: https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
...more
View all episodesView all episodes
Download on the App Store

Linear DigressionsBy Ben Jaffe and Katie Malone

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

353 ratings


More shows like Linear Digressions

View all
99% Invisible by Roman Mars

99% Invisible

26,134 Listeners

You Are Not So Smart by You Are Not So Smart

You Are Not So Smart

1,712 Listeners

Super Data Science: ML & AI Podcast with Jon Krohn by Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

298 Listeners

The Daily by The New York Times

The Daily

111,382 Listeners

The Ezra Klein Show by New York Times Opinion

The Ezra Klein Show

15,220 Listeners

WSJ's Take On the Week by The Wall Street Journal

WSJ's Take On the Week

132 Listeners

The Severance Podcast with Ben Stiller & Adam Scott by Audacy, Red Hour, Great Scott

The Severance Podcast with Ben Stiller & Adam Scott

2,161 Listeners