
Sign up to save your podcasts
Or


Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.
By Kyle Polich4.4
475475 ratings
Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.

32,246 Listeners

30,609 Listeners

288 Listeners

1,105 Listeners

626 Listeners

583 Listeners

306 Listeners

343 Listeners

212 Listeners

203 Listeners

313 Listeners

101 Listeners

551 Listeners

101 Listeners

228 Listeners