
Sign up to save your podcasts
Or


Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.
By Kyle Polich4.4
475475 ratings
Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.

290 Listeners

622 Listeners

584 Listeners

302 Listeners

332 Listeners

228 Listeners

205 Listeners

205 Listeners

306 Listeners

96 Listeners

515 Listeners

262 Listeners

131 Listeners

228 Listeners

622 Listeners