Learning Machines 101

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

06.08.2015 - By Richard M. Golden, Ph.D., M.S.E.E., B.S.E.E.Play

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

Deep learning machine technology has rapidly developed over the past five years due in part to a variety of actors such as: better technology, convolutional net algorithms, rectified linear units, and a relatively new learning strategy called "dropout" in which hidden unit feature detectors are temporarily deleted during the learning process. This article introduces and discusses the concept of "dropout" to support deep learning performance and makes connections of the "dropout" concept to concepts of regularization and model averaging. For more details and background references, check out: www.learningmachines101.com !

 

More episodes from Learning Machines 101