
Sign up to save your podcasts
Or


Backpropagation is a common algorithm for training a neural network. It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.
By Kyle Polich4.4
475475 ratings
Backpropagation is a common algorithm for training a neural network. It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.

290 Listeners

622 Listeners

584 Listeners

302 Listeners

332 Listeners

228 Listeners

206 Listeners

203 Listeners

306 Listeners

96 Listeners

517 Listeners

261 Listeners

131 Listeners

228 Listeners

620 Listeners