
Sign up to save your podcasts
Or


Backpropagation is a common algorithm for training a neural network. It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.
By Kyle Polich4.4
475475 ratings
Backpropagation is a common algorithm for training a neural network. It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.

32,243 Listeners

30,635 Listeners

288 Listeners

1,107 Listeners

629 Listeners

583 Listeners

305 Listeners

345 Listeners

209 Listeners

205 Listeners

313 Listeners

100 Listeners

554 Listeners

102 Listeners

229 Listeners