
Sign up to save your podcasts
Or


There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
By Francesco Gadaleta4.2
7171 ratings
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.

1,287 Listeners

395 Listeners

479 Listeners

624 Listeners

1,839 Listeners

288 Listeners

303 Listeners

335 Listeners

145 Listeners

268 Listeners

209 Listeners

93 Listeners

95 Listeners

209 Listeners

594 Listeners