
Sign up to save your podcasts
Or


There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
By Francesco Gadaleta4.2
7272 ratings
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.

3,999 Listeners

26,335 Listeners

766 Listeners

623 Listeners

12,155 Listeners

6,466 Listeners

301 Listeners

113,521 Listeners

57,033 Listeners

25 Listeners

4,115 Listeners

8,704 Listeners

204 Listeners

6,470 Listeners

16,427 Listeners