
Sign up to save your podcasts
Or


There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
By Francesco Gadaleta4.2
7272 ratings
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.

4,026 Listeners

26,380 Listeners

755 Listeners

628 Listeners

12,134 Listeners

6,461 Listeners

305 Listeners

113,219 Listeners

56,957 Listeners

14 Listeners

4,024 Listeners

8,036 Listeners

211 Listeners

6,466 Listeners

16,524 Listeners