
Sign up to save your podcasts
Or
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
4.2
7272 ratings
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
43,917 Listeners
11,133 Listeners
1,069 Listeners
77,562 Listeners
483 Listeners
592 Listeners
202 Listeners
298 Listeners
260 Listeners
266 Listeners
190 Listeners
2,524 Listeners
35 Listeners
2,979 Listeners
5,422 Listeners