
Sign up to save your podcasts
Or
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
4.2
7272 ratings
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
43,835 Listeners
11,280 Listeners
1,060 Listeners
77,235 Listeners
474 Listeners
585 Listeners
200 Listeners
295 Listeners
253 Listeners
267 Listeners
196 Listeners
2,538 Listeners
42 Listeners
2,824 Listeners
5,364 Listeners