
Sign up to save your podcasts
Or


This paper explores critical learning periods in deep linear network models and shows that these periods depend on the depth of the model and structure of the data distribution. The study also examines the impact of pre-training on transfer performance in multi-task learning.
https://arxiv.org/abs//2308.12221
YouTube: https://www.youtube.com/@ArxivPapers
PODCASTS:
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
By Igor Melnyk5
33 ratings
This paper explores critical learning periods in deep linear network models and shows that these periods depend on the depth of the model and structure of the data distribution. The study also examines the impact of pre-training on transfer performance in multi-task learning.
https://arxiv.org/abs//2308.12221
YouTube: https://www.youtube.com/@ArxivPapers
PODCASTS:
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers

956 Listeners

1,976 Listeners

438 Listeners

112,847 Listeners

10,064 Listeners

5,532 Listeners

213 Listeners

51 Listeners

98 Listeners

473 Listeners