
Sign up to save your podcasts
Or


This study examines the training dynamics of deep networks and introduces a novel statistic called local complexity (LC) to measure the concentration of linear regions around data points. The study finds that LC undergoes different phases during training, with linear regions migrating towards the decision boundary in the final phase. These phases are closely related to the memorization and generalization performance of the network.
https://arxiv.org/abs//2310.12977
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
By Igor Melnyk5
33 ratings
This study examines the training dynamics of deep networks and introduces a novel statistic called local complexity (LC) to measure the concentration of linear regions around data points. The study finds that LC undergoes different phases during training, with linear regions migrating towards the decision boundary in the final phase. These phases are closely related to the memorization and generalization performance of the network.
https://arxiv.org/abs//2310.12977
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers

956 Listeners

1,976 Listeners

438 Listeners

112,847 Listeners

10,064 Listeners

5,532 Listeners

213 Listeners

51 Listeners

98 Listeners

473 Listeners