
Sign up to save your podcasts
Or


The paper introduces time vectors, a method to customize language models to specific time periods. By finetuning the model on data from a single time period and subtracting the weights of the original model, time vectors improve performance on text from that time period. Interpolating between time vectors allows for better performance on intervening and future time periods. The findings are consistent across different tasks, domains, model sizes, and time scales, suggesting that time is encoded in the weight space of finetuned models.
https://arxiv.org/abs//2312.13401
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
By Igor Melnyk5
33 ratings
The paper introduces time vectors, a method to customize language models to specific time periods. By finetuning the model on data from a single time period and subtracting the weights of the original model, time vectors improve performance on text from that time period. Interpolating between time vectors allows for better performance on intervening and future time periods. The findings are consistent across different tasks, domains, model sizes, and time scales, suggesting that time is encoded in the weight space of finetuned models.
https://arxiv.org/abs//2312.13401
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers

963 Listeners

1,987 Listeners

436 Listeners

112,952 Listeners

10,218 Listeners

5,537 Listeners

216 Listeners

51 Listeners

99 Listeners

474 Listeners