Papers Read on AI

LinkBERT: Pretraining Language Models with Document Links


Listen Later

Language model (LM) pretraining can learn various knowledge from text corpora, helping downstream tasks. However, existing methods such as BERT model a single document, and do not capture dependencies or knowledge that span across documents. In this work, we propose LinkBERT, an LM pretraining method that leverages links between documents, e.g., hyperlinks.
2022: Michihiro Yasunaga, J. Leskovec, Percy Liang
Ranked #1 on Text Classification on BLURB
https://arxiv.org/pdf/2203.15827v1.pdf
...more
View all episodesView all episodes
Download on the App Store

Papers Read on AIBy Rob

  • 3.7
  • 3.7
  • 3.7
  • 3.7
  • 3.7

3.7

3 ratings