Microsoft Research Podcast

Abstracts: October 9, 2023


Listen Later

Members of the research community at Microsoft work continuously to advance their respective fields. Abstracts brings its audience to the cutting edge with them through short, compelling conversations about new and noteworthy achievements. 

In this episode, Dr. Sheng Zhang, a Senior Researcher at Microsoft Research, joins host Dr. Gretchen Huizinga to discuss “UniversalNER: Targeted Distillation from Large Language Models for Open Named Entity Recognition.” In this paper, Zhang and his coauthors present mission-focused instruction tuning, a method for distilling large language models into smaller, more efficient ones for a broad application class. Their UniversalNER models achieved state-of-the-art performance in named entity recognition, an important natural language processing (NLP) task. Model distillation has the potential to make NLP and other capabilities more accessible, particularly in specialized domains such as biomedicine, which could benefit from more resource-efficient and transparent options. 


Learn more:

  • View the paper
  • UniversalNER project website with demo
  • Code on GitHub
  • Dataset and models on Hugging Face
...more
View all episodesView all episodes
Download on the App Store

Microsoft Research PodcastBy Researchers across the Microsoft research community

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

80 ratings


More shows like Microsoft Research Podcast

View all
The Daily by The New York Times

The Daily

113,121 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

551 Listeners

Hard Fork by The New York Times

Hard Fork

5,576 Listeners

No Priors: Artificial Intelligence | Technology | Startups by Conviction

No Priors: Artificial Intelligence | Technology | Startups

150 Listeners