The Gradient: Perspectives on AI

Christopher Manning: Linguistics and the Development of NLP


Listen Later

Have suggestions for future podcast guests (or other feedback)? Let us know here!

In episode 41 of The Gradient Podcast, Daniel Bashir speaks to Christopher Manning.

Chris is the Director of the Stanford AI Lab and an Associate Director of the Stanford Human-Centered Artificial Intelligence Institute. He is an ACM Fellow, an AAAI Fellow, and past President of ACL. His work currently focuses on applying deep learning to natural language processing; it has included tree recursive neural networks, GloVe, neural machine translation, and computational linguistic approaches to parsing, among other topics. 

Subscribe to The Gradient Podcast:  Apple Podcasts  | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (02:40) Chris’s path to AI through computational linguistics

* (06:10) Human language acquisition vs. ML systems

* (09:20) Grounding language in the physical world, multimodality and DALL-E 2 vs. Imagen

* (26:15) Chris’s Linguistics PhD, splitting time between Stanford and Xerox PARC, corpus-based empirical NLP

* (34:45) Rationalist and Empiricist schools in linguistics, Chris’s work in 1990s

* (45:30) GloVe and Attention-based Neural Machine Translation, global and local context in language

* (50:30) Different Neural Architectures for Language, Chris’s work in the 2010s

* (58:00) Large-scale Pretraining, learning to predict the next word helps you learn about the world

* (1:00:00) mBERT’s Internal Representations vs. Universal Dependencies Taxonomy

* (1:01:30) The Need for Inductive Priors for Language Systems

* (1:05:55) Courage in Chris’s Research Career

* (1:10:50) Outro (yes Daniel does have a new outro with ~ music ~)

Links:

* Chris’s webpage

* Papers (1990s-2000s)

* Distributional Phrase Structure Induction

* Fast exact inference with a factored model for Natural Language Parsing

* Accurate Unlexicalized Parsing

* Corpus-based induction of syntactic structure

* Foundations of Statistical Natural Language Processing

* Papers (2010s):

* Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

* GloVe

* Effective Approaches to Attention-based Neural Machine Translation

* Stanford’s Graph-based Neural dependency parser

* Papers (2020s)

* Electra: Pre-training text encoders as discriminators rather than generators

* Finding Universal Grammatical Relations in Multilingual BERT

* Emergent linguistic structure in artificial neural networks trained by self-supervision



Get full access to The Gradient at thegradientpub.substack.com/subscribe
...more
View all episodesView all episodes
Download on the App Store

The Gradient: Perspectives on AIBy Daniel Bashir

  • 4.7
  • 4.7
  • 4.7
  • 4.7
  • 4.7

4.7

47 ratings


More shows like The Gradient: Perspectives on AI

View all
The Joe Rogan Experience by Joe Rogan

The Joe Rogan Experience

229,169 Listeners

The a16z Show by Andreessen Horowitz

The a16z Show

1,089 Listeners

NVIDIA AI Podcast by NVIDIA

NVIDIA AI Podcast

334 Listeners

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas by Sean Carroll | Wondery

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

4,182 Listeners

Practical AI by Practical AI LLC

Practical AI

211 Listeners

The Journal. by The Wall Street Journal & Spotify Studios

The Journal.

6,095 Listeners

All-In with Chamath, Jason, Sacks & Friedberg by All-In Podcast, LLC

All-In with Chamath, Jason, Sacks & Friedberg

9,927 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

511 Listeners

Hard Fork by The New York Times

Hard Fork

5,512 Listeners

The Rest Is History by Goalhanger

The Rest Is History

15,272 Listeners

Huberman Lab by Scicomm Media

Huberman Lab

29,246 Listeners

Disintegrator by Roberto Alonso Trillo, Marek Poliks, and Helena McFadzean

Disintegrator

10 Listeners

Practical: AI & Business News by Practical News

Practical: AI & Business News

25 Listeners