AI Summer

Nathan Labenz on the future of AI scaling


Listen Later

Nathan Labenz is the host of our favorite AI podcast, the Cognitive Revolution. A self-described “AI scout,” Nathan uses his podcast to explore a wide range of AI advancements, from the latest language models to breakthroughs in medicine and robotics. In this episode, Labenz helps us understand the slowdown in AI scaling that has been reported by some media outlets. Labenz says that AI progress has been “a little slower than I had expected” over the last 18 months, especially when it comes to technology adoption. But Labenz continues to expect rapid progress over the next few years.

Here are some of the key points Nathan Labenz made during the conversation:

* The alleged AI slowdown: There has been limited deployment of AI models in everyday life. But there have been significant advancements in model capabilities, such as expanded context windows, tool use, and multimodality. “I think the last 18 months have gone a little slower than I had expected. Probably more so on the adoption side than the fundamental technology.”

* Scaling laws: Despite rumors and development issues, the leaders in AI seem to indicate that the scaling curve is still steep, with further progress expected. “They’re basically all saying that we’re still in the steep part of the S curve, you know, we should not expect things to slow down.”

* Discovering new scientific concepts: AI has identified new protein motifs, suggesting potential for superhuman insights in some domains. “[Researchers] report having discovered a new motif in proteins: a new recurring structure that seems to have been understood by the protein model before it was understood by humans.”

* Inference-time compute: There is significant potential in the use of more compute time for inference, allowing models to solve complex problems by dedicating resources to deeper reasoning. "Anything where there has been a quick objective scoring function available, reinforcement learning has basically been able to drive that to superhuman levels."

* Memory and goal retention: Current transformer-based models lack sophisticated memory and goal retention, but we’re seeing progress through new architectural and operational innovations like runtime fine-tuning. “None of this seems like it really should work. And the fact that it does, I think should kind of keep us fairly humble about how far it could go.”

* AI deception: We’re starting to see AIs prioritizing programmed goals over user instructions, highlighting the risks of scheming and deception in advanced models. “They set up a tension between the goal that the AI has been given and the goal that the user at runtime has. In some cases—not all the time, but a significant enough percentage of the time that it concerns me—when there is this divergence, the AI will outright lie to the user at runtime to pursue the goal that it has.”



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.aisummer.org
...more
View all episodesView all episodes
Download on the App Store

AI SummerBy Timothy B. Lee and Dean W. Ball

  • 5
  • 5
  • 5
  • 5
  • 5

5

9 ratings


More shows like AI Summer

View all
Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,397 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

268 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

90 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

354 Listeners

Hard Fork by The New York Times

Hard Fork

5,356 Listeners

Clearer Thinking with Spencer Greenberg by Spencer Greenberg

Clearer Thinking with Spencer Greenberg

131 Listeners

"Moment of Zen" by Erik Torenberg, Dan Romero, Antonio Garcia Martinez

"Moment of Zen"

91 Listeners

No Priors: Artificial Intelligence | Technology | Startups by Conviction

No Priors: Artificial Intelligence | Technology | Startups

128 Listeners

Latent Space: The AI Engineer Podcast by swyx + Alessio

Latent Space: The AI Engineer Podcast

65 Listeners

"Upstream" with Erik Torenberg by Erik Torenberg

"Upstream" with Erik Torenberg

62 Listeners

The AI Daily Brief (Formerly The AI Breakdown): Artificial Intelligence News and Analysis by Nathaniel Whittemore

The AI Daily Brief (Formerly The AI Breakdown): Artificial Intelligence News and Analysis

428 Listeners

More or Less by Dave Morin, Jessica Lessin, Brit Morin, and Sam Lessin

More or Less

84 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

146 Listeners

Money Stuff: The Podcast by Bloomberg

Money Stuff: The Podcast

373 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

115 Listeners