
Sign up to save your podcasts
Or
In the latest episode ofThe New Stack Agents, Naveen Rao, VP of AI at Databricks and a former neuroscientist, reflects on the evolution of AI, neural networks, and the energy constraints that define both biological and artificial intelligence. Rao, who once built circuit systems as a child and later studied the brain’s 20-watt efficiency at Duke and Brown, argues that current AI development—relying on massive energy-intensive data centers—is unsustainable. He believes true intelligence should emerge from low-power, efficient systems, more aligned with biological computing.
Rao warns that the industry is headed toward “model collapse,” where large language models (LLMs) begin training on AI-generated content instead of real-world data, leading to compounding inaccuracies and hallucinations. He stresses the importance of grounding AI in reality and moving beyond brute-force scaling. Rao sees intelligence not just as a function of computing power, but as a distributed, observational system—“life is a learning machine,” he says—hinting at a need to fundamentally rethink how we build AI.
Learn more from The New Stack about the latest insights about the evolution of AI and neural networks:
The 50-Year Story of the Rise, Fall, and Rebirth of Neural Networks
The Evolution of the AI Stack: From Foundation to Agents
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
4.3
3131 ratings
In the latest episode ofThe New Stack Agents, Naveen Rao, VP of AI at Databricks and a former neuroscientist, reflects on the evolution of AI, neural networks, and the energy constraints that define both biological and artificial intelligence. Rao, who once built circuit systems as a child and later studied the brain’s 20-watt efficiency at Duke and Brown, argues that current AI development—relying on massive energy-intensive data centers—is unsustainable. He believes true intelligence should emerge from low-power, efficient systems, more aligned with biological computing.
Rao warns that the industry is headed toward “model collapse,” where large language models (LLMs) begin training on AI-generated content instead of real-world data, leading to compounding inaccuracies and hallucinations. He stresses the importance of grounding AI in reality and moving beyond brute-force scaling. Rao sees intelligence not just as a function of computing power, but as a distributed, observational system—“life is a learning machine,” he says—hinting at a need to fundamentally rethink how we build AI.
Learn more from The New Stack about the latest insights about the evolution of AI and neural networks:
The 50-Year Story of the Rise, Fall, and Rebirth of Neural Networks
The Evolution of the AI Stack: From Foundation to Agents
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
271 Listeners
283 Listeners
152 Listeners
41 Listeners
9 Listeners
627 Listeners
3 Listeners
435 Listeners
4 Listeners
202 Listeners
988 Listeners
189 Listeners
184 Listeners
191 Listeners
65 Listeners
58 Listeners
88 Listeners
62 Listeners