This is AGI

Measuring AI Intelligence with Information Theory | Entropy, Generalization & AGI Metrics


Listen Later

What if AI intelligence could be measured in bytes instead of test scores? In this episode of This Is AGI, Alex Chadyuk introduces a principled, information-theoretic approach to evaluating artificial intelligence, moving beyond benchmarks like ARC or FrontierMath toward entropy reduction as a true measure of generalization. Learn how Shannon information, uncertainty reduction, and model size combine into a scientific framework for “intelligence density,” and why this matters for the future of AGI and frontier AI model evaluation.

...more
View all episodesView all episodes
Download on the App Store

This is AGIBy Alex Chadyuk