
Sign up to save your podcasts
Or


Right now, the tech world is caught in an endless loop of throwing massive compute power at Large Language Models, hoping brute force will magically spark Artificial General Intelligence (AGI). But what if the foundational computing architecture is entirely wrong?
In this episode, we sit down with Ian Hamilton, CEO of Synthetic Cognition Labs, who is walking away from standard models to build true AGI.
Ian dismantles complex ideas, detailing why current AI is essentially faking memory and why the path forward lies in hyperdimensional computing. By exploring the friction between biology and technology, we examine how mapping the neural networks of a fruit fly provides a better roadmap for continuous learning than a billion-dollar GPU cluster. You'll learn the critical difference between LLM tokenization and human "analogy-making," and why breaking the AI scale monopoly might require us to nuke everything we know about computing and start over.
If you are tired of the AI buzzword salad and want to decode the future, this is your blueprint.
Follow Ian: https://www.linkedin.com/in/ianchamilton1/
Watch us on YouTube: https://www.youtube.com/watch?v=Rd0SpOb5gMo
https://3reate.com
Listen:
https://podcasts.apple.com/us/podcast/3reate/id1723426314 https://open.spotify.com/show/48Y2M7Ppja43Uq2wlyUtPF https://youtu.be/2wEMD8EvB9I?si=G3iUBE-z4Mx0Ng-Y
By 3reateRight now, the tech world is caught in an endless loop of throwing massive compute power at Large Language Models, hoping brute force will magically spark Artificial General Intelligence (AGI). But what if the foundational computing architecture is entirely wrong?
In this episode, we sit down with Ian Hamilton, CEO of Synthetic Cognition Labs, who is walking away from standard models to build true AGI.
Ian dismantles complex ideas, detailing why current AI is essentially faking memory and why the path forward lies in hyperdimensional computing. By exploring the friction between biology and technology, we examine how mapping the neural networks of a fruit fly provides a better roadmap for continuous learning than a billion-dollar GPU cluster. You'll learn the critical difference between LLM tokenization and human "analogy-making," and why breaking the AI scale monopoly might require us to nuke everything we know about computing and start over.
If you are tired of the AI buzzword salad and want to decode the future, this is your blueprint.
Follow Ian: https://www.linkedin.com/in/ianchamilton1/
Watch us on YouTube: https://www.youtube.com/watch?v=Rd0SpOb5gMo
https://3reate.com
Listen:
https://podcasts.apple.com/us/podcast/3reate/id1723426314 https://open.spotify.com/show/48Y2M7Ppja43Uq2wlyUtPF https://youtu.be/2wEMD8EvB9I?si=G3iUBE-z4Mx0Ng-Y