
Sign up to save your podcasts
Or


The future of AI isn't digital; it's analog. š°ļøš¤ We investigate the resurrection of Analog Computing, a technology abandoned in the 1970s that is now being rebuilt to solve the "von Neumann Bottleneck." We break down why digital chips are too slow and power-hungry for AI, and how new "Neuromorphic" chips work like the human brain.
1. The "Energy Gap": We analyze the physics. A digital computer needs thousands of transistors to multiply two numbers; an analog circuit needs just one wire. We explain how this efficiency gapāsix orders of magnitudeāis driving companies like Mythic and Intel to build chips that use light and voltage instead of 1s and 0s to process data at the speed of electricity .
2. The "Memory Wall": Why is AI so slow? We explore the architectural flaw of modern computers. The CPU spends most of its time and energy moving data back and forth from memory (the "von Neumann Bottleneck"). We discuss how "In-Memory Computing" solves this by doing the math inside the storage, eliminating the traffic jam .
3. The "Collateral Learning" Tool: Itās not just for big tech. We explore the educational revival. Startups are selling modern, portable analog computers to students, forcing them to physically wire patch cables to model complex systems (like pandemics or markets). We argue that this "hands-on" math teaches a deeper understanding of dynamic systems than typing code ever could .The full list of sources used to create this episode can be found on our Patreon under ā ā https://www.patreon.com/c/Morgrain
By MorgrainThe future of AI isn't digital; it's analog. š°ļøš¤ We investigate the resurrection of Analog Computing, a technology abandoned in the 1970s that is now being rebuilt to solve the "von Neumann Bottleneck." We break down why digital chips are too slow and power-hungry for AI, and how new "Neuromorphic" chips work like the human brain.
1. The "Energy Gap": We analyze the physics. A digital computer needs thousands of transistors to multiply two numbers; an analog circuit needs just one wire. We explain how this efficiency gapāsix orders of magnitudeāis driving companies like Mythic and Intel to build chips that use light and voltage instead of 1s and 0s to process data at the speed of electricity .
2. The "Memory Wall": Why is AI so slow? We explore the architectural flaw of modern computers. The CPU spends most of its time and energy moving data back and forth from memory (the "von Neumann Bottleneck"). We discuss how "In-Memory Computing" solves this by doing the math inside the storage, eliminating the traffic jam .
3. The "Collateral Learning" Tool: Itās not just for big tech. We explore the educational revival. Startups are selling modern, portable analog computers to students, forcing them to physically wire patch cables to model complex systems (like pandemics or markets). We argue that this "hands-on" math teaches a deeper understanding of dynamic systems than typing code ever could .The full list of sources used to create this episode can be found on our Patreon under ā ā https://www.patreon.com/c/Morgrain