
Sign up to save your podcasts
Or
Check out my free video series about what's missing in AI and Neuroscience
Support the show to get full episodes, full archive, and join the Discord community.
Marc Howard runs his Theoretical Cognitive Neuroscience Lab at Boston University, where he develops mathematical models of cognition, constrained by psychological and neural data. In this episode, we discuss the idea that a Laplace transform and its inverse may serve as a unified framework for memory. In short, our memories are compressed on a continuous log-scale: as memories get older, their representations "spread out" in time. It turns out this kind of representation seems ubiquitous in the brain and across cognitive functions, suggesting it is likely a canonical computation our brains use to represent a wide variety of cognitive functions. We also discuss some of the ways Marc is incorporating this mathematical operation in deep learning nets to improve their ability to handle information at different time scales.
0:00 - Intro
4.9
128128 ratings
Check out my free video series about what's missing in AI and Neuroscience
Support the show to get full episodes, full archive, and join the Discord community.
Marc Howard runs his Theoretical Cognitive Neuroscience Lab at Boston University, where he develops mathematical models of cognition, constrained by psychological and neural data. In this episode, we discuss the idea that a Laplace transform and its inverse may serve as a unified framework for memory. In short, our memories are compressed on a continuous log-scale: as memories get older, their representations "spread out" in time. It turns out this kind of representation seems ubiquitous in the brain and across cognitive functions, suggesting it is likely a canonical computation our brains use to represent a wide variety of cognitive functions. We also discuss some of the ways Marc is incorporating this mathematical operation in deep learning nets to improve their ability to handle information at different time scales.
0:00 - Intro
26,300 Listeners
1,031 Listeners
289 Listeners
899 Listeners
4,076 Listeners
1,428 Listeners
248 Listeners
190 Listeners
279 Listeners
90 Listeners
321 Listeners
137 Listeners
457 Listeners
194 Listeners
6 Listeners