This is AGI

This is AGI (S1E4): Hallucinations


Listen Later

Hallucinating LLMs are a critical step towards artificial general intelligence (AGI). We should not try to fix them but instead build more complex agents that will channel the LLMs’ runaway creativity into self-perpetuating cycles of knowledge discovery.'This Is AGI', a podcast about the path to the artificial general intelligence. Listen every Monday morning on your favourite podcast platform.

...more
View all episodesView all episodes
Download on the App Store

This is AGIBy Alex Chadyuk