
Sign up to save your podcasts
Or


We dive into Talkie, a 13‑billion‑parameter AI raised in a sealed pre‑1931 library. Trained on 260 billion words published before 1931 and guided by etiquette manuals, Victorian prose, and historical letters, Talkie challenges our ideas of AI reasoning, generalization, and how a mind built from the past perceives the future. We explore how it learns to converse without modern data, its surprising ability to encode modern concepts like programming languages, and the engineering battles against temporal leakage and OCR quirks. A thought-provoking look at how training data shape intelligence—and what a mind forged in the past can reveal about the future of AI.
Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.
Sponsored by Embersilk LLC
By Mike BreaultWe dive into Talkie, a 13‑billion‑parameter AI raised in a sealed pre‑1931 library. Trained on 260 billion words published before 1931 and guided by etiquette manuals, Victorian prose, and historical letters, Talkie challenges our ideas of AI reasoning, generalization, and how a mind built from the past perceives the future. We explore how it learns to converse without modern data, its surprising ability to encode modern concepts like programming languages, and the engineering battles against temporal leakage and OCR quirks. A thought-provoking look at how training data shape intelligence—and what a mind forged in the past can reveal about the future of AI.
Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.
Sponsored by Embersilk LLC