Token Intelligence

AI's chat interface problem and Lobe's imaginary seed round


Listen Later

Eric and John riff on Lobe's seed round, then dive deep on why chat is the wrong UI for most AI. They unpack the blank page problem, why context matters, and how embedded AI will replace chat.

Summary

In Episode 2, Lobe gets a theoretical 3 million dollar seed round, and Eric and John discuss how they are going to deploy the capital, which includes potential acquisitions.

Next, they dive into a detailed discussion about why chat is a ubiquitous UI for AI. Eric feels very strongly about the shortcomings, which include poor literacy rates, the blank page problem, and which use cases chat is actually good for. The why is even more interesting, and their hypothesis is that cost is one of the primary drivers because of how expensive it is to run models at scale.

They wrap up by imagining a future where AI disappears from interfaces altogether, and is embedded natively in intuitive, multi-model user experiences.

Key takeaways
Lobe.ai

Lobe’s path forward: acquire and partner for distribution (apps/sleep brands), integrate biometrics for REM triggers, and monetize interpretation and creative outputs.

The AI chat interface

Chat is the wrong default interface for AI: it shines for search and inside high-context environments with clear task frames, but obfuscates the power of the tools in most other cases.

Fundamental barriers limit the utility of chat: Americans have low literacy rates, and combined with the blank page problem, chat will limit the value people can get from AI.

Context is king: multimodal, embedded AI will replace generic chat for many jobs. Think IDEs, docs, and app-native flows that deliver value in place.

Hard costs influence the interface: cost and infra realities favor user-initiated interactions now; as economics improve, proactive, background “agentic” features will grow.

Notable mentions with links

Poe (by Quora) is shown as a chat aggregator illustrating how many tools converge on chat as the primary interface.

Notion AI is used to demonstrate higher-context chat inside documents. It's helpful, but with UX pitfalls (e.g., overwriting content and unclear "terms of the transaction").

Cursor (AI IDE) is highlighted as a high-context environment where chat + multimodal controls (browser, on‑page edits) make AI assistance more precise and useful.

v0 is referenced as a multimodal design/build flow that lets users edit generated UI directly, going beyond pure chat to reduce the blank-page burden.

Rabbit R1 is discussed as an alternative, voice‑forward hardware form factor pushing beyond chat, with lessons about timing, expectations, and risk.

Naveen Rao (Databricks) is quoted arguing that generic chat is “the worst interface for most apps,” calling for insight delivered “at the right time in the right context.”

Benedict Evans is cited for the idea that most people will experience LLMs embedded inside apps rather than as standalone chatbots, similar to how SQL is invisible in products.

Jakob Nielsen is noted for the view that prompt engineering’s rise signals a UX gap, and that AI needs a Google‑level leap in usability to cross the chasm.

Low literacy rates are discussed as a key limiter. Good writers tend to extract more value from chat tools.

...more
View all episodesView all episodes
Download on the App Store

Token IntelligenceBy Eric Dodds & John Wessel