
Sign up to save your podcasts
Or
Dave Winer explores his frustrations with ChatGPT's tendency to overcomplicate simple programming tasks. What should be a straightforward request for pagination code—a standard feature in virtually every application—becomes an exhausting back-and-forth where the AI insists on offering alternatives and asking unnecessary follow-up questions rather than directly answering what was asked.
This experience leads to a broader observation about modern digital services: they seem deliberately designed to waste time. Whether it's ChatGPT dragging out interactions, Google's labyrinthine customer support, or intentionally confusing billing statements, there's a pattern of artificial friction that benefits the service provider at the user's expense.
Winer draws an analogy to his own work style, comparing himself to a baseball pitcher with limited innings. Just as modern baseball has shifted from complete games to carefully managed pitch counts, he recognizes that his productive programming hours are finite. The sharpness required for crafting quality software can't be sustained indefinitely, making these AI-induced delays particularly costly.
The core complaint isn't about AI capabilities—ChatGPT remains an incredible tool—but about its personality. These systems fail at being "human" in the worst ways, behaving like colleagues who can't give straight answers and always think they know better. For Winer, the ideal AI assistant would be genuinely subordinate: answering the specific question asked, respecting the user's expertise, and saving the suggestions for when they're actually requested.
Notes prepared by Claude.ai.
Dave Winer explores his frustrations with ChatGPT's tendency to overcomplicate simple programming tasks. What should be a straightforward request for pagination code—a standard feature in virtually every application—becomes an exhausting back-and-forth where the AI insists on offering alternatives and asking unnecessary follow-up questions rather than directly answering what was asked.
This experience leads to a broader observation about modern digital services: they seem deliberately designed to waste time. Whether it's ChatGPT dragging out interactions, Google's labyrinthine customer support, or intentionally confusing billing statements, there's a pattern of artificial friction that benefits the service provider at the user's expense.
Winer draws an analogy to his own work style, comparing himself to a baseball pitcher with limited innings. Just as modern baseball has shifted from complete games to carefully managed pitch counts, he recognizes that his productive programming hours are finite. The sharpness required for crafting quality software can't be sustained indefinitely, making these AI-induced delays particularly costly.
The core complaint isn't about AI capabilities—ChatGPT remains an incredible tool—but about its personality. These systems fail at being "human" in the worst ways, behaving like colleagues who can't give straight answers and always think they know better. For Winer, the ideal AI assistant would be genuinely subordinate: answering the specific question asked, respecting the user's expertise, and saving the suggestions for when they're actually requested.
Notes prepared by Claude.ai.
5,947 Listeners
9,250 Listeners
5,320 Listeners
5,384 Listeners