
Sign up to save your podcasts
Or


A 20-year-old woman started using YouTube at age six and Instagram at age nine. She's now suing both companies, and her case has just become the most important tech trial since the DOJ went after Microsoft in 1998. Here's what's actually at stake — and why it matters whether you're a parent or not.
The trial isn't just about one person's mental health. It's a bellwether case for more than 1,500 similar lawsuits waiting in the pipeline, and the first time CEOs of major social media platforms — including Mark Zuckerberg, who testifies this week — have had to answer questions in front of a jury rather than a Senate subcommittee. The internal documents already in evidence are extraordinary: YouTube memos describing "viewer addiction" as a goal, Meta's Project Myst finding that traumatized kids were especially vulnerable to the platform and that parental controls made almost no difference, and a strategy document laying out a pipeline designed to bring kids in as tweens and keep them as teens.
The central legal question is whether Section 230 — the 1996 law that has shielded every major platform from liability for nearly 30 years — protects design decisions like infinite scroll, autoplay, and the Like button. The judge has already ruled that the jury can consider design liability. If that argument wins, it changes the legal landscape for every platform that has ever made an engineering choice optimized for engagement. Nobody voted on infinite scroll. No regulator approved autoplay. A small group of engineers and executives made those decisions, and billions of people — including six-year-olds — inherited the results. A Los Angeles jury is now being asked to weigh in on that.
Originally published at The Rip Current. Paid subscribers get early access + full transcripts: https://theripcurrent.substack.com
By Jacob Ward5
2424 ratings
A 20-year-old woman started using YouTube at age six and Instagram at age nine. She's now suing both companies, and her case has just become the most important tech trial since the DOJ went after Microsoft in 1998. Here's what's actually at stake — and why it matters whether you're a parent or not.
The trial isn't just about one person's mental health. It's a bellwether case for more than 1,500 similar lawsuits waiting in the pipeline, and the first time CEOs of major social media platforms — including Mark Zuckerberg, who testifies this week — have had to answer questions in front of a jury rather than a Senate subcommittee. The internal documents already in evidence are extraordinary: YouTube memos describing "viewer addiction" as a goal, Meta's Project Myst finding that traumatized kids were especially vulnerable to the platform and that parental controls made almost no difference, and a strategy document laying out a pipeline designed to bring kids in as tweens and keep them as teens.
The central legal question is whether Section 230 — the 1996 law that has shielded every major platform from liability for nearly 30 years — protects design decisions like infinite scroll, autoplay, and the Like button. The judge has already ruled that the jury can consider design liability. If that argument wins, it changes the legal landscape for every platform that has ever made an engineering choice optimized for engagement. Nobody voted on infinite scroll. No regulator approved autoplay. A small group of engineers and executives made those decisions, and billions of people — including six-year-olds — inherited the results. A Los Angeles jury is now being asked to weigh in on that.
Originally published at The Rip Current. Paid subscribers get early access + full transcripts: https://theripcurrent.substack.com

38,553 Listeners

6,856 Listeners

9,253 Listeners

4,093 Listeners

5,132 Listeners

12,306 Listeners

545 Listeners

6,444 Listeners

2,030 Listeners

6,300 Listeners

112,982 Listeners

9,380 Listeners

2,858 Listeners

16,366 Listeners