
Sign up to save your podcasts
Or


New Mexico Attorney General Raúl Torres's case against Meta was the first to go to trial first — and the price this case sets for Meta could cascade across more than 1,500 lawsuits against social media companies. On this episode, we talk about what his case actually alleges: not that Meta failed to police bad content, but that Meta's own design choices — the recommendation algorithm, the "people you may know" feature, the engagement-above-safety tradeoffs documented in internal company documents — actively connected sexual predators to children. And that Meta knew it.Torres ran an undercover investigation posing as a preteen girl on Meta's platforms. What followed was a flood of sexual solicitations. When Meta challenged the results, Torres ran it again as a criminal sting. Three men showed up at a hotel expecting to meet children, and wound up in handcuffs. Torres says the internal documents he's obtained in discovery show roughly half a million children in English-speaking markets are exposed to inappropriate sexual content on Meta's platforms every single day — and that safety concerns about these features were repeatedly overruled by executives focused on engagement and revenue.This case isn't just about New Mexico. It's about what happens when the per-user damages number is established in a small state and then applied to California, Texas, New York, and Florida. It's about whether Section 230 — the 30-year-old legal shield that has protected social media companies from liability for content — can be circumvented by focusing on design rather than content. And it's about whether the legal strategy that revealed the tobacco playbook — knowing your product is dangerous, hiding it, marketing it as safe — will reveal something similar was going on inside social media companies.Originally published at The Rip Current. Paid subscribers get early access + full written analysis: https://theripcurrent.com
By Jacob Ward5
2424 ratings
New Mexico Attorney General Raúl Torres's case against Meta was the first to go to trial first — and the price this case sets for Meta could cascade across more than 1,500 lawsuits against social media companies. On this episode, we talk about what his case actually alleges: not that Meta failed to police bad content, but that Meta's own design choices — the recommendation algorithm, the "people you may know" feature, the engagement-above-safety tradeoffs documented in internal company documents — actively connected sexual predators to children. And that Meta knew it.Torres ran an undercover investigation posing as a preteen girl on Meta's platforms. What followed was a flood of sexual solicitations. When Meta challenged the results, Torres ran it again as a criminal sting. Three men showed up at a hotel expecting to meet children, and wound up in handcuffs. Torres says the internal documents he's obtained in discovery show roughly half a million children in English-speaking markets are exposed to inappropriate sexual content on Meta's platforms every single day — and that safety concerns about these features were repeatedly overruled by executives focused on engagement and revenue.This case isn't just about New Mexico. It's about what happens when the per-user damages number is established in a small state and then applied to California, Texas, New York, and Florida. It's about whether Section 230 — the 30-year-old legal shield that has protected social media companies from liability for content — can be circumvented by focusing on design rather than content. And it's about whether the legal strategy that revealed the tobacco playbook — knowing your product is dangerous, hiding it, marketing it as safe — will reveal something similar was going on inside social media companies.Originally published at The Rip Current. Paid subscribers get early access + full written analysis: https://theripcurrent.com

38,430 Listeners

6,881 Listeners

9,238 Listeners

4,113 Listeners

5,130 Listeners

12,258 Listeners

544 Listeners

6,467 Listeners

2,031 Listeners

6,304 Listeners

113,121 Listeners

9,475 Listeners

2,867 Listeners

16,525 Listeners