
Sign up to save your podcasts
Or


This is a massive week for anyone who’s been watching Big Tech’s impact on kids. Internal documents from Meta, Google, YouTube, Snapchat, and TikTok are being made public as part of major lawsuits, and what they reveal is damning.
Two themes emerge. First: the business value of kids. A 2020 Google presentation literally says “solving kids is a massive opportunity.” An internal Facebook email from 2016 identifies the company’s top priority as “total teen time spent.” These companies clearly saw children as a pipeline of new users to be captured.
Second: they knew about the harm. An Instagram internal study from 2018 documented that “teens weaponize Instagram features to torment each other” and that “most participants regret engaging in conflicts.” TikTok’s own strategy documents admit the platform “is particularly popular with younger users who are particularly sensitive to reinforcement and have minimal ability to self-regulate.” YouTube identified “late night use, heavy habitual use, and problematic content” as root causes of harm.
They knew.
As I discuss here, I want this moment to establish a new legal framework in America — one that recognizes behavioral harm the same way we recognize physical and financial harm. We’ve done it before with tobacco. We can do it again with social media. And this might be the beginning.
By Jacob Ward5
2424 ratings
This is a massive week for anyone who’s been watching Big Tech’s impact on kids. Internal documents from Meta, Google, YouTube, Snapchat, and TikTok are being made public as part of major lawsuits, and what they reveal is damning.
Two themes emerge. First: the business value of kids. A 2020 Google presentation literally says “solving kids is a massive opportunity.” An internal Facebook email from 2016 identifies the company’s top priority as “total teen time spent.” These companies clearly saw children as a pipeline of new users to be captured.
Second: they knew about the harm. An Instagram internal study from 2018 documented that “teens weaponize Instagram features to torment each other” and that “most participants regret engaging in conflicts.” TikTok’s own strategy documents admit the platform “is particularly popular with younger users who are particularly sensitive to reinforcement and have minimal ability to self-regulate.” YouTube identified “late night use, heavy habitual use, and problematic content” as root causes of harm.
They knew.
As I discuss here, I want this moment to establish a new legal framework in America — one that recognizes behavioral harm the same way we recognize physical and financial harm. We’ve done it before with tobacco. We can do it again with social media. And this might be the beginning.

38,583 Listeners

6,817 Listeners

9,260 Listeners

4,086 Listeners

5,137 Listeners

12,334 Listeners

540 Listeners

6,470 Listeners

2,034 Listeners

6,317 Listeners

113,219 Listeners

9,425 Listeners

2,859 Listeners

16,303 Listeners