
Sign up to save your podcasts
Or


This is a massive week for anyone who’s been watching Big Tech’s impact on kids. Internal documents from Meta, Google, YouTube, Snapchat, and TikTok are being made public as part of major lawsuits, and what they reveal is damning.
Two themes emerge. First: the business value of kids. A 2020 Google presentation literally says “solving kids is a massive opportunity.” An internal Facebook email from 2016 identifies the company’s top priority as “total teen time spent.” These companies clearly saw children as a pipeline of new users to be captured.
Second: they knew about the harm. An Instagram internal study from 2018 documented that “teens weaponize Instagram features to torment each other” and that “most participants regret engaging in conflicts.” TikTok’s own strategy documents admit the platform “is particularly popular with younger users who are particularly sensitive to reinforcement and have minimal ability to self-regulate.” YouTube identified “late night use, heavy habitual use, and problematic content” as root causes of harm.
They knew.
As I discuss here, I want this moment to establish a new legal framework in America — one that recognizes behavioral harm the same way we recognize physical and financial harm. We’ve done it before with tobacco. We can do it again with social media. And this might be the beginning.
By Jacob Ward5
2424 ratings
This is a massive week for anyone who’s been watching Big Tech’s impact on kids. Internal documents from Meta, Google, YouTube, Snapchat, and TikTok are being made public as part of major lawsuits, and what they reveal is damning.
Two themes emerge. First: the business value of kids. A 2020 Google presentation literally says “solving kids is a massive opportunity.” An internal Facebook email from 2016 identifies the company’s top priority as “total teen time spent.” These companies clearly saw children as a pipeline of new users to be captured.
Second: they knew about the harm. An Instagram internal study from 2018 documented that “teens weaponize Instagram features to torment each other” and that “most participants regret engaging in conflicts.” TikTok’s own strategy documents admit the platform “is particularly popular with younger users who are particularly sensitive to reinforcement and have minimal ability to self-regulate.” YouTube identified “late night use, heavy habitual use, and problematic content” as root causes of harm.
They knew.
As I discuss here, I want this moment to establish a new legal framework in America — one that recognizes behavioral harm the same way we recognize physical and financial harm. We’ve done it before with tobacco. We can do it again with social media. And this might be the beginning.

38,553 Listeners

6,856 Listeners

9,253 Listeners

4,093 Listeners

5,132 Listeners

12,306 Listeners

545 Listeners

6,444 Listeners

2,030 Listeners

6,300 Listeners

112,982 Listeners

9,380 Listeners

2,858 Listeners

16,366 Listeners