
Sign up to save your podcasts
Or
Members of Congress and state legislatures are taking aim at online platforms’ ability to set and enforce content-moderation guidelines as private entities. Several proposals in Congress would scrap or amend Section 230 of the Communications Decency Act, which shields online platforms from liability for user-generated content. At the state level, “anti-censorship” laws seek to prevent online platforms from taking down certain content — potentially violating the First Amendment.
Online platforms are also struggling to produce content-moderation strategies that satisfy increasingly polarized users. How can Congress, state officials, and social media firms address users’ content-moderation concerns while preserving the free and open internet?
Join AEI’s Shane Tews for a fireside chat with former Rep. Chris Cox (R-CA), who coauthored Section 230.
You can watch the event here.
4.5
4242 ratings
Members of Congress and state legislatures are taking aim at online platforms’ ability to set and enforce content-moderation guidelines as private entities. Several proposals in Congress would scrap or amend Section 230 of the Communications Decency Act, which shields online platforms from liability for user-generated content. At the state level, “anti-censorship” laws seek to prevent online platforms from taking down certain content — potentially violating the First Amendment.
Online platforms are also struggling to produce content-moderation strategies that satisfy increasingly polarized users. How can Congress, state officials, and social media firms address users’ content-moderation concerns while preserving the free and open internet?
Join AEI’s Shane Tews for a fireside chat with former Rep. Chris Cox (R-CA), who coauthored Section 230.
You can watch the event here.
211 Listeners
49 Listeners