
Sign up to save your podcasts
Or


"We need the industry to develop guidelines, and values, and professional ethics," says Gilad Edelman, a journalist at WIRED who covers the politics of technology.
Free speech, misinformation, and content moderation on social media platforms like Facebook have been playing a significant role in political discourse since President Trump and the COVID-19 pandemic. To what extent do companies have a right or responsibility to patrol their platforms when content might be harmful? Gilad shifts this conversation from one-or-the-other principles to a case-by-case, comprehensive decision-making process.
By Justin Ahn"We need the industry to develop guidelines, and values, and professional ethics," says Gilad Edelman, a journalist at WIRED who covers the politics of technology.
Free speech, misinformation, and content moderation on social media platforms like Facebook have been playing a significant role in political discourse since President Trump and the COVID-19 pandemic. To what extent do companies have a right or responsibility to patrol their platforms when content might be harmful? Gilad shifts this conversation from one-or-the-other principles to a case-by-case, comprehensive decision-making process.