
Sign up to save your podcasts
Or
The old maxim holds that a lie spreads much faster than a truth, but it has taken the global reach and lightning speed of social media to lay it bare before the world.
One problem of the age of misinformation, says sociologist and former journalist Mutale Nkonde, a fellow at the Stanford Center on Philanthropy and Civil Society (PACS), is that the artificial intelligence algorithms used to profile users and disseminate information to them, whether truthful or not, are inherently biased against minority groups, because they are underrepresented in the historical data upon which the algorithms are based.
Now, Nkonde and others like her are holding social media’s feet to the fire, so to speak, to get them to root out bias from their algorithms. One approach she promotes is the Algorithmic Accountability Act, which would authorize the Federal Trade Commission (FTC) to create regulations requiring companies under its jurisdiction to assess the impact of new and existing automated decision systems. Another approach she has favored is called “Strategic Silence,” which seeks to deny untruthful users and groups the media exposure that amplifies their false claims and helps them attract new adherents.
Nkonde explores the hidden biases of the age of misinformation in this episode of Stanford Engineering’s The Future of Everything podcast, hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4.8
127127 ratings
The old maxim holds that a lie spreads much faster than a truth, but it has taken the global reach and lightning speed of social media to lay it bare before the world.
One problem of the age of misinformation, says sociologist and former journalist Mutale Nkonde, a fellow at the Stanford Center on Philanthropy and Civil Society (PACS), is that the artificial intelligence algorithms used to profile users and disseminate information to them, whether truthful or not, are inherently biased against minority groups, because they are underrepresented in the historical data upon which the algorithms are based.
Now, Nkonde and others like her are holding social media’s feet to the fire, so to speak, to get them to root out bias from their algorithms. One approach she promotes is the Algorithmic Accountability Act, which would authorize the Federal Trade Commission (FTC) to create regulations requiring companies under its jurisdiction to assess the impact of new and existing automated decision systems. Another approach she has favored is called “Strategic Silence,” which seeks to deny untruthful users and groups the media exposure that amplifies their false claims and helps them attract new adherents.
Nkonde explores the hidden biases of the age of misinformation in this episode of Stanford Engineering’s The Future of Everything podcast, hosted by bioengineer Russ Altman. Listen and subscribe here.
Connect With Us:
Episode Transcripts >>> The Future of Everything Website
Connect with Russ >>> Threads / Bluesky / Mastodon
Connect with School of Engineering >>>Twitter/X / Instagram / LinkedIn / Facebook
4,275 Listeners
1,830 Listeners
1,273 Listeners
32,260 Listeners
1,032 Listeners
322 Listeners
3,995 Listeners
1,436 Listeners
38 Listeners
258 Listeners
147 Listeners
106 Listeners
262 Listeners
91 Listeners
462 Listeners
46 Listeners