
Sign up to save your podcasts
Or


A little more than a year ago, in the first article announcing the release of the Facebook Files, the documents brought out of the company by whistleblower Frances Haugen, the Wall Street Journal’s Jeff Horwitz reported on Cross Check, a Facebook system that “exempted high-profile users from some or all” of the platform’s rules. The program shields millions of elites from normal content moderation enforcement. While the existence of such a program was known, its scale was and perhaps still is shocking.
Following the Journal’s reporting and subsequent concern in the public, Facebook (now Meta) President of Global Affairs Nick Clegg announced the company would request a policy advisory opinion from its independent Oversight Board. 14 months later, the Oversight Board has completed its review and published its opinion.
To talk more about the opinion, the Cross Check system and the problem of content moderation more generally, I’m joined with one member of the Oversight Board, Nighat Dad, a lawyer from Pakistan and founder of the Digital Rights Foundation; and one outside observer who answered the board’s call for opinions about the Cross Check system, R Street Institute senior fellow and University of Pennsylvania Annenberg Public Policy Center distinguished research fellow Chris Riley.
By Tech Policy Press4.9
3333 ratings
A little more than a year ago, in the first article announcing the release of the Facebook Files, the documents brought out of the company by whistleblower Frances Haugen, the Wall Street Journal’s Jeff Horwitz reported on Cross Check, a Facebook system that “exempted high-profile users from some or all” of the platform’s rules. The program shields millions of elites from normal content moderation enforcement. While the existence of such a program was known, its scale was and perhaps still is shocking.
Following the Journal’s reporting and subsequent concern in the public, Facebook (now Meta) President of Global Affairs Nick Clegg announced the company would request a policy advisory opinion from its independent Oversight Board. 14 months later, the Oversight Board has completed its review and published its opinion.
To talk more about the opinion, the Cross Check system and the problem of content moderation more generally, I’m joined with one member of the Oversight Board, Nighat Dad, a lawyer from Pakistan and founder of the Digital Rights Foundation; and one outside observer who answered the board’s call for opinions about the Cross Check system, R Street Institute senior fellow and University of Pennsylvania Annenberg Public Policy Center distinguished research fellow Chris Riley.

314 Listeners

4,225 Listeners

4,113 Listeners

3,530 Listeners

507 Listeners

6,304 Listeners

6,122 Listeners

1,635 Listeners

577 Listeners

5,576 Listeners

16,525 Listeners

366 Listeners

3,538 Listeners

125 Listeners

398 Listeners