
Sign up to save your podcasts
Or


Theresa May has been forced to ditch whole chunks of her party's manifesto in the wake of the election, but one of the key non-Brexit policies to survive is the plan to crack down on tech companies that allow extremist and abusive material to be published on their networks. The recent terrorist attacks have strengthened the arguments of campaigners who've long said that it's far too easy to access this kind of content and have accused internet companies of wilfully ignoring the problem. The promised "Digital Charter" will aim to force those companies to do more to protect users and improve online safety. With the growing power of tech giants like Google, Facebook and Twitter, connecting billions of people around the globe, is it pie in the sky to promise that Britain will be the safest place to be online? On one level this is a moral argument which has been going on for centuries about what we should, and should not be allowed to read and see and who should make those decisions. But is this a bigger problem than freedom of speech? Have we reached a tipping point where the moral, legal, political and social principles that have guided us in this field have been made redundant by the technology? Do we need to find new kind of moral philosophy that can survive in a digital age and tame the power of the tech-corps? Or is the problem uncomfortably closer to home - a question that each and every one of us has to face up to? Tim Cook, the chief executive of Apple, recently said that he was concerned about new technologies making us think like computers "without values or compassion, without concern for consequence." Witnesses are Nikita Malik, Tom Chatfield, Mike Harris and Mariarosaria Taddeo.
By BBC Radio 44.6
5151 ratings
Theresa May has been forced to ditch whole chunks of her party's manifesto in the wake of the election, but one of the key non-Brexit policies to survive is the plan to crack down on tech companies that allow extremist and abusive material to be published on their networks. The recent terrorist attacks have strengthened the arguments of campaigners who've long said that it's far too easy to access this kind of content and have accused internet companies of wilfully ignoring the problem. The promised "Digital Charter" will aim to force those companies to do more to protect users and improve online safety. With the growing power of tech giants like Google, Facebook and Twitter, connecting billions of people around the globe, is it pie in the sky to promise that Britain will be the safest place to be online? On one level this is a moral argument which has been going on for centuries about what we should, and should not be allowed to read and see and who should make those decisions. But is this a bigger problem than freedom of speech? Have we reached a tipping point where the moral, legal, political and social principles that have guided us in this field have been made redundant by the technology? Do we need to find new kind of moral philosophy that can survive in a digital age and tame the power of the tech-corps? Or is the problem uncomfortably closer to home - a question that each and every one of us has to face up to? Tim Cook, the chief executive of Apple, recently said that he was concerned about new technologies making us think like computers "without values or compassion, without concern for consequence." Witnesses are Nikita Malik, Tom Chatfield, Mike Harris and Mariarosaria Taddeo.

7,747 Listeners

892 Listeners

1,066 Listeners

211 Listeners

5,475 Listeners

1,816 Listeners

1,782 Listeners

1,041 Listeners

2,109 Listeners

2,079 Listeners

29 Listeners

36 Listeners

345 Listeners

161 Listeners

44 Listeners

74 Listeners

91 Listeners

643 Listeners

3,220 Listeners

744 Listeners

1,041 Listeners

3,051 Listeners

990 Listeners

777 Listeners

50 Listeners