
Sign up to save your podcasts
Or


Rethink examines emerging issues in politics, society, economics, technology and the UK's place in the world, and how we might approach them differently. We look at the latest thinking and research and discuss new ideas that might make the world a better place.
In this episode, we consider the changing relationship between the public and big tech companies.
Big technology companies have given us incredible social media and online services, that came with a price - our data. They used it to target advertising and to learn about our likes and dislikes, and the vast majority of us couldn't have cared less about giving up this information.
But Artificial Intelligence products have changed the game, from chatbots that can hold human-like conversations, to Generative AI that can write prose or create a picture from a simple text prompt.
And these unthinking machines require endless amounts of data to train them.
Some companies have been quietly changing their terms and conditions to access our social media and messages for AI training. Privacy regulators in the UK have called a halt to this so far, but US consumers don't have that protection.
Developers have also been scraping the internet, gathering both free and copyrighted material, and leading to legal actions in both the USA, the EU and the UK.
Copyright holders are concerned about a lack of payment or licencing deals, and also that AI imitates their content, putting them out of work. The Government has now launched a consultation to try to balance up the needs of AI and the creative industries.
But with some companies refusing to pay for content, creators have a new tool at their disposal - a program that makes stolen pictures poisonous to AI.
Presenter: Ben Ansell
Contributors:
By BBC Sounds4.8
99 ratings
Rethink examines emerging issues in politics, society, economics, technology and the UK's place in the world, and how we might approach them differently. We look at the latest thinking and research and discuss new ideas that might make the world a better place.
In this episode, we consider the changing relationship between the public and big tech companies.
Big technology companies have given us incredible social media and online services, that came with a price - our data. They used it to target advertising and to learn about our likes and dislikes, and the vast majority of us couldn't have cared less about giving up this information.
But Artificial Intelligence products have changed the game, from chatbots that can hold human-like conversations, to Generative AI that can write prose or create a picture from a simple text prompt.
And these unthinking machines require endless amounts of data to train them.
Some companies have been quietly changing their terms and conditions to access our social media and messages for AI training. Privacy regulators in the UK have called a halt to this so far, but US consumers don't have that protection.
Developers have also been scraping the internet, gathering both free and copyrighted material, and leading to legal actions in both the USA, the EU and the UK.
Copyright holders are concerned about a lack of payment or licencing deals, and also that AI imitates their content, putting them out of work. The Government has now launched a consultation to try to balance up the needs of AI and the creative industries.
But with some companies refusing to pay for content, creators have a new tool at their disposal - a program that makes stolen pictures poisonous to AI.
Presenter: Ben Ansell
Contributors:

7,913 Listeners

376 Listeners

863 Listeners

1,067 Listeners

5,576 Listeners

1,808 Listeners

1,729 Listeners

1,018 Listeners

1,952 Listeners

38 Listeners

306 Listeners

73 Listeners

227 Listeners

159 Listeners

45 Listeners

75 Listeners

4,186 Listeners

3,245 Listeners

20 Listeners

779 Listeners

3,858 Listeners

851 Listeners

45 Listeners

27 Listeners

80 Listeners