
Sign up to save your podcasts
Or


Rethink examines emerging issues in politics, society, economics, technology and the UK's place in the world, and how we might approach them differently. We look at the latest thinking and research and discuss new ideas that might make the world a better place.
In this episode, we consider the changing relationship between the public and big tech companies.
Big technology companies have given us incredible social media and online services, that came with a price - our data. They used it to target advertising and to learn about our likes and dislikes, and the vast majority of us couldn't have cared less about giving up this information.
But Artificial Intelligence products have changed the game, from chatbots that can hold human-like conversations, to Generative AI that can write prose or create a picture from a simple text prompt.
And these unthinking machines require endless amounts of data to train them.
Some companies have been quietly changing their terms and conditions to access our social media and messages for AI training. Privacy regulators in the UK have called a halt to this so far, but US consumers don't have that protection.
Developers have also been scraping the internet, gathering both free and copyrighted material, and leading to legal actions in both the USA, the EU and the UK.
Copyright holders are concerned about a lack of payment or licencing deals, and also that AI imitates their content, putting them out of work. The Government has now launched a consultation to try to balance up the needs of AI and the creative industries.
But with some companies refusing to pay for content, creators have a new tool at their disposal - a program that makes stolen pictures poisonous to AI.
Presenter: Ben Ansell
Contributors:
By BBC Sounds4.8
99 ratings
Rethink examines emerging issues in politics, society, economics, technology and the UK's place in the world, and how we might approach them differently. We look at the latest thinking and research and discuss new ideas that might make the world a better place.
In this episode, we consider the changing relationship between the public and big tech companies.
Big technology companies have given us incredible social media and online services, that came with a price - our data. They used it to target advertising and to learn about our likes and dislikes, and the vast majority of us couldn't have cared less about giving up this information.
But Artificial Intelligence products have changed the game, from chatbots that can hold human-like conversations, to Generative AI that can write prose or create a picture from a simple text prompt.
And these unthinking machines require endless amounts of data to train them.
Some companies have been quietly changing their terms and conditions to access our social media and messages for AI training. Privacy regulators in the UK have called a halt to this so far, but US consumers don't have that protection.
Developers have also been scraping the internet, gathering both free and copyrighted material, and leading to legal actions in both the USA, the EU and the UK.
Copyright holders are concerned about a lack of payment or licencing deals, and also that AI imitates their content, putting them out of work. The Government has now launched a consultation to try to balance up the needs of AI and the creative industries.
But with some companies refusing to pay for content, creators have a new tool at their disposal - a program that makes stolen pictures poisonous to AI.
Presenter: Ben Ansell
Contributors:

7,810 Listeners

376 Listeners

893 Listeners

1,069 Listeners

5,517 Listeners

1,804 Listeners

1,871 Listeners

1,065 Listeners

1,981 Listeners

37 Listeners

301 Listeners

56 Listeners

240 Listeners

162 Listeners

47 Listeners

67 Listeners

4,164 Listeners

3,208 Listeners

22 Listeners

780 Listeners

3,480 Listeners

792 Listeners

51 Listeners

24 Listeners

78 Listeners