
Sign up to save your podcasts
Or
If you’ve spent any time playing with modern AI image generators, it can seem like an almost magical experience; but the truth is these programs are more like a magic trick than magic. Without the human-generated art of hundreds of thousands of people, these programs wouldn’t work. But those artists are not getting compensated, in fact many of them are being put out of business by the very programs their work helped create.
Now, two computer scientists from the University of Chicago, Ben Zhao and Heather Zheng, are fighting back. They’ve developed two programs, called Glaze and Nightshade, which create a type of “poison pill” to help protect against generative AI tools like Midjourney and DALL-E, helping artists protect their copyrighted, original work. Their work may also revolutionize all of our relationships to these systems.
4.7
451451 ratings
If you’ve spent any time playing with modern AI image generators, it can seem like an almost magical experience; but the truth is these programs are more like a magic trick than magic. Without the human-generated art of hundreds of thousands of people, these programs wouldn’t work. But those artists are not getting compensated, in fact many of them are being put out of business by the very programs their work helped create.
Now, two computer scientists from the University of Chicago, Ben Zhao and Heather Zheng, are fighting back. They’ve developed two programs, called Glaze and Nightshade, which create a type of “poison pill” to help protect against generative AI tools like Midjourney and DALL-E, helping artists protect their copyrighted, original work. Their work may also revolutionize all of our relationships to these systems.
6,097 Listeners
9,116 Listeners
4,223 Listeners
43,909 Listeners
38,173 Listeners
30,821 Listeners
32,251 Listeners
43,396 Listeners
7,835 Listeners
10,687 Listeners
526 Listeners
15,977 Listeners
174 Listeners
2,137 Listeners
15,174 Listeners