
Sign up to save your podcasts
Or


When bad actors use AI tools to clone a musician’s voice and upload synthetic versions of their songs, they can then file copyright claims against the original artist’s content — and win, at least initially. That’s because the systems platforms used to validate copyright claims are automated and configured to treat whoever files first as the rightful holder. The result: musicians like Murphy Campbell, a folk artist from North Carolina, lose both revenue and control of their own creative identity.
The same mechanism works just as well against any organization that publishes audio or video content online. In this midweek episode, Shel Holtz and Neville Hobson break down how the scam works, why it matters to communicators, and what you should be doing right now — before an incident forces your hand.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, April 27.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
Raw Transcript
Neville Hobson: Hi everyone and welcome to For Immediate Release, this is episode 509. I’m Neville Hobson.
Shel Holtz: And I’m Shel Holtz. And today we’re going to talk about something else that communicators need to worry about. I think we need to develop a worry list for communicators. This one starts with a tale about a folk singer from the mountains of Western North Carolina. She’s named Murphy Campbell. She plays banjo and dulcimer and records old Appalachian ballads, some of them written by her own distant relatives. And she posts videos of herself performing in the woods. She has about 7,800 monthly listeners on Spotify. And she is, as Shelly Palmer put it in a recent column, exactly the kind of artist the copyright system was designed to protect.
In January, some of her fans started messaging her about songs on her Spotify profile that she had never uploaded. Someone would have taken her YouTube performances, run them through AI voice cloning tools, and posted synthetic versions of her songs under her name on streaming platforms. These fake tracks, to put not too fine a point on it, were really bad. Her dulcimer sounded like — and these were her words — a warbled metallic mess. Her voice had been deepened and auto-tuned into what she called a bro country singer. But here’s where it gets interesting for those of us in communications, because that’s not the end of the story. It didn’t stop at impersonation.
Whoever uploaded the fakes through a legitimate music distributor called Vydia (V-Y-D-I-A) then filed copyright claims against Campbell’s original YouTube videos — the very videos the AI had been trained on. Because YouTube doesn’t use humans to review initial copyright claims, Campbell stopped earning revenue on her own content. That revenue started going to the person who had filed the copyright claims.
She described herself as being in a weird limbo where “I’m telling robots to take down music that robots made.” Shelly Palmer called this a reverse copyright scam, and he confirmed, speaking to other content creators off the record, that this is more common than he might have believed.
Now, I know what you’re thinking — music streaming platforms, artists, what does this have to do with me? And the answer is everything. Because the mechanism that elbowed Murphy Campbell out of earning royalties for her own music will work just as well against any organization that publishes content on platforms with automated enforcement systems. That is virtually every organization that has a YouTube channel, a podcast feed, or any kind of public video or audio presence.
So here’s the structural problem as Palmer frames it. The copyright system we have was built on a foundational assumption that the first entity to register a claim is the rightful owner. That assumption held when human creativity was the bottleneck. It breaks completely when AI can generate a synthetic version of any content in seconds using any voice. Think about what your organization puts out there publicly — executive speeches, earnings calls, thought leadership videos, branded audio, training content, podcasts, content marketing pieces. Every one of these is a potential training data set for someone who wants to clone your voice, your leaders’ voices, and then upload a synthetic version through a low-cost distributor. We’re talking about something that costs $25 to $90 a year. Then they file a claim against your legitimate content before a human ever reviews it.
Neville Hobson: (pause)
Shel Holtz: That means the system is going to see them as the first one to file that claim and assume they are the legitimate copyright holder. Now, Rolling Stone confirmed that this isn’t an isolated case. Paul Bender, Veronica Swift, Grace Mitchell — these are just a few of the artists who have faced the same attack. One musician even ran an experiment he called Operation Clown Dump, uploading fake content under his colleagues’ names across platforms. His success rate was 100%.
So what do communicators need to do? First, audit your public content footprint. Do it now, before an incident forces you to. Know what you’ve published, where it lives, and what revenue or visibility is attached to it. Second — and here’s something that’s new for a lot of communicators — register your copyrights. Formal registration is the prerequisite for meaningful legal recourse in the United States. Third, build a rapid response protocol for platform disputes. The organizations that survived these attacks quickest were the ones who knew who to call and knew what to say. And fourth, have this conversation with your legal team today, not after something goes wrong.
Murphy Campbell eventually got Vydia to withdraw its claims, but only after her story went viral. Most organizations won’t have that option. Your story won’t go viral. The bad actor doesn’t need to win permanently — they just need the automated system to act before you do. And that is the lesson, and it’s one we’d better learn from musicians before we have to learn it the hard way.
Neville Hobson: Extraordinary, isn’t it, Shel? I guess you could call it a new phenomenon, only in the sense of the speed with which this can be done. I must admit, I’m astonished that the system is such that the first person to file the copyright claim is assigned ownership. Maybe that’s similar here in the UK — every jurisdiction is different, of course — but that’s rather unsettling. It obviously goes back to a time when people weren’t exploiting the system the way they are now. There are similar examples here in the UK of this kind of activity where people unwittingly find that their content is being misused and misrepresented. And although no major artists — though I may be wrong about that — I did see an article noting that YouTube allows some users to clone the voices of stars like Charli XCX and Sia, with their permission. But unauthorized AI covers of artists like Harry Styles — hundreds of thousands of copies — is a widespread phenomenon, and one that barely registers in mainstream news.
A number of artists, a bit like your example of Murphy Campbell — there’s one I’ve heard about, Greg Rutkowski, a Polish-born artist known for his work on Dungeons & Dragons, who found his style being used in over 400,000 AI prompts, raising serious concerns about the obsolescence of human artists. And to your point about what communicators should watch out for: your corporate communication messaging that’s in audio, your CEO on an earnings call that’s been recorded and distributed. So never mind video — audio alone, at that scale of 400,000 AI prompts, is not a good situation. If you project the thinking out, this is utterly relevant to anyone publishing audio or audiovisual content online.
I find it astonishing that some platforms, notably Spotify — which features prominently in a lot of reporting on this — are being used to literally steal someone’s intellectual property by replicating it. And I think it reinforces the point that registering copyright isn’t an idle exercise. It’s something that should be front of mind, and it does other things for you as well as the owner of the property.
Something as simple as displaying a current copyright notice on your website — it’s remarkable how many sites I come across that still show “Copyright 2016,” never updated. Displaying a current notice signals that the business is active and its information is up to date. There are also tools to protect against AI scraping, though how effective they are is still unclear. Creative Commons licensing is another option, setting out the terms under which people can use your content — though that requires everyone to play by the rules, which frankly isn’t always the case these days.
Nevertheless, you’ve got some protection — or at least the peace of mind that you’ve taken steps. But it really is quite extraordinary, isn’t it, Shel? When I looked into what’s happening in the UK, I came across a recent movement — over a thousand UK musicians, including Paul McCartney, Annie Lennox, and Damon Albarn — who released a silent album to protest proposed legislation that would allow AI companies to train on copyrighted material without consent. It struck me as a real head-scratcher: why would a government enable that to happen?
Shel Holtz: Probably very effective lobbying from the AI companies, I’m sure, is behind that.
Neville Hobson: No doubt, no doubt. But there are other things going on — organizations like the Musicians’ Union and Equity campaigning for better copyright protection, consent, and fair compensation for creators. It’s not getting much mainstream coverage, but activity is happening behind the scenes. Nevertheless, the example of Murphy Campbell and others represents a genuine threat that you need to be aware of if you’ve got content online that matters to you. Never mind the “they shouldn’t be doing this” argument — the point is, if it’s important to you, have you thought about this?
Shel Holtz: If you think about the days before the web, copyright wasn’t something most people had to worry about that much. Professional artists with record deals had people to handle it. Same with authors — someone like Stephen King never had to worry that somebody would be the first to file a copyright claim under his name and siphon off his revenue. But now you have artists who don’t get record deals — like Murphy Campbell — publishing on YouTube and Spotify, building small followings, and making a reasonable living. This is the working class musician concept we talked about, oh, it’s got to be 15 years ago now.
The fact is, you can use Spotify and YouTube to build a following, play some small clubs a few times a year, and make enough to pay the mortgage and put your kids through school. You’re not going to get the penthouse suite from playing to 100,000 people, but you can make a living. But this has also opened up the ability for bad actors to take advantage of that. And now with AI able to reproduce your voice and create new music at scale, all the pieces are in place for this kind of theft. Unless you’re able to get your story to go viral — as Murphy Campbell did — it’s not clear what you can do, because YouTube and Spotify have set up systems that automate this process with no human review. When you used to register with the copyright office yourself, a human was checking. So it’s not likely most organizations have revenue-generating content online — though I’m sure some do, and I’ve actually argued there are ways to use content to generate revenue.
For example, I’ve always loved the idea of a Webcor YouTube video series called “Building for Girls,” where our employee resource group, Women of Webcor, does a five-minute lesson every two weeks on construction to get young girls interested in STEM and engineering careers. Get enough views and YouTube starts paying you. If you don’t copyright-protect that content, someone can come along, produce similar videos, claim the rights, and suddenly your revenue is going to someone else. But even if you’re not producing revenue-generating content, there are other reasons to ensure nobody else can claim ownership of what you create — especially as content marketing demands more and more output. So yes, register that copyright.
Neville Hobson: Yeah, it made me think about watermarking for written content — though I’m not sure there’s something truly effective offering the same protection for audio and video yet. And even if there were, you’ve got situations like Murphy Campbell’s, where it’s her style and tone — the whole persona that defines her music — that’s being copied. And you don’t know about it until strange things start happening: your revenue drops, someone says “I love that new song you just published,” and you discover it wasn’t you. Or you read a review and think, wait — I didn’t write that.
Shel Holtz: Or “I hate that new song you published” — in Murphy Campbell’s case.
Neville Hobson: Exactly. I’m sure people are working on the technology. You’ve got digital rights management, which isn’t new, but I’m not sure it helps here because the issue isn’t copying your content outright — it’s imitating or repurposing it at scale. Hundreds of thousands, or millions of instances. I think the platforms need to do far more than they currently are. It’s a similar argument to what we’re hearing here in the UK about Meta and X doing nothing effective to protect children. This is in the same territory, and it needs a lot more from those platforms — who are making serious money throughout all of this. As to what exactly “more” looks like, I’m not entirely sure, but they need to do more.
Shel Holtz: Yeah, and they probably won’t until there are some high-profile, visible court cases that create real reputation issues for them — then they’ll take action. The easy thing to do right now is simply register the copyright. That’s your protection. When someone imitates you, or claims the content you produced is theirs, you have legal standing to act. That’s why you need to have this conversation with your legal team.
But I wouldn’t wait for either the platforms or the government to do anything. They’re both reticent to act. You have the ability to do something about this right now, and it’s just a matter of working with your legal team and filing those copyrights.
Neville Hobson: Yeah, exactly. And even using Creative Commons licensing — if you’re an individual without all the formal resources, but you have a niche following, even that’s a start. Keep a record of every iteration of everything you’ve created — “I did this in 2017, here’s proof, backed up here.” That gives you something to stand on, a way to demonstrate that you can act if someone uses your content. And if you don’t do this, there’s another consequence worth considering: your original content gets buried in search results because the AI-generated imitations have somehow accrued better signals to rank higher. That kind of pollution from AI slop is its own problem.
Shel Holtz: Yeah — and then people stop paying attention to your content altogether because they’re so fatigued by the AI slop that they tune everything out. But at least this one has a solution communicators can follow: something new to add to the copyright to-do list. And that will be a 30 for this episode of For Immediate Release.
The post FIR #509: Does Corporate Content Need Copyright Protection? appeared first on FIR Podcast Network.
By Neville Hobson and Shel Holtz5
2020 ratings
When bad actors use AI tools to clone a musician’s voice and upload synthetic versions of their songs, they can then file copyright claims against the original artist’s content — and win, at least initially. That’s because the systems platforms used to validate copyright claims are automated and configured to treat whoever files first as the rightful holder. The result: musicians like Murphy Campbell, a folk artist from North Carolina, lose both revenue and control of their own creative identity.
The same mechanism works just as well against any organization that publishes audio or video content online. In this midweek episode, Shel Holtz and Neville Hobson break down how the scam works, why it matters to communicators, and what you should be doing right now — before an incident forces your hand.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, April 27.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
Raw Transcript
Neville Hobson: Hi everyone and welcome to For Immediate Release, this is episode 509. I’m Neville Hobson.
Shel Holtz: And I’m Shel Holtz. And today we’re going to talk about something else that communicators need to worry about. I think we need to develop a worry list for communicators. This one starts with a tale about a folk singer from the mountains of Western North Carolina. She’s named Murphy Campbell. She plays banjo and dulcimer and records old Appalachian ballads, some of them written by her own distant relatives. And she posts videos of herself performing in the woods. She has about 7,800 monthly listeners on Spotify. And she is, as Shelly Palmer put it in a recent column, exactly the kind of artist the copyright system was designed to protect.
In January, some of her fans started messaging her about songs on her Spotify profile that she had never uploaded. Someone would have taken her YouTube performances, run them through AI voice cloning tools, and posted synthetic versions of her songs under her name on streaming platforms. These fake tracks, to put not too fine a point on it, were really bad. Her dulcimer sounded like — and these were her words — a warbled metallic mess. Her voice had been deepened and auto-tuned into what she called a bro country singer. But here’s where it gets interesting for those of us in communications, because that’s not the end of the story. It didn’t stop at impersonation.
Whoever uploaded the fakes through a legitimate music distributor called Vydia (V-Y-D-I-A) then filed copyright claims against Campbell’s original YouTube videos — the very videos the AI had been trained on. Because YouTube doesn’t use humans to review initial copyright claims, Campbell stopped earning revenue on her own content. That revenue started going to the person who had filed the copyright claims.
She described herself as being in a weird limbo where “I’m telling robots to take down music that robots made.” Shelly Palmer called this a reverse copyright scam, and he confirmed, speaking to other content creators off the record, that this is more common than he might have believed.
Now, I know what you’re thinking — music streaming platforms, artists, what does this have to do with me? And the answer is everything. Because the mechanism that elbowed Murphy Campbell out of earning royalties for her own music will work just as well against any organization that publishes content on platforms with automated enforcement systems. That is virtually every organization that has a YouTube channel, a podcast feed, or any kind of public video or audio presence.
So here’s the structural problem as Palmer frames it. The copyright system we have was built on a foundational assumption that the first entity to register a claim is the rightful owner. That assumption held when human creativity was the bottleneck. It breaks completely when AI can generate a synthetic version of any content in seconds using any voice. Think about what your organization puts out there publicly — executive speeches, earnings calls, thought leadership videos, branded audio, training content, podcasts, content marketing pieces. Every one of these is a potential training data set for someone who wants to clone your voice, your leaders’ voices, and then upload a synthetic version through a low-cost distributor. We’re talking about something that costs $25 to $90 a year. Then they file a claim against your legitimate content before a human ever reviews it.
Neville Hobson: (pause)
Shel Holtz: That means the system is going to see them as the first one to file that claim and assume they are the legitimate copyright holder. Now, Rolling Stone confirmed that this isn’t an isolated case. Paul Bender, Veronica Swift, Grace Mitchell — these are just a few of the artists who have faced the same attack. One musician even ran an experiment he called Operation Clown Dump, uploading fake content under his colleagues’ names across platforms. His success rate was 100%.
So what do communicators need to do? First, audit your public content footprint. Do it now, before an incident forces you to. Know what you’ve published, where it lives, and what revenue or visibility is attached to it. Second — and here’s something that’s new for a lot of communicators — register your copyrights. Formal registration is the prerequisite for meaningful legal recourse in the United States. Third, build a rapid response protocol for platform disputes. The organizations that survived these attacks quickest were the ones who knew who to call and knew what to say. And fourth, have this conversation with your legal team today, not after something goes wrong.
Murphy Campbell eventually got Vydia to withdraw its claims, but only after her story went viral. Most organizations won’t have that option. Your story won’t go viral. The bad actor doesn’t need to win permanently — they just need the automated system to act before you do. And that is the lesson, and it’s one we’d better learn from musicians before we have to learn it the hard way.
Neville Hobson: Extraordinary, isn’t it, Shel? I guess you could call it a new phenomenon, only in the sense of the speed with which this can be done. I must admit, I’m astonished that the system is such that the first person to file the copyright claim is assigned ownership. Maybe that’s similar here in the UK — every jurisdiction is different, of course — but that’s rather unsettling. It obviously goes back to a time when people weren’t exploiting the system the way they are now. There are similar examples here in the UK of this kind of activity where people unwittingly find that their content is being misused and misrepresented. And although no major artists — though I may be wrong about that — I did see an article noting that YouTube allows some users to clone the voices of stars like Charli XCX and Sia, with their permission. But unauthorized AI covers of artists like Harry Styles — hundreds of thousands of copies — is a widespread phenomenon, and one that barely registers in mainstream news.
A number of artists, a bit like your example of Murphy Campbell — there’s one I’ve heard about, Greg Rutkowski, a Polish-born artist known for his work on Dungeons & Dragons, who found his style being used in over 400,000 AI prompts, raising serious concerns about the obsolescence of human artists. And to your point about what communicators should watch out for: your corporate communication messaging that’s in audio, your CEO on an earnings call that’s been recorded and distributed. So never mind video — audio alone, at that scale of 400,000 AI prompts, is not a good situation. If you project the thinking out, this is utterly relevant to anyone publishing audio or audiovisual content online.
I find it astonishing that some platforms, notably Spotify — which features prominently in a lot of reporting on this — are being used to literally steal someone’s intellectual property by replicating it. And I think it reinforces the point that registering copyright isn’t an idle exercise. It’s something that should be front of mind, and it does other things for you as well as the owner of the property.
Something as simple as displaying a current copyright notice on your website — it’s remarkable how many sites I come across that still show “Copyright 2016,” never updated. Displaying a current notice signals that the business is active and its information is up to date. There are also tools to protect against AI scraping, though how effective they are is still unclear. Creative Commons licensing is another option, setting out the terms under which people can use your content — though that requires everyone to play by the rules, which frankly isn’t always the case these days.
Nevertheless, you’ve got some protection — or at least the peace of mind that you’ve taken steps. But it really is quite extraordinary, isn’t it, Shel? When I looked into what’s happening in the UK, I came across a recent movement — over a thousand UK musicians, including Paul McCartney, Annie Lennox, and Damon Albarn — who released a silent album to protest proposed legislation that would allow AI companies to train on copyrighted material without consent. It struck me as a real head-scratcher: why would a government enable that to happen?
Shel Holtz: Probably very effective lobbying from the AI companies, I’m sure, is behind that.
Neville Hobson: No doubt, no doubt. But there are other things going on — organizations like the Musicians’ Union and Equity campaigning for better copyright protection, consent, and fair compensation for creators. It’s not getting much mainstream coverage, but activity is happening behind the scenes. Nevertheless, the example of Murphy Campbell and others represents a genuine threat that you need to be aware of if you’ve got content online that matters to you. Never mind the “they shouldn’t be doing this” argument — the point is, if it’s important to you, have you thought about this?
Shel Holtz: If you think about the days before the web, copyright wasn’t something most people had to worry about that much. Professional artists with record deals had people to handle it. Same with authors — someone like Stephen King never had to worry that somebody would be the first to file a copyright claim under his name and siphon off his revenue. But now you have artists who don’t get record deals — like Murphy Campbell — publishing on YouTube and Spotify, building small followings, and making a reasonable living. This is the working class musician concept we talked about, oh, it’s got to be 15 years ago now.
The fact is, you can use Spotify and YouTube to build a following, play some small clubs a few times a year, and make enough to pay the mortgage and put your kids through school. You’re not going to get the penthouse suite from playing to 100,000 people, but you can make a living. But this has also opened up the ability for bad actors to take advantage of that. And now with AI able to reproduce your voice and create new music at scale, all the pieces are in place for this kind of theft. Unless you’re able to get your story to go viral — as Murphy Campbell did — it’s not clear what you can do, because YouTube and Spotify have set up systems that automate this process with no human review. When you used to register with the copyright office yourself, a human was checking. So it’s not likely most organizations have revenue-generating content online — though I’m sure some do, and I’ve actually argued there are ways to use content to generate revenue.
For example, I’ve always loved the idea of a Webcor YouTube video series called “Building for Girls,” where our employee resource group, Women of Webcor, does a five-minute lesson every two weeks on construction to get young girls interested in STEM and engineering careers. Get enough views and YouTube starts paying you. If you don’t copyright-protect that content, someone can come along, produce similar videos, claim the rights, and suddenly your revenue is going to someone else. But even if you’re not producing revenue-generating content, there are other reasons to ensure nobody else can claim ownership of what you create — especially as content marketing demands more and more output. So yes, register that copyright.
Neville Hobson: Yeah, it made me think about watermarking for written content — though I’m not sure there’s something truly effective offering the same protection for audio and video yet. And even if there were, you’ve got situations like Murphy Campbell’s, where it’s her style and tone — the whole persona that defines her music — that’s being copied. And you don’t know about it until strange things start happening: your revenue drops, someone says “I love that new song you just published,” and you discover it wasn’t you. Or you read a review and think, wait — I didn’t write that.
Shel Holtz: Or “I hate that new song you published” — in Murphy Campbell’s case.
Neville Hobson: Exactly. I’m sure people are working on the technology. You’ve got digital rights management, which isn’t new, but I’m not sure it helps here because the issue isn’t copying your content outright — it’s imitating or repurposing it at scale. Hundreds of thousands, or millions of instances. I think the platforms need to do far more than they currently are. It’s a similar argument to what we’re hearing here in the UK about Meta and X doing nothing effective to protect children. This is in the same territory, and it needs a lot more from those platforms — who are making serious money throughout all of this. As to what exactly “more” looks like, I’m not entirely sure, but they need to do more.
Shel Holtz: Yeah, and they probably won’t until there are some high-profile, visible court cases that create real reputation issues for them — then they’ll take action. The easy thing to do right now is simply register the copyright. That’s your protection. When someone imitates you, or claims the content you produced is theirs, you have legal standing to act. That’s why you need to have this conversation with your legal team.
But I wouldn’t wait for either the platforms or the government to do anything. They’re both reticent to act. You have the ability to do something about this right now, and it’s just a matter of working with your legal team and filing those copyrights.
Neville Hobson: Yeah, exactly. And even using Creative Commons licensing — if you’re an individual without all the formal resources, but you have a niche following, even that’s a start. Keep a record of every iteration of everything you’ve created — “I did this in 2017, here’s proof, backed up here.” That gives you something to stand on, a way to demonstrate that you can act if someone uses your content. And if you don’t do this, there’s another consequence worth considering: your original content gets buried in search results because the AI-generated imitations have somehow accrued better signals to rank higher. That kind of pollution from AI slop is its own problem.
Shel Holtz: Yeah — and then people stop paying attention to your content altogether because they’re so fatigued by the AI slop that they tune everything out. But at least this one has a solution communicators can follow: something new to add to the copyright to-do list. And that will be a 30 for this episode of For Immediate Release.
The post FIR #509: Does Corporate Content Need Copyright Protection? appeared first on FIR Podcast Network.

32,246 Listeners

30,233 Listeners

113,121 Listeners

56,944 Listeners

10,331 Listeners

9,167 Listeners

67 Listeners

16,512 Listeners

14,324 Listeners

2,230 Listeners

29,272 Listeners

12,848 Listeners

20,222 Listeners

1,261 Listeners

98 Listeners