
Sign up to save your podcasts
Or
Videos from virtual influencers are on the rise, according to a report from YouTube. And AI will play a significant role in the service’s offerings, with every video uploaded to the platform potentially dubbed into every spoken language, with the speaker’s lips reanimated to sync with the words they are speaking. Meanwhile, the growing flood of AI-generated content presents YouTube with a challenge: protecting copyright while maintaining a steady stream of new content. In this short midweek FIR episode, Neville and Shel examine the trends and discuss their implications.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, February 24.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. Shel has started a metaverse-focused Flipboard magazine. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
Raw Transcript:
Shel Holtz: [00:00:00] Hi everybody, and welcome to episode number 461 of four immediate release. I’m Shell Holtz.
Neville Hobson: And I’m Neville Hobson. This month marks 20 years since the first video was uploaded to YouTube, a 19 second clip that launched a global platform now at the Center of Digital Media as the platform. Reflects on its past.
It’s also looking sharply ahead. And what lies on the horizon is a bold AI powered future highlighted in two reports published in the past week. According to YouTube’s leadership, we’re five years away from a world where every video uploaded to the platform could be automatically dubbed into every spoken language.
More than that, the dubbed voice will sound like the original speaker with AI generated lip movements tailored to match the target language. It’s a vision of seamless global accessibility where creators can invest once and reach audiences everywhere. [00:01:00] This isn’t speculative. YouTube is already piloting dubbing tech with hundreds of thousands of creators and experimenting with voice cloning and lip reanimation.
But with that ambition comes a fair amount of controversy. Underpinning these features is Google’s Gemini AI model trained on an ocean of YouTube videos, YouTube. Many from creators who weren’t aware their content was being used this way. Some have pushed back arguing that a license granted under YouTube’s terms of service doesn’t equate informed consent for AI training.
At the same time, YouTube’s 2025 trends report highlights the rise of virtual influencers, synthetic personas, who are building large audiences and changing what authentic content looks like. For a growing number of viewers, it doesn’t seem to matter whether the face on screen is real generated or somewhere in between.
What emerges is a picture of a platform trying to empower creators with powerful tools while, while quietly shifting the [00:02:00] ground beneath their feet, culturally, ethically, and. On one hand, a report by Bloomberg paints a picture of YouTube as a tech powerhouse using AI to expand creative reach, drive viewership, and reshape media, but not without controversy over how training data is sourced, especially from creators unaware that content fuels these advancements.
On the other hand, social media, today’s take focuses more on the cultural shift. AI generated influencers, fan created content and multi-format storytelling are changing the rules of what audiences find compelling and raising questions about the very definition of authentic content. Both views converge on the same point, AI is here to stay, and whether you are excited or concerned, it’s reshaping the creator economy from top to bottom.
So is this YouTube fulfilling its mission to de democratize creativity through technology? Or is it becoming a platform where the line between creator and content becomes so blurred [00:03:00] that the original human touch gets lost? We should unpack this. There’s quite a bit here to talk about. Isn’t.
Shel Holtz: There is, and it seems to me a relatively natural evolution for YouTube.
Uh, as long as creators are able to upload what they want, I think you will find plenty of authentic content. There’s going to be no shortage of people who want to talk into a camera and share that. Uh, people who. Themes, uh, that they think people would be interested in? Uh, I, I love hearkening back to a story I read about a, a physics grad student, uh, who started a YouTube series, uh, called Physics for Girls.
Uh, and it was aimed at the K through 12. Cohort of of students and trying to get them interested in the STEM sciences and it became very popular and she was [00:04:00] making, I think I read a million dollars a year in. Advertising revenue. I don’t think that’ll stop. I think people will be able to continue to do that.
What you see is in a platform where there’s no limits, there’s no constraints. How many gigabytes of of video data can be uploaded? They just. Keep expanding their data center capacity, uh, that there’s room for all of this other stuff, including the AI generated content. And as long as it’s entertaining or informative, if it serves a purpose, people will watch it.
And that’s the thing, if it’s crap, people aren’t gonna watch it. It’s not gonna get recommended, uh, it won’t find its way into the algorithm. And. Spending time creating it if it doesn’t produce the kind of results that they’re looking for. But we’ve already seen that influencers. Work, uh, on both sides of the equation, you [00:05:00] can tailor them to be exactly what you know your audience is looking for.
So it’s great for the consumer. Uh, and in terms of the brand or the advertiser, uh. You don’t have these loose canon celebrities that you’re, uh, using or, or somebody who’s just a professional influencer who goes off the rails. You’re in complete control. So, uh, you know, it’s not my favorite concept, but I don’t see any way to slow it down.
And I think the people behind them are gonna continue to, uh, find ways to make them. Resonate with, with the people that they’re, uh, aiming them at. And in terms of the training of AI models on all of this, you know, right now you have a, an administration in Washington DC that is agreeable to the approach that the, uh, the AI companies, uh, open ai [00:06:00] and like.
Want the government to take, which is to, uh, just put an end to this whole intellectual property thing and say, AI can train on anything it wants to. Uh, so I, I think that’s probably coming, uh, God knows Elon Musk is, is training grok on all of the content that is shared on X. And if you have an account there that’s, that’s your.
Implicit permission to let him do that. It’s one of the reasons that he went ahead and bought X in the first place was knowing that he had access to that treasure trove of data. So I don’t see it. I don’t see that slowing down either, and I don’t see the fact that people are unhappy, that their content is being used for training, being an impediment to having that content used as training.
It’s gonna continue to happen.
Neville Hobson: That’s part of what worries me a lot about this. I must admit, if I took, if taking the Bloomberg report, um, which [00:07:00] is, uh, this, this idea of auto dubbing videos into every spoken language. We’ve talked about this before, not what YouTube’s doing, but the notion of. The example, you often give the CEO of a company giving an all employee address and he’s an American or a native English speaker.
Uh, and yet there’s a version in 23 other languages like Urdu or Hindi or, or Spanish even. You know, you then talk about Mongolian, perhaps if they have offices in Learn Battle or something. Um. That, uh, shows him fluent in talking in all of those language, which is, I’ve always believed and I still do.
That’s misleading. Uh, unless you are very transparent, which is fact adds to your burden of, of engage with employees. If you’ve gotta explain every time he’s not fluent, and this is not really him speaking Hindi. It’s, uh, an AI has done it or however you might frame it. So that’s not gonna stop though easier.
Uh, your point I agree with as well [00:08:00] that most people won’t really care about, about this Probably. Um, I mean, I’m a, I count myself as a creator, uh, uh, in terms of the very tiny bits of content I put up on my YouTube channel, um, which, uh, isn’t a lot, uh, it’s not a regular cadence, uh, is now and again. Uh, and if I found versions in, uh, you know, in, uh, uh, in native, uh, in, in a native language on Bolivia, for instance, would I care?
Well, only in the sense of is, is it reflecting exactly what I said in English and have to, you have to assume that it’s gonna be doing that, but that’s not to me the point really, they’ve gone ahead and done it without permission. There will be people who don’t want this happen to content. Ts and Cs saying they can do this.
If you don’t like it, you’re gonna have to stop using YouTube. And that’s the reality of life, I think. But there are a couple of things though. Uh, I, I think, you know, Google wants creators to use its ai IE Gemini, uh, to, uh, create, edit, market and [00:09:00] analyze the content that they create and, and, uh, uh, that’s, you may not want to use Gemini.
Um. You’ve got, uh, uh, the training element that Google is assuming they’re okay to use your content to do things like that. Uh, it aligns with their terms of service, they say, but trust isn’t in that equation as far as critics are concerned. The voice cloning and lip animation, the technology is amazing, I have to say.
Uh, and according to Bloomberg, YouTube’s already testing multilingual dubbing in eight languages with plans to expand that. Well, yeah, there’s cloning and lips. To mimic native language speech are in pilot phrases. So all this is coming without doubt. So I think it is interesting. There’s some downsides on all of that.
According to Bloomberg, dubbing will reduce, uh, CPMs when moving from English to other languages. You’ve got that to take into account too. But expanding reach to new language audiences may ultimately increase total revenue. If it’s a monetization thing you’re looking at. Um, so [00:10:00] YouTube says they think quality content, to your point, will still rise above the growing flood of AI generated deepfake material.
I guess that’s part of what we call AI slop these days, right? So there’s that, which of course leads you straight into the other bits about virtual influences, uh, and, uh.
Just a casual look. And I was doing this, uh, this morning, my time before we were recording this, uh, uh, coming across examples of what people are writing about and publishing, uh, with photos and videos of people that you, you get it in the highest resolution you want. I swear. You cannot tell if it’s real or not, if it’s a real human being or an AI generated video.
Will that matter? At the end of the day, I, I think it probably comes down to do you feel hood wicked when you find out it’s an ai when you thought it was a person? And there’s a few surveys out recently, and, and this is kind of tangential connection to this topic, but of people who are [00:11:00] building relationships with ais, they’re, they’re getting intimate with them.
And, and I don’t, I don’t mean the, obviously meaning what we might think intimate means, but developing emotional bonds. With an AI generated persona. And so, uh, there’s great, uh, risk, I think there of, uh, misuse of this technology. So, you know. Going down the rabbit hole or, or even the, the, the, the, the idea of it’s all a conspiracy and they’re out to steal our data and confuse us.
No, it’s not that. But there’s great risk, I think, of opacity, not forget about transparency. This is, this is completely the opposite of that. Uh, and it’s, it’s got, uh, issues in my view that, uh, uh, we ought to try and be clearer than just give. The likes of Google and others, uh, literally can’t blanc to do what the hell they want without, uh, without any, uh, uh, any regulation, which, uh, unfortunately that seems to be aligned with, uh, Mr.
Trump and his gang in Washington as to what, they [00:12:00] don’t care about any of this stuff at all. In which case, um, uh, tech companies, if you listen to some of the strong critics are rubbing their hands with glee at what they’re gonna be able to do now without any oversight. And therein is the issue. But I’m not saying that’s something we should therefore, you know, get out our pitchforks and shovels of March on Washington.
But it’s a concern, right? I mean, this is a major development. Um, the virtual influencers I think is, uh, is is exciting idea. I. Um, but the risks of of misuse are huge in my view. So I just having the yes but moment here basically. And I normally not, I don’t normally do this shell, I’m normally embracing all this stuff straight away, but there’s big alarm bells ringing in my mind about some of the stuff that’s happening.
Shel Holtz: Well, I think a lot of it is going to be contingent upon what we become accustomed to. Uh, yeah. As, as you become accustomed to things, they just become normalized and you don’t give them a second thought. There was a TV commercial. [00:13:00] I’m gonna have to. See if I can find it. They must have it on YouTube. Uh, even though this had to be 20, maybe 25 years ago, I believe it was an IBM commercial.
It was a great commercial by the way. This is why I remember it so many years later. Yeah. Uh, it was either black and white or, or sort of CP toned. Uh, it was in a dusty old diner out in the middle of nowhere, and there’s a waitress behind the counter, uh, and there’s nobody there. One guy wanders in and sits down.
And he, I don’t remember what he asks for, but they don’t have it. They don’t have this, they don’t have that. And then he sees the tv. He says, uh, what do you have on tv? And she says, every movie and television show every ever made in any language you want to hear it in, uh, and talking about the future of technology, right?
If you get to a point where anything you wanna see. Is available in your language [00:14:00] then, does it continue to be an ethical question when you see your CEO who doesn’t speak your language speaking to you in your language? Or is this just something that we all accept that the technology does for everything now and it doesn’t matter whether he speaks your language or not, he can because of the technology.
Now, I’m not saying that. Promoting as an approach to take today from an ethics standpoint, I think you do need to let people know, uh, we think it’s gonna be a lot easier and more meaningful for you to hear, uh, the CEO speak in your native language. Mm-hmm. But he doesn’t speak it. This was AI assisting with this, but in five years when everything.
Is handled that way, it will it even matter. I, you know, I, I suspect that it won’t, I suspect it won’t matter whether somebody speaks that language when you know that any media you consume can be consumed in your native language thanks to the technology that we all [00:15:00] take for granted at that point.
Neville Hobson: Hmm. Uh, that’s a sound assessment.
Uh, and you may well be right and I, I suspect that much of what you said will likely come to pass. I just think that there’s. Concerns we ought to be paying more attention to than we seem to be. It seems to be. So for instance, uh, one big thing to me is, is um, I guess it’s kind of related to the ethical debate, but what does real mean anymore?
I. In this, what does authenticity mean? Now, it doesn’t mean what it meant yesterday. If you’ve got virtual influencers, uh, creating videos, you don’t know that that’s not, that’s not a real person. Things like that.
Shel Holtz: That’s, that’s keep in mind that I was, I was sold, uh, sugar Frosted Flakes by Tony the Tiger, uh, who was not a real person, uh, or even a real tiger.
But they,
Neville Hobson: they weren’t pretending it was, or, or making you assume that it probably was. That’s the only different, but this is. Thing.
Shel Holtz: This is, uh, uh, the, the modern equivalent. Uh, and well, Tony the
Neville Hobson: tiger. [00:16:00]
Shel Holtz: Yeah. And yeah, the, the virtual influencers I’ve seen so far, uh, are obvious. Uh, I have not seen one that they have worked really, really hard to convince you that this is anything but a virtual influencer.
And on Instagram, at least most of them, I see the disclosure, uh, that, that they are, uh, I just don’t think people care. Uh, no. If, if, if they’re getting good information, if they’re being entertained, you know, are you not entertained? If you are, you’ll continue to watch. And, uh, if somebody says, you know, that’s ai, your answer’s gonna be okay.
So
Neville Hobson: I get that, but I think we have a responsibility to, uh, to point out certain things, whether people care or not. That’s part of our Oh, that question. Our responsibility is communicate. Yeah. So yes. So, so hence my point about, uh, what does real mean? What, how do we defining real now? Uh, and I think the, um, the, the, the kind of, uh.
Bigger worry. Waiting in the wings is [00:17:00] the fakery that we see everywhere. It’s getting even easier to, uh, to do this kind of thing. Um, deep fakes, whatever they’re now called. Um, that’s been off the radar for a bit now, but suddenly you’ll see something and to, for what I mean, I read, I haven’t seen anything myself, but I did read this morning that already there’s videos around of Pope Francis who died, uh, on Monday, uh, that he is not.
Actually, uh, according to these videos, he’s out there speaking and, and doing all these events and so forth, um, that will confuse some people. And th this is the, this is, I think the gr the, the grave risk, uh, of not the technology, um, because. It’s what people will do with it. And that’s not, I’m not suggesting for a second that because of that, therefore we shouldn’t do X and y and so forth.
Not at all. But we need to, uh, address these concerns and indeed the, uh, the unspoken concerns, uh, before they become a problem, uh, or at least make people aware [00:18:00] and that that is a lot. Not to do with the awareness that we’re already seeing from governments everywhere. Like here in the uk for instance, I see government ads across every social network now and again about, uh, checking the very, checking the authenticity of things and people, uh, and products that people are pitching and so forth.
Uh, and that will ramp up, no doubt, in which case opportunity for communicators then for that kind of education. So, um, it, it, it perhaps will come down to, uh, to that the, uh, the, uh, the ethical debate on training, on consent, uh, on people’s rights, intellectual property, whatever. Governments in Washington, DC I mean, that.
Uh, the situation with Trump and his, uh, um, his psycho fence, as I call them, really, uh, is only, um, uh, is well, it’s more than a blip. It’s, it is made a huge change around the world that no one could have predicted. Whatever you think about Trump, you gotta give it, give it to him in one sense that he [00:19:00] has forced huge change on almost every country around the world.
So, uh, I see here things that people are discussing now, we’re gonna. Would never have dreamt that these politicians would be suggesting that if Trump was not on the scene. So that is a big impact in all of this, and it’s hard to predict what effect that’s gonna have on something like this. But, um, I think the, uh, the concerns of people about training, for example, using their content without permission, uh, human beings, again, this is a, a related thing to what other conversations are worried about being replaced by the.
That’s not, not, not a separate or a suddenly new thing, but it just reinforces in my view, certainly that we need to address all of these things. We need to show that we have people’s backs in their concerns about this, and we’re gonna help kind of understand it if we can. That’s our job as communicates, it seems to be.
Shel Holtz: Yes. In addition to creating some of this content.
Neville Hobson: Oh, indeed. [00:20:00]
Shel Holtz: That’ll be a 30 for this episode of four immediate release.
The post FIR #461: YouTube Trends Toward Virtual Influencers and AI-Generated Videos appeared first on FIR Podcast Network.
5
2020 ratings
Videos from virtual influencers are on the rise, according to a report from YouTube. And AI will play a significant role in the service’s offerings, with every video uploaded to the platform potentially dubbed into every spoken language, with the speaker’s lips reanimated to sync with the words they are speaking. Meanwhile, the growing flood of AI-generated content presents YouTube with a challenge: protecting copyright while maintaining a steady stream of new content. In this short midweek FIR episode, Neville and Shel examine the trends and discuss their implications.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, February 24.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. Shel has started a metaverse-focused Flipboard magazine. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
Raw Transcript:
Shel Holtz: [00:00:00] Hi everybody, and welcome to episode number 461 of four immediate release. I’m Shell Holtz.
Neville Hobson: And I’m Neville Hobson. This month marks 20 years since the first video was uploaded to YouTube, a 19 second clip that launched a global platform now at the Center of Digital Media as the platform. Reflects on its past.
It’s also looking sharply ahead. And what lies on the horizon is a bold AI powered future highlighted in two reports published in the past week. According to YouTube’s leadership, we’re five years away from a world where every video uploaded to the platform could be automatically dubbed into every spoken language.
More than that, the dubbed voice will sound like the original speaker with AI generated lip movements tailored to match the target language. It’s a vision of seamless global accessibility where creators can invest once and reach audiences everywhere. [00:01:00] This isn’t speculative. YouTube is already piloting dubbing tech with hundreds of thousands of creators and experimenting with voice cloning and lip reanimation.
But with that ambition comes a fair amount of controversy. Underpinning these features is Google’s Gemini AI model trained on an ocean of YouTube videos, YouTube. Many from creators who weren’t aware their content was being used this way. Some have pushed back arguing that a license granted under YouTube’s terms of service doesn’t equate informed consent for AI training.
At the same time, YouTube’s 2025 trends report highlights the rise of virtual influencers, synthetic personas, who are building large audiences and changing what authentic content looks like. For a growing number of viewers, it doesn’t seem to matter whether the face on screen is real generated or somewhere in between.
What emerges is a picture of a platform trying to empower creators with powerful tools while, while quietly shifting the [00:02:00] ground beneath their feet, culturally, ethically, and. On one hand, a report by Bloomberg paints a picture of YouTube as a tech powerhouse using AI to expand creative reach, drive viewership, and reshape media, but not without controversy over how training data is sourced, especially from creators unaware that content fuels these advancements.
On the other hand, social media, today’s take focuses more on the cultural shift. AI generated influencers, fan created content and multi-format storytelling are changing the rules of what audiences find compelling and raising questions about the very definition of authentic content. Both views converge on the same point, AI is here to stay, and whether you are excited or concerned, it’s reshaping the creator economy from top to bottom.
So is this YouTube fulfilling its mission to de democratize creativity through technology? Or is it becoming a platform where the line between creator and content becomes so blurred [00:03:00] that the original human touch gets lost? We should unpack this. There’s quite a bit here to talk about. Isn’t.
Shel Holtz: There is, and it seems to me a relatively natural evolution for YouTube.
Uh, as long as creators are able to upload what they want, I think you will find plenty of authentic content. There’s going to be no shortage of people who want to talk into a camera and share that. Uh, people who. Themes, uh, that they think people would be interested in? Uh, I, I love hearkening back to a story I read about a, a physics grad student, uh, who started a YouTube series, uh, called Physics for Girls.
Uh, and it was aimed at the K through 12. Cohort of of students and trying to get them interested in the STEM sciences and it became very popular and she was [00:04:00] making, I think I read a million dollars a year in. Advertising revenue. I don’t think that’ll stop. I think people will be able to continue to do that.
What you see is in a platform where there’s no limits, there’s no constraints. How many gigabytes of of video data can be uploaded? They just. Keep expanding their data center capacity, uh, that there’s room for all of this other stuff, including the AI generated content. And as long as it’s entertaining or informative, if it serves a purpose, people will watch it.
And that’s the thing, if it’s crap, people aren’t gonna watch it. It’s not gonna get recommended, uh, it won’t find its way into the algorithm. And. Spending time creating it if it doesn’t produce the kind of results that they’re looking for. But we’ve already seen that influencers. Work, uh, on both sides of the equation, you [00:05:00] can tailor them to be exactly what you know your audience is looking for.
So it’s great for the consumer. Uh, and in terms of the brand or the advertiser, uh. You don’t have these loose canon celebrities that you’re, uh, using or, or somebody who’s just a professional influencer who goes off the rails. You’re in complete control. So, uh, you know, it’s not my favorite concept, but I don’t see any way to slow it down.
And I think the people behind them are gonna continue to, uh, find ways to make them. Resonate with, with the people that they’re, uh, aiming them at. And in terms of the training of AI models on all of this, you know, right now you have a, an administration in Washington DC that is agreeable to the approach that the, uh, the AI companies, uh, open ai [00:06:00] and like.
Want the government to take, which is to, uh, just put an end to this whole intellectual property thing and say, AI can train on anything it wants to. Uh, so I, I think that’s probably coming, uh, God knows Elon Musk is, is training grok on all of the content that is shared on X. And if you have an account there that’s, that’s your.
Implicit permission to let him do that. It’s one of the reasons that he went ahead and bought X in the first place was knowing that he had access to that treasure trove of data. So I don’t see it. I don’t see that slowing down either, and I don’t see the fact that people are unhappy, that their content is being used for training, being an impediment to having that content used as training.
It’s gonna continue to happen.
Neville Hobson: That’s part of what worries me a lot about this. I must admit, if I took, if taking the Bloomberg report, um, which [00:07:00] is, uh, this, this idea of auto dubbing videos into every spoken language. We’ve talked about this before, not what YouTube’s doing, but the notion of. The example, you often give the CEO of a company giving an all employee address and he’s an American or a native English speaker.
Uh, and yet there’s a version in 23 other languages like Urdu or Hindi or, or Spanish even. You know, you then talk about Mongolian, perhaps if they have offices in Learn Battle or something. Um. That, uh, shows him fluent in talking in all of those language, which is, I’ve always believed and I still do.
That’s misleading. Uh, unless you are very transparent, which is fact adds to your burden of, of engage with employees. If you’ve gotta explain every time he’s not fluent, and this is not really him speaking Hindi. It’s, uh, an AI has done it or however you might frame it. So that’s not gonna stop though easier.
Uh, your point I agree with as well [00:08:00] that most people won’t really care about, about this Probably. Um, I mean, I’m a, I count myself as a creator, uh, uh, in terms of the very tiny bits of content I put up on my YouTube channel, um, which, uh, isn’t a lot, uh, it’s not a regular cadence, uh, is now and again. Uh, and if I found versions in, uh, you know, in, uh, uh, in native, uh, in, in a native language on Bolivia, for instance, would I care?
Well, only in the sense of is, is it reflecting exactly what I said in English and have to, you have to assume that it’s gonna be doing that, but that’s not to me the point really, they’ve gone ahead and done it without permission. There will be people who don’t want this happen to content. Ts and Cs saying they can do this.
If you don’t like it, you’re gonna have to stop using YouTube. And that’s the reality of life, I think. But there are a couple of things though. Uh, I, I think, you know, Google wants creators to use its ai IE Gemini, uh, to, uh, create, edit, market and [00:09:00] analyze the content that they create and, and, uh, uh, that’s, you may not want to use Gemini.
Um. You’ve got, uh, uh, the training element that Google is assuming they’re okay to use your content to do things like that. Uh, it aligns with their terms of service, they say, but trust isn’t in that equation as far as critics are concerned. The voice cloning and lip animation, the technology is amazing, I have to say.
Uh, and according to Bloomberg, YouTube’s already testing multilingual dubbing in eight languages with plans to expand that. Well, yeah, there’s cloning and lips. To mimic native language speech are in pilot phrases. So all this is coming without doubt. So I think it is interesting. There’s some downsides on all of that.
According to Bloomberg, dubbing will reduce, uh, CPMs when moving from English to other languages. You’ve got that to take into account too. But expanding reach to new language audiences may ultimately increase total revenue. If it’s a monetization thing you’re looking at. Um, so [00:10:00] YouTube says they think quality content, to your point, will still rise above the growing flood of AI generated deepfake material.
I guess that’s part of what we call AI slop these days, right? So there’s that, which of course leads you straight into the other bits about virtual influences, uh, and, uh.
Just a casual look. And I was doing this, uh, this morning, my time before we were recording this, uh, uh, coming across examples of what people are writing about and publishing, uh, with photos and videos of people that you, you get it in the highest resolution you want. I swear. You cannot tell if it’s real or not, if it’s a real human being or an AI generated video.
Will that matter? At the end of the day, I, I think it probably comes down to do you feel hood wicked when you find out it’s an ai when you thought it was a person? And there’s a few surveys out recently, and, and this is kind of tangential connection to this topic, but of people who are [00:11:00] building relationships with ais, they’re, they’re getting intimate with them.
And, and I don’t, I don’t mean the, obviously meaning what we might think intimate means, but developing emotional bonds. With an AI generated persona. And so, uh, there’s great, uh, risk, I think there of, uh, misuse of this technology. So, you know. Going down the rabbit hole or, or even the, the, the, the, the idea of it’s all a conspiracy and they’re out to steal our data and confuse us.
No, it’s not that. But there’s great risk, I think, of opacity, not forget about transparency. This is, this is completely the opposite of that. Uh, and it’s, it’s got, uh, issues in my view that, uh, uh, we ought to try and be clearer than just give. The likes of Google and others, uh, literally can’t blanc to do what the hell they want without, uh, without any, uh, uh, any regulation, which, uh, unfortunately that seems to be aligned with, uh, Mr.
Trump and his gang in Washington as to what, they [00:12:00] don’t care about any of this stuff at all. In which case, um, uh, tech companies, if you listen to some of the strong critics are rubbing their hands with glee at what they’re gonna be able to do now without any oversight. And therein is the issue. But I’m not saying that’s something we should therefore, you know, get out our pitchforks and shovels of March on Washington.
But it’s a concern, right? I mean, this is a major development. Um, the virtual influencers I think is, uh, is is exciting idea. I. Um, but the risks of of misuse are huge in my view. So I just having the yes but moment here basically. And I normally not, I don’t normally do this shell, I’m normally embracing all this stuff straight away, but there’s big alarm bells ringing in my mind about some of the stuff that’s happening.
Shel Holtz: Well, I think a lot of it is going to be contingent upon what we become accustomed to. Uh, yeah. As, as you become accustomed to things, they just become normalized and you don’t give them a second thought. There was a TV commercial. [00:13:00] I’m gonna have to. See if I can find it. They must have it on YouTube. Uh, even though this had to be 20, maybe 25 years ago, I believe it was an IBM commercial.
It was a great commercial by the way. This is why I remember it so many years later. Yeah. Uh, it was either black and white or, or sort of CP toned. Uh, it was in a dusty old diner out in the middle of nowhere, and there’s a waitress behind the counter, uh, and there’s nobody there. One guy wanders in and sits down.
And he, I don’t remember what he asks for, but they don’t have it. They don’t have this, they don’t have that. And then he sees the tv. He says, uh, what do you have on tv? And she says, every movie and television show every ever made in any language you want to hear it in, uh, and talking about the future of technology, right?
If you get to a point where anything you wanna see. Is available in your language [00:14:00] then, does it continue to be an ethical question when you see your CEO who doesn’t speak your language speaking to you in your language? Or is this just something that we all accept that the technology does for everything now and it doesn’t matter whether he speaks your language or not, he can because of the technology.
Now, I’m not saying that. Promoting as an approach to take today from an ethics standpoint, I think you do need to let people know, uh, we think it’s gonna be a lot easier and more meaningful for you to hear, uh, the CEO speak in your native language. Mm-hmm. But he doesn’t speak it. This was AI assisting with this, but in five years when everything.
Is handled that way, it will it even matter. I, you know, I, I suspect that it won’t, I suspect it won’t matter whether somebody speaks that language when you know that any media you consume can be consumed in your native language thanks to the technology that we all [00:15:00] take for granted at that point.
Neville Hobson: Hmm. Uh, that’s a sound assessment.
Uh, and you may well be right and I, I suspect that much of what you said will likely come to pass. I just think that there’s. Concerns we ought to be paying more attention to than we seem to be. It seems to be. So for instance, uh, one big thing to me is, is um, I guess it’s kind of related to the ethical debate, but what does real mean anymore?
I. In this, what does authenticity mean? Now, it doesn’t mean what it meant yesterday. If you’ve got virtual influencers, uh, creating videos, you don’t know that that’s not, that’s not a real person. Things like that.
Shel Holtz: That’s, that’s keep in mind that I was, I was sold, uh, sugar Frosted Flakes by Tony the Tiger, uh, who was not a real person, uh, or even a real tiger.
But they,
Neville Hobson: they weren’t pretending it was, or, or making you assume that it probably was. That’s the only different, but this is. Thing.
Shel Holtz: This is, uh, uh, the, the modern equivalent. Uh, and well, Tony the
Neville Hobson: tiger. [00:16:00]
Shel Holtz: Yeah. And yeah, the, the virtual influencers I’ve seen so far, uh, are obvious. Uh, I have not seen one that they have worked really, really hard to convince you that this is anything but a virtual influencer.
And on Instagram, at least most of them, I see the disclosure, uh, that, that they are, uh, I just don’t think people care. Uh, no. If, if, if they’re getting good information, if they’re being entertained, you know, are you not entertained? If you are, you’ll continue to watch. And, uh, if somebody says, you know, that’s ai, your answer’s gonna be okay.
So
Neville Hobson: I get that, but I think we have a responsibility to, uh, to point out certain things, whether people care or not. That’s part of our Oh, that question. Our responsibility is communicate. Yeah. So yes. So, so hence my point about, uh, what does real mean? What, how do we defining real now? Uh, and I think the, um, the, the, the kind of, uh.
Bigger worry. Waiting in the wings is [00:17:00] the fakery that we see everywhere. It’s getting even easier to, uh, to do this kind of thing. Um, deep fakes, whatever they’re now called. Um, that’s been off the radar for a bit now, but suddenly you’ll see something and to, for what I mean, I read, I haven’t seen anything myself, but I did read this morning that already there’s videos around of Pope Francis who died, uh, on Monday, uh, that he is not.
Actually, uh, according to these videos, he’s out there speaking and, and doing all these events and so forth, um, that will confuse some people. And th this is the, this is, I think the gr the, the grave risk, uh, of not the technology, um, because. It’s what people will do with it. And that’s not, I’m not suggesting for a second that because of that, therefore we shouldn’t do X and y and so forth.
Not at all. But we need to, uh, address these concerns and indeed the, uh, the unspoken concerns, uh, before they become a problem, uh, or at least make people aware [00:18:00] and that that is a lot. Not to do with the awareness that we’re already seeing from governments everywhere. Like here in the uk for instance, I see government ads across every social network now and again about, uh, checking the very, checking the authenticity of things and people, uh, and products that people are pitching and so forth.
Uh, and that will ramp up, no doubt, in which case opportunity for communicators then for that kind of education. So, um, it, it, it perhaps will come down to, uh, to that the, uh, the, uh, the ethical debate on training, on consent, uh, on people’s rights, intellectual property, whatever. Governments in Washington, DC I mean, that.
Uh, the situation with Trump and his, uh, um, his psycho fence, as I call them, really, uh, is only, um, uh, is well, it’s more than a blip. It’s, it is made a huge change around the world that no one could have predicted. Whatever you think about Trump, you gotta give it, give it to him in one sense that he [00:19:00] has forced huge change on almost every country around the world.
So, uh, I see here things that people are discussing now, we’re gonna. Would never have dreamt that these politicians would be suggesting that if Trump was not on the scene. So that is a big impact in all of this, and it’s hard to predict what effect that’s gonna have on something like this. But, um, I think the, uh, the concerns of people about training, for example, using their content without permission, uh, human beings, again, this is a, a related thing to what other conversations are worried about being replaced by the.
That’s not, not, not a separate or a suddenly new thing, but it just reinforces in my view, certainly that we need to address all of these things. We need to show that we have people’s backs in their concerns about this, and we’re gonna help kind of understand it if we can. That’s our job as communicates, it seems to be.
Shel Holtz: Yes. In addition to creating some of this content.
Neville Hobson: Oh, indeed. [00:20:00]
Shel Holtz: That’ll be a 30 for this episode of four immediate release.
The post FIR #461: YouTube Trends Toward Virtual Influencers and AI-Generated Videos appeared first on FIR Podcast Network.
11,256 Listeners
38,086 Listeners
30,891 Listeners
32,108 Listeners
43,250 Listeners
8,814 Listeners
111,470 Listeners
10,061 Listeners
64 Listeners
11,386 Listeners
5,886 Listeners
772 Listeners
5,338 Listeners
1,140 Listeners
416 Listeners