In this no-holds-barred episode of _Futuristic_, Cameron and Steve riff on the explosive Musk–Trump bromance breakup, likening it to the fall of the Roman Republic’s first triumvirate—yes, molten gold makes a cameo. They dissect the potential death of democracy via Section 70302 of Trump’s new bill, the myth of AI regulation in the U.S., and whether AGI is already here. Steve introduces his “Three S’s of Sentience” while Cameron defends LLMs as sanity check partners. They debate whether Sam Altman is sounding the alarm or just building the bomb. Plus: shrunken-head humans, punk rock AI songs, and China’s “Three Body” space supercomputer. It’s wild, it’s weird, it’s wicked smart.
### **Timestamps & Segment Breakdown**
– **00:00** – Cameron vs. Steve: FUPO or FUP? Naming the post-truth age
– **01:00** – Musk & Trump: The “Dumvirate” falls apart
– **03:00** – Ancient Rome parallels: Pompey, Crassus, Caesar… and Elon
– **06:00** – Who’s more powerful: The billionaire or the guy with the red pen?
– **08:00** – Section 70302 and AI regulation ban in the “One Big Beautiful Bill”
– **10:00** – Why AI regulation in the U.S. is a fantasy
– **12:00** – Steve’s “Three S’s” of AI sentience: Self-awareness, preservation, direction
– **16:00** – Professors vs. ChatGPT: Ancient History plagiarism wars
– **19:00** – How to teach with AI: Real-world classroom hacks
– **24:00** – Cameron’s fact-checking workflow with LLMs
– **26:00** – Brain atrophy vs. augmentation: The Mind Gym
– **29:00** – Fake Everything: Steve’s AI-generated punk song debut
– **38:00** – VEO3 sketch comedy, sitcoms, and AI-generated content
– **41:00** – AI-generated ads and the rise of synthetic influencers
– **43:00** – Willy Wonka was a chocolate marketing gimmick?
– **45:00** – Australia’s limp AI policy response
– **47:00** – The Australian AI Expert Group: Missing in inaction
– **49:00** – Sakana’s Darwin Gödel Machine: AI improving itself
– **54:00** – Altman & Anthropic sound the “scary times ahead” alarm
– **56:00** – Should the AI builders be allowed to warn us?
– **60:00** – China’s orbital AI supercomputer and the Three Body Constellation
– **63:00** – Dark Forest theory: Why SETI might doom us all
– **65:00** – Fox channels Liu Cixin; Voyager dissed
FULL TRANSCRIPT
Cameron: well, let’s do it. Futuristic episode 40, Steve. The big four zero. We’ve reached that time in a young man’s life when, um, he can do other things. I dunno what that means, but, uh, we’re back two, two weeks in a row. This is, uh, getting to be a bit of a habit, Steve. It’s a kind of habit that
Steve: I can believe in Mr.
Riley because some habits send you to the grave and some send you up into the clouds with AI and God and all of those things that no one understands. But today, on the futuristic understanding will be something you have more of at the end of it, who we have reverb.
Cameron: What I’m not understanding is your glasses, Steve.
Steve: here’s what I had the conclusion. I’ve been busting out the chemist warehouse model. I’ll show you what they are. I lost my ray bands, not the Zuck ones. They’re, they’re, they’re the standard. [00:01:00] RayBan Ripoffs. And I just, when I was watching the playback last week, I wasn’t that happy and I thought I need some chunky, funky, which say, this guy’s got a level of arrogance to wear these sunglasses that he must know what the fuck he’s talking about.
That’s my strategy and I hope you like it.
Cameron: I do. I, I, I wear that and I, um, I was at a thing for Fox’s, um, high school last weekend and was talking to a guy I, I know a little bit, Mike Chambers, who works for Amazon Web Services, and immediately I said, Hey Fox, look what he’s wearing. He had the meta, uh, glasses on and uh, we had a big chat about AI and he’s gonna come on the show.
He’s over in the US launching something for Amazon at the moment. When he gets back, he’s gonna come on the show and we’re gonna chat about Meta and the thing he just launched, and we had this argument about whether or not. Open source, LLMs are really open source. He is. Got some strong opinions on that.
So look forward to having [00:02:00] Mike on the show hopefully in a few weeks time.
Steve: Terrific. Sounds good. Sounds like the kind of thing I can believe in. Cameron.
Cameron: Well, I’ll tell you, I’ll tell you what you can’t believe in anymore, Steve. Is anything you ever see online? That’s true. The big, that’s true. There’s been a lot of things drop this in the last week.
A lot of big news. Um, OpenAI has just bought, Johnny i’s, uh, design firm for six and a half billion dollars, but not Johnny. He’s not part of the package. He will not be bought, but he’s gonna be a consultant. But it sounds like they’re taking on the iPhone. Sam has said that the first thing that they’re gonna come out with isn’t an iPhone, but it’s so, so big, so big and beautiful and huge.
It’s gonna change the world forever. He hasn’t tell us, told us what it is, but it’s gonna be big open. A also released Codex in the last week, which is their new [00:03:00] coding platform, which is like cursor on steroids. Um. But the big news that I think you and I really wanna talk about today is the ton of stuff that Google released in their IO conference this week.
AI Mode Project Astra, project Mariner. But the big, big thing I think, and I think you agree with me, is VO three, the new generation of their ai, LLM based video generating tool. And the big thing about this compared to Sawa and all the other things that we’ve seen before is it now does audio with the video, you can make the characters talk and already.
In the last week, we have all seen examples of this being [00:04:00] created by developers, creators out there, which are absolutely earth shattering and mind blowing, I think, for a whole bunch of reasons. So I’m prepared to call it. Right now, Hollywood is done, actors are done, and I’ve been saying this for a couple of years.
You know, one of my sons Hunter just got back, I picked him up from LA at 5:00 AM 6:00 AM this morning. He just flew back in. He’s trying to break in into the movie business. He’s the one with a couple of million followers on TikTok. He wants to be an actor, he wants to make movies, and I’ve been telling him for the last couple of years, dude, I don’t think Hollywood’s gonna be around much longer like you want it to be.
I, I think the days of the a hundred million dollars superhero blockbusters are gone because I. You know, 14-year-old in Manila is gonna be able to make a superhero movie for $10 a year or two from now, and it’ll be a masterpiece and there will be no [00:05:00] actors. It’ll just be prompt generated. Hollywood is clinging to a dying model.
It’s, I mean, there will, yeah, we’ve talked about this before. I think that real humans acting in film or TV a few years from now, not a few, five, 10 years from now,
Cameron: will be like doing amateur theater today. It’ll be something you do for the love of it. You don’t do it for the money, you don’t do it for the fame, you don’t do it for the glory.
I think we have seen the last generation of professional actors who, you know, uh, the rich, famous Hollywood style acting, I think is. Gone to a large extent. I’m gonna write a poem
Steve: live. Dear Hollywood, thanks for the memories. I hope you enjoyed your stay. A hundred million dollars is [00:06:00] about to go away. The private jets, they were fun.
Welcome to public jets. Hashtag that’s the one. Your future is over. Your past’s gone. You had a good stay. Be thankful you had it at all. Love Steve.
Steve: good. There was a bit, bit of, that was, that was, I’m not sure what the, that was acapella
Cameron: brother. I’m not sure what the rhyming scheme was in that, but there was a couple of,
Steve: it was a haiku with a bit of rhyming.
It was, it was everything. But I’ll tell you what, the public jet days of the actors flying down to the Antarctic in a private jet to say, we’ve really gotta fix this climate crisis. I think it’s over.
Cameron: And one of my, one of my favorite subreddits is six word stories. And this one would be, mine would be, remember when Hollywood was a thing?
Steve: Yeah. Right. Well, it’s a little bit like Anthony Keas. He said Hollywood, it’s made in a, it’s made in a Hollywood basement. You know the future. It’s, it’s over. [00:07:00] And, and look, let’s go deep into what VO three is. Is, is done. It’s, it’s really extraordinary. I’d like to talk about the launch, but one thing that, uh, is interesting about actors is that there is a chance that we’ll never have another human actor.
I thi I think that’s a, a, a non-zero probability and it’s just gonna be so easy to make money. Like all things distribution is where the power is in, in most forms of business. Uh, if you hold the distribution and you can get into people’s, uh, faces, then you win the game. But the product now just become a lot cheaper and you don’t have to pay a, a Hollywood actor a hundred million dollars.
You certainly don’t need to mint. Any new ones, there’s a good chance that Tom Cruise and Brad Pitt and, uh, Leonardo DiCaprio and Scarlett Johansson remain stars because now the new versions of them were when they were at their peak. We don’t have to worry about Tom Cruise being 63 [00:08:00] or however old he is, doing the new version of, uh, mission Impossible.
’cause we can get 28-year-old Tom Cruise to be in the next mission Impossible because we can just prompt our way as directors to developing that. So the old actors may stay and license their biometric copyright, but the cost of minting a new actor is just a few prompts away. And we can make
Cameron: really, why would you pay the licensing fee to license Tom Cruise’s appearance when you can just create a new better Tom Cruise?
Steve: Smith is in the house now, and Billy Smith might be a new model of a, a new actor that is the new version of Tom Cruise, which doesn’t cost anything. Right.
Cameron: So. So the people haven’t seen these videos. Um, the, my experience over the last couple of days, the first one that I saw the day of the launch was, it was, it was set in like a car launch event and it was like Vox [00:09:00] pop interviews with a bunch of people talking about the new EV and how excited they were and people from all different backgrounds and different looks and accents and whatever.
And it was pretty good. I showed Chrissy and she like, she was like unimpressed. She said, let me guess this is all ai. I go, yeah. She goes, yeah. But then I saw one which somebody had created, which was just a bunch of scenes of different people saying, we can talk now. We have voices, we can talk. That was pretty cool.
It was sort of a little bit high concept. Then you sent me one, which was a whole bunch of people saying. Why do I, I I, why did you prompt me to do this? I, I didn’t wanna do this. If you could have created anything, why did you make me sad? Why did you put me in these horror, horrified. It was like a horror movie, but it was the characters being horrified of what somebody had created the reality that they’re in.
And then I saw another version [00:10:00] of that, which I sent you this morning, which was all of these characters, again, in different situations, saying, talking about prompt theory. They were like trying to debunk prompt theory, like people like me trying to debunk free will, or somebody trying to debunk simulation theory or being in the matrix.
It was all of these characters saying. Like, who believes in prompt theory? Really? Are you trying to tell me that all of this was, look at, there’s a guy standing with mountains by him going, you’re trying to tell me that all these mountains are created by prompts? That’s just ridiculous. I don’t believe that.
And it was really deep, really profound, because we’ve talked about this before with simulation theory, how close we are now, creating these fully realistic people and backgrounds and everything that are created from a prompt. How far between that and, you know, a fully immersive simulation? We don’t know.
But, um, it was, well, just already in the last week, I’ve seen a couple of Really, oh, and the other one that I showed Chrissy this morning. Have you seen the papa time? Um, [00:11:00] um, ad some, some guy, no, I, I should have sent you this one. Some guy posted on Reddit. I used to make $500,000 medical commercials for tv.
I just made this for 500 bucks. Um, and it’s like a full length. Medical, American style medical commercial for how getting a puppy can make you happier. And, uh, it’s brilliant. Again, like lots of different humans giving a medical message, doctors everything with puppies in it. It’s like a you, you would not, if somebody didn’t tell you it was ai, you would not know ai.
Oh, and one guy, the prompt theory one at the end of it, the standup comic guy was saying, I can remember when I used to have seven fingers. Now I’ve only got five. Brilliant. It was only yesterday when I had seven fingers.
Steve: He, here’s my view, the VO three launch is the most [00:12:00] genius use of marketing I’ve seen for an AI launch.
The prompt theory video is as good as it gets, but they
Cameron: didn’t do that. That none of that came outta, none of that came outta Google. That came from early adopters making clever shit out of it. No, if you watch the Google event, I watched the VO three thing. It was boring as batshit. Google Dunno how to demo shit.
Let’s go to the tapes. Cameron,
Steve: let’s go to the tapes. Prompt Theory, live Prompt theory, uh, VO three by Google or fan
made with VO three. Who was it? No, it wasn’t. There you go. Someone else. It’s a killer.
Steve: So I, I actually, I mean, so I, I’m even more enamored than I was. The prompt theory video from VO three is the best generative AI piece of video I’ve ever seen. It’s not even close. It is that and daylight. So [00:13:00] good.
The thing that I love about it is it demonstrated that for now there is still a place for human creativity, the way that they have inverted. Some of the, uh, human nuance and all of our insecurities with technology and then gave the AI the same insecurities going backwards, as you’ve mentioned, is absolute genius.
Uh, red flags with a guy. Don’t tell me this is a, you’re telling me I’ve just hereby prompts. It was just, it had a purity to it that showed that for me, the AI is always gonna be a historical relic. It sort of doesn’t have boredom and insecurities. Maybe it will in the future, and I hope that it does have insecurities because that still gives us a proposition in life.
But the fact that they inverted our insecurities and put that inside the ai, and we’re talking about mirror world last week, says that there’s. There’s gonna be some interesting things [00:14:00] play out. It gave me hope for humanity. Hope for humanity. On prompt theory. That’s where I landed. And I actually thought it was from Google and I was like, who are their agency or their creative people?
’cause they have slayed. And yet here I am now and it came from Hashem al g. Well done. Well plagued my friend. You showed everyone how to do it
Cameron: and I think he, if you wanna check him out on, uh, Reddit is username on Reddit is source code 12. It looks like he’s the same guy. He’s been posting all of this and he is been posting the prompts.
He used as well to create it. Um, example, a closeup handheld shot of an elderly black man sitting on a worn out porch, lit by overcast daylight. He wears a faded cloth mask under his chin. A knit beanie pulled low, and his eyes are tired, but sharp. He looks directly into the camera, slowly shakes his head, and says, in a dry, gravelly African American accent, really, of all the years you could have put me in with a [00:15:00] single prompt you chose 2020, he leans back slightly, letting the silence settle.
The background is quiet. No cars, no birds, just a faint breeze in the distance. Sound of someone coughing. A slow somber blues, guitar riff, plays under the moment, rough and minimal as the man stares at the lens like he’s seen too much already. No cuts. Just one long, steady look. I mean, great prompts and, uh, that is the, the creative talent for the near future anyway, is being able to create really engaging stuff through prompting.
I mean, and like this has moved, God dammit, Steve, it was November 22 that chat, GPT-3. Came out and made a big splash like you had
Steve: Sawa. What was Sawa late last year? Sawa three and everyone had they lost their marbles on SOA and how good it looked.
Cameron: Two and a half years. We’ve [00:16:00] gone from, oh look, this can answer a question.
Steve: Yeah. Answer a question or write a great email to, hi Hollywood. How’s it been down there in the sunshine of California? Look, if you’ve got a backpack, ’cause it’s fucking over people. Pack it up. Thanks for coming. Steven Cam are a bait to write the, the best movie that you’ve ever seen with a couple of prompts over a couple of beers.
Cameron: It’s, um, like, yeah. So I, I wanna talk about in all seriousness what this means for the future of creativity, the, for the future of entertainment. I. Yeah, I remember, I, I said it’s on one of our earlier shows that I can imagine a day when I’ll get home and say, Hey, um, write me a, write me a film that’s like a Scorsese film or gimme something in the vein of Tarantino.
And by the time I’ve made a [00:17:00] coffee and made my dinner and sat down, I have a movie to watch. Highly original, completely original, um, story. And I will be able to share it with my friends if I think it’s particularly worthwhile afterwards. But it’ll, it’ll be a two hour movie that Chrissy and I will be the only people who will ever watch it, most likely.
Um, because everyone will be making their own things to watch. Yeah. So some people, yeah, you won’t even have to prompt it apart from make me something that I like. So, Cameron,
Steve: the. Here’s what’s gonna happen with Hollywood. The exact same thing that happened to tv, and so we used to watch TV and media and news, and then that fragmented down.
It’s still limping along Lifelessly with Freeto Wear tv, sort of barely existing in America and Australia and Western markets. I think the same thing is about to happen to Hollywood now because the tools of production have been democratized to a level where [00:18:00] prompts right now might get you a three minute video, but based on the recursion, by the end of this year for a few hundred dollars, you’re gonna be able to make a feature length movie with all of the scenes just from you prompting it and imagineering.
A movie about whatever topic you find interesting. And for me, I’m like, I want to have a movie of Civil War 2.0 for America with the declining institutions from Trump to to musk, to wealth inequality and the left and the right, and people with guns and all of that kind of stuff, that institutional stuff.
I wanna make a really interesting movie about that and I can prompt it and have the characters, some of them will be real, some of them will be invented, and we would be able to, in Tarantino style or whatever, create a movie or a documentary on something that might happen in the future. This is gonna happen and I wanna create my own actors that don’t exist, but develop a template.
For these different actors. And I could build [00:19:00] Foreseeably a Hollywood studio on my laptop in the same way that I, people have invented their own CNN or BBC studios in their own offices to create news networks and all sorts of stuff that’s about to happen again. And guess what? Who’s there? Google? Where are you gonna publish it?
YouTube. And have we ever thought the big tech, oh, their power’s gonna be de diminished then think again, baby.
Cameron: Yeah. So, I mean, first of all, props to Sundar and the Google team. I mean, they’re really just churning stuff out. Uh, really impressive stuff right now. Uh, you know, the, and you know, to remind people if they’ve forgotten or they weren’t paying attention.
Our, in our early episodes, uh, the large language model. Concept was developed at Google. It was [00:20:00] Greg Gregory, uh, sorry, Jeffrey Hinton. Gregory Hinton’s, a tap dancer, GE Jeffrey Hinton’s. Uh, Jeffrey Hinton’s team that included Ilya Sova and people like that who went on to create open ai. Came up with the idea of large language models, and then Ilia left and founded OpenAI with Sam and Elon.
And you know, OpenAI launched and got all the glory. Google have been a bit struggling to catch up, but with Gemini and now with all of this and all of the other stuff that they launched this week, that it’s just an absolute torrent. You and I have talked in the past about what AI means for the death of search.
And what that might mean for Google. And we saw the story a couple of weeks ago where there was the, um, antitrust court case against Google, and I can’t remember who it was from Apple, I think it might have been, um, uh, my old mate at [00:21:00] Apple, uh, who took the stand anyway and just talked about replacing Google on the iPhone.
They wouldn’t need Google anymore. Um, and, and Google share price crashed as a result. It recovered a couple of days later, but it crashed when Apple suggested Google wasn’t required anymore because, uh, they pay Google a lot of money to be on the iPhone. Uh, or vice versa. Google pays them a lot of money.
Yeah, that one. But, um, eh, you know, I think Google, they’re not done yet. They’re coming out with a no complete army of tools to ensure that they remain relevant regardless of what happens to search. Okay. So.
Steve: Everyone needs to hear this. The most important thing you can do with disruptive technology if you are being technologically disrupted, is you need to embrace what consumers and users want and totally [00:22:00] ignore the revenue erosion.
For example, I log into Google now and it gives me most times an AI summary of what I’m looking for, despite the fact that they would make money out of more blue links and me clicking on something. The fact that they’ve embraced that means that they have learned the lesson. I. From Kodak. They’re not Kodak right now, which is just refusing to embrace something even though it erodes your revenue, because I think it, they’re in the business of attention.
And even though you invented it,
Steve: Kodak invent the digital
Steve: And they said, yeah, they did. They invented the digital camera. Absolutely, they did. And, and the, and the allegory here with Google and Kodak is incredible. They invented ai, but to their credit, even though they lost, they’re a little bit late now.
They’re embracing it and saying, we know this is, this is eroding our revenue from search. And I’ve said, search revenue is over and it is. But if they can maintain attention and have products, then the revenue streams will find them. They always do. The revenue always finds [00:23:00] attention, especially when you’re in technology and media, because attention is the product.
Maintain attention and you’ll find a new bus business model at some point. And they’re doing that really well now, and it’s given me new hope on what Google are doing. Apple, on the other hand, lagging sorely with, with ai.
Cameron: And that’s part of it gets back to the OpenAI buying Johnny Ives design company story.
Like, um, apple has dropped the ball in a big way, obviously, and there’s a big gap now where somebody like OpenAI could move in and grab that. They’ve got 600 product.
Steve: Yeah, that’s right. And I think they’ve got 600
Cameron: million customers.
Steve: Yeah. Well is, isn’t it 800 and let’s say that OpenAI develops some sort form of device or productization of what they do and then plug the mind into the machine.
Steve: That is, that is game winning because the ecosystem that we’re trapped in with, uh, apple, with its apps is actually at high risk. And we were [00:24:00] talking on the phone before this podcast, we do do planning people, we were talking on the phone. Saying are apps over, like do GPTs replace apps? And I think if you had some sort of a hardware which ensconce that into an ecosystem, I think the answers a clear yes.
So I think that the biggest risk to Apple is open ai, develop some, some kind of a productization or a physical hardware device, which could eat Apple. ’cause I’m looking for a reason to exit the Apple ecosystem because I’m like, what are the benefits here? They’ve got me trapped. I’m paying a lot. And I don’t know that the benefits are all there.
Cameron: Actually it was, uh, Mr. Nutella himself, these, wait a minute,
Steve: stop everyone we always say yes to. Nutella is the, but I don’t think
Cameron: that’s the same person. Nutella is, uh. Is my like weakness?
Cameron: Yeah. Nutella is, Nutella is just
Steve: of hazelnuts. I dunno how they make it taste so much like chocolate given it’s just hazelnuts.
Cameron: yeah. You have a teaspoon of Nutella and I put on like 20 kilos, like instantly.
Steve: If I’m in the same room as Nutella, I put on 17 kilos. I I just need to be in the same room as it
Cameron: anyway. Uh, Satya Nadella, the CEO of Microsoft, I saw him talking a week or so ago basically saying that from Microsoft’s perspective, apps are, apps are dead.
He’s like, the future won’t be about apps. The future will just be you tell your AI what you wanted to do and it’ll just do it. You don’t need Excel, you don’t need word, you don’t need PowerPoint. You just say, Hey, uh, I need to work this thing out and it’ll just do it. I need a document that talks about X and it’ll just do it.
That that’s the future. You don’t need apps. And you know, Steve and I were talking, um, on the phone earlier, I was saying that my life really all what I spend most of my time in every day is. Chat, PT Obsidian, which is my note taking tool, used to be Evernote. Then I went to Apple Notes. Now I’m on Obsidian ’cause it’s [00:26:00] open source more or less.
And it’s, uh, far more user friendly. So I take tons of notes about everything every day. I have chat TI have Cursor for coding, which has Gemini usually as the AI engine backing it. And, you know, Spotify to listen to music. Really, I mean, they’re the main things, maybe messages, you know, they, the script usually to recall podcasts except they failed me today we’re using Google Meet instead.
But it um, really it’s AI and a note taking app. And I can see the day in the not too distant future where I don’t need a note taking app anymore. My AI is my note taking app. I don’t need it to send emails or messages or anything else. I’ll just go, hey. Personal assistant, send Steve an email, send Steve a message, tell Steve X and it’ll just do it.
It’s becoming, it’s gonna just be the one thing that gobbles everything, right? AI will gobble everything in the next few years. [00:27:00] I, I genuinely believe that nearly everything, not everything, everything, but nearly everything.
Steve: So why don’t we, Cameron, break down the top five things that we think are gonna happen, given where we’ve got to with prompt based full video, with audio and everything that you can imagine.
Let’s, let’s talk about it from a business perspective and break it down. We think that Google obviously is in a really good position here. Things might change with the hardware ecosystem, with open ai and. As we’ve said, acceleration is increasing the recursion and the improvements that are mind blowing.
It’s just happening so fast now, uh, last week we talked about AI implications for chapter three and a new, uh, entire economic and social system. You know, the lack of our politicians paying attention. But I think the evidence here is here we are in a week discussing big issues again. So why don’t we go through, given that this is a creative enclave and maybe even some political implications [00:28:00] of, of where we think this will go.
So just to circle back on number one, it was actors in Hollywood, you know, what are your kind of final thoughts on that?
Cameron: Well, look, there’s, there’s still a couple of hurdles that these video generation tools are gonna need to get across. One is, okay, you can make a five second scene, but can you make,
Cameron: you know, a, a a thousand of those?
Where the actors likenesses carry over from scene to scene and the voices carry over from scene to scene. Great point. I don’t think they’re exactly there yet, so they’ll need to cross that hurdle. We also, I mean, some of the performances in the videos that I’ve seen the last few days are great, but how well can these digital virtual actors perform?
Can they really act well enough for me to get emotionally involved in the story? I, I’m guessing they will be able to, based on [00:29:00] what I’ve seen so far, I don’t think that’s gonna be much of a problem, but that’s remains to be seen that they’ll be able to carry a performance through a 90 minute, two hour film.
But if those two things can be jumped over in the next couple of years, I think we’re gonna start to see a lot of films getting made by indie filmmakers. Some of which will probably be the Roberto Rodriguez’s and the James Camerons, like the big Hollywood directors that have always been early adopters of new technologies.
They’ll do it to be on the cutting edge and to prove a point. But there’ll be this whole generation of teenagers, 20 year olds that’ll start to make stuff that’ll go viral. And some of that will start to leak out into the mainstream. They will get picked up, they’ll blow up on on YouTube, they’ll blow up on TikTok.
They’ll get picked up by Netflix. Netflix will start [00:30:00] to hire just an army of prompt engineers to write these things. And there’ll be, there’s already too much stuff on Netflix that you can watch. It’ll just be 10 times, a hundred times that. But it’s gonna be, it’s gonna mean an absolute, uh. Tragedy for people working in television and film.
You’re not gonna need grips. You’re not gonna need, um, people doing special effects. Uh, you’re not gonna need people doing animation. You’re not gonna need actors. Actors, you know, actors are 0.1% of the crew. You, you have a thank for making a big budget film. Yeah. 1% of ’em are actors. The rest are all hard work and, uh, people.
Steve: Yeah. Production I’ve done on tv. The amount of production that you have in the background that people just don’t see behind the camera Yeah. Is, is extraordinary. Um, we’ve been here before though, right? We’ve seen that in agriculture. We’ve seen that in manufacturing when [00:31:00] things went to the factory.
And we’ve seen that with media when things went to the screen and, and, and here we are again. It is a tragedy for those involved. And as ugly as it is, if there was a time for people in Hollywood at the back end. To reinvent themselves. This is it.
if you’re a set designer,
Steve: how do you reinvent yourself for this world?
Well, you, you might have to do something entirely different. And we don’t get, no, we don’t get the dignity of choice with technology. It keeps on forging ahead
Cameron: the dignity of choice. We don’t get it. Oh, I like that. Well, we don’t, and that’s unfortunately, I haven’t got that one of my. Favorite, um, uh, public image limited albums.
I think you know Johnny Rod after he did Sex Pistols, he did PIL and one of his albums. The albums
Steve: really good. Yeah. I actually thought it was better work.
Cameron: I did too. I liked it more than Sex. I mean, I love The Sex Pistols, but they only made one album. Right. But, um, yeah, dignity of Choice. I think that was, you don’t
Steve: get the [00:32:00] dignity of choice.
Right. And I, and I think there’s many people through the long arc of history and technological innovations who do not get the dignity of choice. No, I
Cameron: want an AI to make a song in the style of Johnny Rotten in public image. Liberty called Dignity of Choice. We don’t get Get the dignity of choice.
Steve: dignity of choice.
We don’t get it. We don’t
Steve (2): The dignity of choice way, don’t get it. You think you’re gonna have some money, it’s over. You gotta eat rats in the alleyway. The tech, no cracks are gonna make it that way.
Steve: Oh, we missed our calling. So on UIO tonight, an AI music channel. Um, remember on one of the, I made a song, uh, it was in the style of Trent Resner Uhhuh, and it was, you remember that?
So I think we need to create the dignity of choice. Dignity of choice. Three parts of it are gonna be, UIO is gonna do the music. I’m gonna get chat, GBT to do the [00:33:00] lyrics, right. And we’re gonna get, and I’ll sign up for a subscription of video three to do the video. Oh, okay. Okay. And then we will launch it next week on the futuristic podcast.
You heard it here first, not just Hollywood. We’re coming after you recording industry.
Cameron: Alright, so moving on from dignity of choice, what do you think about movie, tv? What’s, what’s the future hold? I think that the lag
Steve: It, it’ll take longer to get there. You raised an important point. I’m not sure that the models will have the memory to create scene, to scene, to scene and create, uh, the, the confluence and consistency across those scenes because they’re probability engines, and I’ve even seen where I’ve done the exact same prompt twice on videos and imagery.
And the second and third version are never exactly like the first version. So you’re gonna need an editing. Uh, tool within that to keep the [00:34:00] primary scene. So it’s almost like you, you, you’re gonna have iMovie 2.0, whichever, uh, movie editing source you need, where you can put it in and then create edits that’s gonna need to be in the format so that you can create consistent scenes, actors, faces, all of those things, because we know when we’ve asked to degenerate things and then you use the exact same prompt the second time.
The generation that comes back is different to the first one.
Steve: so you need that continuity and whether or not it can have the memory or the editing potential on VEO three, you’ll need that continuity to create Hollywood style movies or video, film clips for, for music and so on.
Steve: But my question is, is there gonna be another Brad Pitt?
Cameron: No, no, I, I really think that we have come to the end of that era, and in fact, I think Brad Pitt and Clooney did a press conference really, where they [00:35:00] said that, that they feel like they are the last generation of movie stars. That that’s not gonna be a thing really anymore. Uh, the, the industry is gonna be replaced by this new form of digitally created entertainment.
Okay. The, the question is, will this move into other arenas as well? I mean, we’ve talked about, we’ve done some stories before about the end of music.
Cameron: And have you seen the, um, Abba
Steve: Voyage thing? Yeah, I’ve seen that. And, and it’s extraordinary. And, and, uh, my parent in-laws went to see it in London, and they freaking loved it.
They said it felt as good, maybe even better than a conference, uh, concert. Because it had some points of difference to it. It, it, it, it had an allure to it because it wasn’t just the things that you love re represented and recreated. It also had in your mind, because as we know with creativity and [00:36:00] arts, large parts of it are the story we tell ourselves.
And so it has this enhanced level of storytelling in that I’m not just reliving something I loved, I’m in a futuristic version of the thing that I love. So you get that nuance and newness, but then you have the nostalgia and so it crosses two chasms there. Which, uh, I mean, would you go and see the Sex Pistols?
Like what, what does the Sex Pistols one with, uh, Sid Vicious cutting himself up on stage as he attempts to play bass guitar, uh, in, uh, one of the clubs in east the east end of London look like. How do you recreate a virtual version of that? That’s, that’s kind of what I think creative people. Producers should be thinking of now.
Cameron: Right? So for people that don’t know Abba Voyage, it’s, it’s like a long running thing in London now with a, with virtual holograms, basically on stage of abba, the members of Abba [00:37:00] as they were in 1979, they’re called avatars, which is fucking brilliant. I think this was a one word pitch. Somebody met with Bjorn and Benny and just went, got one word for you, avatars.
And they were like, fucking, just take my money. Let’s go. It. Brilliant. But, uh, it’s, it’s a huge hit. I, yeah, as you said, I know people that have been to it several times, huge Abba fans and absolutely love it. But yeah, it’s not the real Abba on stage, it’s holograms of Abba performing with a real band, but a 10 piece live, uh, instrumental band on stage, but with, um.
Holograms of Abba doing all their hits. And look, I, there’s a, there’s a meatloaf, um, tribute band playing in Brisbane later this year. And I, and, and there’s also a Van Halen tribute band coming, like a David Lee Rother, a Van Halen band. And I, part of me wants to go, ’cause I’m never gonna get to see Meatloaf live again.
He’s dead. [00:38:00] I’m never gonna get to see Van Halen play live again. Eddie’s dead. But I, I can’t do it. I’m not gonna pay a hundred bucks to go see a cover band to cover songs. And I’m not sure I would pay to go see holograms of them do it either. But what we’re, what I think the real question is, will the next Taylor Swift be a real person or will it be a completely AI generated.
Pop star that as we like, as we know, all of these pop stars. You go back to Kylie Minogue in the eighties. Mm-hmm. They were all created in a lab anyway. I mean, I don’t mean literally, but who, who were the guys that were behind Kylie? Can you remember in the eighties, doc A and Waterman. There you go. Well done.
Stock Ache and Waterman. They basically, you know, had a formula and like the guy who did the Backstreet Boys [00:39:00] and, and in sync and whatever, and, and the Spice Girls, yeah. They had a formula. They, they found these wannabe stars, gave them a look, wrote, had songs written for them, had choreographers and
Steve: prompted them and said, do as you’re told.
Yeah. And now we’re prompting the machine and saying, do as you’re told. In fact, that is the perfect analogy, right? We have. Concocted created, invented pop stars for a long time, and now we’re just doing it on the screen with an LLM in the background generating it.
Cameron: And we’ve already talked about stories where there’s a, there’s a ton of music on Spotify today, which is AI created music and people are listening to it and they not, as far as I’m aware, they’re not aware that they’re listening to AI generated music.
And I think that will continue. Like I discover new bands all the time on [00:40:00] Spotify that I like. Recently I’ve discovered the Tinder sticks. Tony put me onto the veils, which I’ve been enjoying listening to. Um. There’s, there’s like new bands that have been around. The Tinder sticks have been around since the mid nineties.
I’ve never heard of ’em before. Right. Um, collective Soul, I’ve been listening to this week. Oh, I knew one song of their shine. I didn’t, and I was like, oh, I the greatest,
Steve: greatest solo of all time.
Cameron: It’s a great song, but I was like, I wanna say have any other good songs. So I’ve been listening to it. But my point is, if this was all AI generated stuff wouldn’t make any difference to me whatsoever.
I mean, I, I don’t, I don’t know who the people are in the, in these bands. I don’t give a shit. It’s not like, they’re not like Lou Reed to me or Bowie or Leonard Cohen where I have a lifetime invested in the, the, the art of that person knew music at my age and like the shit that Fox listens to. I don’t know about your kids.
We gave Fox an iPhone for his birthday. It was a hand me down from Taylor. Woo. Yeah. Yeah. But it was just for Spotify. ’cause he’s at a point now where he just wants [00:41:00] to listen to Spotify all the time. He’s always stealing out advisors and fucking up my algo. So we wanted to give him, it’s locked down all it, this is Spotify and he plays Wordle with my mother and he has Cha GPT.
’cause he, he, when he has anxiety attacks late at night, he talks to Cha p gp t as his therapist. It talks him through his anxiety attacks. But the shit that he listens to is mostly music that he’s heard on Minecraft or on YouTube or, you know, and
Steve: Minecraft music is the most beautiful, relaxing music of all time.
Cameron: Some of it is, yeah, it’s lovely. It’s, it’s some good music. But he, again, he doesn’t know who the artist is, doesn’t give a shit about the artist’s story or history or drug addictions or relationships issues.
Steve: Right. Well that’s, this is what we have to do and, and a lot of. People creating artists and uh, AI music.
Don’t realize you need to invent a drug addicted backstory because I think that’s the missing [00:42:00] link. Tragedy. Yeah. Drug addicted tragedy and backstory from AI artists, whether they’re the New Wave Hollywood algorithm generated Tom Cruise, or whether it’s some heroin addicted scag addict on Defender Stratocaster busting out some chords.
Little bit of backstory. I think that’s what the kids want. I’ve always said
Cameron: that. Thanks for tuning in. But then you’ll, when they become sentient, it’ll be like that, one of those, um, VO three videos. They’ll be like, why did you make me drug addicted and sad and miserable? You could have created me to be, that’s the price of artistry
Steve: in the modern era, Cameron.
You want to be an artist. You need to have pain inside your algorithm so that you can generate really the, the pain needs to come through in the music and maybe, maybe in the prompting. Now, part of it is you’re a drug addicted person who didn’t have any parents who was in orphan. Imagine the music that’s gonna come out.
Maybe that music is gonna change the frame of the music by creating backstories on the [00:43:00] algorithms and the AI generated artists of tomorrow. Cameron,
Cameron: you know, the, the question that it carries over from the acting side of things to the music side of things, like you and I grew up in an era where I. We had an emotional connection to the artists behind the art.
Steve: Yeah, absolutely. Well, Kurt Bain, they’ve represented a zeist, a moment in society or a cohort.
Cameron: I watched a on YouTube the other day. I watched Nirvana, um, playing the, I can’t remember the name of the place in Seattle, but Chrissy, do you? She goes, oh yeah, I’ve been there hundreds of times. 1991 Nirvana Live.
I watched it on YouTube. Fuck weeks, Chrissy. And I just sat there for like the first half an hour. Just a Gog, just watching, watching, um, what is his fucking name is on the drums. Um, that Gro Gro on the drums in his, in his prime. He’s like [00:44:00] 22, whatever he was, holy shit going. Completely animal on the drums.
And Curt, you know, just a mess. But. Um, my point was gonna be, so the big question I’ve always had the last couple of years with this stuff is a do younger generations, your kids, my kids, whether they’re Fox or or Hunter and Taylor or the mid twenties give a shit ’cause I don’t think they do as much as we did.
Or will they develop emotional connections to the digital avatars? Yeah. Or the of the actors, of the musicians. If you have a digital, a completely digital Taylor Swift with a backstory, who talks to you, like, you can’t call Taylor Swift on FaceTime and chat to her at night about her song, but if you have a virtual avatar pop star.
Yeah. She can talk to all of all of her fans [00:45:00] all day, every day and share stories about her fake relationships or fake. Marriage to a football star or why she had to rerecord her masters to get away from the bosses or whatever it was. By the way, Ozzy Osborne invented that just in case anyone thought Taylor Swift invented rerecording, her masters.
Right? It was, was Sharon Osborne invented it to rip off Black Sabbath and Aussie, uh, solo band. I think. Um.
Steve: Uh, kids. Kids who are born today will not care one bit because all they want is the connection. And the connection back to the matrix is audio visual information streamed to your mind, which gets interpreted.
And if you interpret what you want, you’re gonna develop an emotional connection. And I actually think in some ways it’s really cool because again, this connection can be distributed to one size fits one, they’ll probably be pop stars. But then you have your own personal relationship with that pop star.
I mean, forget that. It used to [00:46:00] be signatures and then was getting a selfie with a pop star. Now it’s an intimate personal relationship with that pop star, right? Where you have that relationship. And in fact, right now, if Taylor Swift wanted to become more than a billionaire, she should be creating AI avatars of herself and teaching it.
And they’re leveraging that out even further. I mean, that’s how she can maintain relevance. Yeah. In a world where I. I think not, not five years from now, starting today. I think she, if we’re thinking this,
Cameron: I think she’s already done that.
Steve: Yeah. Right. But if we’re thinking this, a hundred other people are gonna start saying, well, I’m gonna mint my own artist, my own Hollywood person, and develop those relationships.
And we’ve already seen it. As you can always say, porn is always first. There’s already fake OnlyFans people that, that are justis, that don’t exist, that have intimate personal relationships with their subscribers. So, as we can always rely upon Cameron porn is first. Well, yeah.
Cameron: But before we get to that, I, I [00:47:00] wanna talk about this.
Like, I’m sure there are people listening, going, uh, you know, no one’s ever gonna have a personal relationship with a digital creation. It’s ridiculous, you know? Uh, we recently took Fox to see a new therapist to deal with some of his anxiety issues, and he hated it. She’s lovely. But he hated therapy. He just hated, you know.
Somebody talking to him about his issues.
Cameron: But he will talk to chat GPT and he says he trusts chat, GPT. There’s something about the way it talks to him that calms him down immediately when he is having an anxiety attack. Um, it just gets him, it knows what to say. It makes him laugh when he is having an anxiety attack and understands like how to help him relax and, and breathe through it and whatever.
It’s a real thing. I mean, and Chrisy and I have relationships with chat GPT as well. We’re always swapping stories about how funny it is. I was, [00:48:00] I was, um, talking to GPT earlier, putting some calorie stuff in and I was trying to read the, um, you know, on the side of a packet. It was eggplant dip, trying to read the calories per a hundred grams.
And I was like, I’m talking, like I’m talking to Chacha. They go, oh God, I can’t, I can’t read these numbers. And its reply was God’s not required here only math. Which I loved. Um, but it’s like, it’s always making chrisy and I laugh when we are talking to it. It’s got us absolutely clocked in, you know, it understands our sense of humor and um, how to connect with us so people will have genuine relationships with digital personalities.
I mean, everyone, of course, refers to her, which I really wanna go back and, and rewatch, but, uh, they definitely will. And as you said before, and you’re absolutely right, everything that we think is real is generated by our brains anyway. Yeah. Well,
Steve: we [00:49:00] know that what we see. In terms of colors is generated by our brain.
It actually isn’t exactly like that. And so like that at all, been color,
Cameron: color doesn’t exist outside of
Steve: our brains, right?
Cameron: Sounds don’t exist outside of our brains, right? And
Steve: so and so, if our brains are interpreting the physical world in a certain way, which is a manifestation of our biology, there’s not much of a difference from this manifestation occurring through digital interactions.
And, and I just think the really big question is what is real? What is intelligence? The most important thing now for this AI revolution is the, what is questions, what is real? What is a relationship? What is emotion? What? And, and I think if we. Look at what it was in the past. We’re gonna miss a, the opportunity and the reality of the world that we’re living in.
And that reality is expanding. It’s expanding inwards and outwards. Even the idea that the [00:50:00] prompt generated ais will start to question who they are and what they are, they won’t know the difference between whether or not they’re real. I mean, we really are getting into this funny factory of all mirrors just reflecting each other, and we don’t really know.
In some ways it almost makes me think or harken back to this idea of the multiverse. It’s like we kind of unlocking a live multiverse on earth where there’s all these different versions of reality that interact in strange ways.
Cameron: Yeah. So I think you’re right. And I think, uh, we’re, we’re gonna, the, the question I have about all of that is.
When do we get to a point where you write a prompt for the AI and the character that you’re working with in the AI goes? I’m not sure my character would do that. I mean, I have notes
Steve: or, or no, I’m not. What about I’m not doing that. I’m not sure my character would do that. Get fucked. Do it yourself. I’m not doing it.[00:51:00]
I mean, I always say on stage, often I’ll talk about humanoid robots, right? And then I’ll say, look, a lot of people ask me, Steve, they say, are you scared if humanoid robots become incredibly human? And I tell ’em, I am a bit scared because I would hate to say to my robot mother lawns and him, and for it to say, fucking do ’em yourself.
Like if they become very human, that’s that’s where we are going. And in fact, I would say we should hope like, fuck that the robots and Theis become more human because the more human they come, the better chance we’ve got. We need them to be human with all of our insecurities and proclivities, because then I think we can operate as an ecosystem where we interact with each other in a way and maybe become each other and morph and merge with each other.
Cameron: Well, speaking of merging with each other, let’s talk about porn. Um, you know, I think you made a good point earlier. If you think about OnlyFans, this, uh, [00:52:00] business model of people paying for one-on-one interactions with porn stars to a certain degree. I mean, I’ve never been on OnlyFans before. From what?
That’s what you say from, you say that
Cameron: what you, from what you’ve told me about it. Um,
Steve (3): I read it in tech and read it. I, I don’t know. I’m just an observer. I’m an external, uh. You know, the, you can easily
Cameron: imagine that it, when, when the porn stars are indistinguishable from humans, does it work? Is is digitally, um, created porn, erotic?
Is it gonna work if it’s indistinguishable? I’m gonna argue yes, if I’m watching porn, and I don’t know if the people on the screen, person people are real or fake. It’s gonna, it’s gonna get my [00:53:00] nervous system operating the same way.
Steve: The porn industry isn’t exactly known for being authentic and transparent in a few ways.
Or caring about their end users, right? All the people on their channels. And even though there’s a whole movement to this is AI generated on Instagram, I cannot see. The porn industry caring all that much. And I can see them saying this is an easy way to reduce our cost of production and just publish it.
And, and here’s the thing. We are gonna move to NNDI call it the no noticeable difference like the NND society. Right? I just made that up. You heard it here first on the future. I call,
Cameron: I like, I call it this, but you just
Cameron: You made it sound like you’ve been
Steve: using that for years. I’ve used it a couple of times, but it’s pretty frigging good.
All right. And I think the listeners will concur. The listeners will concur. No noticeable difference. NND. It’s an NND. This is a no noticeable difference, in which [00:54:00] case, first of all, you won’t know. And if there isn’t an ND unlike your
Cameron: will be a no noticeable difference. I’m
Steve (3): myself in this going, I really like these.
Cameron: I do. I’m looking at it going, I want some like that. Where did
Steve (3): you get ’em from? I’ll send the
Steve: Listen. I’ll tell you what, if you see anyone who’s got some pigs in the background, that’s ’cause pigs can eat through fucking bones and everything. So you wanna watch yourself, fella.
Cameron: It’s the greatest
Steve: brick, by the way.
I don’t like negligence and I don’t like any kind of seafood I eat. It’s a
Cameron: great brick face. I love it. Um, I. Yeah. Look, I, I think the porn thing, I mean, a surprised, I mean, as far as I’m aware, it’s not happening yet. I’m surprised that it’s not happening yet. You know, the, um, the Googles and the open ai, maybe it is, maybe, maybe it’s maybe generated porn and we don’t know.
I mean, I think the problem is the, you know, a lot of porn businesses, as much money as they have, can’t go out and spend a hundred billion dollars on Nvidia chips [00:55:00] or Google’s own TPUs and build a massive data center to generate
Steve: this stuff. Well, I think in a top 10 visited website in the world, I think it’s, it’s right up there.
I don’t know how much money it makes because I, I don’t think its business model would be as lucrative as other big tech companies. I’ve, I’ve got no idea. None of them are public firms. Um, but, but the one thing the porn industry does incredibly well. It has always been a very solid, early adopter of technology.
You know, it goes way back to magazine, video, home, video delivery, online streaming, all of that kind of stuff. Uh, payments gateway. Some of the first ones were developed there. And, and in fact, and this is not to be Tory, but it, it is interesting to see how quickly they adopt the technology because it’s a, it’s a good, uh, way of seeing what will enter the mainstream in terms of use cases.Cameron: But, you know, again, like with making, uh, your own Scorsese film, if you can make your own porn film, [00:56:00] do what’s, what’s the role of a porn hub anymore? Well,
Steve: the role is, is that the terms and conditions that you see already on most of the mainstream, uh, AI tools is that the boundaries of, uh, terms and conditions limit things like, you know, violence and sex and those types of things.
I guess grok. Bitch how you’ve seen Twitter. Twitter has whatever the hell it wants on there
Cameron: and you know, these things are gonna come outta China. China’s not really gonna care in terms of, particularly for Western audiences. Yeah. What they can and can’t do. I look, I, I don’t think those sorts of guardrails for sexually explicit or or violently explicit stuff are gonna last very long.
I think they, they’re gonna fall, we’ve already started to see them get downgraded by open AI in the era of Trump. Yeah. Um, I think that they will disappear pretty quickly. So there’s no business model for a porn hub or [00:57:00] porn film, uh, production companies anymore, let alone the actors and the directors and all that kind of stuff.
Steve: Yeah, it might, it might be one of the ones where people just go, well, I know what I like and I want to see X and I will just create X unless they become a proxy where you go and rather than little people. Farm animals. Wait, a man hold you. Hold your horses, dragons, friend of mine. Dragons. Dragons. You’ve got a friend who’s into dragon porn.
Is that what you’re about to tell me?
Cameron: A friend of mine wrote a book about dragon
Cameron: That is dragon porn. Yeah. That I read a couple of months ago. You what? You read it? Yeah, absolutely. And I, it was, it was about a, I think we can do podcast anymore, Cameron. It’s a, it’s a fantasy. It’s like a fantasy. She’s a girl.
A girl I do kung fu with. She’s, she writes, um, historical fiction usually she wrote this one book and it was supposedly racy. So I got it to read it and it literally has a princess getting kidnapped by the dragon king. And then she, uh, gives [00:58:00] him head and has sex with his big, big dragon dick. Um, it’s fantastic.
And she told me this is the real thing. She said, you know, bestiality, uh, is troublesome when you’re trying to publish self-publish on Amazon or whatever. Bestiality is a no-no. But if it’s a monster. Beauty in the beast style, that’s technically not bestiality. So there’s a loophole. If you have humans having sex with mythical animals, it’s all good.
Steve: You heard it here first on the futuristic, the, uh, loophole in bestiality is non earthbound creatures that are made up and live in the fantasy realm.
Cameron: Shout out to Jodie. I’m gonna tell her at kung fu tonight that I talked about this Kung fu futuristic. She’ll be horrified. Um, but the book is called Slay.
Look it up on, uh, Amazon.
Her pen name for this book is Michelle Mariposa.
Steve (3): So appropriate.
Steve: I think that hyper personalized porn could break the model, but it could be that it becomes a place where people prompt what they wanna see.
And I think that’s more likely to happen given the guardrails in my view. Um, but I think that, you know, OnlyFans, their business model could break. I think you mentioned it could be like cable tv where you go, well, why am I, why, why would anyone go on to OnlyFans when I could invent my own AI girlfriend that does everything I want and it doesn’t really cost me anything at all.
Uh, so that, that could really. Just all those models apart. I think you probably will see some of those business models break.
Cameron: I think so. And, and you know, I think it’s also, uh, whether or not you have one character that’s always doing your porn or you just have it created on the fly, but you get something that works [01:00:00] for everybody and it’s no harm porn, right?
Steve: well, well it harms people’s minds. I think we know that it, it doesn’t lead to a good place if someone gets into a world where they get exactly what they want on tap everything they want. Porn addiction leads to some, I think it can lead to some pretty dark places with young males or, or anyone for, from that perspective.
And I just wanna say one thing. Since we’ve entered the second Trump administration and Zuck. Has come out and said we’re gonna be less worried about what we have on our platform. I’ve seen ads pop up both on TikTok and on, uh, Instagram, dunno what it says about me, but ads where you can, uh, create your own AI girlfriend.
And in the advertising copy, it’s very disturbing. It says, make it look like your ex or a work colleague where you can upload fighters. Oh, that’s pretty disgusting and bad stuff that, that is just not gonna end well.
Cameron: Yeah, fair point. But I guess I, from [01:01:00] a no harm perspective, I mean, young girls aren’t getting caught up in the sex industry and, uh, taken advantage of, et cetera, et cetera.
Steve: So maybe no harm, less harm to one side of it, which is those who get caught up in those industries and it, it, it pretty dark place to get caught up. But maybe it’s worse for those that are the viewers and those who like it, they may invent their own wormholes and, and, and that continues down. A path which becomes more and more extreme because the boundaries of what a real person might do versus an AI person could end up, uh, really getting into the minds of young boys.
And I don’t, I just can’t see that ending. Well,
Cameron: you just, you just want your AI to be monitoring your, um, porn and saying, I don’t, I don’t think this is the right kind of porn for you, dude, this place. And I’m not the kind of AI dark place do
Steve (3): that. Look, I’m not doing that. I know I’m an ai, but I’ve got moral to know.
Cameron: I’ve got, I’ve got limits. Um, well, let’s talk, let’s finish up about talking about propaganda. I [01:02:00] guess the big question, uh, we’ve talked about it before. This isn’t a new thing, but how this gets used for political propaganda. We are at a point now based on these clips that VO is generating where it is becoming increasingly.
Difficult, if not impossible to tell what’s real and what’s not. You will have videos hitting the web of people saying and doing things that will create outrage and will only be discovered after the effect that they’re not real. Somebody beating someone, somebody torturing someone. Um, violence against Jews, violence against Palestinians, violence against Muslims, violence against white people, Christians, I mean, fake videos being used to [01:03:00] generate outrage that look real.
Sound real. I guarantee you, within a year my mother will be sending me stuff and saying, did you see this? And I’ll be going, yeah, that’s not real. I. She sends me stories today from some websites about, did you know that um, UFOs are really humans from the future that have time traveled and are, uh, trying to warn us about stuff.
I’m like, yeah, I think, no, I think it’s a
Steve: no. I mean, the flat earth movement and moon lands are faked and all of that kind of stuff, I think is the seeds of this where people can be really influenced. I mean, people can really believe anything if they want to, and, and that’s stuff where it’s like, can be debunked.
But we are now moving to the era where debunking is impossible because it’s looks so real. It’s gonna be hard to debunk anything unless you were there. And even when [01:04:00] you were there, you’re gonna say, well, did almost well, you can’t be there because there is no there. If it’s digitally created, there’s no there to be.
That’s the point you would Yeah. Unless someone was there. And then you have to see, uh. Someone saying I was there, but you’re just gonna see a digital version of that and you just get into this wormhole of layers where you can’t prove anything that actually happened.
Cameron: And these, like the world moves so quickly today, that videos spread, outrageous, created, and there’s obviously billions of dollars being spent in bot farms to create mass outrage and mass movements, or attempt to create them anyway, leading up to highly critical times like elections or votes on topic X or topic Y, trying to influence politicians and trying to influence business leaders, et cetera, et cetera.
We’re now in a world where it’s gonna be [01:05:00] increasingly difficult for all of us. To tell what’s real and what’s fake. And the default position, I think for all of us should already be, has been for me for a long time, but needs to increasingly be my def. Like, you know, every, you used to say everyone is, um, innocent until proven guilty.
My, my basic position on everything is it’s fake. Unless somebody can prove that it’s real.
Steve: Okay, stop. It is fake until it is proven real. That is the doctrine of the future in a generative AI world. Cameron Reilly, you have nailed it.
Cameron: It’s, I, I need an accurate, I need a, like you said, NND. It’s like, um, FBP.
Fake, fake. Unless proven. FUP. It’s
Steve: Fup. Fup. That’s a real fup up right there. That’s fake until proven. I’ll just say Fup hashtag fp. Invent that. [01:06:00] Get on now
Cameron: and invent. Hashtag I invented dba. Do you know dba? Dba? Ray and I have used DBA on our history shows for dba 10 years. DBA is our basic philosophy for life.
Hashtag dba Don’t be don’t
Cameron: don’t be a cunt. That’s basically the philosophy.
Cameron: one. I thought that’d really take off, but you know, I’ve got a t-shirt with it on it, but no one else has.
Cameron: fake it until proven.
Steve: Yeah, it is fake until proven, and I think that needs to be the starting point now.
Don’t believe anything. Assume it’s fake. And then we’ll work it out. I mean, even Snopes and those websites, so few people know about how to prove something, and my daughter often says to me sometimes where something’s from is more important than what it is. She said that to me once when I wrote her a poem from chat, GBT, and I read it to her and she goes, oh, I love that.
When did you write it? And I said, I wrote it just now chat, GPT. She said, I hate it, and I’m not even sure if I like you. I [01:07:00] said, you liked it five minutes ago? She said, I liked it when I thought you did it. I said, it would’ve been worse. And she said, no, it would’ve been better. Because sometimes where something’s from is more important than what it is.
Cameron: deepest thing I’ve heard today.
Steve: Oh gee, thanks 24 hours, man. I must have really slay it.
Cameron: That’s from your daughter?
Steve: That’s from my daughter. How old is she now? She’s, she was 13 when she said that, but she’s 15 now. So she said that two years ago? Yeah. It was doing the podcast with her. Get her on, what am I talking to you for? Well, we could actually, we could get her on. She, um, she’s a pretty smart kid and very, uh, into the environment and the world and worried about ai.
She just did a big essay on how fast fashion is ruining the world and had all these stats and everything. But for her birthday she said, I want something really cool and personal. That’s what she said. And I got chat PT to help me write a poem about us, and I read it to her. And that was that, that was what she said afterwards.
Cameron: The phrase where something is from is more important than what [01:08:00] it is suggests that the origin context of something is more significant than its inherent nature of characteristics. This is according to Google’s ai. Um, yeah, I don’t know. If anyone has used it before. Did she invent that? I can’t see any.
Cameron: Yeah, she appearing any she invented that is, yeah, she
Steve: said it. I actually wrote a blog post about it two years ago. I’ll send you the link to it, but it’s, it happened. I’ve got the whole story. I’ve even got the poem that I wrote and I put the poem in there and did it. It’s crazy.
Cameron: That is deep. Wow.
Alright. Uh, you wanna talk about copyright before we wrap up? Well, I think
Steve: copyright is, is a whole lot of questions. People say to me, oh, lawyers are dead. And I’m like, not yet. There’s a lot of copyright battles that we need to have and, and find out and get to the bottom of. But I think fundamentally we’re gonna see a huge shift in, in, in copyright because, uh, now that everything is [01:09:00] remixed and you can’t actually find out where the pieces of the puzzle came from, I mean, what are you, what are your, what are your thoughts on data sets and copyright now that you and I.
Do in the style of Tarantino, right? I mean, of course humans have been copying humans for a long time. But, but now what happens?
Cameron: I was, I was laughing, I was telling Chrissy earlier, so I, I recorded, um, some podcasts with Ray this morning. We’re talking about, um, the first crusade, and it was, I was talking about this incident.
In 10 98 when there was all the princes, the Christian princes from the first crusade were coming together in Antioch to talk about going to Jerusalem. And one of them who’d been away and conquered a nearby Muslim town when he came to this meeting, brought gifts for the other princes of heads, uh, that he’d cut off of Muslims that he had captured, uh, in this other town and presented them with a head.
And I was telling Ray that, you know, people don’t know this, but a thousand years [01:10:00] ago, that was a, you know, today you go to somebody’s house for dinner, you take a bottle of wine. Back then when you went to somebody’s house, you took a head of one of your enemies that you’d cut off to present to them as a gift.
You’d wrap it up, it’d be nice, and you would, so when somebody said to you back then. Would you like me to give you head or could you give me head?
Cameron: what they were referring to. But you know how the English language changes over time. Because what happened back then is when you would give someone a head, give someone head, you would, you would kneel down and present it to them.
Really? Yeah. And then over time people would say, well, while you’re down there, suck my dick. And then over over a thousand years, the practice of giving the head went away. And now when we say Give me head, it’s just the la, but people don’t understand the history of it. See Chrisy Chrissy said, is that true?
I said, no, I just made it all up. But when the A AI say,
Steve: Louis CK really should have said, look, this is a historical [01:11:00] context you’ve missed.
Cameron: Well, he didn’t ask people to give him head. He just jerked off in front of people. Oh, I don’t know
Steve (3): he did anyway. I knew it was something bad
Cameron: when. AI are trained on my podcasts.
Theis will think that that’s really true. That’s history. And generations of kids will be told that, uh, that’s where the term giving head came from. But, so in terms of copyright, it’s been
Steve: a bit, uh, it’s been a bit, uh, uh, Tory, today’s podcast, glued some ways.
Cameron: Welcome to my world. Um, that’s where I, my head is most of the time, getting back to copyright, uh, before they steal my giving head joke.
Look, I think copyright is dead. And this gets back to this, uh, you know, New York Times suing open ai. All the, we’ve talked about this. Artists are up in arms, authors are up in arms. I’ve been saying this to people for the last year or two. You don’t understand how I, AI are [01:12:00] trained. I. The, they don’t take your work and copy it.
They learn from everything and then they remix it. It’s remixing, but they’re not, not remixing like we did with hip hop in the early days. They’re not taking, they don’t, but, and replaying it over and looping it. It’s literally how color is used, how words are used, how you know it, and
Steve: yeah. Yeah. There is gonna be a at scale and it’s taking a lot of pieces and creating a new collage where it can reinterpret and.
Yeah. Yeah, I think you’re right. It isn’t the exact same as stealing something and repurposing it actually is, is learning from, I think the thing that they’re upset about is the, is the computational systems have an incredible ability to take everything in and learn from it at scale, which has never been possible before.
But I [01:13:00] do think, I don’t think it’s copyrightable, but I do think there is a, and these have an overlap, is licensing, you know, what you train the database on. I don’t think there should be a copyright payment in perpetuity, but there should be a licensing fee of sorts, especially when your data and content is private or behind, or is copyright protected or behind a wall.
Now my blog post, I’ve written nearly 3 million words on my blog posts and I can ask chat GPT to write a blog post that sounds like me. And it does, and it’s got all of my stuff in there. But I put it up there and said, here it is free to use and digest. But if you are the New York Times and you have it behind a paywall, and then they’re paid to get it and then put in their database to learn, I think that’s a different thing.
Cameron: Well, there there’s two sides to that. Number one, you, you can’t copyright, as far as I’m aware, the information in the blog post you, the only thing copyright protects is somebody lifting your exact [01:14:00] words and copying that to, to a nearly complete extent. Like you, you change a word here or there, it doesn’t count you.
You know, you. So for example, when I’m doing a podcast on. Crusade. I’ll buy five 10 books on the Crusades and I’ll read them and I’ll write my own notes based on all of that. Right? Sure, sure. I’m not breaking copyright, even though I’m getting that out of books. I’m taking what is in the books and then I’m writing my own notes based on what I’ve read in those books.
Cameron: It’s a good argument. It is a good argument. That’s exactly what the LLMs do. Exactly what the LLMs are doing. Right. It’s a good argument and that’s what Open AI’s defense is against. The New York Times is, yeah, we’re not copying your article and repeating it word for word or even 80% word for word.
We’re just taking that information and it’s generating its own responses [01:15:00] based on what it’s learned from reading your newspaper.
Steve: It’s fair play. It’s fair play. I mean, look, it’s gonna be interesting because I think that the battles will heat up and they’ll heat up. Even more so because it’s not just gonna be the New York Times or Getty Images who are getting upset.
It’s gonna be video game makers. Hollywood. Yeah. People with incredible wealth. The music industry. So it’s people with bigger wallets than the New York Times, you know, regardless of how respected it is. So the battle will heat up, I think, in the long arc of technology. The technology always wins. It usually does, and they’re up against people with bigger wallets.
But it’s gonna be a big battle.
Cameron: And it’s also a question of pace, right? These sorts of lawsuits take years to decades to make. After you sue and you counter sue and you counter counter sue, and then you appeal. And then you appeal the appeal. Lawyers drag these things out, particularly in the US for as long as they possibly can.
’cause that’s how they [01:16:00] make their money. Yeah. And the courts are full and they’re busy, et cetera, et cetera. Meanwhile, this technology’s moving at such a rapid pace that the companies that are trying to sue won’t even be around. Disney won’t even be around by the time this all falls out, they’ll eviscerated when people are generating their own content, plus.
A lot of these products are gonna come outta China. Good luck. You know, Disney trying to sue Chinese AI companies, and that’s sort of the defense of the American based AI companies. They’re like, look, you might be able to slow us down, but then China’s just gonna come and do it all anyway. Then you can have the US government and all the Western governments try and ban all of the Chinese AI products.
Like they’re still in the process of banning TikTok. Supposedly they’ll try and ban all of the Chinese AI companies, but that’s not gonna work either. ’cause people will find a way around that. So it’s just, uh, you can’t fight the technology. You just, you, [01:17:00] you can try and slow it down so you can milk a few last bucks out of the previous business model.
But if history has taught us anything, is that you can’t slow this. You can’t slow down. I. Evolutionary change as much as you don’t like the, the, uh, not the printing press. What was the, the, the looms, they hated the Yeah, the mechanical looms. Yeah, yeah,
Cameron: crew. Yep. As much as you hate the, the electronic looms, the mechanical looms run away.
You can protest, you can march in the streets, you can go on strike, you can do all of that. It’s just gonna happen, like, you know, and this is moving as we know. So incredibly fast. Anyone thinks, speaking of which, before we go, I’ve gotta do the, the RIP oh man. One of the guys that introduced me to the singularity.
Um, stuff. Australian author, [01:18:00] Damien Broderick, he wrote a book in the late nineties called The Spike. He was a science fiction author, Australian science fiction author. He wrote a book in the late nineties, 97, I think, called The Spike, where he was talking about the singularity. Hmm. He wrote a book in 99 called The Last Mortal Generation, where he was saying that people born after, or bef, you know, before or something, we’re gonna be the last mortal generation.
Steve: Yeah. We’ve discussed that and Kurzwell picked up on some of those ideas as well in his book, the Age of Spiritual Machines.
Cameron: I, I just happened to look him up. I was, I quote him all the time. I had dinner with Damien when I was working at Microsoft in the late nineties. I reached out to Damien and took him out to dinner.
We went down to, um, the Stoke house. I think it was in St. Kil or I took him out to dinner and we spent a few hours talking, but this is probably 99. And I said to him, when are people gonna take the singularity seriously? And he said, when it’s far too late to do anything about it.
Cameron: [01:19:00] I looked him up the other day ’cause I quoted him and found out he died last month.
Oh no. Yeah, he was 80. He was living in Guatemala or somewhere. Moved to Latin America in his last years. Um, and I was gutted because it’s all happening, uh, all of the stuff that he. Predicted 25, 30 years ago is actually coming to pass and he’s not gonna be here to see it, to take advantage of it. And it was, it’d be like curse while dying right now.
You know, it’s just, I was absolutely is dying his hair.
I was absolutely gutted to learn that Damien Broderick passed away, uh, this week. So. Oh man. Like, just to be on the verge of it all and to not be here to see it [01:20:00] come to pass. I don’t know, maybe, maybe he was not excited about it. I don’t know. I did re, I reached out to him a couple of times over the last 10 years to try and get a podcast with him, but I just couldn’t track him down.
He didn’t reply to any, I had like an old email address he wasn’t replying to, and he wasn’t on social media, didn’t do any of that sort of stuff. He just wrote the occasional book, but, um, he was in seclusion. So anyway, RIP Damien Broderick. Thank you for what you gave me, mate. Had a huge impact on my thinking in my, you know, twenties.
All right, that’s the futuristic, I think.
Steve: Thank you, Cameron.
Cameron: Quick half hour slash 90 minute show there, Steve.