
Sign up to save your podcasts
Or
Ask someone with enough experience
Alan: Welcome to the XR for
Michael, welcome to the show, my
Michael: Thank you so much for
Alan: All right. Tell us what
Michael: It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?
Alan: Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s pretty badass.
Michael: I completely agree. I think stabilization is what kind of brought the technology full circle, and what started enabling it to be widely used by not just VR content creators, but also by traditional filmmakers who just want special angles or to capture certain moments in a much easier way. And just kind of backing up for a second, what most people don’t see us as, and what Insta360 truly is, is a software company. And that’s what you were talking about as being the difference in creating a product that from a hardware standpoint is powerful, but also from a software standpoint, it’s easy enough to use where it’s not going to hold you back, or create these lengthy postproduction delivery timelines. And that’s something that really dipped the whole industry several years ago, is that the hardware was there with some of these companies, but when it came to actually deliver this content or edit it or share it across any platform, that’s kind of where all the roadblocks were. And so what we decided was we need to blur the lines. We need to make 360 content creation just as simple and easy as traditional 2D flat capture. And a big thing in doing that was adding cinema-grade stabilization. And with a 360 camera, it’s a little bit different than stabilizing for a traditional flat camera, because you’re essentially capturing everything. And since you’re capturing everything, the stabilization, the way it works and the algorithm that you need to program is a little bit different. And not getting too many technical details — which my engineers, I’m sure, would love to chat about — but with our One X camera, which is the world’s most popular consumer-grade 360 camera, it’s actually doing two things at the stabilization level. The first is we’re packing insanely powerful gyroscopes directly into the cameras.
Alan: That makes sense. I
Michael: There are two types of stabilization happening in this small little camera. There’s the stabilization at the capture level, which is essentially just using the gyro and the data to stabilize that in capture. And the second way is through a post-processing stabilization algorithm, that we call FlowState. And FlowState is something that we came out with the last generation of our camera, but we were constantly taking it in, perfecting it, and making it an even better, more dynamic stabilization algorithm. And so with FlowState now, you’re stabilizing again during post. And that’s something that’s available in our app, whether you’re using an Android or an iOS device or if you want to stabilize through our desktop software, you’re taking that and you’re essentially making cinema-grade stabilization at the touch of a couple of buttons right from your phone. And that’s something that’s created a really smooth and dynamic workflow with all of our users. You don’t need to be a super well-trained videographer or you don’t need a buy or rent tons of stabilization gear to stabilize your camera. All you need is the actual device, our invisible selfie stick, or whatever you’re mounting it on, and the willpower to actually do it. So now we’re seeing all of these folks in the action sports industry, in the filmmaking space. They’re now incorporating the One X into their workflows, whether it’s specifically around 360 capture for VR, or just adding special angles and pulling off really unique shots.
Alan: Like the flying one where you can attach a camera to little plane wings and fly it through a scene.
Michael: Exactly.
Alan: That’s so badass.
Michael: And that shows that at
Alan: For sure. It’s just
Michael: It’s incredible how
Alan: It really is. Your team–
Michael: Sure. So the Evo is one of our newer products. It came out a little less than a year ago, I believe it was last April. And what the Evo is, it’s a consumer-sized camera. It fits in the palm of your hand and it’s a convertible. So it converts from 180 degree 3D to full 360 degrees. And the great part about this is how simply it goes from mode to mode. One challenge that we started with was, well, when you’re flipping it from 180 degrees to 360, how do you account for the calibration? How does the camera get to know that it’s going from this mode to the other mode? And how simply is it going to be for the end-user to actually go through that process? And so what we did was we–
Alan: How easy it is: as a user,
Michael: And that’s the exact
Alan: Yeah. And the 180’s interesting because I have to also call out your packaging of this product. When I got the box, first of all, it’s a beautiful small box, size of a consumer electronics box. You flip it over and there’s a lenticular image on the back of the box of some people at a birthday party. But it’s fully 3D and it just has this depth and feeling to it that’s absolutely incredible. And that was a photo taken using the 180 mode. Like, this is really incredible marketing. When you open the box, camera’s inside, but there’s also a VR viewer, like a little Google Cardboard kind of thing, a little plastic thing that you slap on your phone. So I took a picture in 180 and I was like, okay, what is this 3D all about? Put it into VR. And I couldn’t believe– It just brought the photo fully to life in three dimensions. It was just really, really cool. And so I don’t know how people– how are people using this?
Michael: So it’s actually
Alan: Lenticular
Michael: Yeah. And–
Alan: We had lenticular business
Michael: And so it’s so
Alan: [explosion sound] That was
Michael: [laughs]
Alan: So basically what you’re saying is I can take a 3D 180-degree video, and then play it back on my phone with this case. It’ll– I guess I assume it’s like a lenticular case that allows me to see it in 3D.
Michael: That’s exactly it.
Alan: That’s badass. I
Michael: So we have one on the way for you, and we’re excited to get your feedback on it. That’s one really cool niche, unique way of using a 180 3D camera. But overall, I would say there’s this mega push happening right now from some of the biggest tech companies in the world. You see Facebook with Oculus. You see Adobe. You see Google. So I’m talking like all the tech giants right now are making this huge push for 180. And as a result of that, we decided to work with them to create the Evo. And we’ve done countless workshops now, we’ve done some really great introductions with the YouTube VR Creator Lab, where we actually saw creators get a choice between an Evo and some of these more high end expensive 180 3D solutions. And once we had a chance to intro the Evo and talk about what it can do and how it can make their content capture easier, we saw mouths drop and we saw people actually taking the $400 camera over the $2,500 camera. And it was incredible to see, because one challenge — especially with YouTube content creators — is that when they first started dabbling in 360 or just 180 3D, there were a lot of roadblocks along the way. And those roadblocks ranged from simple things like not even– not getting a live view on what you’re recording, to connectivity issues, having to be too far away from the camera, and then not seeing the image that you’re recording. And then when it comes time to actual post-production and delivery, they were having to go through this whole long workflow where you had to stitch your footage. You had to–
Alan: I remember those days.
Michael: And so after they’ve
Alan: Stabilizing.
Michael: Yeah. And there’s like
Alan: Yeah, 360’s too hard.
Michael: Yeah. I’ve spent a week just exporting this video. And when I finally put it into my Oculus or whatever I’m viewing it in, it doesn’t look good. And they don’t want to spend the time to go back and do it again. And so what we thought of was, “Well, how do we make every step of the production and post-production process easier for these folks?” And once we showed them that, like, “hey, you can actually connect your Evo through Wi-Fi to an Oculus Go headset or to an HTC Vive and you can adjust your exposure and your camera settings, live in-headset and see where you’re actually recording,” it was a huge game-changer. And then when we showed them that you can after you record, you can stream whatever you just shot directly into your headset so that you don’t need to go home and put it in your computer and go through the process. And if you need to reshoot something, you’ve already lost your location, you’ve lost your talent, lost everything that you had on production day, and you simply just can’t go back and do it without more budget. This has revolutionized the industry for them, because now the pipeline for delivery is much shorter and you’re not getting as many headaches throughout the entire production process. So we’re really listening to our end-users and learning about what their problems are. And we’re trying to solve each problem every step of the way.
Alan: And it shows, it really
Michael: Yeah. Yeah. It’s completely non-dependent on which phone you have. We support most models of newer phones, the workflow and the user experience is a little bit different with Android and iOS, just in terms of how both operating systems interact with the products. But the experience in the app is is the exact same. And where we started was the Nano and the Nano S are obviously our older product lines. But what we proved with those is that you can have this small phone attachment and you can simply and easily capture, edit, and share photos and videos in 360 to social media without ever having to pull out the microSD card from your camera. So showing that you can actually upload 360 videos and photos faster in some cases than you would from a traditional camera or even editing from your cell phone, is a huge milestone for not just our company but for the whole industry. And with cameras like the Nano S, it had some really unique features in that you can live chat with somebody like FaceTime in 360 and you can give the person on the other end of the camera full control of your 360 camera, so they can actually pan and scan around your 360 without having to have one themselves. So we’ve had these really unique features that we’ve come out with that just aren’t achievable with a normal camera.
Alan: It’s incredible. Absolutely. And I mean, if I wasn’t– if I wasn’t a user of this, I wouldn’t know what you’re talking about. But I’ve been using these cameras. The first time I met Max from your team was at the UploadVR party, maybe four years ago now, when they launched their studio in LA. And you guys had the Insta Pro there. The very first Insta360 Pro. So let’s kind of move away from the consumer side. And then I would recommend that anybody if you’re in marketing, you’re in sales, you want to create, capture, and develop great content, kind of the lower level content you can use the Go, the Evo, the One X, all of those, the Nano. But when you want to go the next level up and you want to put something in VR and you want something to be future-proofed, you guys have the 360 Pro, the Pro 2 and the Titan. So walk me through that, because the first time I saw it, it was in a low light little show within Upload’s studio. It was dark and I put the headset on, in real-time I was seeing three dimensional 360 stereoscopic. It was just incredible. And now you guys have made it even better. It’s– the Pro is what, 8K?
Michael: Yes, the Pro shoots up
Alan: And the Titan is 11K. To
Michael: And that’s the goal. It’s– there’s– the challenge in our industry is definitely on the headset and the viewing side. The technology just isn’t quite keeping up with the camera tech. But we’ve gone above and beyond and undercut the system a little bit. So– and I know I’m jumping forward, but I think this is kind of important to make that distinction. So there’s three inherent challenges from a professional VR content capture perspective. The first is obviously production and we can get into that. The second is post-production. And then the third is delivery and viewing. So it’s basically there’s problems at every step of the way, right? On the viewing side, what we did about a year ago was we came out with a technology called Crystal View. And Crystal View allows us to play 8K videos — or higher, now — in a non-8K device and that’s available in iOS, Android devices, their smartphones, your tablets, and headsets like Oculus Go and HTC Vive. What this does is it uses a technology that Google made popular some years ago and it uses something called dynamic rendering. And so what it’s doing is, it’s packing as many pixels as possible into your immediate field of view and diluting everything else that you’re not looking at. And if you whip your head side to side and you’re looking at other perspectives in your 360 headset, it’s doing this live. So you’re not seeing any lag, you’re not seeing any latency. And so we’re basically what we call it is our version of playing higher res and non 8K and 10K and 11K devices. And so you’re actually getting higher resolution, when the device itself is maxed out at 4K.
Alan: So you’re basically using
Michael: It’s– that’s exactly
Alan: Wow. So for people to understand what that means: in order to render the full scene of 360, think of how much data has to be put into a headset to make everything super crystal clear behind you, that you’re not looking at ever until you turn around. So you’re basically saying, most people’s eyes — well, everybody’s eyes — only see in the middle 5 percent of what you’re looking at any given time. So I would assume that as we progressed with these headsets to have eye-tracking, your Crystal View will actually get more refined into that five degrees rather than whatever it is now, maybe 50 degrees, I would think. But that would be even more so as we get eye-tracking involved. Is that correct?
Michael: That’s it. And it’s
Alan: Absolutely. So you’re basically– you’re able to shoot these cameras in super high res. Future-proofing is acknowledged. So something you shoot today on the Titan is going to be as relevant today as it is five years from now, by the time the headsets actually catch up to this. If they ever catch up, and by then you’ll be shooting in 600K or something or.
Michael: [laughs] Yeah. That’s
Alan: [chuckles] It was a pipe
Michael: Yeah. So the idea is
Alan: And let me remind
Michael: And you had to have the skill to be able to do that. It’s not just like anybody can pick up these unstitched files and just seamlessly put them together. It actually required a lot of knowledge and skill, and so it was limited to only a few people. And that’s one of the reasons, in addition to just being time-consuming, it was a drain on your budget. Nobody wanted to do it. So what we came out with and a lot of other companies have this now is auto-stitching. So you’re looking at the camera, you’re looking at the live view and it’s already being stitched in front of you. So that was kind of the revolution on that side. But then we took it one step further. So let’s say you’re shooting an hour of 8K content and you’re on a MacBook Pro like I am. That can take you overnight. It can take you even if at times it could take you a couple of days to stitch that out in full resolution. And so you’re still– even though you’re not sitting there in front of your computer and stitching it yourself, it’s still time-consuming. It’s still– it’s taking a lot of time to get that done. So what we decided was, OK, well, why don’t we partner with Adobe and why don’t we give people the chance to before they even export, you can trim out everything that you don’t care about. So if you only want five minutes of your one hour, or if you want three minutes or 10 minutes or however much you want, you can actually now export out only the parts that you care about in full resolution. And we did this in two ways. So we actually enabled the Pro 2 to record proxies simultaneously, as it’s recording the full resolution videos. And in doing so, you can now import the proxies directly into Premier and then you can trim and edit based off that.
Alan: Ah, so you don’t have to
Michael: But it will also bring in the whole giant file as well. So if you’re doing something like motion graphics and you need to be frame specific, you can toggle between the full-res and the proxy. So everything’s there for you. It’s just all about the user individual preference and how you want to edit and how you want to export and do all of your work. We’re very flexible and not forcing people into using our software, per se.
Alan: Wow, it’s really incredible. I got to ask you a question. So I’m looking at the Insta360 Titan website for a second. It says “8× Micro Four Thirds Sensors.” What the hell does that mean?
Michael: So with the Titan, it’s an incredible piece of technology, because we took what we learned from the Pro and the Pro 2, and we applied it to something that’s even higher resolution. So one thing that the Pro and the Pro 2are somewhat limited in, is low light shoot8ing.
Alan: Ah, OK.
Michael: We took that learning and we just built in higher resolution sensors and higher and micro four thirds will enable you to shoot in low light. And you won’t see a lot of pixelation, you won’t see a lot of blues or purples in the blacks. It’s making it the ultimate low light 360 camera. And we’ve taken this thing everywhere and we’ve tested it, and we’ve had our partners test it in low light. And the feedback that we’ve gotten from it has been absolutely remarkable.
Alan: Yeah, I can believe it. I can’t wait to see stuff shot on this. And the great thing– it’s interesting because today, my first interview this morning was with Michael Mansouri from Radiant Images. And I mean, they’ve been pioneering 360 cameras and that sort of thing since the very beginning. And I believe they’re one of your resellers as well. The cameras that they put together two years ago were the– I wouldn’t say the equivalent to the Titan, because they’re not, but they were basically hacking together this, using full digital SLR cameras in a custom mounted rig, whereas you guys have it all in a unibody construction that allows you just pull it out, film with it, put it away, put your SD cards in. I even think there’s hot-swappable batteries in these, aren’t there?
Michael: So you can have the titan on house power and you don’t even need to have a battery in it. So, yeah. And Michael’s a very good friend. We worked together quite a bit and we do some great projects together. We actually just did the world’s first 8K livestream in 360 into a dome. That was a– it’s a project that we’ve been working on to promote our new 8K live stitching software. That was about a month and a half ago. And so if you think about that, we’re barely just now getting 8K TV sets, 8K headsets are still a year or two off.
Alan: Minimum.
Michael: The fact that we’re
Alan: That’s amazing. So let’s talk about some use cases, because we’ve talked about the technology quite a bit. What are people using this for in business, in enterprise? What are the best use cases that you’ve seen other than entertainment, which is the obvious one man?
Michael: I can talk about this
Alan: [laughs] We’ve got 15
Michael: Oh my goodness. Well, I
Alan: Is your camera able–
Michael: It’s mostly our partner
Alan: Very cool.
Michael: But beyond that, we’re
We worked with a psychologist in New York City who is using exposure therapy to help cure people of PTSD, phobias and other traumatic events that happen in their lives. And the really great part is how you see the treatment evolving. They used to actually just build virtual experiences and it would use gaming engines like Unity and Unreal. And it wasn’t realistic enough. It was basically like a cartoon or almost like gamified people that you were interacting with, and it wasn’t working. And so when the psychologists started taking real 360 footage and he had a person that was scared of flying, or driving over bridges, or even walking downstairs, he would put himself in these environments. He would shoot the whole thing in 360 and then he would put his patients in those same experiences over and over and over again until they finally felt at peace with whatever phobia or traumatic disorder they had. And he’s seeing a huge success rate in this type of treatment over the past two years.
Alan: That’s incredible. To
Michael: The Titan is $15,000,
Alan: So $15,000 to– and how
Michael: Well, in the past it may have cost quite a bit, but today you can do it with a $400 camera and $10 a month with Matterport.
Alan: No, I know. But what I’m
Michael: It’s a no-brainer. It
Alan: Amazing. What is one
Michael: One big issue that we’re having right now is in the journalism space, in the newsgathering space. There’s so many different ways of telling your story. And right now, with everything that’s happening politically around the world, we just need real news and we need to see what’s actually going on. With 360, you can’t get any more real. And we’re seeing the journalism space being a huge beneficiary of 360 technology, whether it’s live broadcasting from disaster events like CNN has been doing, or just telling immersive stories that really hit home with whatever you’re trying to get across. I think that’s the future of all news and stories and coverage that’s going to be shared across the world.
4.5
1212 ratings
Ask someone with enough experience
Alan: Welcome to the XR for
Michael, welcome to the show, my
Michael: Thank you so much for
Alan: All right. Tell us what
Michael: It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?
Alan: Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s pretty badass.
Michael: I completely agree. I think stabilization is what kind of brought the technology full circle, and what started enabling it to be widely used by not just VR content creators, but also by traditional filmmakers who just want special angles or to capture certain moments in a much easier way. And just kind of backing up for a second, what most people don’t see us as, and what Insta360 truly is, is a software company. And that’s what you were talking about as being the difference in creating a product that from a hardware standpoint is powerful, but also from a software standpoint, it’s easy enough to use where it’s not going to hold you back, or create these lengthy postproduction delivery timelines. And that’s something that really dipped the whole industry several years ago, is that the hardware was there with some of these companies, but when it came to actually deliver this content or edit it or share it across any platform, that’s kind of where all the roadblocks were. And so what we decided was we need to blur the lines. We need to make 360 content creation just as simple and easy as traditional 2D flat capture. And a big thing in doing that was adding cinema-grade stabilization. And with a 360 camera, it’s a little bit different than stabilizing for a traditional flat camera, because you’re essentially capturing everything. And since you’re capturing everything, the stabilization, the way it works and the algorithm that you need to program is a little bit different. And not getting too many technical details — which my engineers, I’m sure, would love to chat about — but with our One X camera, which is the world’s most popular consumer-grade 360 camera, it’s actually doing two things at the stabilization level. The first is we’re packing insanely powerful gyroscopes directly into the cameras.
Alan: That makes sense. I
Michael: There are two types of stabilization happening in this small little camera. There’s the stabilization at the capture level, which is essentially just using the gyro and the data to stabilize that in capture. And the second way is through a post-processing stabilization algorithm, that we call FlowState. And FlowState is something that we came out with the last generation of our camera, but we were constantly taking it in, perfecting it, and making it an even better, more dynamic stabilization algorithm. And so with FlowState now, you’re stabilizing again during post. And that’s something that’s available in our app, whether you’re using an Android or an iOS device or if you want to stabilize through our desktop software, you’re taking that and you’re essentially making cinema-grade stabilization at the touch of a couple of buttons right from your phone. And that’s something that’s created a really smooth and dynamic workflow with all of our users. You don’t need to be a super well-trained videographer or you don’t need a buy or rent tons of stabilization gear to stabilize your camera. All you need is the actual device, our invisible selfie stick, or whatever you’re mounting it on, and the willpower to actually do it. So now we’re seeing all of these folks in the action sports industry, in the filmmaking space. They’re now incorporating the One X into their workflows, whether it’s specifically around 360 capture for VR, or just adding special angles and pulling off really unique shots.
Alan: Like the flying one where you can attach a camera to little plane wings and fly it through a scene.
Michael: Exactly.
Alan: That’s so badass.
Michael: And that shows that at
Alan: For sure. It’s just
Michael: It’s incredible how
Alan: It really is. Your team–
Michael: Sure. So the Evo is one of our newer products. It came out a little less than a year ago, I believe it was last April. And what the Evo is, it’s a consumer-sized camera. It fits in the palm of your hand and it’s a convertible. So it converts from 180 degree 3D to full 360 degrees. And the great part about this is how simply it goes from mode to mode. One challenge that we started with was, well, when you’re flipping it from 180 degrees to 360, how do you account for the calibration? How does the camera get to know that it’s going from this mode to the other mode? And how simply is it going to be for the end-user to actually go through that process? And so what we did was we–
Alan: How easy it is: as a user,
Michael: And that’s the exact
Alan: Yeah. And the 180’s interesting because I have to also call out your packaging of this product. When I got the box, first of all, it’s a beautiful small box, size of a consumer electronics box. You flip it over and there’s a lenticular image on the back of the box of some people at a birthday party. But it’s fully 3D and it just has this depth and feeling to it that’s absolutely incredible. And that was a photo taken using the 180 mode. Like, this is really incredible marketing. When you open the box, camera’s inside, but there’s also a VR viewer, like a little Google Cardboard kind of thing, a little plastic thing that you slap on your phone. So I took a picture in 180 and I was like, okay, what is this 3D all about? Put it into VR. And I couldn’t believe– It just brought the photo fully to life in three dimensions. It was just really, really cool. And so I don’t know how people– how are people using this?
Michael: So it’s actually
Alan: Lenticular
Michael: Yeah. And–
Alan: We had lenticular business
Michael: And so it’s so
Alan: [explosion sound] That was
Michael: [laughs]
Alan: So basically what you’re saying is I can take a 3D 180-degree video, and then play it back on my phone with this case. It’ll– I guess I assume it’s like a lenticular case that allows me to see it in 3D.
Michael: That’s exactly it.
Alan: That’s badass. I
Michael: So we have one on the way for you, and we’re excited to get your feedback on it. That’s one really cool niche, unique way of using a 180 3D camera. But overall, I would say there’s this mega push happening right now from some of the biggest tech companies in the world. You see Facebook with Oculus. You see Adobe. You see Google. So I’m talking like all the tech giants right now are making this huge push for 180. And as a result of that, we decided to work with them to create the Evo. And we’ve done countless workshops now, we’ve done some really great introductions with the YouTube VR Creator Lab, where we actually saw creators get a choice between an Evo and some of these more high end expensive 180 3D solutions. And once we had a chance to intro the Evo and talk about what it can do and how it can make their content capture easier, we saw mouths drop and we saw people actually taking the $400 camera over the $2,500 camera. And it was incredible to see, because one challenge — especially with YouTube content creators — is that when they first started dabbling in 360 or just 180 3D, there were a lot of roadblocks along the way. And those roadblocks ranged from simple things like not even– not getting a live view on what you’re recording, to connectivity issues, having to be too far away from the camera, and then not seeing the image that you’re recording. And then when it comes time to actual post-production and delivery, they were having to go through this whole long workflow where you had to stitch your footage. You had to–
Alan: I remember those days.
Michael: And so after they’ve
Alan: Stabilizing.
Michael: Yeah. And there’s like
Alan: Yeah, 360’s too hard.
Michael: Yeah. I’ve spent a week just exporting this video. And when I finally put it into my Oculus or whatever I’m viewing it in, it doesn’t look good. And they don’t want to spend the time to go back and do it again. And so what we thought of was, “Well, how do we make every step of the production and post-production process easier for these folks?” And once we showed them that, like, “hey, you can actually connect your Evo through Wi-Fi to an Oculus Go headset or to an HTC Vive and you can adjust your exposure and your camera settings, live in-headset and see where you’re actually recording,” it was a huge game-changer. And then when we showed them that you can after you record, you can stream whatever you just shot directly into your headset so that you don’t need to go home and put it in your computer and go through the process. And if you need to reshoot something, you’ve already lost your location, you’ve lost your talent, lost everything that you had on production day, and you simply just can’t go back and do it without more budget. This has revolutionized the industry for them, because now the pipeline for delivery is much shorter and you’re not getting as many headaches throughout the entire production process. So we’re really listening to our end-users and learning about what their problems are. And we’re trying to solve each problem every step of the way.
Alan: And it shows, it really
Michael: Yeah. Yeah. It’s completely non-dependent on which phone you have. We support most models of newer phones, the workflow and the user experience is a little bit different with Android and iOS, just in terms of how both operating systems interact with the products. But the experience in the app is is the exact same. And where we started was the Nano and the Nano S are obviously our older product lines. But what we proved with those is that you can have this small phone attachment and you can simply and easily capture, edit, and share photos and videos in 360 to social media without ever having to pull out the microSD card from your camera. So showing that you can actually upload 360 videos and photos faster in some cases than you would from a traditional camera or even editing from your cell phone, is a huge milestone for not just our company but for the whole industry. And with cameras like the Nano S, it had some really unique features in that you can live chat with somebody like FaceTime in 360 and you can give the person on the other end of the camera full control of your 360 camera, so they can actually pan and scan around your 360 without having to have one themselves. So we’ve had these really unique features that we’ve come out with that just aren’t achievable with a normal camera.
Alan: It’s incredible. Absolutely. And I mean, if I wasn’t– if I wasn’t a user of this, I wouldn’t know what you’re talking about. But I’ve been using these cameras. The first time I met Max from your team was at the UploadVR party, maybe four years ago now, when they launched their studio in LA. And you guys had the Insta Pro there. The very first Insta360 Pro. So let’s kind of move away from the consumer side. And then I would recommend that anybody if you’re in marketing, you’re in sales, you want to create, capture, and develop great content, kind of the lower level content you can use the Go, the Evo, the One X, all of those, the Nano. But when you want to go the next level up and you want to put something in VR and you want something to be future-proofed, you guys have the 360 Pro, the Pro 2 and the Titan. So walk me through that, because the first time I saw it, it was in a low light little show within Upload’s studio. It was dark and I put the headset on, in real-time I was seeing three dimensional 360 stereoscopic. It was just incredible. And now you guys have made it even better. It’s– the Pro is what, 8K?
Michael: Yes, the Pro shoots up
Alan: And the Titan is 11K. To
Michael: And that’s the goal. It’s– there’s– the challenge in our industry is definitely on the headset and the viewing side. The technology just isn’t quite keeping up with the camera tech. But we’ve gone above and beyond and undercut the system a little bit. So– and I know I’m jumping forward, but I think this is kind of important to make that distinction. So there’s three inherent challenges from a professional VR content capture perspective. The first is obviously production and we can get into that. The second is post-production. And then the third is delivery and viewing. So it’s basically there’s problems at every step of the way, right? On the viewing side, what we did about a year ago was we came out with a technology called Crystal View. And Crystal View allows us to play 8K videos — or higher, now — in a non-8K device and that’s available in iOS, Android devices, their smartphones, your tablets, and headsets like Oculus Go and HTC Vive. What this does is it uses a technology that Google made popular some years ago and it uses something called dynamic rendering. And so what it’s doing is, it’s packing as many pixels as possible into your immediate field of view and diluting everything else that you’re not looking at. And if you whip your head side to side and you’re looking at other perspectives in your 360 headset, it’s doing this live. So you’re not seeing any lag, you’re not seeing any latency. And so we’re basically what we call it is our version of playing higher res and non 8K and 10K and 11K devices. And so you’re actually getting higher resolution, when the device itself is maxed out at 4K.
Alan: So you’re basically using
Michael: It’s– that’s exactly
Alan: Wow. So for people to understand what that means: in order to render the full scene of 360, think of how much data has to be put into a headset to make everything super crystal clear behind you, that you’re not looking at ever until you turn around. So you’re basically saying, most people’s eyes — well, everybody’s eyes — only see in the middle 5 percent of what you’re looking at any given time. So I would assume that as we progressed with these headsets to have eye-tracking, your Crystal View will actually get more refined into that five degrees rather than whatever it is now, maybe 50 degrees, I would think. But that would be even more so as we get eye-tracking involved. Is that correct?
Michael: That’s it. And it’s
Alan: Absolutely. So you’re basically– you’re able to shoot these cameras in super high res. Future-proofing is acknowledged. So something you shoot today on the Titan is going to be as relevant today as it is five years from now, by the time the headsets actually catch up to this. If they ever catch up, and by then you’ll be shooting in 600K or something or.
Michael: [laughs] Yeah. That’s
Alan: [chuckles] It was a pipe
Michael: Yeah. So the idea is
Alan: And let me remind
Michael: And you had to have the skill to be able to do that. It’s not just like anybody can pick up these unstitched files and just seamlessly put them together. It actually required a lot of knowledge and skill, and so it was limited to only a few people. And that’s one of the reasons, in addition to just being time-consuming, it was a drain on your budget. Nobody wanted to do it. So what we came out with and a lot of other companies have this now is auto-stitching. So you’re looking at the camera, you’re looking at the live view and it’s already being stitched in front of you. So that was kind of the revolution on that side. But then we took it one step further. So let’s say you’re shooting an hour of 8K content and you’re on a MacBook Pro like I am. That can take you overnight. It can take you even if at times it could take you a couple of days to stitch that out in full resolution. And so you’re still– even though you’re not sitting there in front of your computer and stitching it yourself, it’s still time-consuming. It’s still– it’s taking a lot of time to get that done. So what we decided was, OK, well, why don’t we partner with Adobe and why don’t we give people the chance to before they even export, you can trim out everything that you don’t care about. So if you only want five minutes of your one hour, or if you want three minutes or 10 minutes or however much you want, you can actually now export out only the parts that you care about in full resolution. And we did this in two ways. So we actually enabled the Pro 2 to record proxies simultaneously, as it’s recording the full resolution videos. And in doing so, you can now import the proxies directly into Premier and then you can trim and edit based off that.
Alan: Ah, so you don’t have to
Michael: But it will also bring in the whole giant file as well. So if you’re doing something like motion graphics and you need to be frame specific, you can toggle between the full-res and the proxy. So everything’s there for you. It’s just all about the user individual preference and how you want to edit and how you want to export and do all of your work. We’re very flexible and not forcing people into using our software, per se.
Alan: Wow, it’s really incredible. I got to ask you a question. So I’m looking at the Insta360 Titan website for a second. It says “8× Micro Four Thirds Sensors.” What the hell does that mean?
Michael: So with the Titan, it’s an incredible piece of technology, because we took what we learned from the Pro and the Pro 2, and we applied it to something that’s even higher resolution. So one thing that the Pro and the Pro 2are somewhat limited in, is low light shoot8ing.
Alan: Ah, OK.
Michael: We took that learning and we just built in higher resolution sensors and higher and micro four thirds will enable you to shoot in low light. And you won’t see a lot of pixelation, you won’t see a lot of blues or purples in the blacks. It’s making it the ultimate low light 360 camera. And we’ve taken this thing everywhere and we’ve tested it, and we’ve had our partners test it in low light. And the feedback that we’ve gotten from it has been absolutely remarkable.
Alan: Yeah, I can believe it. I can’t wait to see stuff shot on this. And the great thing– it’s interesting because today, my first interview this morning was with Michael Mansouri from Radiant Images. And I mean, they’ve been pioneering 360 cameras and that sort of thing since the very beginning. And I believe they’re one of your resellers as well. The cameras that they put together two years ago were the– I wouldn’t say the equivalent to the Titan, because they’re not, but they were basically hacking together this, using full digital SLR cameras in a custom mounted rig, whereas you guys have it all in a unibody construction that allows you just pull it out, film with it, put it away, put your SD cards in. I even think there’s hot-swappable batteries in these, aren’t there?
Michael: So you can have the titan on house power and you don’t even need to have a battery in it. So, yeah. And Michael’s a very good friend. We worked together quite a bit and we do some great projects together. We actually just did the world’s first 8K livestream in 360 into a dome. That was a– it’s a project that we’ve been working on to promote our new 8K live stitching software. That was about a month and a half ago. And so if you think about that, we’re barely just now getting 8K TV sets, 8K headsets are still a year or two off.
Alan: Minimum.
Michael: The fact that we’re
Alan: That’s amazing. So let’s talk about some use cases, because we’ve talked about the technology quite a bit. What are people using this for in business, in enterprise? What are the best use cases that you’ve seen other than entertainment, which is the obvious one man?
Michael: I can talk about this
Alan: [laughs] We’ve got 15
Michael: Oh my goodness. Well, I
Alan: Is your camera able–
Michael: It’s mostly our partner
Alan: Very cool.
Michael: But beyond that, we’re
We worked with a psychologist in New York City who is using exposure therapy to help cure people of PTSD, phobias and other traumatic events that happen in their lives. And the really great part is how you see the treatment evolving. They used to actually just build virtual experiences and it would use gaming engines like Unity and Unreal. And it wasn’t realistic enough. It was basically like a cartoon or almost like gamified people that you were interacting with, and it wasn’t working. And so when the psychologists started taking real 360 footage and he had a person that was scared of flying, or driving over bridges, or even walking downstairs, he would put himself in these environments. He would shoot the whole thing in 360 and then he would put his patients in those same experiences over and over and over again until they finally felt at peace with whatever phobia or traumatic disorder they had. And he’s seeing a huge success rate in this type of treatment over the past two years.
Alan: That’s incredible. To
Michael: The Titan is $15,000,
Alan: So $15,000 to– and how
Michael: Well, in the past it may have cost quite a bit, but today you can do it with a $400 camera and $10 a month with Matterport.
Alan: No, I know. But what I’m
Michael: It’s a no-brainer. It
Alan: Amazing. What is one
Michael: One big issue that we’re having right now is in the journalism space, in the newsgathering space. There’s so many different ways of telling your story. And right now, with everything that’s happening politically around the world, we just need real news and we need to see what’s actually going on. With 360, you can’t get any more real. And we’re seeing the journalism space being a huge beneficiary of 360 technology, whether it’s live broadcasting from disaster events like CNN has been doing, or just telling immersive stories that really hit home with whatever you’re trying to get across. I think that’s the future of all news and stories and coverage that’s going to be shared across the world.