Send us a text
Today we're talking about a technology that gets us a few steps closer to the holodeck from star Trek, or the construct from the matrix, which if you remember, was that virtual environment used by Morpheus and Neo to learn kung fu and control and alter simulations “guns, we’ll need more guns”. These were virtual environments that could have a real impact on the people within the physical world.
Before we get futuristic virtual worlds that mirror and simulate the real world, we’re going to need to work out the technology that expands on how we visually represent and interact with a digital twin to effect change in reality. We have tens of thousands of individual products today that have bi-directional relationships with physical devices and a digital representation, including factories, airplanes, and individual condition monitoring solutions. The challenge we face is finding an easier way to integrate the entirety of these disparate ecosystems of devices that don’t natively talk to one another. Initially, businesses are seeking to improve efficiency, monitor safety, or even simulate or recreate events that occurred within these ecosystems of independent systems by interacting with their digital twin.
Science fiction has been predicting this future state well before the technology existed. It’s told in stories where somebody gets trapped in a virtual simulation while the rest of the crew is outside of the simulation and is in grave peril.
It requires the hero to find the backdoor in the virtual simulation in order to recreate a version of the starships controls that can affect the physical world. We may not have neural interfaces or fully immersive holographic of environments, but we do have examples of very intricate digital twins and augmented reality.
I've seen a number of command and control centers at larger manufacturing facilities where it feels a bit more like mission control filled with monitors displaying digital representations of plant status, which is usually it's a 2d image and color-coded status icons and helpful if you’re a bird or very familiar with the facility, but we are now starting to see the use of high fidelity 3d images that allow digital twins to be manipulated from multiple angles and see multiple data points visually superimposed on a different augmented reality layer. The data streams in from the various sensors that don’t talk to one another but still centrally report in, from the PLCs, IoT sensors, equipment status alarms, and granular condition monitoring subsystems. And that is really exciting. As we discussed in Episode one about the digitization of workforce knowledge, sometimes we need to see all of the different contextual data points to really understand why something is happening.
Generating these large visual 3D representations of a physical space is rapidly evolving. For years we’ve been able to look at photos and have 3D designers create a digital representation — trade show booths, home builders, and architectural firms have been selling their ideas for decades now. That’s not what we’re talking about — the technology we’re going to discuss is how we can digitally recreate a physical space that already exists in 3D In a matter of minutes to hours, depending on the size of the facility.
So in this interview, we've got Brittany Shramm from Matterport, a company that specializes in 3d image capture software and technology that created some of the highest fidelity scans of an industrial facility that I had ever seen.
They showed some demos and I was, uh, blown away. Digital twins and augmented reality have really gone from novel curiosity into something a bit more mainstream. We’re finding that it is becoming increasingly necessary to combine data from physical assets into applications and it infrastructure in order to imp