Skip to content
The Present

Why 2023 will be “the year of mixed reality”

2023 will see an “arms race” in mixed reality hardware and software. This truly will revolutionize our society.
mixed reality
Credit: Gorodenkoff / Adobe Stock
Key Takeaways
  • The metaverse received a lot of hype in 2022, but the cartoonish virtual world depicted by the media is not what will transform society.
  • What will transform society is “mixed reality” (MR), in which immersive virtual content is seamlessly combined with our physical world.
  • Many new hardware and software products are coming out in 2023 — a veritable “arms race” in mixed reality.

Last month, I wrote a piece for Big Think that praised Generative AI as the most impactful technology of 2022. Over the weeks since, many people have asked me why the metaverse wasn’t chosen, as it was certainly the most hyped technology of the year. My answer is that 2022 was a rollercoaster for the metaverse, not a rocket ship. The public was promised a society-changing technologybut what many people saw were either cartoonish virtual worlds filled with creepy avatars or over-hyped startups selling “virtual real estate” through pump-and-dump NFT schemes. 

No, the metaverse was not the most impactful technology of 2022. Fortunately, it really does have the potential to be a society-changing technology. But to get there, the industry needs to move past today’s cartoonish worlds and push for experiences that are more realistic, more artistic, and far more focused on creativity and productivity than on minting NFT landlords. In addition, the industry needs to overcome the common misconception that the metaverse will force everyone to live in a virtual world that will replace our physical surroundings. This is not how the metaverse will unfold. 

Yes, there will be popular virtual worlds that are fully simulated, but these will be temporary “escapes” that we sink into for a few hours at a time, similar to how we watch movies or play video games today. But the real metaverse, the one that will impact our lives from morning until night, will not separate us from our physical surroundings. Instead, the metaverse will be a mixed reality, in which immersive virtual content is seamlessly combined with our physical world, expanding and embellishing our lives with the power and flexibility of digital content. 

I have been making this assertion for quite some time, but 2023 will be the year that mixed reality (MR) finally starts to take shape. That’s because a wave of new products is headed our way that will bring the magic of MR to mainstream markets. 

Mixed reality comes to the market

The first step in this direction was the recent release of the Meta Quest Pro, uses color passthrough cameras that capture the real world and combines it with spatially registered virtual content. It’s an impressive device, but so far there is little software available that showcases its MR capabilities. That said, we can expect the potential of this hardware to be unleashed during 2023 as software rolls out.

Also in 2023, HTC will release a headset that looks to be even more powerful than the Meta Quest Pro for MR experiences. To be unveiled at CES in January, it reportedly has color passthrough cameras of such high fidelity, you can look at a real-world phone in your hand and read your text messages in mixed reality. Whether consumers prefer HTC’s new hardware or Meta’s, one thing is clear — an MR arms race is underway, and it’s about to get far more crowded. 

That’s because Apple is expected to launch their own MR headset in 2023. Rumored to be a premium device that ships midyear, it likely will be the most powerful MR product the world has seen. There are claims it will feature quality passthrough cameras along with LiDAR sensors for profiling distances in the real-world. If the LiDAR rumors pan out, it could mean the Apple device is the first MR eyewear product to enable high-precision registration of virtual content to the real world in 3D. Accurate registration is critical for suspension of disbelief, especially when enabling users to manually interact with real and virtual objects. 

Why virtual reality is not the future

We humans do not like being cut off from our physical surroundings. Sure, you can give someone a short demo in virtual reality (VR), and they’ll love it. But if you have that same person spend an hour in fully immersive VR, they may start to feel uneasy. Approach two hours and for many people, myself included, it’s too much. 

This phenomenon first struck me back in 1991 when I was working as a VR researcher at Stanford and NASA studying how to optimize depth perception in early vision systems. Back then, the technology was crude and uncomfortable with low-fidelity graphics and lag so bad it could make you feel sick. Because of this, many researchers believed that the barrier to extended use was the clunky design and poor fidelity. We just needed better hardware, and people wouldn’t feel uneasy. 

I didn’t quite agree. Certainly better hardware would help, but I was pretty sure that something else was going on, at least for me personally — a tension in my brain between the virtual world I could see and the real world I could sense around me. It was this conflict between two opposing mental models that made me feel uneasy and made the virtual world seem less real. What I really wanted was to take the power of VR and combine it with my physical surroundings, creating a single immersive experience in which my visual, spatial, and physical senses were all perfectly aligned. I referred to this sensory-focused approach as “design for perception” and suspected that the mental tension and unease caused by VR would go away if we could allow users to reach out and interact with the real and the virtual as if they inhabited the same conceptual reality.    

By a stroke of luck, I had the opportunity to pitch the U.S. Air Force and was funded to build a prototype MR system at Wright Patterson Air Force Base. It was called the Virtual Fixtures platform, and it didn’t just support sight and sound but touch and feel (3D haptics), adding virtual objects to the physical world that felt so authentic, they could help users perform manual tasks with greater speed and dexterity. The hope was that one day this new technology could support a wide range of useful activities, from assisting surgeons during delicate procedures to helping technicians repair satellites in orbit through telerobotic control

A unified mental model

Of course, that early Air Force system didn’t support surgery or satellite repair. It was developed to test whether virtual objects could be added to real-world tasks and enhance human performance. To measure this, I used a simple task that involved moving metal pegs between holes on a large pegboard. I then wrote software to create a variety of virtual fixtures that could help users perform the task. The fixtures ranged from virtual surfaces and cones to virtual tracks you could slide the peg along, all while early passthrough cameras aligned the activity. And it worked, enabling users to perform with much greater speed and precision. But more importantly, none of the human subjects complained about feeling uneasy. This was very different than the VR experiments I ran at NASA. 

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

I give this background because of the impact it had on me. I can still remember the first time I moved a real peg toward a real hole and a virtual surface automatically turned on. Although simulated, it felt genuine, allowing me to slide along its contour. At that moment, the real world and the virtual world became one reality, a unified mixed reality in which the physical and digital were combined into a single perceptual experience that satisfied all my spatial senses – visual, auditory, proprioceptive, and haptic. Of course, both worlds had to be accurately aligned in 3D, but once that was achieved, you immediately stopped thinking about which elements were physically real and which were simulated. Instead, it was all just one world — one mental model. 

The future is mixed reality

Mixed reality is just getting started. The technology is poised to take off, and it’s not just the impressive new headsets from Meta, HTC, and potentially Apple that will propel this vision forward, but hardware and software tools from Magic Leap, Snap, Microsoft, Google, Lenovo, Unreal, Unity, and many other major players. At the same time, more developers will push the limits of creativity and artistry in MR. Already, they are unlocking what is possible when you combine the real and the virtual, from new types of board games (Tilt Five) and powerful medical applications (MediView XR) to remarkable entertainment experiences (Niantic Labs).

This is why I am confident that the metaverse, the true metaverse, will be an amalgamation of the real and the virtual, so skillfully combined that users will cease to think about which elements are physical and which are digital. Instead, users will simply go about their daily lives and engage a single reality that is enhanced and embellished with the power of immersive media. This technology has been a long time in the making, but 2023 will be the year the true potential finally starts to take shape. 


Related

Up Next