Imagine a parallel digital universe. A place where anything is possible and everything can be anything. You can see what others can't or participate together in the same dimension. You could switch your vision from work to navigation to gaming to socializing with a flick of a switch.
This will be the future, as many companies and organizations are developing such an environment. It is called the AR Cloud. What will this parallel reality be like?
AR Cloud is a persistent 3D digital content overlay on top of the real world. It is a continuously updated and connected collection of machine-readable datasets, point clouds, and descriptors aligned with geolocation, visual markers, and sensors to display 3D and other virtual content in real-time. The AR Cloud is a shared multi-user dimension around us, in which persistent augmented reality experiences reside.
It is the world's digital twin, real-time geospatial map, a parallel digital universe, Mirrorworld, The Real World Web or The Spatial Web - the Metaverse.
AR Cloud is an augmented reality universe
AR Cloud - the Metaverse - will become the next critical infrastructure in the history of computing — the operating system for the spatial computing era. And spatial computing, on the other hand, is going to be the next big thing, along with AI, artificial meat, quantum computing and gene editing. It's the future, of which we are starting to see glimpses of already.
Currently, there are multiple parallel universes created simultaneously. At least Apple, Google, Microsoft, 6D.ai, Immersal, Open AR Cloud, and Facebook participate in the creation process. In these early stages, the Metaverse is currently mostly theoretical but very much plausible sci-fi.
Who will do the development and investments, and how?
Most likely, no one entity will build or own the AR Cloud. Instead, there will be many clouds, just like there are many platforms and networks that comprise the web. Most companies will be building AR assets and local clouds, possibly with only one kind of content at a time. But what if things will be different?
There are still some technical details that make creating an actual Metaverse wait for itself. The Metaverse will require persistence, a kind of a 3D save function for virtual content. The virtual objects should stay in the real world where the user has left them. Depending on the situation, they might be personal or collective objects, but this is a massively multiplayer world of content attached to a simulation of the real world — a lot of data.
Another technical obstacle to overcome is occlusion, which refers to a situation where a physical object blocks the view of virtual objects. This kind of smooth blending between extended reality and the real world is required for an immersive effect.
There are many demo applications, but a real-time depth mapping of any real-world object at any given speed or opacity is hard. It requires a lot of AI to make our devices see as clearly as our eyes, developed by millions of years of evolution.
Other things, such as physics interaction and continuously changing lighting environment in relation to the real world, are more trivial, but add all these calculations and computer vision together with real-time data transfer of everything anywhere, and your powerful smartphone will quite eagerly bend over.
We also need next level displays and battery technology to shrink down the current generation of bulky AR glasses from awkward Star Trek-style sci-fi accessories to street-credible fashion you want to wear daily.
But we are getting there. As always, technology is advancing at a much higher pace than we can imagine.
So what is possible in the Metaverse? The most apparent business applications would be 3D navigation, providing autonomous vehicles with real environmental data, IoT applications, construction and urban planning. And of course advertising, social apps and games.
But the most imaginative future lies in the planet-scale, multi-user XR experiences. We will literally be the one and the same with a shared experience.
Moving the internet from screens onto the real world
With the AR Cloud, the navigation shifts from clicks and links to spatial interactions. All information will be at its correct place. Add user manuals, history and current data points of everything in every place, and access that data with your eyes, without the need for searching.
The Google mission statement 20 years ago was: "To organize the world's information and make it accessible and useful". Today we want and need immediate information - information about what's in front of me right now. The Metaverse will answer the call.
The dystopian prophecy in Keiichi Matsuda's short film Hyper-Reality (above) will, fortunately, likely never materialize. Consider it a warning for the designers of the future to focus on user experience and actual value. Even in today's world, people are blocking their ads. As the ads would be 100% intrusive, the whole augmented reality world would collapse under its own impossibility. Our monkey brain is already overloaded with the current amount of data, and completing any task would be impossible with a continuous bombardment of new visual stimuli.
What we need is a way to switch our realities according to the task at hand. With the Metaverse, this kind of extension of our physical being to the virtual dimension will be possible.
The real value of the Metaverse will reside in data
As everything you do in the real world would be mirrored and saved into the cloud, you will become the data point to follow. All of your interactions will become valuable. Where you are, what you do and what are you looking at. As with all data, privacy will be the single most crucial aspect to be considered. You really should read the small disclaimer in your AR Cloud EULA.
Dystopian vision is not the way to create a new universe
China is already collecting enough information to classify you as a citizen. And they are going as far as to show your socioeconomic status to the people around you. Scary, but reality.
Not unlike the episode "Nosedive" in Netflix's Black Mirror -series.
A shared experience
The Metaverse can be a shared recreation of the physical world that enables us to share experiences, collaborate and achieve collective intelligence to exceed our wildest imaginations. Today the internet is a home for more data one could ever consume, but you have to look for it. It is the most extensive library in the world, but finding what you are looking for is at the same time troublesome, intimidating, and most of all, dangerous and unreliable.
Imagine that all the data you could ever need is right in front of you, always, without the need for searching.
We are moving from looking down at objects to looking through and inside them.
At the time where the cloud and the Metaverse become more complete with things like depth sensors and direct links to our brain, we might even reach the point where the blind can see and the deaf can hear.
A new paradise
With great power comes great responsibility. Let's not reduce the magnificent platform to a plethora of intrusive advertising overlooking privacy issues. Like the real world, the Metaverse will be a delicate ecosystem of different species that is at least as easy to destroy as the real world with overpopulation and pollution.
I genuinely hope we can find the balance between the open and closed universe - a world where everything is social but where privacy is a choice. So the result will be more of a parallel paradise to enlighten humanity and not a suburb you are too afraid to visit.
But whatever the result, I'm really in to go for a ride. We have exciting times ahead of us!
As we are slowly approaching the status of a virtual god in the creation of a new parallel universe, the possibility that we already live in a simulation might not be that far off an idea after all.
Arilyn is one of the founding members of the global Open AR Cloud
Magic Leap's version of their AR cloud, the Magicverse
6D.ai master plan for the creation of AR Cloud