Metaverse Technology – An intersection of trending concepts and technologies
People often are looking for specific technologies that come along with Metaverse. They imply that Metaverse is a special technology, so they try to understand what the technology behind the Metaverse is. Other people in turn link the Metaverse only to Virtual Reality. And yet other people start to talk about crypto and NFT or virtual worlds and games when it comes to the Metaverse. It seems that – even though many people talk about the Metaverse – it still needs to be clarified what the related technologies are and what their part in the overall concept is. So, how do technologies shape the Metaverse? Are those technologies ready for industrial business cases? If not, what are the alternatives? How will that apply to enterprises? This article aims to provide some insights and talks about some myths around this. Let’s deep dive….
Authors: Arta ALAVI, Timo Gramlich, Andreas Spahn
For those of you who want an introduction about the topic: Metaverse Intro
Metaverse Technology: trendy and confusing at the same time…
It’s funny to see a concept that is so widely used, so trendy… and so confusing at the same time. In a recent survey conducted by DEPT, only 16% of people agreed on understanding what the Metaverse is, and we can surely expect that not all of them really have a clear understanding of this concept. The Metaverse can be seen as the emerging immersive dimension of the internet that we may experience in the future. It will run in continuity across all our devices, be it glasses, smartphones, laptops, or headsets, blending the physical and virtual worlds in a seemingly real way. With the goal of forming a creator economy and putting the creators in control of their contents and their revenue streams. And get rid of the middle players. This is far beyond the current claims of companies selling their “Metaverse solutions”. Several technologies are required to be combined to make these experiences possible: Immersive technologies, Blockchain, Digital Identity, Twins, IoT, Artificial Intelligence, Spatial Analytics, Edge Computing, 5G/6G, …
Technologies behind the Metaverse
As stated above, the Metaverse is not a single technology, nor a single platform, but rather a technical concept containing various technologies that form the Metaverse. The diagram shows the intersections of some of the key subject areas for Metaverse including the most prominent ones of immersive computing with virtual reality capabilities, web3 and its decentralization, as well as artificial intelligence. If you expect from this article that you get a full list of technologies for Metaverse as a cocking recipe, then we must disappoint you. As of today, nobody can tell you about this list for a concept that reaches its full fledge vision in the next 5-15ish years. This is a realistic timespan we must consider for the Metaverse and its technologies to mature. So, let’s look at some of the most important “Metaverse technologies” in a bit more detail.
Immersive Computing and the Metaverse
While multiple technologies related to the Metaverse shape its mechanics, abilities, and behavior, two main technologies (Virtual Reality and Augmented Reality) are intended for experiencing or immersing oneself in the Metaverse.
Immersing oneself in the Metaverse using Virtual Reality
Virtual Reality (VR) and Augmented Reality (AR) create immersive, 3D virtual environments that a user can interact with. Virtual Reality provides the user the highest immersion into artificial environments like those in the Metaverse context.
The idea of virtual reality is to immerse the user (at least as far as the sensory perceptions are concerned) into a computer-generated environment using VR headsets. Virtual realities can be experienced using VR headsets. Those head-mounted devices (HMDs) are equipped with two high-resolution displays that are positioned right in front of the user’s eyes. Examples for VR headsets are Meta Quest 2, Meta Quest Pro, Varjo XR-3, HP Reverb G2, HTC Vive Pro 2 and PICO 4. VR headsets block the user visually (completely) and acoustically (partly) from reality to replace it with an artificial one. That means a user can sit comfortably in the living room on the couch wearing a VR headset and getting an adrenaline rush from a roller coaster ride in the virtual reality.
Further information for you:
- Virtual Reality for business and enterprise applications
- Eye control software and it’s role for VR glasses
- Metaverse and Virtual Reality: a little misconception
- How Virtual Reality shapes the Future of Work
Augmented Reality to create real world overlays
While VR devices are designed to immerse the user completely (at least visually) in a virtual reality, AR glasses focus on enhancing the perceived reality with additional information but without blocking the user from the real world as it always remains visible when the user wears AR glasses. Thus, augmented reality is mainly about adding additional visual information and content to the reality we perceive with our senses. The game Pokémon Go is often used as an example for a very well-known “augmented reality app” that creates an overlay over the real world that pulls users into a virtual experience. Technology wise, the most famous forms of corresponding devices are AR glasses like the Microsoft HoloLens.
Away from games, AR has become more widespread in the last decade and can also be found in everyday use – think of the head-up displays in modern cars that bring additional information like current speed or navigation arrows onto the windshield, so we can keep our head (and eyes) up on what’s happening on the road instead of looking down on a display.
Game Engines to create virtual world experiences in the Metaverse
The simplest way to create an immersive experience is to use the VR headset as a playback device for 3D videos like “Maldives VR 360” (YouTube). Of course, that’s far from Metaverse with its computer-generated virtual realities and immersive environments that usually are created using game engines like Unity or Epic’s Unreal Engine – to only name the two most famous ones.
As tools originally intended for game development, working with Unreal or Unity is way different to usual development experiences in IDEs like Eclipse or browser-based tools for cloud development. While for example Epic’s Unreal Engine is written in C++ and allows of course also to use C++ in development context, most of the development- and creation tasks are executed in the editor following a so-called Blueprints Visual Scripting approach – which by no means is to be understood as. In addition, introducing game engines needs a new skillset in most companies.
Artificial Intelligence (AI) and its role within the Metaverse
When a user enters the immersive experience, he/she should/could have a virtual assistant to help during the whole experience. The virtual assistant can take the form of either a wrist terminal, a voice-over, or a full-blown avatar. The first use of AI technologies is the voice recognition, which turns your voice into a text and NLP (Natural language processing) helps to understand the intent. Which is covered with Large Language Models (LLM) like OpenAI ChatGPT, Google Bard based on LamDA or Meta LLaMa. We can add additional Tone and Emotion analysis to try to extract more information from your voice or apply sarcasm detection on the text. These technologies combined will enable the system to fully understand your intent. Once the request is processed, the system needs to generate the appropriate answer (or action) and respond back to you in an appropriate tone using the Speech Synthesis. If you opt for an avatar, the AI might also help to manage the facial gestures, the lip syncing and arm and body gestures when the avatar is speaking, to look more natural. AI can also predict an event (request for help), display insights (contextual information when talking with someone) or the likelihood of acting to a proposal.
Generative models can be used to populate the Metaverse with terrains, trees, lacs, people, animals and even create fairy worlds or creepy horror places for Halloween. Text-to-x solutions can generate in few minutes, images, videos that matches your description to be used in the design of your environment. Models like the Nvidia GET3D can generate 3D assets to populate your 3D experience.
Analytics, especially Spatial analytics, leverage AI to better understand users’ behaviors and help building better and more appealing experiences for users, because an unhappy user will not come back.
The Metaverse, Web3 and Blockchain: How decentralization might shape the future of virtual Interaction
Blockchain and Distributed Ledger Technology for the Metaverse
Blockchain is a Distributed Ledger Technology (DLT), a peer-to-peer computer network, that consists of a growing list of records (called blocks), securely linked to previous blocks using cryptography, forming the chain. Each block contains the cryptographic hash of the previous block, a timestamp and the transaction data. Blockchain technology is integrated into multiple areas:
The best knowns are cryptocurrencies like Bitcoin or Ethereum. A cryptocurrency is a digital currency designed to work on computer network and does not rely on any central authority, such as government or bank, for verifying that parties to a transaction have the money they claim to have and operate the transfer of funds between them.
Non-Fungible Tokens (NFTs):
A NFT is a unique digital identifier that is recorded in a blockchain and managed using a SmartContract. Smart contracts are programs stored on a blockchain that run when predetermined conditions are met. The objective is to reduce the need for trusted intermediators, frauds or malicious or accidental exceptions. Smart contracts were introduced by Ethereum and are considered as the building block for decentralized applications like Decentralized Finance (DeFi) using cryptocurrencies or NFT. NFTs are trendy because they represent the ownership of a digital content, mostly but not only an art piece that is also stored into a decentralized storage like IPFS, FileCoin, Storj, etc. NFT creators have two sources of income: the primary sale of NFTs and royalty payments (5%-15%) from secondary transactions, managed by the SmartContract.
Soulbound Tokens (SBTs):
SBTs are digital identity tokens, representing features or achievements, “soul-attached” to specific real persons or properties, and are not transferable.
Decentralized Autonomous Organization (DAO):
DAO is a structure with no central authority and whose members are token holders participating in the management and decision-making of an entity. DAOs are used for a bottom-up management approach. All votes and activity are posted on a blockchain for transparency. Voting power is almost linear to the number of tokens someone owns.
How Web3 is impacting the Metaverse
Web3 is a decentralized, permissionless, trustless infrastructure, transferring the control from current centralized entities to the participants to the decentralized peer-to-peer network. Web3 is aimed to take back the control to the content creators over their creations, as opposed to Big Tech walled garden platforms in today’s Web2 world.
Interested in learning more? Web3 explained
The decentralization is at the heart of the Metaverse, and we will cover several aspects that rely on decentralized blockchain technologies:
Authentication and data privacy:
Digital passports are not new, and we easily use Google, Facebook or Microsoft to login to different websites. Tomorrow, with Self-Sovereign Identity (SSI), you will authenticate using your SSI Wallet and you will accept or revoke the access to your private information without intermediaries. Tracking possibilities are much wider in the Metaverse, as all spatial information can be interesting to track what you see or say, and to whom you’re talking to. Data privacy must be at the heart of each experience.
A popular way of monetizing content and access to certain experiences is by minting special NFTs that you will give to your best customers, to make that event exclusive. Those NFTs will then act as an access key to your event. These NFTs are tradable, so it’s an opportunity for you to better promote your event. If you don’t want your guests to trade the tickets, just mint SBTs (Non-Transferable NFTs).
All immersive platforms have marketplaces to buy/sell items based on a currency defined for in-experience trading. Almost all the platforms have a virtual currency (Robux for Roblox, V-Bucks for Fortnite, Linden dollar for Second Life, Sand in Sandbox, Mana for Decentraland…) that is either based on a fiat currency or is a cryptocurrency. It’s a financial flexibility as you can develop your currency according to your business requirements. It also helps you to run crowdfunding campaigns to support a profit or non-profit project, or for your startup using ICO (Initial Coin Offering).
As your assets are managed by your wallet, you can connect other experiences and reuse your existing assets. Buy a pair of shoes in an experience for your avatar and wear them in another experience later. The assets you buy belong to you, not to the experience.
NFTs are de facto ownership certificates of digital assets. Even if they had lot of scams in recent years, NFTs are the most secure way to prove that a digital asset belongs to you, so you can trade it, like having an NFT for owning a virtual land or a virtual cloth in an experience. At the opposite, SBTs are non-transferable tokens, sticking with its owner, which can be used as certificates for achievement (diploma, training, …).
Decentralized Services and Infrastructure:
As Ethereum is Turing-complete, we can create anything with, so we can replace any exiting Web2 service with a Web3 service with certain limitations in term of scalability due to the decentralization. Except for your SSI wallet, the experience must not mandatorily use Web3 technologies, as Interoperability will ensure smooth transition between experiences.
With Web2 technologies, Big Tech companies are acting as intermediaries to host the creators’ contents and are using Ad service to monetize the contents. Independent content creators include social media influencers, bloggers, videographers, 3D designers, digital painters, etc. Currently 50 million people around the world consider themselves creators. The aim of Web3 is to enable the Creator Economy and reward the creators for their contents, which requires a transparent system to certify the ownership, the number of views, and the associated rewards. In the Metaverse those contents can be the complete experience, a 3D asset (house, shoes, cloth, paint, …), animations, special effects, audio songs, and everything that can be bought/sold in a marketplace.
Low-Code / No-Code and the Metaverse
In the Metaverse context, Low-Code / No-Code (LCNC) is and will be of interest as well, to create and customize virtual or augmented environments without the need to write any code. Already used in immersive gaming, LCNC allows users to create and customize their avatars as a basic example. This will translate into the Metaverse in general as well so participants will use tools that let them build and tweak their virtual look, the environment itself and of course processes and behavior, that is represented in the Metaverse. Think for example of retailers that will be able to build their virtual presence and manage inventory of their 3D online stores created and deployed by themselves using an LCNC solution.
New to LCNC? Then you might want to read this: What is Low-Code / No-Code?
Low-Code / No-Code – necessary paradigms to build the Metaverse
Even on building the Metaverse (or parts of it) itself, tools nowadays already follow low-code/no-code paradigms. One example is Epic’s Unreal Engine which is used to develop games but also VR or AR content in general. Based on the C++ programming language under the hood, the tool provides also a LCNC graphical interface to implement logic, behavior, and processes. Only for rare use cases or optimization purposes, it might be necessary to get on C++ code level again.
Whilst LCNC sounds relatively easy to use, it should by no means be taken as downright simple – rather it can be a complex task to realize the intended logic, as shown below with an example of one of our VR prototypes.
Integration of the various technical concepts to build the Metaverse
Especially when it comes to Metaverse with its various individual technologies, some of which are still maturing or will be combined as the Metaverse is created – integration is key. This will also affect the ease of developing those new worlds, applications, and environments. Especially when using the vast number of tools out there for tasks like image- and video editing, 3D rendering, creation of VR or AR content, game- and scene design and so on. Oftentimes, these tools are great in their field but don’t talk or integrate directly or seamlessly with others.
Integrating various Metaverse technologies needs a common ground
To effectively create the Metaverse, all these fields and different bits and pieces need to come together which can be quite cumbersome and development intense like writing dedicated code to make something work together. Usually, many businesses then start building their own solutions for their very needs which results in a lack of common ground. Take the internet as an example – it works great thanks to the HTML standard. This means that all web applications speak a common language and the browser you’re using doesn’t need to deal with thousands of ways to interpret the web pages you’re visiting. Undoubtedly, a common ground like HTML would also be an advantage or even a necessity to pursue a concept like Metaverse.
Universal Scene Description as the future integration standard?
A standard that could become the HTML for the Metaverse is the Universal Scene Description (USD) of Disney’s animation studio Pixar. Originally developed to support the creation of 3D movies like Finding Dory (2016), USD enabled a wide variety of applications to communicate with each other and exchange data.
Using Pixar’s Universal Scene Description (USD) as foundation, GPU manufacturer Nvidia is expanding and evolving USD to take it from animation studio to Metaverse with a more industrial-centric focus. The overarching idea is, to enable also more and more traditional industries and businesses to step into the Metaverse to profit from its benefits from collaboration to digital twins. Already existing examples are the German railway company Deutsche Bahn that created a digital twin of the German rail network (e.g., for simulation purposes) or car manufacturer BMW with a Metaverse version of one of their factories.
So, together with industrial partners like Siemens, Pixar, Adobe, Autodesk and more, Nvidia is about the creation of the industrial Metaverse – boosted by their RTX GPUs and a supporting and orchestrating platform called Nvidia Omniverse.
Conclusion: The Metaverse Technology Landscape is complex and still work in progress (as of 2023)
We must admit, the title – Metaverse Technology – of this article is somewhat misleading. It implies that someone already knows what the Metaverse will look like and so we can work accordingly to this definition and do research and development to make this happen. The truth is that this definition is not yet in place and only some enablers are known. So, the Metaverse and its technology landscape are yet to mature, evolve, improve, and overcome existing limitations. It’s a bit as if we are still waiting for the “iPhone of the Metaverse”. Until then we keep experimenting and proposing what the current technology can provide. As do the various Metaverse platforms. To bring the overall concept to life, integration is key. This will help to create a more immersive and seamless Metaverse experience for users. Until then the Metaverse still must survive – after the initial hype – and proof that it benefits society. However, on its way to this vision we already see and reach a lot of business value through the individual Metaverse technologies!
Find more topics about digital innovation here:
What do you think about the topic? Leave us a comment and share your thoughts.