UPES professors give insights on Metaverse at Pune Design Festival
UPES was an associate sponsor of the Pune Design Festival, 2022. One of the workshops at the festival ‘Into the Metaverse – AR/VR’ was presented by UPES faculty Dr. Anmol Srivastava and Pankaj Badoni, who gave engaging insights about the early stages of Metaverse
Last year, Mark Zuckerberg, the Chief Executive Officer of Meta (formerly Facebook), gave a video presentation of Metaverse, touted as the next phase of the Internet. He emphasised the notion of a ‘shared sense of space’, where users can communicate and navigate ‘across different layers of reality’.
Zuckerberg’s tour of this fancy world created a buzz around the concept. In January 2022, Microsoft acquired game developer Activism Blizzard for 68.7 billion USD, citing the deal as the ‘building blocks for the Metaverse’.
Today, Metaverse is everywhere. However, along with the hype, there is also confusion about this hypothetical world, with people trying to figure out what exactly happens in the Metaverse. In this light, UPES faculty Dr. Anmol Srivastava, School of Design, and Pankaj Badoni, School of Computer Science, teamed up for a workshop ‘Into the Metaverse – AR/VR’ to provide clarity about the early stages of Metaverse.
The workshop was part of the Pune Design Festival 2022, of which UPES was an associate sponsor. The discussion that followed covered topics such as Augmented Reality (AR), Virtual Reality (VR), Social VR and Metaverse. Excerpts:
Dr. Anmol Srivastava, Assistant Professor, School of Design:
Metaverse became famous, especially after the talk by Zuckerberg. Though the concept originated long back around 1992-94 when American writer Neal Stephenson introduced the term in his novel called ‘Snowcrash’.
Meta means transcending different boundaries. Stephenson mentioned that users can gain access to Metaverse through personal terminals that project a high-quality virtual reality display onto goggles worn by the user. Other aspects include low-quality public terminals in booths with the penalty for presenting a grainy black-and-white appearance.
In VR, we wear headsets and are teleported into this new world altogether. People say that though it is a machine, when you wear it, it feels like truth. Basically, VR fools our brain into believing something is real. Like earphones hijack your ears, eyephones hijack your eyes and ears, but VR hijacks your eyes, ears, and other senses as well.
Somewhere around 2017, the evolution of VR headsets began. There is Oculus, Google Cardboard (took extended VR capabilities to a larger audience), Samsung Gear VR, and today we have Google Daydream.
Then you have cave displays which are fully immersive boxes, projection maps, goggles that when you put on teleport you to a virtual world. Then there are markers and opti-trackers, which track your movements and only the person wearing the goggles can experience it.
Then there is AR, which is overlaying objects into a physical space. Mixed reality (MR) is a stronger version of AR, as it also recognises the depth and the space around the user and combines them. It can be said as the starting point for Metaverse.
Then there is something called Augmented Virtuality (AV). It overlays real content into a virtual environment. AV is the real-time representation of real objects in virtual environments.
There is a famous reality-virtuality continuum defined by Paul Milgram. As per the continuum, there is a real environment and there is a virtual reality, which cuts us from the real world altogether, and somewhere in between is MR, which is a stronger version of AR. Then there is XR, which is an umbrella term that encapsulates AR, VR and MR.
The goal of Metaverse
Convergence between real and virtual reality is the goal of Metaverse. Metaverse is a cyber-space, which is a combination of the physical world and its digital twin. The key technologies that will drive it are Artificial Intelligence (AI), Blockchain, Computer Vision, Distributed Network, User Interactivity, Pervasive Computing, Scene Understanding, Ubiquitous Interfaces, Extended Reality, Internet of Things (IoT) and Robotics. The ecosystems include avatars, content creation, virtual economy, social acceptability, security and privacy, trust, and accountability.
What is a digital twin?
Digital Twin is a virtual model designed to replicate the physical world. You have different digital worlds that can be replicated, which act just like the real world. Within the Metaverse, there will be digital natives, which are individual users who appear as avatars of any form. There is a duality where individuals have lives in the real as well as the physical world. They are living two lives, which might influence each other.
As proposed by Lee, Li-Hang et al, there are digital twins in the digitalised real world, there are digital natives in the many virtual worlds, and then there is co-existence of physical and virtual reality.
The evolution of Metaverse
The evolution of Metaverse began from literatures (like Lord of the Rings and Dungeons and Dragons) and text-based interactive games (AberMUD, DikuMUD, Snowcrash). Then came virtual worlds and massively multiplayer online games (Second life and Minecraft), followed by immersive virtual environments on smart mobiles and wearables (Pokémon go, VR chat, Super Mario AR), to the new era of Metaverse, where technologies such as blockchain are being introduced (crypto-assets such as Cryptokitties and Alien worlds that encourages users to earn non-fungible tokens, NFTs, that can be converted into currencies in the real world).
Features of Metaverse include:
Immersive Realism: How seamlessly can a person experience, psychologically and emotionally the environment that they are in when they are inside a Metaverse. Currently, AR is limited to visual aspects, but there are research going on to create olfactory and human food interactions (virtual tastes).
Ubiquity: To fully realise the Metaverse, a person must create an environment, where people can move around in different virtual spaces. The content they create in one universe should hold true in another virtual universe.
Inter-operability: There are different platforms that should interact with each other. This is something of a concern right now.
Scalability: The goal is to bring the whole world into a virtual environment. Therefore, scalability becomes a huge issue as with any software. This primarily is in terms of how many users can log in at the same time, how avatars can interact, and what is the complexity of the scene.
Pankaj Badoni, Assistant Professor, School of Computer Science:
As per research, every seventh person is using AR. The sole reason could be the ease of use. For AR, we do not need cumbersome headsets. We only need smart devices. In Milgram’s VR continuum, he said that on the extreme left, we have the real environment and on the extreme right, we have a virtual environment. Somewhere in between, you have AR and AV.
When we talk about Metaverse, the main component is going to be VR, which immerses you in a virtually-created environment. You do not have any connection with the real world at that moment in time. The sensors will give you the spatial belonging of where you are in that room. Apart from that, whatever you are touching, interacting with, or whatever you are doing, is in the virtual world.
There is a famous triangle, which says that to have a good experience in VR, you need to have three Is – Immersion, Interaction, and Imagination. Now, a fourth proponent has been added to it – AI. The idea is that the environment should not be static; it should be interactive. The imagery which is being created should seem real. Lags should not come into the frames.
One of the main components of VR is the headsets. Oculus Quest-2, which is a product of Meta (previously Facebook), is a standalone system. There are no wires attached to it. If you have created an application, it must run in the device for which you need to create an APK, the same APK which is being created for the android application. That APK should go to the Oculus store. Through the store, you can access the app on the headgear.
Then you have the HTC Vive, which is a bit cumbersome as there is a dedicated wire. However, it gives a more realistic imagery experience. Both the headsets have exhaustive controllers and are available at the UPES labs.
Coming to MR headsets, the typical example is Microsoft’s Holo Lens. When you wear Holo Lens, you experience artificial objects being augmented in the real environment. Holo Lens is also available at the School of Design lab.
Once you wear the VR headset, you cannot see the real world; you can only see the virtual environment. With MR headsets, you can see virtual objects in the real world.
How AR can change the game for Metaverse
In AR, when you take your mobile phone or tablet and point it to a certain place in your room, an artificial object gets augmented over there. In reality, there is no object, but on your tablet, you can see the object augmented over there.
IKEA is a good example of using AR. You can see through your smart device how a particular piece of furniture is going to look like in your home. When you are buying, say a sofa, you have a doubt whether the colour would look good or not, will it fit, etc. Through their app, you just point the device where you want the sofa, and you can see how the sofa would look over there.
AR has wide applications because the cost involvement is low. An HTC Vibe headset costs about INR 1.5 lakh; a Holo lens costs about INR 2.5 lakh. But a tablet costs about INR 30,000. That is why, across the world, the penetration of AR is quite high.
AR overlays digital content and information onto the physical world – as if they are actually there with you, in your own space. AR opens new ways for your devices to be helpful throughout your day by letting you experience digital content in the same way you experience the world. It lets you search things visually, simply by pointing the camera at them. It can put answers right where your questions are by overlaying visual, immersive content on top of your real world.
Broadly, there are two types of AR experiences that can be created: Marker-based and marker-less.
For instance, there are many people who cannot physically visit the Kedarnath temple. We can create the AR experience, where the site can be brought to their living room. Anyone can take a 360-degree tour of the temple. This has been done using the marker-less AR.
We have also come up with a museum Metaverse. The whole environment is in 3D, bringing in a social and psychological experience into it. Let’s say if there are two people geographically apart who wish to go to the museum at the same time. They can enter the Metaverse and interact with the objects as well as with each other.
Applications of AR range from education to medical and retail, gaming, and tourism. Basically, every scenario imaginable. The capabilities of AR are endless. In my opinion, if there is any technology that can change the game in Metaverse, it is going to be AR.
The discussion that ensued covered the possibilities and opportunities in the Metaverse. UPES offers two programs – B.Tech. CSE Graphics and Gaming and B.Des. Interaction Design – which, along with the infrastructure at the School of Design’s XR lab, can act as major contributors to the development of Metaverse. As technologies continue to evolve, the university will continue to provide courses and infrastructure that keep its students ahead of the game.