
If you ask Meta or their colleagues, if the metaverse is possible, the answer is assured: yes, it’s only a matter of time. The challenges are huge, but technology will overcome them.
This may correct many issues the metaverse faces: better displays, more sensitive sensors, and faster consumer hardware will be key. But not all problems can be solved by improving existing technologies. It doesn’t scale easily by piling dollars against them as metaverse may find bound itself by technical barriers.
Meta’s vision of the metaverse is a fully-simulated “embodied internet” experienced via an avatar. It implies a realistic experience which allows users to move through space at will and pick-up objects with ease. But the metaverse as it exists today falls far short of the mark. Movement is restricted and objects rarely, if ever, react as expected.
Louis Rosenberg, CEO of Unanimous AI and someone with a lengthy history of augmented reality work, explains the reason is simple: you aren’t really there and you aren’t really moving.
“We humans have bodies,” said by Rosenberg in an email. “If we’re just a pair of eyes on an adjustable neck, VR headsets would work great. But we have bodies, and that causes a problem that I describe as ‘perceptual inconsistency.’ “
Meta often demos an instance of this problem: friends surrounding around a virtual table. The company’s press materials show avatars fluidly moving around a table, standing-up and sitting-down at moment’s notice and interacting with the table & chairs as if they were real physical surface.
“That cannot happen. The table isn’t there,” Rosenberg said. “In fact, if you tried to pretend to lean on the table to make your avatar look like it, your hand would go right through it.”
Developers can try to fix the problem with a collision detection that prevents your hand from moving across the table. But remember: the table isn’t there. If your hand stops in the metaverse but keeps moving in reality, you may be feeling disoriented. a bit like a prankster pulling a chair out from beneath you moments before you sit down.
Meta is working on EEG & ECG bio-sensors that could allow you to move around in the metaverse with a thought. This could improve many movements and prevent unwanted contact with real-world objects while moving in a virtual space. But even this cannot provide full immersion. The table doesn’t exist yet and you can’t feel its surface yet.
Rosenberg believes this will confine the potential of a VR metaverse to “short-duration activities” such as gaming or shopping. He bilieves augmented reality as a more comfortable long-term solution. Unlike VR, AR augments the real-world rather than creating a simulation. which avoids the problem of inconsistency of perceptions. With AR, you interact with an actual table.
To understand, how to translate our physical forms into virtual avatars is a hurdle, but even when that’s solved, metaverse will likely to face another problem. Moving data between users thousands of miles away with very low latency.
“To be truly immersive, the round trip between user action & simulation response needs to be imperceptible for the user,” Jerry Heinz, a member of Ball Metaverse Index expert council, said in an email. “In some cases, ‘imperceptible’ lasts lesser than 15 milliseconds.”
Heinz, former head of enterprise cloud services at Nvidia, knows this problem first-hand. Nvidia’s GeForce Now service allows customers to play games in real-time on hardware located in a data center. This requires high bandwidth and low latency. Consistent with Heinz, GeForce Now averages around 30 megabits per second down and 80 milliseconds round trip, with only a little dropped frame.
Modern cloud services such as GeForce Now handle user load via content delivery networks that host content in data centers close to users. When you connect to a game through GeForce Now, the data isn’t delivered from a central data center used by all players, but rather from the closest available data center.
The Metaverse throws a wrench into the work. Users exist anywhere in the world and path data travels between users may not be under the control of the platform. To solve this, the metaverse platforms require more than scale. They require a network infrastructure that spans many clusters. of servers working together in multiple data centers.
“The inter-connects between clusters & servers would require to change versus today’s loss affinity,” Heinz said. “To further reduce latency, service providers may require to offer rendering and compute at their edge while back-hauling state data to central servers.
The problems of perceptual inconsistency and network infrastructure can be solved, but it will take many years of work and large sums of money. Meta’s Reality Labs has lost more than $20 billion over the past three years. That seems to be just the tip of the iceberg.