After thinking about the many technical obstacles facing the creation of an ideal Metaverse, longtime Metaverse developer Lex4art argues that cloud streaming is the best solution we have at our disposal now. Here's his case:
Accessibility & Pricing Flexibility
Anyone can join, hardware requirements are super low (fast enough Internet connection + some kind of display + some kind of input to move/interact with the metaverse). There's also flexibility in monthly fee: the lower the requested rendering quality and lower video stream bitrate (lower traffic), then the less it can cost for the [end user]. Maybe even free connections will be possible for lowest quality. So, if you need quality - pay more. If you don't need it - pick a less fancy solution that still gets job done.
Three more reasons from Mr. Art:
Better Latency Management/Optimization
Metaverse cloud architecture allows another plot twist: it lets us split player activities between two different types of servers inside the cloud: one that will process fast-paced (low latency demanding) activity and everything else is put on slow (high-latency tolerant) servers far from the client location. So, when the client is trying to move/interact with an in-world UI (that <150ms rule from Google for not-annoying interfaces) - the client sends movement/click data to the nearest cloud that can verify them and perform player movement/UI interaction & return updated images in <50ms time. But when the client is trying to perform money transactions, it's done in different servers inside cloud, probably located in the mother country with secure laws and a stable political system that respects people & the law.
Flexibility With Computing Costs
Rendering architecture in the cloud opens up interesting & unique possibilities to seize! For example, why not create just one huge set of clusters that will compute only lighting for the whole metaverse and then only update it every second or so (also flexibility - if changes are too drastic, this lighting update will happen with an additional second or two, but still good enough. A smart compromise; if a server cluster crushes, others can do the job with seconds delay). And this giant "photon cache" representation of the whole metaverse can be simply buffered and requested as tiny pieces (matching current client location in the metaverse) by all clients clusters, spread around the world (closer to clients to have low latency and having very wide bandwidth connection to rendering server clusters). There are a lot of problems to solve here, but maybe something like that will be an adequate solution to try.
Management of User-Generated Content
Custom content, created by end users, uploaded to the cloud, split between fast/slow servers depending on content type and used like any other part of the virtual world on demand - the loading of needed data between clusters in the cloud is super fast due to highway type of connections between servers (terabytes per second and more), so there will be almost no time to wait till location loading... all data there is transferred between servers, cached and available on a fraction of the second.
"So, in a nutshell," Lex concludes, "a 'metaverse 2.0' may be a purely cloud-based thing on current gen tech."
This all sounds roughly right to me, though I'd add that there should always also be an option to access the Metaverse via videogame consoles and other devices with high end graphics power. I'll also add that cloud rendering may always be "good enough" but definitely not perfect experience. Rendering to end users via wireless or 4G/5G will always run up against inevitable latency -- hence the slight delay "stickiness" you usually experience when you try to input any interaction (typing, moving, etc.) back into a cloud-streaming Metaverse. (Matthew Ball goes into more detail in part 3 of his primer.)
Thanks for exposing this. While cloud gaming services are already available for decade there are some key differences between "let's push standard PC game to a cloud" and purely cloud architecture for a virtual world made from scratch for that purpose. If it's possible to perform compute-intensive tasks (like lighting) only once on one cluster and then just share results for clients units - this means minimal energy expenses and thus very cost effective solution compare to modern cloud-based gaming with need to spend 300W+ GPU&CPU expenses per playing client (and 80W+ for idle client).
P.S. Please, if possible - fix my mistake in this text (this is about how to solve that "sluggish appearance" of cloud-based metaverse 2.0.):
one that will process fast-paced (low latency demanding) activity and everything else is put on slow (high-latency tolerant) servers far from the client location
Interesting thing on top of all that - for VR headsets we need to have very low latency for tracking head direction/movement, but maybe it's possible to send 360C video stream for client and thus have at least head rotation fully performed on client device. Creating 360C rendering is ~6 times more expensive than standard viewport and bandwidth requirements also 6 times higher but in the end this is about cost & fast connection, so VR users just need to pay more for that feature to have.
Posted by: Lex4art | Tuesday, August 10, 2021 at 05:59 AM
Thanks, Lex! Updated the post.
Posted by: Wagner James Au | Tuesday, August 10, 2021 at 12:40 PM
I can't avoid seeing a cloud rendering solution to Second Life as a cop-out to avoid doing the actual work to make it run natively on mobile or in a browser, which is what people *actually* want.
Posted by: Adeon Writer | Tuesday, August 10, 2021 at 01:17 PM
A cloud rendered second life option is nice for a very small subset of existing users, but it's not going to attract anyone new, not when a subscription plan is needed to use it.
Posted by: Adeon Writer | Wednesday, August 11, 2021 at 09:13 AM