« Quest 2's Install Base Confirmed as "About 4 Million" Due to Face Rash Recall - UPDATE: 4M May Refer Only to US-Based Quest 2s | Main | Maka Orion Makes Metaverse Images Matched to Inspirational Quotes »

Tuesday, July 27, 2021

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Lex4art

Thanks for exposing that series of metaverse articles - listened through first 3 chapters and it feels quite adequate + gives some fresh thoughts on latency limitation problem.

Epredator

This speed of light latency problem has me pondering quantum entanglement and instant changes at infinite distances. Aldo just as we now have AI doing upscaling of images or choosing what it thinks it needs to render before it does some element of forward prediction helps with latency mitigation. Kind of like games do already if packets drop they predict your next position (not always accurately but it’s a start)

Kyz

Spot on with latency. On human expression and lip synch though, I have to wonder if anyone has done an experiment on what people are viewing in a virtual world. I mean as a breakdown on where that "camera" time is spent, broken down in percentages.

Is it scenery, close ups on faces, whatever they happen to be interacting with? That would be an important first step into prioritizing what's important visually to the end user. If people do indeed focus on an avatars face while they're speaking then sure, expressions can be important. That may be especially true for meetings, or romance.

But as Wagner says, it doesn't seem to matter in existing successful worlds.

Wagner James Au

"This speed of light latency problem has me pondering quantum entanglement"

Holy cow, does this mean we can't get an ideal metaverse until we figure out APPLIED QUANTUM ENTANGLEMENT technology?!

Lex4art

>>Holy cow, does this mean we can't get an ideal metaverse until we figure out APPLIED QUANTUM ENTANGLEMENT technology?!

Yep.

Lex4art

But anyway, latency is only tip of the iceberg of problems to solve. I've finished listening through whole "Metaverse primer" articles (last 3 chapters where quite huge and watery on my taste, no clear picture) - IMHO there are some significant tech improvements are needed even for "far from ideal full of compromises single country scale not metaverse but just a decent virtual world", like somehow start to mass-producing "vacuum fibers" (to have full light speed medium that allow 40-60ms latency coverage at least for US (and servers should be located nearby geographical center). Math is also the problem - even very coarse real-world alike physics for clothing on all characters, fleshy soft-bodies and decent destruction is out of the possibilities for math-based processors. So, we need something that I can call "context based processors" - e.g. like current CPUs run x86_x64/ARM/etc command set to process mathematical context those "context processors" will run special, non-math context command set. But there are not so much breakthroughs on that horizon - neural networks dedicated CPUs still math-based monstrosities, only abridged to the core for NN math operations range...

Meh.

Lex4art

But something interesting can appear even at current gen tech - some types of virtual fun (like slowly building stuff in Minecraft) didn't need much latency, just hide delays smart enough way from user when he interacts with world and this works well already. World-scale virtual world also maybe not worth it - simply because language barriers/culture barriers makes person-to-person interaction not that interesting and quite clumsy. And if there are super cool virtual art was created in one distant country - maybe it will be enough simply copy it and bring to all other countries data-centers to share at least art with good latency & content downloading speed... will se ).

Lex4art

Oh, and how can I forgot about "cherry on top of the metaverse cake" - networking model for that kind of project is server-side-does-most-of-the-stuff so we can have secure payments & content distribution, no cheaters and no trespassing in VIP/personal zones. This is how Second life, Sine space and World of Tanks build - but this also means 2x latency - you hit movement key -> this goes to server and it calculates movement amount/permissions -> returns to you result so to animate character locally using received data. So, very limited amount of active metaverse activities for that kind of connection but this is the only way to do things secure.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)

Wagner James Au
Really Needy Second Life Sims Roleplay HUD
Dutchie housewares Second Life chandelier
Sinespace virtual world Unity free home
Samsung Edge computing reports NWN
my site ... ... ...