I recently wrote a smartass post about how High Fidelity supports motion capture of real life finger movements to an avatar's fingers -- now watch how impressive this really is:
Video (and hands!) via High Fidelity's Chris Colins. So yes, this means if you flip the bird at your computer in real life, your avatar can instantly flip the bird in the metaverse. But the use cases are much more than that: "The biggest application is enhancing the immersion," Chris tells me. "When you talk you move your arms and your hands." And yes, this means you and your avatar could, say, communicate in sign language with another avatar and her user on the other side of the world, he says.
High Fidelity founder Philip Rosedale tells me they've done tests of this tech, and the latency is impressively low:
"About 150 milliseconds from hand motion to the other person seeing it," says Philip. If you remember his demo from 2014, Philip believes 100 milliseconds is the "magic place" for avatar to avatar chat, so an additional 50 milliseconds isn't that much more to wait.
Please share this post:
even an old cynic like me has to admit thats pretty cool.
Posted by: Issa Heckroth | Wednesday, March 02, 2016 at 03:04 PM
This is just great but I hope it translates into Inworld object handling in accordance because otherwise is to much of an RL gear setup just to be able to speak and make gestures all along
Posted by: Carlos Loff | Thursday, March 03, 2016 at 04:44 AM
this is exciting its too bad the juiced up computer / graphics card i bought 6 years ago to run Philips last venture probably not good enough and the reason I cant get HiFi to run. Im not buying another computer to run Hifi, sorry.
Posted by: metacam oh | Thursday, March 03, 2016 at 05:19 PM