High Fidelity Achieves Fully-Articulated, Real-Time, Motion-Captured Avatar Finger Movement, Leads to the Inevitable
Ladies and gentlemen, High Fidelity has achieved the the ultimate in customized avatar expression:
"This was a demonstration of using Leap Motion's new 'Orion' update with High Fidelity," company founder Philip Rosedale tells me. So now Alpha users with Leap's motion capture camera and a High Fidelity script can translate their real world finger movements to the fingers of their avatar.
Juvenile bird-flashing aside, this is an extremely impressive, well, leap for Leap:
"Leap is now tracking much better, so it may be a good choice for hand controllers," Philip adds, "particularly for those wanting finger movements - for which only Leap and Perception Neuron offer a solution."
I'm really curious to see how articulate this finger articulation actually is in practical terms -- for instance, could someone communicate in real time in American Sign Language through their avatar?* In any case, one small birdflip for an avatar, one giant Leap for avatar-kind. (See what I did there?)
UPDATE, 2:15pm: High Fidelity's Chris Collins says Yes, indeed, your avatar could do American Sign Language through this system.
Please share this post: