2/2: UPDATE on this post here with info on doing this for yourself.
It was inevitable, and it's happened. This is Second Life connected to Kinect, with basic controls such as avatar gestures and camera movement controlled by physical gestures:
This hack is from USC's Institute for Creative Technologies, who created a similar innovation linking Kinect to World of Warcraft. With this new software, you can pantomime a wave, brow, hug, or flying, which your avatar then copies in Second Life. I also admire the camera control which is triggered with Minority Report-like sizing gestures. As you might imagine, I'll be writing more about this as soon as I can.
Once we had FAAST, allowing mapping of kinect gestures to keyboard keystrokes, this was always going to happen quickly. Great to see it though :-) Next step is the continuous fluid copying of your body movement to the SL avatar, which would be very cool although I can't yet imagine how you would then control other features like the camera and the rest of the UI!
Posted by: NeilC | Thursday, January 20, 2011 at 01:04 AM
On reflection, as far as I understand it movement of other avatars when viewed in my AVs client are all animations that have to be downloaded like sounds and textures... so doesn't this mean that the existing architecture of SL would never allow the kind of fluid complete motion copying that kinect could provide? You'd be able to see the copying in your own viewer but no one else would see it? Or is there a way for the viewer to send actual bone positions through to the server and have those sent to all clients? I suspect not :-(
Posted by: NeilC | Thursday, January 20, 2011 at 01:13 AM
Neat. But I wouldn't want to build a tulip using the gestures and movements of a real-world carpenter hehehe. *looks at his small thin arms* Hmmm, maybe I should.
-RODION
Posted by: Rodion Resistance | Thursday, January 20, 2011 at 02:08 AM
I definitely like to Kinect being put to good use :) Good find, Hamlet, this is really a breakthough!
Posted by: Gwyneth Llewelyn | Thursday, January 20, 2011 at 04:28 AM
I decided I didn't know enough about avatars and puppeteering so I read up and wrote up :-)
Posted by: NeilC | Thursday, January 20, 2011 at 05:12 AM
This is splendid news! Congrats USC ICT.
Wish it didn't require pre-animated gestures but it's a huge step for virtual worlds.
Posted by: Bettina Tizzy | Thursday, January 20, 2011 at 06:39 AM
Second Life has the capability to support realtime avatar animation (as opposed to animation triggers) via motion controller.
The open question is whether Linden Lab still has the talent to dust off and complete the puppeteering system and is willing to do so.
Most of the killer app potential I see in motion control depends on a robust physics system, extensive precaching, and optimized script handling.
I'm convinced that it can be done, but I'm not convinced that LL can do it.
Posted by: Arcadia Codesmith | Thursday, January 20, 2011 at 07:23 AM
@Arcadia, right on target. LL, get with it.
Posted by: Bettina Tizzy | Thursday, January 20, 2011 at 09:02 AM
Can someone clarify this for me please. Are the basic gestures like waving, bowing and so on baked into the client such that they do not need to be downloaded and therefore will be seen reasonably quickly by remote avatars?
Posted by: James Corbett | Thursday, January 20, 2011 at 09:54 AM
@James Here's what we could have in Second Life if LL completed avatar puppeteering: Realtime avatar animation, which everyone could see. We were so close to it once upon a time. Sigh. http://npirl.blogspot.com/2008/03/avatar-puppeteering-for-second-life-in.html
Posted by: Bettina Tizzy | Thursday, January 20, 2011 at 10:30 AM
Well for one thing, it should burn more calories for many Second Life compulsives whose only exercise comes from opening the front door for the pizza delivery.
My ovetaxed desk chair just said "please dear God get this for him now".
Posted by: Eddi Haskell | Thursday, January 20, 2011 at 10:47 AM
Looks like my human will really need to learn how to dance. Looks like a good way to get my but off the couch, maybe loose more weight.
Posted by: Cisop Sixpence | Thursday, January 20, 2011 at 11:44 AM
this is a fun thing! i love the hugging at 1:30 (me be a sap)
i think i'd get tired arms pretty fast! but i would love to see how building would be affected by this - it would be fun to build using gesturing =)
Posted by: Ener Hax | Saturday, January 22, 2011 at 09:41 AM