Kinect Integrated Into Snowstorm Second Life Viewer by Merov Linden (Who Also Worked on a Kinect Predecessor)!
As it happens, this is actually Kinect returning to its roots, so to speak. Back in 2008, Bossut was working with Linden board member Mitch Kapor to create a hands-free 3D camera interace for Second Life. "It compares pretty well since it's the same code really," as Bossut explains to me. "The Kinect hardware is provided by Primesense, and Mitch and me did work with a Primesense camera at the time." As Kapor told me then, this was part of his plan to help make Second Life mass market, but the project was discontinued due to the difficulty of mass producing the 3D camera hardware. With Kinect very much mass market, however, it may be that Microsoft has solved that problem for them. And us.
More from Bossut after the break, on making this code open source, and the fascinating challenges of integrating it into Second Life usage:
"We don't have a clear plan right now as to when we'll put that code out in an open source repo on bitbucket but we will eventually," Merov tells me. He's not sure if this Kinect code will ever be part of the official Second Life viewer: "For the moment, it's at the level of experimental hacks as in, 'can we do something fun and usable with that thing in SL?'" As for the actual code, "Most of my work for the demo you see today was to get the libfreenect (openkinect) library, derive my camera class to support that new camera, implement a simple tracking system on the depth field, and port my viewer 1.18 code to the 2.6 Snowstorm viewer code base."
In Merov's demo, simple avatar actions are triggered by specific physical pantomimes, so the avatar isn't really replicating the user's movements in real time. I asked him how difficult it would be to directly control a Second Life avatar through Kinect's motion capture camera.
"That's a whole different level of complexity as you can imagine. As you know, we had code for puppeteering client side a few years ago, but we never implemented the protocol to broadcast live joint positions from clients to other clients through the server.
"Beyond that, I am not personally convinced it'll provide the best animation experience. One can fall into the Uncanny Valley in no time with something like this. I'd rather first enrich the number of available animations and use a gesture-triggering system as we have today, except it'll be triggered by camera capture, and we'll be transparent to a user who so wishes to animate his/her avatar. You could dance and see you avatar dance without it dancing awkwardly. In my opinion, that will give a much more natural and pleasing experience."
And now with Microsoft releasing its own SDK for Kinect, we're likely to see many more SL viewer developers experimenting in Merov's motion-tracked footsteps.
Hat tip: Opensource Obscure.