A Superman-style pantomime commands your avatar to fly via Kinect
The software to connect your Kinect to Second Life, which we first saw demonstrated last month, is available to all who want to try doing it themselves: Click here to download the .zip file.
Developed by Thai Phan (Second Life name: Ty Gould), an engineer at the MxR Lab of USC's Insititute for Creative Technologies, the software, he tells me, "contains the executable I made for my demonstration. It contains instructions on how to setup Kinect with your PC. Someone on YouTube got it working, so I'm glad it works." This is how: "The software I wrote constantly checks where your hands, elbows, and shoulders are at all time," Thai tells me. "When it detects you are doing a certain gesture, it emulates keyboard short cuts that are sent to the Second Life viewer. The keyboard commands are picked up by an animation-override that must be attached to your avatar. Based on what keystrokes are pressed, the animation override will trigger the corresponding animation."
As he demonstrates in the video (viewable below), "avatar control using gesture is not one-to-one." Before your avatar can perform the action you're making, "my software has to first detect the gesture. Afterwards, keyboard commands are sent to Second Life over the Internet. Finally, your avatar will perform the appropriate animation."
However, Thai's software is just the first step to more intuitive person-to-avatar interaction, and other developers can improve upon what he has innovated. The development suggests a future of virtual world interaction beyond the emoticons and rudimentary avatar gestures we're accustomed to now. Read more after the break:
"[F]or now," says Thai, "developers can use OpenNI to create their own gesture recognition algorithms. It will take more ingenuity to recognize more complex movements associated with dance, sports, and other forms of performance art. In addition, the corresponding animation that is triggered must be recorded and uploaded to Second Life beforehand." And here's my open call to see different variations of Kinect-to-SL interaction, experimenting with the most natural body-to-avatar UI.
Thai Pan does foresee a future way of making movement truly flow from Kinect to SL: "For control to be truly one-to-one, the positions and orientations of your joints must be mapped directly to the joints of your avatar, which requires a constant flow of data between your computer and Second Life. Many people have suggested that the Second Life puppeteering project be revived for this purpose." Created by Jeffery Ventrella when he was at Linden Lab, the puppeteering was discontinued a few years ago.
At the moment, there's no plans to open source the code, since USC is still using it. However, Thai adds, "I do hope to make it available to the public some time in the future."
Sending a hug into Second Life with Kinect
A Second Life Resident since 2005, Thai Pan believes gesture-based interfaces like his open up virtual world interaction to people who aren't already accustomed to the methods that have been used for over two decades now:
"People who chat online on a regular basis have adapted to using emoticons to express their emotions," as Thai puts it. "I believe for those people, they have become accustomed to translating their emotions into words and symbols."
Those people (i.e., us), however, are in a minority.
"For the large majority of people on the planet who do not normally play computer games, let alone interact in 3D social environments, they probably won't get the same feeling from simulating a social gesture with a keypress as they would performing the actual gesture. With today's technology thus far, we force ourselves to perform actions that only the computer can understand." Goal of researchers like him at ICT, he says, is "to force the computer to understand what we do. Keyboards and mice are still useful and efficient human interface devices, but in the foreseeable future, computers will adapt to what we do, rather than the other way around."
Want to help create that future? Get Thai Pan's software here, and share your results with us.
Sorry for the OT but I had to share this... for all of us with fond memories of the "Starry Night" SL installation... run, don't walk, to check the painting on the new Google Art Project. Incredibly high resolution, with every brushstroke and paint crack in full glory.
http://www.googleartproject.com/museums/moma/the-starry-night
Also a lot of other masterpieces on the main site.
Posted by: Nahasa Singh | Wednesday, February 02, 2011 at 04:30 AM
or you could just try this:
http://blog.esimplestudios.com/2011/01/unity3d-and-microsoft-kinect-hell-yeah/
Posted by: Komuso Tokugawa | Wednesday, February 02, 2011 at 04:46 AM
This is welcome news that could lead to greater adoption of virtual world environments. The learning curve in using these environments remains high for those new to this field. In addition, there is a growing demand from corporate and government sectors for this type of interface.
Posted by: Dave Levinson | Wednesday, February 02, 2011 at 06:42 AM
thanks for releasing this code, Thai! Nothing has gotten me more excited about getting a gaming system than the Kinect.
Of course this could lead to our SL dancing skills being greatly diminished if they map to our actual bodies. LOL.
Posted by: rikomatic | Wednesday, February 02, 2011 at 08:34 AM
Puppeteering.
Motion sensing.
3D monitor goggles.
Full-body tactile feedback.
Virtual smell/taste.
Omni-directional treadmill/flight harness.
Sheer nirvana.
It'd be nice if all of the above were mass-market before I'm too old to stand up. Add life extension and nano-rejuvenation to the wish list.
Posted by: Arcadia Codesmith | Wednesday, February 02, 2011 at 09:45 AM
Yeah. Like I see people doing stupid arm waving moves and silly walks, while trying to get email from SL while they run down the street holding an ipad and waving to the bus they just missed to stop..
Of course they cant own a car, since gas is 5.00 a gallon and they can only get a job for min wage, or linden values....
Like the Eyetoy, all of this stuff will be a fad and forgotten in a year.
How's that 3D movie biz and TV set sales doing?
Posted by: hank | Wednesday, February 02, 2011 at 10:02 AM
@Arcadia, I'll take the "life extension and nano-rejuvenation" and you can merge with the machines.
One lifetime is not enough to see all the wonders and terrors of Meatspace.
Posted by: Ignatius Onomatopoeia | Wednesday, February 02, 2011 at 05:29 PM
Iggy, I'm quite fond of meatspace, and especially the parts of it with little or no technology at all (though I confess a soft-spot for modern plumbing).
Still, there are times I just want to climb to the top of the highest mountain available, hurl myself from the peaks, and soar unfettered with the wind rushing over my skin.
Until we crack the unified field theory, that may be hard to accomplish IRL. Base jumping isn't quite the same thing.
Posted by: Arcadia Codesmith | Thursday, February 03, 2011 at 08:03 AM