A Superman-style pantomime commands your avatar to fly via Kinect
Developed by Thai Phan (Second Life name: Ty Gould), an engineer at the MxR Lab of USC's Insititute for Creative Technologies, the software, he tells me, "contains the executable I made for my demonstration. It contains instructions on how to setup Kinect with your PC. Someone on YouTube got it working, so I'm glad it works." This is how: "The software I wrote constantly checks where your hands, elbows, and shoulders are at all time," Thai tells me. "When it detects you are doing a certain gesture, it emulates keyboard short cuts that are sent to the Second Life viewer. The keyboard commands are picked up by an animation-override that must be attached to your avatar. Based on what keystrokes are pressed, the animation override will trigger the corresponding animation."
As he demonstrates in the video (viewable below), "avatar control using gesture is not one-to-one." Before your avatar can perform the action you're making, "my software has to first detect the gesture. Afterwards, keyboard commands are sent to Second Life over the Internet. Finally, your avatar will perform the appropriate animation."
However, Thai's software is just the first step to more intuitive person-to-avatar interaction, and other developers can improve upon what he has innovated. The development suggests a future of virtual world interaction beyond the emoticons and rudimentary avatar gestures we're accustomed to now. Read more after the break:
"[F]or now," says Thai, "developers can use OpenNI to create their own gesture recognition algorithms. It will take more ingenuity to recognize more complex movements associated with dance, sports, and other forms of performance art. In addition, the corresponding animation that is triggered must be recorded and uploaded to Second Life beforehand." And here's my open call to see different variations of Kinect-to-SL interaction, experimenting with the most natural body-to-avatar UI.
Thai Pan does foresee a future way of making movement truly flow from Kinect to SL: "For control to be truly one-to-one, the positions and orientations of your joints must be mapped directly to the joints of your avatar, which requires a constant flow of data between your computer and Second Life. Many people have suggested that the Second Life puppeteering project be revived for this purpose." Created by Jeffery Ventrella when he was at Linden Lab, the puppeteering was discontinued a few years ago.
At the moment, there's no plans to open source the code, since USC is still using it. However, Thai adds, "I do hope to make it available to the public some time in the future."
Sending a hug into Second Life with Kinect
A Second Life Resident since 2005, Thai Pan believes gesture-based interfaces like his open up virtual world interaction to people who aren't already accustomed to the methods that have been used for over two decades now:
"People who chat online on a regular basis have adapted to using emoticons to express their emotions," as Thai puts it. "I believe for those people, they have become accustomed to translating their emotions into words and symbols."
Those people (i.e., us), however, are in a minority.
"For the large majority of people on the planet who do not normally play computer games, let alone interact in 3D social environments, they probably won't get the same feeling from simulating a social gesture with a keypress as they would performing the actual gesture. With today's technology thus far, we force ourselves to perform actions that only the computer can understand." Goal of researchers like him at ICT, he says, is "to force the computer to understand what we do. Keyboards and mice are still useful and efficient human interface devices, but in the foreseeable future, computers will adapt to what we do, rather than the other way around."
Want to help create that future? Get Thai Pan's software here, and share your results with us.