This video below shows you World of Warcraft as played with physical gestures via Microsoft Kinect. It also shows you, in my opinion, an innovation that might finally make 3D virtual worlds and MMOs a true mass market phenomenon:
Developed by USC's Institute for Creative Technologies, the WoW demo runs with FAAST (for Flexible Action and Articulated Skeleton Toolkit) free software which you can download and read more about here. (It'll soon be open sourced too.) "The toolkit incorporates a custom VRPN server to stream the user's skeleton over a network," the document explains, "allowing VR applications to read the skeletal joints as trackers using any VRPN client. FAAST can also emulate keyboard input triggered by body posture and specific gestures." In other words, World of Warcraft is just the beginning. And it reminds me of something Mitch Kapor told me back in 2009, when I asked him if he still thought worlds like Second Life could ever be mass market:
"The clock has a long way to run before you can say 'Well, it didn't happen,'" Kapor told me. "I think it's all a question of what's the appropriate time frame." By his lights, real progress building the audience for virtual worlds required new technology. "The interaction metaphor is not going to be a mouse and keyboard." (Unsurprisingly, Kapor's venture firm is involved in the development of 3D web cameras that enable people to control their avatars with physical movement and facial expressions.)
An earlier version of that project eventually evolved into Kinect, and what you see above is the latest iteration. After just a few months on the market, 4 million copies of the Kinect have bought. So how soon do you suppose we'll see commercial versions of something like this played on WoW and other worlds by several hundred thousand -- or still more?
Hat tip: Grace McDunnough.
Building in SL (and like grids) with kinect will be awesome! Provided it has the extreme fine resolution required. Which I doubt. But can hope for.
Posted by: Ann Otoole InSL | Tuesday, December 28, 2010 at 08:31 PM
Yeah, Ann. I was thinking the same thing, how cool it would be to create using this, or working with zbrush, two implemented double contact points on a sculpt and actual working the mesh with your hands.
I've been geeking over all these kinect hacks over at engadget.com
Posted by: Seymore | Tuesday, December 28, 2010 at 08:48 PM
I got a PS3 Move controller for Christmas and haven't logged into Second Life since.
If SL supported motion controllers and had slicker physics, I might be tempted to scrape up some capital and build game fields.
Nothing will replace the keyboard for me - text has a quality of its own that cannot be replaced by voice. But that doesn't preclude stepping a few feet away from the desk now and then and getting a little exercise.
Posted by: Arcadia Codesmith | Wednesday, December 29, 2010 at 07:57 AM
OK, "a little exercise", but what about several hours of playing? Isn't it tiring? If I'm in the mood of playing, I play half a day, even a whole day. Now I imagine as I play with exercising my arms the whole time... :(((
Posted by: Flo2 | Wednesday, December 29, 2010 at 09:12 AM
I've been playing 4-6 hours a day. I'm a little stiff and sore, because I'm an old out-of-shape gamer with no business being a gladiatrix. But it's so much fun!
I'd LOVE to use something like this with PC MMOs -- not as the only interface, but as an option.
But for SL, I think we need a much more robust gaming platform to leverage the technology. There are compelling non-game applications (yes, I'd love to use this to sculpt and paint too!), but games are the killer app.
Posted by: Arcadia Codesmith | Wednesday, December 29, 2010 at 12:25 PM
It might be a while before SL gets to be like WorldBuilder. The Kinect interface might be a good start however.
http://www.youtube.com/watch?v=VzFpg271sm8
Posted by: Little Lost Linden | Wednesday, December 29, 2010 at 06:03 PM