The future of virtual interaction is being shaped by Kinect and Sesame Street. Not sure that's the case? Watch this and decide:
What kind of virtual worlds and interactions will the kids who grow up on this expect when they become teens and adults? Something quite different than what we think of as virtual now, I'd say.
This is an awesome hack that links Kinect to Second Life in an incredibly innovative way: Kinect reads the head movement and facial expressions of someone in real life, and those movements are translated into data which then dynamically alters a face sculpture in SL, composed of thousands of cubes. Created by an artist known in SL as Glyph Graves, he narrates this video above, featuring a friend who volunteered to demonstrate. This mixed reality sculpture has already been avidly blogged by a number of SLers starting with Chestnut Rau, but I begged Glyph to shoot this video, so we could see what was happening on the other side.
“It seemed like an obvious thing to do,” Glyph tells me, when I ask what inspired this project. “[Kinect] does face depth. I thought I could do it in SL. So I did.” The reaction in-world when he’s showed this to avatars has been pretty stellar: “Shock, amazement, some disbelief. Part of it is the potential it suggests for SL.”
To make it possible, he had to create a fairly complex interaction between Kinect and SL, read on:
Pretty clever, because if Avatar Kinect is robust enough to convey humor through an avatar, it's probably as powerful in other communication contexts. Sucks for me: I don't own a 360. Anyone reading NWN care to give it a try, and share your reactions here? Now's a very good time, because in September, Avatar Kinect will be restricted to Xbox Live Gold subscribers.
Here's a look at Child of Eden, a music game for Xbox 360 with Kinect integration released this month, a brain-melting experience that puts your body into a virtual landscape of 3D graphics and sound:
The gameplay aims to produce synesthesia, where the senses of sight, sound, and touch overlap. My dear friend and game writer/developer Jane Pinckard has an essay in praise of Child of Eden on Kotaku, and describes how it felt to play for the very first time: "I was suddenly falling upward through a liquid field of stars. I don't really know how else to describe it. It was exhilarating, because for the first time in a very long time I felt again that excitement of experiencing something utterly new and strange and beautiful."
A port of Minecraft is being developed for Xbox 360, creator Markus "Notch" Persson recently told Reddit fans. "It won't be a straight port," he allows, and will be developed by a console team (with Notch as designer). Best of all, it'll run with Kinect. Specifically, in Persson's exact words, "WITH THE POWER OF KINECT YOU WILL FEEL CLOSER TO YOUR GAME THAN EVAR BEFORE AND ALSO IT SAVES KITTENS." So it's got that going for it too.
Today at E3, Microsoft announced the launch of Kinect Fun Labs, a kind of app store for Kinect programs and experiments created by developers and the Kinect hacking community, and despite my long-standing policy not to trust anything an unshaven guy wearing sunglasses indoors says, it looks pretty cool. Have a look:
I especially like the avatar conversion app. With the right support and promotion from Microsoft, this could become a great platform to grow the audience for Kinect innovations around avatar and 3d interaction beyond the hackers who enjoy them now.
Turn the volume up before watching this pretty impressive demonstration of flying in Second Life via Kinect:
I like how hands-up and palms-forward gestures control avatar flight and left/right panning -- to me, it metaphorically looks like pushing against the glass surface of virtual reality. (So to speak.) The project is by students at Ryerson Universitys' Interactive Computer Applications and Design (ICAD) Group, in Canada, a school which also developed an earlier Second Life-Kinect project. Remember, USC released code for connecting Kinect to Second Life, to develop your own variations. With Kinect already boasting a 10 million install base, making it the fastest-selling consumer device ever, I think it's time to see commercial, consumer-friendly applications of this technology.
[E]ventually we will have Kinect-like devices installed everywhere – in our homes, our business offices, even our cars. Public environments will be installed with the equivalent of the Vicon motion capture studio. Natural body language will be continually sucked into multiple ubiquitous computer input devices. They will watch our every move.
Using Unity3D, a studio called Esimple recently figured out how to create a Kinect hack that captures human movement and translates it in real time to an avatar. Watch, and be sure stay around for :45, when we also see some (rudimentary) human-avatar interaction with dynamic 3D objects:
Pretty impressive. There have been a number of hacks connecting Second Life to Kinect, but none work like this -- instead, as with this USC version, a human gesture merely triggers a pre-existing avatar animation. The developer of the Linden Lab version that I blogged about yesterday argues this is the better way to go. With dynamic one-to-one motion capture, Philippe Bossut argued, "[o]ne can fall into the Uncanny Valley in no time". More than that, however, there's the technical difficulty of even making this possible in Second Life; for starters, that would require reviving the avatar pupeteering project the company abandoned a couple years ago. Meantime, this Unity 3D version is already (relatively) operational.
Another metaverse milestone: Kinect interactivity has been integrated into Snowstorm, Linden Lab's open source Second Life viewer, by engineer Philippe Bossut, known in SL as Merov Linden. Watch:
As it happens, this is actually Kinect returning to its roots, so to speak. Back in 2008, Bossut was working with Linden board member Mitch Kapor to create a hands-free 3D camera interace for Second Life. "It compares pretty well since it's the same code really," as Bossut explains to me. "The Kinect hardware is provided by Primesense, and Mitch and me did work with a Primesense camera at the time." As Kapor told me then, this was part of his plan to help make Second Life mass market, but the project was discontinued due to the difficulty of mass producing the 3D camera hardware. With Kinect very much mass market, however, it may be that Microsoft has solved that problem for them. And us.
More from Bossut after the break, on making this code open source, and the fascinating challenges of integrating it into Second Life usage: