Wednesday, January 25, 2012

« Have Furries Fairly Fled Second Life for Furry Forums? | Main | Ask Miss Metaverse Manners About Dealing With SL Content Theft Accusations! »

Tokyo University Developing Real Time, Open Source Kinect-to-Second Life Software

SLKinect 2 is a pretty cool open source project being developed by folks at Network System Laboratory of Tokyo University to connect real world motion captured by a Kinect to avatar movement in Second Life and OpenSim. Watch:

Fumi Iseki, a developer with NSL, recently told me more about the project: "These softwares are open source, so anyone can download, read code and use [them] software freely." The code has almost no licensing restrictions, and they've also created code for third party viewers to use. "We hope that third party viewers include our patch."

This is part of a larger SL/OpenSim-oriented project by Network Systems Laboratory:

SL Kinect Open Source

"We are developing various tools for Second Life/OpenSim. Proxy Server for SL/OS, Money Server for OS and Web Interface for OS, etc. SLKinect is one of them." Click here for an English language site that explains much more.

Here's another demo, showing Kinect-based control of several avatars:

There's more tweaking to do, but ultimately, Fumi tells me, they want this software to enhance communication between avatars, become a machinima creation tool, and finally, a dance game like this one:

Which seems so goddamn cool. So SL developers, click here to find out more.

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341bf74053ef016761133afd970b

Listed below are links to weblogs that reference Tokyo University Developing Real Time, Open Source Kinect-to-Second Life Software:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

bd

Wow I can't believe nobody else has commented. I for one am like holy cow ZOMG!

Patty

I agree with the comment above, that's really amazing! Looking forward to more of such initiatives.

Alan

I love how pathetic he was at the dance game, and exhausted between rounds. It was funny and human. It showed how this technology
would humanise avatars, and add a level of intimacy.

Fumi.Iseki

Thank you for your interest!!

Please use SLKinect.
We are developing next version of SLKinect now. :-)

Thanks.

Maddy Gynoid

As I understand, SLKinect works only local or with connection to a special animation relay server. If someone use it in SL, the animations are only visible inside the viewer of this user (or a group which connect to the same animation relay server).

But altogether an exciting technology which still stands at the beginning. It shows a possible expansion of SL, if Linden Lab is willing to have a look at this. As of February 1st, the sale of Kinect for Windows will start and LL should not fail to jump on this train.

Moni Duettmann

Of course I ask myself: why isn't Linden working on such a thing? That's what they should be working on, instead of the crap they thought is important for the development of Second Life for the last 3 years (like viewers nobody wanted to have).

NeilC

This is exciting stuff, I saw the first video a few months ago and it's certainly something people always dream of for virtual worlds. But there are big hurdles in turning a proof of concept into something really useful and easily usable by the majority of users. If all your body motions become motions of your AV, how will you walk around in the world, change camera angles, drive the UI of the viewer? If you move to your keyboard will your av move (and then legs and arms are out of shot of the kinect - what happens to them?) There are a myriad fascinating UI challenges, but it is great work - I look forward to trying it

Adeon Writer

I may very well buy a kinect just for this.

I believe Niran's viewer recently added this to his test viewer if anyone wants to try it who can't compile their own viewer.

Pathfinder

This is awesome.

The biggest innovations in virtual worlds will come from universities banging away at things like this.

Which is why companies developing virtual world technology need to carefully cultivate relationships with academics and educators.


Ignatius Onomatopoeia

Excellent direction for both SL and OpenSim.

I sure hope LL is paying attention, will partner with these and other clever developers, and use their staff to fix existing issues such as sim-crossings and lag. I just hope they don't try to roll out some clunky Kinnect-enabled UI on their own.

But they will. Especially if breedable Gorean vampire bunnies can be linked in the marketing.

Graham Mills

I might be mis-remembering but I thought there was a networked version of this now. Of course, Merov Linden as is is now was working on this for Mitch Kapor before Microsoft bought the tech. I think the powers-that-be decided to forsake this angle for something webcam-oriented, presumably on the basis of the space requirements for the Kinect. Anyway, I shall be sorely tempted when the Windows Kinect emerges in a few days time.

shockwave yareach

IF LL can take the local data of the animation and relay it to everyone else who can see it, then this will absolutely rock and rock hard. Being able to do my own dance while the music plays would be so much fun. Sitting and talking to someone and my gestures match in VR would be incredible. You could even create that holy grail of things so many people want, and be able to create the BVH files right within the SL viewer itself.

elizabeth (16)

this is way way cool

Edmund Edgar

This is awesome stuff.

One minor factual thingamy: Fumi's university is Tokyo University of Information Sciences, which is a university in Tokyo, but a different university to Tokyo University.

Connie Arida

Fantastic. I remember "avatar Puppeteering" a while back and always thought there must be some way to use Kinect to drive it. At last.

John Branch

I notice no one has yet mentioned Snow Crash. Neal Stephenson cleverly sidestepped, for the most part, the question of how the interface for his Metaverse worked (unless I've forgotten). I always figured it would've required something like what the Kinect has turned out to offer.

Lynne Hand

I'm not sure what hubby will think about me leaping around the living room instead of clicking buttons, but bring it on. :)

Online university

Tokyo University is really doing amazing things in software development . I am very impressed by video above where man only moving his hand up and the software is converting him in animation.

Post a comment

If you have a TypeKey or TypePad account, please Sign In.