« Education Metrics: Second Life Used to Teach Accounting Over 40% Effective (But Much Less When SL Isn't Mandatory) | Main | Ophelia's Gaze on Second Life Summer Swim Wear: Infinite Possibilities with the Infinity Bikini! »

Monday, July 26, 2010

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Stroker Serpentine

While I applaud any interface extensions of the SL UI, I can tell you from first experience and considerable research and application, that a mocap suit is limited to ONLY avatar movement controls and/or AO/Guesture manipulation. BVH animations or any real-time streaming format are NOT progressively uploaded to SL (yet).

Basically, you could get more "bang from your buck" by using a Xbox controller and not a 30-60K mocap suit.

Until such time VW's incorporate real-time streaming of animation files (a la webcam, mocap, pupeteering) These appliances are simply expensive toys funded by education grants.I'm not complaining mind you, since we sell and support all sectors =D

There are inexpensive systems coming to the marketplace that will significantly lower pro-sumer entry price points.The Gypsy 7 for instance:

http://www.animazoo.com/index.php/gypsy-7)

However, representing them as an "interface" is speculous at best.I am unaware of any SL/OpenSim client capability that will allow for direct joint manipulation of an avatar in real time. One could only specualte the resource/lag hit the server would take with such an open port.

A "tool" would be a more accurate description. Quite an expensive tool at that!

Ann Otoole InSL

The company that makes the logic controller for microsoft kinect is making a PC compatible version soon and it uses an open API. Therefore it will be a matter of someone wanting to develop the client/plugin for this controller when it comes out. No mocap suits needed. Affordable to consumers.

I look forward to being able to construct assemblies with my hands. The education and training use cases for this are staggering.

Adric Antfarm

I concede I am lacking bumping ugly animation and compurgator knowledge, but I think it's pretty damn cool.

shaqq

Well.. Just wait I guess and you'll see real time motion streaming happening pretty soon. It is certainly possible and we know how to do it.

I agree that our suit is quite an expensive alternative right now. Its price will certainly drop in the future though. We're just lucky to have one of those in our lab and why would we use something else if this one is a top of the line equipment and very precise as well :) I was looking into Gyro before we ordered our MVN suit and the animation designer that consulted us said that Gyro is not a good option as a lot of postprocessing is required and the data is a little messy. With MVN suit, in contrary (if properly calibrated) we never had to do any postprocessing at all.

Moreover, the technology we're developing is not limited to this suit only. Any motion capture equipment can be connected to it. There are cheap alternatives already available on the market with 3D cameras and even with a number of simple cameras (like http://www.youtube.com/watch?v=dTisU4dibSc).

Stroker Serpentine

@shaqq

"At the moment it's just able to control the avatar walking direction," Bogdanovych allows, "but in the future we will enable full control of the avatar with the suit (e.g. every move of every part of the body of the person wearing the suit will be translated onto the avatar.)"

I believe this statement to be misleading somewhat. While one can certainly inferface with movement controls and UI manipulation, currently there is no real-time functionality within the SL client-server architecture that will allow for direct skeletal manipulation of the SL avatar.

Correct me if I am wrong.

Stroker Serpentine

@shaqq

Please do not misconstrue my comments as contrarian.I was one of the most ardent supporters of "puppeteering' and the physical avatar when the intial development was spearheaded by Cube and Ventrella Linden. The source code was posted to svn and subsequently shelved.

To my knowlege there is no current plans by LL to incorporate the physical avatar source code into any public release client now or in the future. There may however be an OpenSim application or interest.

See:

http://wiki.secondlife.com/w/index.php?title=Puppeteering&oldid=639823

http://avatarpuppeteering.com/

If your research and/or progress indeed motivates LL to re-visit the technology I would be most impressed and supportive.

However,Given recent budget/staff cuts I am not very optimistic.

Adeon Writer

1:1 Motion, live puppetry of an avatar with a full body suit would be one step closer to SnowCrash. :)

shaqq

Second Life is completely open source now, so how can anything development related be a problem?

Rodion Resistance

I don't know if it's just me, but I find the idea of actually moving your own body to control you avatar a bit "primitive" compared to a method by which something akin to "neuroimpulses" would be used to control an avatar.

Back in the 80's, there was this B-movie called Robojox--it's basically like the mecha you see in Battletech or Mechwarrior, but the main difference was, the writers of the movie approached the idea of controlling the gigantic robot by "puppeteering" it inside the cockpit (i.e. the guy stands on this shiny, slippery metal plate, and he actually uses his shoe movements to move his robot/mech forward, kinda like Michael Jackson moonwalking, and his own hand/arm movements would also control the robots upper limb movements). The result was a somewhat of an awkward, silly, and possibly even fatiguing set of actions done by the mech pilot.

In contrast, the Battletech universe simply employed a "neurohelmet" system by which to control the robot/mech--i.e. impulses from the brain would control the lower and upper limb movements of the robot--a more elegant way IMHO to control a robot or an avatar (a method I believe which has already been documented in this very blog several months/years before).

-RODION

shaqq

2 RODION:

I know a person who is working on an interface you describe. At this stage what you suggest is impossible and there are also potential health risks involved. Measuring brain activity and trying to decipher the signal to understand whether the user wants to move the avatar left or rights is possible with some minor errors and quite some training required on the user side. Something simple like playing packman in this way can certainly be done... but something as complex as dancing and all other types of body movement... I don't think we are there yet.

I don't think there is any rock solid evidence in regards to the health risks when wearing the neurohelmet, but I would cetrainly refuse to wear one on the regular basis. As far as I know those helmets use a technique similar to MRI -the process that involves passing a strong magnetic field through the head. This can't be good for you.

shaqq

Actually, I'm wrong... the technology is EEG rather than MRI, so the health risk might not be there.

Little Lost Linden

The Lawnmower Man is in your head now Jake. There's no escape...Ever.

http://www.virtualworldlets.net/Resources/Hosted/Resource.php?Name=LawnmowerMan

Ann Otoole InSL

ok so what will win to replace a mouse and keyboard? "rigging up" in an expensive mocap system or a set top box (ala kinect) that costs a couple hundred bucks or less?

I'll go with the kinect style system because, in a couple of years, it will be ubiquitous and the defacto standard for advanced UI development.

People that want to develop the viewer for these things should be discussing things with PrimeSense so they can be ready when the units are getting ready to ship.

I.e.; we don't need full body control (not possible in SL anytime soon anyway). We do need hand motion control for the UI and that would be ground breaking enough for the edu and training simulation market to make a vast difference alone.

Ajax Manatiso

The quality of motion is based on the actually number of joints interpreted -- at this point in time SL avis have no finger joints or toes joints and the remaining joints have limited motion. MOCAPS CAN be uploaded to SL and I have successfully loaded a number of them -- however the extra joints have to be removed from the BVH to work properly in SL. The end result is better than most SL animations, but a lot of the fluidity is lost. I'm surprised at Stroker's statements -- AO's use BVH files just like any other animation in SL -- didn't he know that?
Bottom line is all the additional joints required for high-quality MOCAP movement in SL would probably double the lag and the strain on the servers, so it just ain't gonna happen.

Arcadia Codesmith

Who said anything about high quality mocap? Controlling the limited number of joints available in SL would be just peachy.

But the best we could do with the state of the art would be to trigger animations with the mocap interface. Primative.

On the other hand, if a PC verson of kinect can drive mocap software, the barriers to entry in the SL animation market would fall through the floor.

And to be honest, full-body control is a LOT like voice, in terms of breaking immersion. If you're an average graceless human being, your avatar's posture and gestures will reflect that.

I expect it would be a fantastic advance for specific applications (SL's ballet companies rejoice), but not much employed in day-to-day use. Too bad; we could all use the exercise.

Adric Antfarm

If there is any piece of software (I know the hazard of calling Second Life a "game") with more user "experts" in just about every area than Second Life, I've sure as hell never seen it.

John "Pathfinder" Lester

Very cool research. Ultimately, I think the ability to instantly map your physical-world body language and facial expressions onto your avatar will be a game-changer.

I spoke about this idea at length at a recent keynote I gave in Denmark: http://worlds.ruc.dk/archives/2758

"To condense fact from the vapor of nuance" is the key.

Ann Otoole InSL

I'm sure people that use wheelchairs or other equipment will be thrilled to see the metaverse discard them.

shaqq

Ann, you shouldn't treat science and technology as something that aims at leaving someone behind. Nobody is going to prohibit using keyboard and mouse. Moreover, the technology mentioned above by RODION is being developed with the goal in mind to facilitate the use of computer systems by disabled people (even completely paralysed). Nobody will be left behind! What I'm doing is just an attempt to force those who can walk and move to finally stand up from the couch and do some exercise.

Arcadia Codesmith

To my mind, the biggest challenge in full-body avatar control is the lack of kinetic feedback. We've seen systems to "touch" things, even "hug" people, but it's hard to envision with the current state of technology a consumer-grade system that would allow the user to be lifted or dipped in a ballroom dance, or permit a player with a virtual sword to have a blow blocked by the opponent's shield and have the momentum of the controller be checked in mid-swing.

You can do fun and amazing things with gyroscopes in this regard, but not quite enough.

I'm very sorry to say that by the time we have full-body control with feedback sufficient for an immersive experience, many of us are going to be moving very slowly and cautiously. It'd be embarassing if grandma or grandpa broke a hip playing Street Fighter XXXVII.

But I dearly hope I'm wrong. I have plans for my holosuite when it finally hits the market.

bodzette Coignet

So let's say that this can't/won't be done in second life any time soon.

Have the naysayers forgotten the existence of OpenSim and open source browsers such as Imprudence?

This technology doesn't have to be implemented on Linden Lab's version of Second Life for it to make it part of the experience, opensim will do just as well.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)

Wagner James Au VR MMO blog New World Notes
Sinespace Unity MMO
Ample Avi  SL avatars
Tableau_SL_Nylon_pinkney
Click to visit Nylon Pinkey's many fashion brands in Second Life: Nylon Outfitters, Golden Years, Wrigglesworth Residence, Yummy, and Art Nails
my site ... ... ...