Thursday, August 11, 2011

« Trooblad: Custom Pose for 3 Avatars to Recreate Rolling Stone's Sexy True Blood Cover! | Main | Nana Minuet Shoots SL Fashion Beautifully, Cinematically »

Brain-Computer Interface for Disabled People to Control Second Life With Thought Available Commercially Next Year

This is an awesome use of a brain-computer interface developed for disabled people to navigate in the 3D virtual world of Second Life, using a simple interface controlled by the user's thought:

Developed by an Austrian medical engineering firm called G.Tec, the prototype in the video above was released last year, but since New Scientist wrote about the project recently, and since it's one of the few real world applications of Second Life that's already showing tangible, scalable, incredibly important social results, I checked with the company for an update:

"The technology is already on the market for spelling," G.Tec's Christoph Guger tells me, pointing to a company called Intendix. "The SL control will be on the market in about one year." I imagine there are many disabled people in SL right now who would benefit from this, and many more not in SL who could, once it's on the market. (A Japanese academic created a similar brain-to-SL interface in 2007, but to my knowledge, there are no commercial plans for it as yet.)

Guger shared some insights on how the technology works, and the disabled volunteers who helped them develop it:

BCI volunteers and interface


G. Tec test volunteers and interface, courtesy Christoph Guger

Above is a pic of the main G. Tec interface with all the basic SL commands. There are other UIs for chatting (with 55 commands) and searching (with 40 commands.)

Not surprisingly, Guger tells me their disabled volunteers enjoyed flying in Second Life most. "It is of course slower than with the keyboard/mouse," Guger allows, "but the big advantage is that you appear as a normal user in SL, even if you are paralyzed."

This brain-to-SL interface literally gives housebound disabled people a world to explore, and a means to meet and interact with as many people there, as live in San Francisco; that in itself is an absolute good. But beyond that, Guger sees other medical applications: "First of all you can use it for monitoring, if the patient is still engaged and as a tool to measure his performance. Beside that, it gives access to many other people, which would not be possible otherwise. New games are also developed for ADHD children for example."

See more G.Tec videos here. And hopefully, we'll see more about this technology soon.

Much thanks to Extropia DaSilva for the link!


TrackBack URL for this entry:

Listed below are links to weblogs that reference Brain-Computer Interface for Disabled People to Control Second Life With Thought Available Commercially Next Year:


Feed You can follow this conversation by subscribing to the comment feed for this post.


ahh this is good. We experimented with the EMotiv and SL several years ago but with very limited results and lack of control. I hope this system will be fairly cheap, available & also not tire the user out after 15 m ins!

Myf McMahon

@Pyewacket. There's nothing I'm seeing that suggests this device is going to any better than the Emotive Epoc or the OCZ NIA. There's major stumbling blocks that are keeping EEG based control systems from consumer viability. Even in the disabilities market, where people due to need, will jump through more hoops than the average person; sticking contacts to your head with a hard as rock glue, is a bit to much of a problem.

Any disabled individual can already already access SL, it just takes a judicious mix of software and hardware and the time and patience to set up so as to address that person's needs. And Second Life is hardly the end of it. I know of quadraplegics capable of pvping in WoW and of playing TF2. I'm not sure if this device would be able to help them much tbh.

foneco zuzu

It will be amazing news but i think the way Linden Lab is acting, they really dont give a ship about disabled!
Just see the terrible changes they made when moving up to V2 code, that makes users with eye sight problems cant use it without stressing their eyes and having to quit using it at all (and seems none LL nor tpv v2 developers are really worried about that!)
So for GOD's sake, think about the ppl like me, that can only enjoy Sl on a V1 viewer, not cause we are against progress, but cause our eyes cant stand V2

TheBlack Box

This one is different from Emotiv and OCZ (now BCINET) NIA.

G-TEC uses

All current systems are mostly interesting for patients who cant use a keyboard for now.

For a BCI that works so well that you might want it instead of conventional input-devices, we will need better hardware.

A nice tour through the history of EEG-based BCI can be found here:

Laura Bondi

The device and software is amazing, and the idea behind is really commendable. Of course the comment of Foneco Zuzu is false and out-themed for the current post. SLV2 has the same GUI visibility than SLv1 and a better, modern interface. Contrast is better, the dark color sends less light to eyes and therefore is more relaxing, infact all the modern softwares for photo pro and graphic pro are in grey (see in example Zoner software or Nikon software) and i know well since i work in photo agency.

This is the again just the old discussion about fans of the old-style viewer that as usual are stuck in the past, unable to accept any new things. But of course those people has nothing to do with the great creative minds behind this device for disabled. Moreover SLv2 allow disabled people to do things without to fill the screen with useless floaters very hard to manage for who is unable to use the hands. Sidebar is very fast to use and in few click allow people to access to all the features of the viewer without to fill the screen with floaters. The new "Easy" mode allows newbies to explore the world without to face the complexity of the advanced interface.
And for who wants to have a touch of colors in the GUI, please give a look to the great Kirstensviewer with his glassy green skin or install the multicoloured Starlight skins (in nostalgia blue, orange, pink, white, green, teal and so on) Lot of choices as you can see, and really no reason to stay in limited v1 viewers. Of course those skins are less homogeneous than LL default skins because not all the floaters or the buttons are perfectly coloured. But with the time the developer will fix those issues.

Please then, stop bothering with this old discussion about the v1 viewer. That is past, LL made a great new viewer and it is without any doubts better than the previous. Old minded people should just open their mind or go back to watch their b/w TV laying in their old, dusty sofas, instead to invent daily new excuses to denigrate the work that lot of developers are doing to give us a better SL.

Going back to the main discussion of this post, the device is really a great idea. Finally a good way for give a chance to disabled people to live in a totally new amazing world and to do there what life has denied to them in RL. Second Life, a second chance for everyone!

foneco zuzu

Well, It's my personal drama, my eyes, not anyone else.
But i bleave you are not in world, at least for 4h straight as i do most of days, cause then you will notice this:

75 pct of residents us Tpv v1 viewers.
20 pct or less Tpv v 2 viewers.
the rest LL viewers.

Thats not a made number, is the one you can see all days on the grid in all places.

foneco zuzu

And to be sure, i agree this is OT, on this post.
But still it shows how much resistance to what Microsoft did is just short minded.
All windows releases has a choice of the classical mode.
Is it so hard to do same with LL viewers?
Or is true that they just made the v2 from an outsource company and cant even know how to figure a simple thing to do, like a customizable interface?


foneco zuzu,

use Firestorm then. It can be set up to look just like a V1 viewer -but is a V2, and will have some stuff that SLv2 does not have.

That should solve your problem.



Arcadia Codesmith

Pointless and stupid viewer wars aside, the interface looks intriguing. There are multiple EEG-based childrens' toys that have reached the market; the hardware to do basic navigation, communication and control could be mass-produced at a consumer-friendly price point.

That'd be great because it'll cut greedy, gouging medical suppliers out of the loop entirely, hopefully leading them to leap off buildings and die.

Have a great day!

Hamlet Au

I don't think debating quality of viewers is stupid, though it's definitely better done in another forum. I've invited Guger to answer reader comments, so if anyone has questions specifically about this BCI project, please post!

Myf McMahon

Getting back on track, I've got a couple of questions I'd like answered. =)

@Guger: How does your recording of the P300 wave set your project apart from earlier EEG controllers, such as the Emotiv Epoc, or OCZ (BCINET) NIA?

Also, one of the major stumbling points for EEG controllers in the past has been the positioning of the electrodes, then keeping them in position. Generally speaking, the more electrodes the device has, the more problems the user has. How are you proposing to address this? How are you going to make the device easy to use, while maximisg it's ability to function?

Komuso Tokugawa

@Myf These look promising

Myf McMahon

That's awesome. It's stuff like that that make me realise, I am living in the future.


no one will probably see this video now but i came accross it the other day and it blew my socks off i really recomend you watch this just before half way through

Christoph Guger

The g.tec BCI system for SL control uses the P300 component. Therefore all the control icons are flashing in a random order and the user has to select one by looking at it. When this target icon flashes up, the P300 wave in the EEG is produced. For such a BCI system it is very important to have a reliable and precise timing of the flashes and of the EEG signal analysis. We made also group studies with 100 subjects that showed that it works with very high accuracy only with a few minutes of training. We use 8 EEG electrodes over the most important regions for the P300. With fewer electrodes the accuracy drops down rapidly and we can not speak anymore of a brain-computer interface.

Post a comment

If you have a TypeKey or TypePad account, please Sign In.