By now, you've probably heard about the Japanese university which developed an electrode-based interface that enables control of an avatar in Second Life by conscious brain activity. Yesterday I contacted Junichi Ushiba, the scientist leading the Keio University project, to explain it further. It was inspired, he tells me, by a person close to him, cut off from daily life by tragic circumstance.
"One of my friends is a patient who is suffering with spinal cord injury," Ushiba tells me. (An ill-fated dive into a swimming pool broke his neck.) "He cannot move his hands, arms, and legs... Of course he can move anywhere if the motor-driven wheelchair can be used, but the possible places where he can go are limited, because the real world still has a barrier to persons with disabilities. Limitation of movement limits his opportunities to talk with others in person, so he frequently uses e-mail to communicate. But e-mail is like a letter, and can feel impersonal. Instead of that... Second Life provides us 'reality'. So, I imagined that SL would be a good platform for assisting him for communication."
To help his friend, Ushiba and his staff created "Ousshy Jun", an avatar they used to develop the mechanics for this idea. Of course, there are other computer interfaces for the disabled, but these came with their own challenges.
"Eye blinks, bending neck position, or breathing is a possible control signal," Ushiba e-mails me. "Actually, switching external devices using these signals have been widely used. But it is not the best way, because what the person wants to control and what he has to do to control it is mismatched. So, I realized that the way to control something as one's thinking is the best way."
And with that insight, Ushiba and his team developed a unique solution to the mind-body program. "A brain wave, or electroencephalogram, is recorded from the surface of the head. The brain wave reflects the cortical activity from the areas which innervate upper and lower limbs (named the sensorimotor cortex), so online signal processing distills what types of movements the subjects are thinking. Recording and decoding brain waves are achieved by Computer A. Through USB port, computer A sends key commands such as '->' (pressing right arrow), '<-' (pressing left arrow), and '^' (pressing up arrow) to Computer B that is running SL."
The result is featured here in this video, courtesy of Professor Ushiba's site. They're now integrating new functions to the project "We are trying to add a word processing (typing) feature," he tells me. "I may help subjects to make [Japanese] character-base chatting, and buy/sell something in SL."
And though his first inspiration was to give his friend a new portal into community, Ushida's virtual world technology awaits the real world's impediments. "At this moment, we have not applied this technology to persons who are suffering with motor dysfunction," says Junichi Ushiba. "I am now submitting a research proposal to the local ethics committee."
I was overwhelmed seeing the video and reading this. Words escape me so all I can say is well done guys and keep up the good work.
Pauline
Posted by: Pauline Aquilina | Friday, October 19, 2007 at 12:28 AM
This is truly amazing given that even after months in SL, walking and turning in SL is not accurate. Integrating with an electrode that senses the thinking and controls the movement is just pure magic.
Posted by: Labsji Link | Monday, October 29, 2007 at 05:25 AM
This sort of a brain-computer interface is not so new. I know of two companies that are on the market with quite far developed devices:
www.emotiv.com
and
www.neurosky.com
Posted by: JJ | Wednesday, November 28, 2007 at 01:50 PM
Two years on and it's good to know that some people are investigating the development of an Emotiv EPOC solution for Second Life.
Posted by: James Corbett | Monday, December 21, 2009 at 01:46 AM