What you’re looking at above (on the surface at least), is a VRChat furry wiggling its ears.
But here’s the thing: The avatar's real life own, Rantis, is doing so… with his mind.
Literally:
“To control my ears I imagine them as a floating number along a slider and I adjust the intensity by thinking about it,” Rantis explains to me. “Imagine having two faders on a DJ deck and you control them by raising and lowering their intensity, the output being focus.”
This is just the latest amazing user-created VRChat innovation I’ve learned about recently -- but the creator, known as Chillout Charles in-world, has been building this virtual world-to-brain sensing headband software for two years:
“The use case that I hear the most is that this is helpful with self expression for those who sit on the autistic spectrum, which is interesting,” Charles tells me. “I only ever thought of self expression as a communication thing I want to improve for fun. I never realized this would have practical benefits, even for the original use case I set out for it.”
As to how this works -- and you can try this out yourself:
“Your neurons emit small voltage differences whenever they fire and you have a lot of neurons,” Charles explains. “The electrodes on the headband can detect those sum of voltage differences and get your neuron firing speeds. Those firing speeds correspond to brain states, of which I can get focus and relaxation scores from, which are then transmitted to vrchat via Open Sound Control Protocol. It’s something usually used to send data to lights and effects in music concerts, but VRChat uses it as a way to send data between third party applications.”
To run this yourself, Charles recommends getting the Muse MU-01, a headband that runs for about $200 or less. The software itself is available on Charles’ GitHub.
“My code sits on top of the brainflow library which supports a wide range of consumer EEG headbands. Comfortable to use and fits under most VR headsets. Just make sure to have a bluetooth LE adapter and you're good to go. Then it’s up to the users to manifest those [EEG] numbers through creative use of animations.”
He does have a safety warning for vulnerable users:
“With this approach, you'll be bending your brain to move limbs that do not exist. Things like fatigue, the worsening of existing mental disabilities. The good news is that these risks are infrequent, but it's something to be aware of, especially when it comes to remaking neural pathways outside of a medical setting.”
In my favorite scene in Snow Crash, a profoundly disabled man is able to control his avatar in the Metaverse through an interface just like this. And VRChat’s user community continues to realize the Metaverse’s full potential with projects like this.
Good grief! Somewhere in the distant and dusty past there was a DARPA project which used this kind of technology to fly a jet fighter and it also aimed and fired the guns and anti-aircraft missiles. DARPA was solving the problem that there was too much going on for the pilot to react with enough speed. It was a successful program but it was scrapped because it was found the pilot's brainwaves included shoot commands trigger involuntarily by subconscious hostility. Later I believe Hollywood picked up the idea and used in in a movie with Client Eastwood as the pilot. Its still a pretty neat idea. If you play in the garden you might find something unexpected.
Posted by: Argo Nurmi | Sunday, January 21, 2024 at 09:30 PM