I mentioned how Philip Rosedale aims to achieve extremely low latency to improve avatar-to-avatar interactions in his new Oculus Rift-compatible High Fidelity virtual world; but it's not just a matter of shortening ping time -- he's also working with a neuroscientist and 3D brain scans to improve that experience too:
"Basically," Philip tells me, "you can see things like 'I feel a certain way toward you' in the scanner and we can look for that data and then test breaking it with various different transformations of person into avatar." Philip demonstrated this at South by Southwest last March with Dr. Adam Gazzaley of UCSF, but media coverage at the time didn't quite explain Philip's purpose, which is to improve the avatar-to-avatar sense of presence in High Fidelity:
"Adam and I have know each other for a while, and have been exploring ways to work together to use his expertise and lab to help us understand the experience of 'presence' between avatars/people." Here's how:
"We're interested in whether there are neurological signals that are visible either to EEG and/or fMRI that can help design High Fidelity by showing us how to bridge the gap between the face-to-face experience and the avatar-to-avatar one."
At SXSW, they showed off the "Glass Brain" (video above), which they demonstrated live with Philip's wife Yvette hooked up to the scanner.
"The 'GlassBrain' itself is entirely Adam's lab's work," adds Philip. "The demo we did at SXSW had both a demo of the latest High Fidelity avatar stuff and then the Glass Brain with Yvette. We are looking at ways to combine both technologies in future work."
In other words, we're getting to a point where virtual world/VR interactions are modeling our awareness of each other on the neurological level. If we're lucky, we'll hear more about High Fidelity's progress at Philip's SVVR keynote next week.
Please share this post: