Looks like the next version of the Oculus Rift will allow motion control with head movement, which is a big addition indeed - says Wired:
For the first time, Rift is capable of positional tracking, which allows users to lean and move within the game environment by simply moving their head... it monitors the user’s head movements in real space, and it’s able to translate those movements into not just orientation changes — looking up, down, or behind you — but also as actual motion, which previously was possible only by using a game controller in conjunction with the Rift. It utilizes an “outside-in” system: an externally mounted camera tracks small LED lights on the prototype’s faceplate, adding three “degrees of freedom” (forward/backward, left/right, and up/down) to the Rift’s tracking ability. Up until now, developers and early Oculus adopters have only been able to accomplish this by taping a Razer Hydra motion controller to the side of their Rift headsets. Now, though, leaning down while playing the demo brings you closer to the tower-defense game, and lets you watch the armies you control firing turrets and launching minions. It’s the first look at an untethered VR experience.
More here. For virtual world fans, it should be obvious why this is so key:
Now there's a seamless way to move into the 3D space without necessarily needing a keyboard or another peripheral. And we're closer to a seamless, intuitive virtual world/virtual reality experience that a mass market can use.
Please share this post:
Tweet
Excellent, virtual hands are a must.
Posted by: Metacam Oh | Wednesday, January 08, 2014 at 11:15 AM
Does this mean more realistic blow jobs?
Rudi
Posted by: Rudi | Wednesday, January 08, 2014 at 01:49 PM
Oh wow, mixed with leap motion this could enable proper gaze-directed user interfaces, e.g. ironman.
(By that I mean: Imagine looking at the moon in the sky, stretching your hand out and trying to pick it up between your fingers and then being able to move it around. You can't physically feel it, but you feel as if you would be moving it around. For that you need to be able to understand the location of the users heads' eyes and their finger positions.)
Posted by: Nexii Malthus | Wednesday, January 08, 2014 at 04:02 PM