Update, 10/7: More background from Brian in Comments.
A guy named Brian Peiris figured out how to write code for graphics in virtual reality while in virtual reality itself, so he sees the changes as he implements them, and it's mind-bogglingly cool -- watch:
Specifically, as he explains, "I built a live-coding web app for the Oculus Rift where you code in JavaScript using Three.js and watch the world change around you in real-time."
"Very cool," said Philip Rosedale, when I showed him this video. "I want to hire the kid!" Rosedale of course, founded Second Life and is now creating High Fidelity, a new fully VR-integrated virtual world. As it happens, he and his team are already developing a comparable feature for High Fidelity:
"We are doing very similar stuff already with live Javascript code editing in Hifi," he tells me. "[T]he existing editor is live, and we are already displaying parts of the UI in the Oculus. So we'll definitely try this sort of approach."
However, he adds, there's one challenge they need to solve:
"The problem is how to type! Possibly passing the image from a front-mounted camera on the Oculus through, but in some experiments this is quite sickening." I speculated to Philip that this guy Brian Peiris is just really good at touch-typing, but I'm checking on that with him now.
In any case, as I wrote in my book, something like this is one of the kernels for the idea of Second Life, years before SL was born -- Philip imagined himself in the darkness, creating code while within code itself. Now, nearly two decades later, that vision is finally being realized.
Hat tip for video: Linden Lab vet Adrian Herbez.
Please share this post:
Thanks for the post!
I am a decent touch-typist so using the keyboard while inside VR is not too much of a problem, as long as I'm familiar with the particular keyboard I'm using. i.e. switching from my desktop keyboard to my laptop keyboard (which has a different physical layout) would definitely throw me off and make the VR live-coding experience worse.
A video pass-through could work with the right camera. I'm sure many have experimented with attaching a camera to their DK2s with varying success but I haven't tried it yet. I was thinking about hacking something together with the getUserMedia API. Of course you could always just make the camera stationary and have it point down at your keyboard at all times. You'd then have an equivalent, stationary "window" in VR. Use a kinect-like sensor for bonus depth information. Finally there are solutions like ControlVR -- capture your hand and finger motion and represent them with equivalent virtual hands. Take a single picture of your keyboard and use that as a reference for your hands.
Personally though, I think learning to touch-type is a small price to pay especially when live-coding. The experience is significantly better when there are as few impediments as possible between your creativity and your creation. Looking down while typing would kinda defeat the purpose of live-coding. You want to see the *effect* of almost every keystroke, not the keystroke itself :)
I really ought to try High Fidelity out. I love the concept.
Posted by: Brian Peiris | Tuesday, October 07, 2014 at 04:53 AM
Amazing post. This is really interesting and helpful.Thanks for sharing it with us.
Posted by: learning python | Thursday, July 23, 2015 at 02:16 AM