
I've written a lot over recent years about how the hype over VR has failed in the face of modest consumer interest, but one question has been flickering in the back of my mind since then: Why did the hype start up again in the first place, after waning in the 90s? The Oculus Kickstarter got the momentum going, but that crowdfunder didn't even raise $2.5 million -- hardly an indication of mass market interest. And to be sure, there are many powerful applications of VR for training, therapy, and prototyping, but that alone doesn't justify the billions of dollars companies like Google and Facebook have poured into the technology.
Cory Doctorow recently suggested a likely explanation that surprised me at first -- that the hype was impelled in part by the end of Moore's Law:
The period in which Moore's Law had declined also overlapped with the period in which computing came to be dominated by a handful of applications that are famously parallel -- applications that have seemed overhyped even by the standards of the tech industry: VR, cryptocurrency mining, and machine learning...
It's possible that this is all a coincidence, but it really does feel like we're living in a world spawned by a Sand Hill Road VC in 2005 who wrote "What should we invest in to take advantage of improvements in parallel computing?" on top of a white-board.
Specifically, Cory expanded to me, "GPUs! Any 3D rendering." And for that matter, offloading most VR rendering to a connected PC, or through the cloud. "And the cloud," as Cory notes, "is mostly low-cost parallel processors!"
My working theory has always been VR hype was powered by the desire to create a successor to smartphones and the Valley's attachment to Snow Crash and other cyberpunk classics depicting virtual reality as The Future. But now I think Cory's explanation is a big piece of the puzzle. After all, some VCs who've invested in virtual reality, not to mention executives involved in VR, have made a similar point about the end of Moore's Law. For instance: