I've written a lot over recent years about how the hype over VR has failed in the face of modest consumer interest, but one question has been flickering in the back of my mind since then: Why did the hype start up again in the first place, after waning in the 90s? The Oculus Kickstarter got the momentum going, but that crowdfunder didn't even raise $2.5 million -- hardly an indication of mass market interest. And to be sure, there are many powerful applications of VR for training, therapy, and prototyping, but that alone doesn't justify the billions of dollars companies like Google and Facebook have poured into the technology.
Cory Doctorow recently suggested a likely explanation that surprised me at first -- that the hype was impelled in part by the end of Moore's Law:
The period in which Moore's Law had declined also overlapped with the period in which computing came to be dominated by a handful of applications that are famously parallel -- applications that have seemed overhyped even by the standards of the tech industry: VR, cryptocurrency mining, and machine learning...
It's possible that this is all a coincidence, but it really does feel like we're living in a world spawned by a Sand Hill Road VC in 2005 who wrote "What should we invest in to take advantage of improvements in parallel computing?" on top of a white-board.
Specifically, Cory expanded to me, "GPUs! Any 3D rendering." And for that matter, offloading most VR rendering to a connected PC, or through the cloud. "And the cloud," as Cory notes, "is mostly low-cost parallel processors!"
My working theory has always been VR hype was powered by the desire to create a successor to smartphones and the Valley's attachment to Snow Crash and other cyberpunk classics depicting virtual reality as The Future. But now I think Cory's explanation is a big piece of the puzzle. After all, some VCs who've invested in virtual reality, not to mention executives involved in VR, have made a similar point about the end of Moore's Law. For instance:
[Venture Capitalists] Horowitz and Andreessen... are adamant that Moore’s Law has “flipped” over the life of the fund. For many years, chip manufacturers rolled out new chips every year and half or so that were twice as fast as the last one for the same price. This trend resulted in mainframes, mini-computers, PCs and finally smartphones, but now it is time to move on, according to Andreessen. “A lot of people have said that progress in the tech industry has stalled because chips aren’t getting any faster, but the dynamic is not about increased performance, it is about reduced cost”.
Andreessen and Horowitz's firm was an early investor in Oculus and last year, despite continued slow VR headset sales, led a $68 million year investment in another VR startup.
The CEO of Nvidia, a leading chip manufacturer for VR devices, is a proponent of the end of Moore's Law:
[As] the scale of chip components gets closer and closer to that of individual atoms, it's gotten harder to keep up the pace of Moore's Law. It's now more expensive and more technically difficult to double the number of transistors -- and thus the processing power -- for a given chip every two years. "Moore's Law used to grow at 10x every five years [and] 100x every 10 years," Huang said during a Q&A panel with a small group of reporters and analysts at CES 2019. "Right now Moore's Law is growing a few percent every year. Every 10 years maybe only 2s. ... So Moore's Law has finished."
Meanwhile the head of AMD, another major manufacturer for VR devices, promotes an updated version of Moore's Law to justify supporting VR:
[I]t's not on Moore's Law [anymore]. Moore's Law Plus means that, in some instances, it costs more. But even more so, Moore's Law Plus says that creativity will never stop. And so, it will be ingenuity at the system level to put solutions together. It might be combinations of CPU and GPU, other accelerators, different memory configurations, how they're pieced together – there's room for lots of innovation at the next level.
None of this necessarily means that heavily investing in VR was a serious mistake. But if this interpretation is right, it also suggests that much of the VR hype is driven by a search for investments in parallel computing -- but not first and foremost an investment based on actual consumer demand.
Interesting point. Way back in the Jurassic period I was taught (we had a semiconductor design class) that 'Moore's Law' was no such thing, just an extrapolation based on observing trends - ie nice to aim for but eventually would fall over.
There might possibly be a case for something similar regarding data transfer rate/cloud voodoo but, as the old caveat about no matter the size of your new disc you will fill it with junk, we see similar with petabytes of basically rubbish. Guilty as charged.
( Bit of an aside, used to be the tech world was vulnerable to well placed EMPs, now we just let punters add huge security holes for toys - IoT anyone? )
Posted by: sirhc desantis | Friday, January 17, 2020 at 05:22 AM