While Apple Vision Pro is impressive technology, sales have been slow, and I've grown skeptical that it'll ever gain mass market adoption for various reasons. For one thing, sales remain slow, and for another, even most early adopters who have bought one aren't using it on a daily basis.
But my longtime friend and colleague Amber Case, design consultant and former fellow at MIT and Harvard, delves deeper than that, raising some fascinating points that never occurred to me before, citing a neurologist:
I asked my colleague Dr. David Sisson, a neurophysiologist, why XR adoption hasn’t matched media hype. He reminded me that audio and visual inputs—the primary senses XR taps into—aren’t the full picture.
Then there’s the matter of touch in XR. “Without touch, there’s no ‘intimacy.’ You’re not really interacting with what’s going on,” David told me. “You can hit a ball—and hear the ‘crack’ in VR—but you’re not feeling anything other than a little jerk in the controller that makes you feel like there’s some inertia happening.”
In short, headsets deprive us of the tactile and physical context that supports memory formation.
In other words, XR devices like the Vision Pro may impede how we learn and remember, even on a neurological level. As Case puts its:
While XR devices like VisionPro do re-create home and office setups and allow for vast screen real estate, they lack a true sense of location. Evolution shaped our brains to operate differently depending on whether we’re traveling or at home. Researchers call this the encoding specificity principle—our memories link closely with the environment where they were first formed.
With a headset on, our minds don’t fully orient to a place, and so we never quite settle in. Apple offers a vast virtual workspace you can take on the go, but that benefit comes at the cost of the sensory richness and physical grounding of a real-world setup. Neuroscience, not just practicality, suggests that working in a physical space—with monitors, windows, textures, smells, and distance—offers deeper engagement and memory retention.
This is crucial, because it speaks to challenges that are inherent in the XR experience itself. Even if updates to the Vision Pro make it more affordable, and less heavy, and much more content is added, and so on, it may still fundamental clash with how humans evolved to learn and remember things.
Read Amber's whole post here on FastCompany.
Pictured: Me at AWE 2024 in a Vision Pro and, well, bunny ears.
> In short, headsets deprive us of the tactile and physical context that supports memory formation.
So do books.
> In other words, XR devices like the Vision Pro may impede how we learn and remember, even on a neurological level.
So do books.
> Even if updates to the Vision Pro make it more affordable, and less heavy, and much more content is added, and so on, it may still fundamental clash with how humans evolved to learn and remember things.
And that also applies to books: even though lots of the best books are freely available online, written content still fundamentally clashes with how humans evolved to learn and remember things (which is by hands-on imitation of what other people show them).
So what? Should we stop using books for teaching? I guess almost everyone would agree that there are still situations where books for teaching are useful as a part of a learning experience. And similarly there are situations where XR devices are useful; for example, consuming media and playing games - in particular social games.
Do today's apps justify the price of a Apple Vision Pro for the average consumer? Certainly not. Will that change in the future? Maybe. The latest XR device that I bought (a used Meta Quest 2 on eBay with two controllers and an Elite head strap) cost me 120 bucks in total (including shipping). At that price the device doesn't have to be very useful to justify the cost. (And I'm using it almost every day.)
That said, the linked article is actually not as bad as I expected.
Posted by: Martin K. | Tuesday, June 10, 2025 at 03:36 PM
> Should we stop using books for teaching
Maybe the real question is, should we stop teaching kids on tablets and other screens (including XR screens), and bring back books:
"A 2024 meta-analysis of 49 studies found that students who read on paper consistently scored higher on comprehension tests than those who read the same material on screens. Researchers call this the 'screen inferiority effect'—meaning that digital reading leads to lower information retention and understanding."
https://oxfordlearning.com/screen-vs-paper-which-one-boosts-reading-comprehension/
Posted by: Wagner James Au | Wednesday, June 11, 2025 at 09:17 AM