The problems with XR headsets that even Apple can’t fix
Tech companies have been trying to make VR/AR/XR headsets go mainstream for over three decades. Here’s what I’ve discovered in my own research of the technology.
In 2016, I ran an experiment to see how long shifts in VR might affect my perception of time and attention. I spent 4–8 hour days in VR over the next few months, playing as many games as I could. I was given an HTC Vive to do this research with, and although I found the headset comfortable and fun, but found that I had less memories of the experience than I expected. I wasn’t playing around in the real world, but this virtual space. As I was interacting with only a few buttons and a two dimensional screen, there were not a lot of sensory and memory markers for my mind to hang onto.
I later tried the Apple VisionPro with a full setup of multiple virtual computer monitors and displays. This seemed a lot more like a practical tool than a gaming system, and I was genuinely curious how valuable this could be for focus and research. Many people’s experience of VR is only a 15 minute long demo in a museum or shopping mall, but I felt like it was a very different experience to be able to use it throughout a whole work day.
So as Apple, Meta, and other major players continue struggling to make high-end XR head-mounted displays (HMDs) go mass market, it’s important to point out the virtual elephant in the room I couldn’t help noticing during the research:
There’s good reason to suspect that XR HMD’s barrier to mainstream success is not solely due to cost. Weight of the headset, lack of content, eye strain, and other addressable factors are usually blamed. I later discovered findings from neuroscience comparing the sensory spectrum of XR devices vs. the analog work. They related that the sensory spectrum of XR could never make up for the benefits of having a physically shared environment with physical computer displays in them — especially when it came to knowledge and memory formation.
Full-Time Computing in XR vs. Physical Monitors
My first experiments with XR required standing all day and moving my arms around. I’d get pretty tired at the end of the day. And although I found it a great workout (SUPERHOT was one of my favorite games), it was a lot different than using the system to do everyday tasks. Dramatically moving data around in 3D space, Minority Report-style, makes for compelling cinema, but not for long term task work.
When the Apple VisionPro came out, I was excited to use it while sitting down, especially when my friend set me up with a huge working environment. I enjoyed using it because I found it provided me with a strong focus when learning something new, or really digesting a Wikipedia article.
Still, when I set up my home office a year ago, I chose two large physical monitors at eye height, and a sit/standing desk. I enjoyed VR, but I found the lag of set-up and getting situated even more involved than docking a laptop to a physical screen. And while I felt excited to be in a virtual space, I found myself feeling a bit more disconnected from the analog one. I wanted to be outside more often than usual, and I didn’t want to stare at a screen anymore at the end of the day.
I also wondered about the differences in eye strain and focus that come from screens right in front of the eyes, instead of at a distance. I prefer to work in a room that has a window with a view, so that I can take a break from my monitors by looking into the distance and then looking back at the screen, especially when I’m thinking about something I just experienced.
The Trouble With Losing Touch
I asked my colleague Dr. David Sisson, a Ph.D. in Neurophysiology, why XR wasn’t taking off as quickly as the media promised. He pointed out that audio and visual information — the kind conveyed in HMDs — is only part of how we make sense of what we’re experiencing.
Then there’s the quality of touch conveyed through force feedback in most XR rigs.
“Without touch, there’s no ‘intimacy’. You’re not really interacting with what’s going on,” David told me. “You can hit a ball — and hear the ‘crack’ in VR — but you’re not feeling anything other than a little jerk in the controller that makes you feel like there’s some inertia happening.”
Headsets and their up-close screens, in other words, deprive our minds of the tactile and situational context which feeds into how we build memories. By contrast, when playing a console game, you’re situated in a physical environment with a screen that’s far away. The console’s physical controls are colorful and we can see them. Playing in a fixed location can also help give our minds muscle memory. If you’re like me, you have vivid childhood memories of not only playing your favorite console game, but also remember the couch or carpet you sat on to play it, and the friends/family members you played that game with.
The experience of “placeness” seems to extend beyond our sense of touch:
The Neuroscience of Virtual Experience
“[C]hemical senses are not a part of [the VR experience],” said David, “There’s a well-considered idea, a linkage between olfaction, smell between memory — that you’re living that out of the [VR] picture entirely.” (Yes, there are attempts to add virtual smells to VR, but it’s unclear if they’re an adequate replacement — or how many consumers even want to fill their living room with odd aromas.) The famous moment in French literature when the aroma of a cookie sends the author flying backward into his childhood memories has been validated by recent peer-reviewed studies — and strongly suggests digital memories formed apart from smells will be less vivid or indelible. Indeed, a recent study suggests that our sense of smell “may facilitate transfer of learning to other sensory domains”.
While XR headsets like the VisionPro do recreate our home/office environment, and enable us to see much more screen real estate while coding, they’re less able to offer us “placed-ness”. Due to our evolution as a species, our brains operate differently when we’re nomadic, versus when we’re physically in a place for a long time that we consider home. (Scientists call it the encoding specificity principle, in which our knowledge is associated with the place where we first acquired that information.)
With a headset on, since our minds don’t get oriented to being in a place, we never get settled. In exchange for that sacrifice, Apple offers the ability to interact with a huge computing surface, even while traveling. But the sensory spectrum of XR can never make up for the benefits of having a physical environment with multiple monitors or a projector in an analog world — as much for the neuroscientific benefits of place and full sense experience, as for the practical aspects (the inconvenience of removing / re-wearing a headset, head/eye strain, and so on).
Putting the “Pro” in VisionPro
To be clear, XR headsets are great for use cases like physical rehabilitation and therapy, which are usually done in shorter bursts, and when you truly need to be more embodied, with your hand motion and other parts of your body tracked. Also, a select niche of heavy content creators may value the VisionPro’s multiple screens and a headset which eliminates all surrounding distractions. But that is very much not a mass market audience.
While many XR evangelists may keep touting VisionPro as the device to make the technology mainstream, we — and for that matter, Apple — should probably keep the “Pro” part of the name in mind, and assume that only a very narrow set of use cases for a select number of professionals will find it valuable enough to become fully immersed.
The rest of us probably gain too much benefit from computing environments within our physical environment — and their ability to engage all our five senses — to ever go completely virtual.
—
FURTHER READING
Krekhov, A. et al. (2022). Do we still need physical monitors? An evaluation of VR for productivity. Proc. IEEE VR 2022. (Working in VR led to higher task load, eye strain, and lower well-being vs physical environment) ceur-ws.org.
Biener, V., Kalamkar, S., Nouri, N., Ofek, E., Pahud, M., Dudley, J. J., Hu, J., Kristensson, P. O., Weerasinghe, M., Čopič Pucihar, K., Kljun, M., Streuber, S., & Grubert, J. (2022). Quantifying the Effects of Working in VR for One Week. arXiv preprint arXiv:2206.03189. https://arxiv.org/abs/2206.03189
Pavanatto, L., Davari, S., Badea, C., Stoakley, R., & Bowman, D. A. (2023). Virtual monitors vs. physical monitors: an empirical comparison for productivity work. Frontiers in Virtual Reality, 4. https://doi.org/10.3389/frvir.2023.121582
Mann, S. (1996, November 18–22). ‘Smart clothing’: Wearable multimedia and ‘personal imaging’ to restore the technological balance between people and their intelligent environments. In Proceedings of the ACM Multimedia ’96 (pp. 163–174). ACM. http://wearcam.org/acm-mm96/index.html Accessed 25 Jun 2025.
—
A shorter version of this article was originally published in FastCompany
—
Amber Case is the founder of the Calm Tech Institute, an effort advocating for design that creates more harmony in people’s lives. Case is a former research fellow at the MIT Center for Civic Media and the Berkman Klein Center for Internet & Society at Harvard University. She founded the Calm Tech Institute to better advocate for design which creates more harmony in people’s lives.