At Apple’s WWDC23, I think I saw the future. [Pausing to ponder.] Yes, I’m pretty sure I’ve seen the future – or at least Apple’s vision of the future of computing. Tuesday morning I got to try out the Apple Vision Pro, the new $3,499 mixed reality headset announced this week and shipping next year.
I’m here to tell you the main details of my experience, but the overall impression I have is that the Vision Pro is the most impressive first-gen product I’ve seen from Apple – more impressive than the iMac 1998 or the 2007 iPhone. And I’m fully aware that other companies have made VR headsets, but Apple does what it does, where it puts its understanding of what makes for a satisfying user experience and creates a new product in an existing market that puts the higher bar of excellence.
Yes, it’s expensive, and yes, this market hasn’t proven it can go beyond a niche. These are very important considerations to be discussed in other articles. For now, I’ll pass on my experiences and impressions here, based on an hour-long demo at Apple Park. (I was not allowed to take any photos or record videos; the photos posted here were provided by Apple.) The device I used is an early beta, so it’s possible, even likely, whether hardware or software changes before next year.
An emotional high, thanks to the screens
My Apple Vision Pro demo covered a lot of ground, but the demonstrations of spatial photos and videos and immersive experiences are what impressed me with what the headset could do. With the spatial media that Apple demonstrated, I was placed right in the middle of recorded memory and it triggered my own memories and emotions from similar times. How would you like to see my own space media moments? Incredible, I imagine.
Apple
The immersive video gave me chills. My body reacted to situations and my mind reacted to sights and sounds. In one demo, I got to pet a fucking dinosaur. Not a beast that looks like a 3D model on an illustrated background, but a realistic looking dinosaur that sniffed my hand and let me stroke it. And it sent shivers down my spine.
Now, that feeling of immersion isn’t new to the world of VR headsets – it’s the core of the product in general. But the difference with the Apple Vision Pro lies in the two screens placed in front of each eye. The resolution and color they display are fantastic and make things look realistic. They are not perfect. I sometimes notice pixelation and have often seen stutter playback not of demo videos but of people live in the room with me in real time.
Apple’s demo videos during the keynote and on its website give the impression that wearers never see the headset surrounding the video, but you see it, although it doesn’t interfere with the feeling. of immersion.
Usability and portability
When I first saw the home screen, my instinct was to use my finger to tap an icon, like you would on an iPhone or iPad. But instead, you watch what you want to use, then use hand gestures to perform an action. At first I felt like I had a hard time adapting to this method, but after 20 minutes it was natural.
Apple
It helps you not have to reach out and do the gesture of an orchestra conductor. I was able to keep my arms comfortably by my side while I sat on a couch and navigated the UI, and the EyeSight eye tracking was accurate and didn’t feel like it strained my eyes . My experience using the operating system was limited; I haven’t had a chance to use the on-screen keyboard, nor Bluetooth input devices. I also didn’t use the Apple Vision Pro as a Mac display, a major part of the main presentation.
Before my demo, I was worried that the headset wouldn’t fit my head, which is a little big (my head circumference is between 7 ½ and 7 ¾.) But when I put on the Apple Vision Pro for the first time I had to tighten – not loosen – the straps to get a good fit. In my demo, the headset had a Velcro strap that goes across the top of your head – this strap isn’t shown in any Apple product photos or videos. I think after taking my head measurements, Apple determined that I would benefit from this top strap.
Apple uses what it calls a Light Seal to close the gap between the face and the headset. It blocks the light in the room and for me it fits nicely at the top where my eyebrows are. But underneath, there was a visible space between my nose and the helmet. When asked how Light Seal sizing works, Apple said it doesn’t use a simple small, medium, and large sizing scheme, but the Seal comes in a multitude of shapes and sizes – after everything, faces come in a multitude of forms. and sizes. Had I been in a retail situation, I could have asked Apple to adjust the fit.
Apple
After about an hour my demo was complete and I took the helmet off. I didn’t feel any fatigue in the neck or any tenderness where the helmet and straps squeezed my head. I felt like I could have gone on longer with the session, and I would have if Apple let me. I left the demo feeling like I had a chance to see something truly innovative that can really affect the way we use computers in the future if Apple plays its cards right and the market embraces the ‘Apple Vision. And I think he has a chance to do so.
Our Apple Vision Pro guide has everything you need to know about the mixed reality headset. And don’t forget to check out the other WWDC23 announcements.