I spent a few hours yesterday with the Apple Vision Pro at Apple’s Executive Briefing Center in NYC. It is true that the “demo” was definitely Apple-directed, and crafted with its usual care to detail, but this doesn’t detract from the importance of being able to get hands-on with this new platform. Interestingly enough, we didn’t log into our own personal Apple IDs, so what was NOT demonstrated was the tight integration with the rest of Apple’s ecosystem. We also didn’t do any of the “profile setup” to display one’s image on the front glass. Nor did we build custom avatars to interact collaboratively in a shared space.
Despite all of this, the experience was mind-blowing. The immersive content is so advanced and such high fidelity that it’s hard to describe in mere words. I believe this will be the first “killer app” for this platform. If the NFL (or NBA or MLS, etc.) can place high fidelity cameras around the playing fields/courts and capture in 3D the game action and also capture the spatial audio of the gameplay itself and then pipe that back to a viewer at home wearing an AVP, it will make the viewer feel like they are right there, on the sidelines or courtside. I’m quite sure the NFL is already working on this. Can anyone say, Sunday Ticket 2028?
Similarly, the ability to capture high fidelity video and audio on the iPhones in our pockets and then use the AVP to experience (or re-experience) those events is….just astonishing. Birthday parties, family events, concerts, conventions, religious worship, sporting events. The list is just endless and none of us can actually predict all the ways content will be generated, shared, re-lived, etc.
The AVP is not an AR or VR device. It’s both and more. There’s a reason Apple is calling it “Reality OS” and the experience, “Mixed Reality.” You can choose to leave your physical environment around you or you can dial back the immersive experience and allow the reality outside to fade in/out and overlap. We saw some of the apps that companies are designing: Airline flight simulators, 3D models that can be “explored” in VR, “digital twin” models that you can interact with for training purposes, and more.
It’s an incredible but imperfect piece of first-generation technology, as all first-gen technology tends to be. It’s too heavy. The battery pack is unusual but understandably necessary. I am near-sighted so had to use the Rx Zeiss lenses I ordered, and even then, my brain was having a hard time with the peripheral vision on the AVP. I imagine that with more hours of use, my brain will adjust, but the constant focusing and unfocusing was disorienting, jarring, and annoying.
But we must view this in the historical context of technology advancement. You can’t have a 2nd gen or 3rd gen of something without a 1st gen. I’m eager and excited to see this platform evolve. It’s just possible that we’re at the edge of a new paradigm shift in how humans interact with technology.