Working in the Apple Vision Pro

 

 
 

Apple Vision Pro launched in the U.S. today, kicking off what Apple calls the beginning of the “spatial computing” era. I have been using the headset, which starts at $3,499, since Tuesday night and if you follow me on X (formerly Twitter) and Threads, you already know that I’ve been testing the hell out of it.

I have tapped my thumb and index finger together within visionOS to do many things. I have worked in Vision Pro, throwing up a dozen app windows in my living room; I wrote part of my Vision Pro unboxing article in Safari and edited a few of my colleagues’ stories. I have messaged them many times on Slack using voice dictation, the virtual keyboard, and a wireless keyboard and trackpad. I’ve read short blog posts and long features. I’ve used Vision Pro as a giant virtual display for my MacBook Pro.

I have spent hours posting on X, responding to everyone’s Vision Pro questions and replies. I have watched one movie and two TV episodes on the biggest display that I’ve ever had in my own apartment. I have played several “spatial games” and viewed my massive collection of spatial videos, and 2D photos, panoramas, and videos. I have FaceTimed close friends and family to see their reaction to my “Persona.”

Hell, I’ve even eaten meals while using Vision Pro.