Augmented World Expo in Long Beach is the first conference I’ve attended fully equipped with Meta Ray-Ban glasses and Apple Vision Pro as part of my reporting toolkit.
I made plans to attend Augmented World Expo this year in Long Beach without a schedule in mind. Here, there are people I wanted to see and talk to in the physical world. I’ve taken many of those meetings off the record or on background with discussions informing my reporting long-term. Some of those that are on the record, though, flowed directly from Meta Glasses into an iPhone and then back onto Apple Vision Pro for composition in the Safari Web browser. I’m typing these words on the iClever pocket keyboard on a gigantic screen that seems to stretch across my hotel room’s ceiling.
I’ve attended various technology conferences as a reporter for more than 15 years and, now with Meta glasses and an Apple headset, this is precisely the way I would want to cover any going forward. When I started down this career path, I was scribbling in handwriting into a paper notebook only really decipherable by me, and that capture process meant coming back to a desktop for turning those notes into a coherent article in a Content Management System (CMS) where, typically, a photographer separately turns in a set of photos on the same subject.
Here in 2024, I am recording a conversation face-to-face on my sunglasses and then taking it back to the media room where I’m turning a dial on my VR headset for complete solitude, blocking out noise completely with AirPods, setting the keyboard on the table and focusing everything on composing words, images, and video into a Safari window until the task is done. You can view my first attempts at this workflow in action in my articles on Sony’s headset, Ultraleap and Doublepoint.
Conferences like these are opportunities for ideas to mix and partnerships to forge. I went face-to-face with Palmer Luckey, Amanda Watson, Darshan Shankar, Nima Zeighami, Blair Renaud, Ashley Huffman, Bernie Yee, Tipatat Chennavasin, Aidan Wolf, Sonya Haskins, and many more.
As I typed this, T-Mobile texted me to tell me I’ve used 48 of my 50 GB of data for the month. I am attempting to place roughly 30 minutes of spatial video on iCloud showcasing the entire keynote discussion between Shankar and Luckey from my seat as I watched from the second row in my Vision Pro. I Air Dropped the videos to my iPhone because T-Mobile already throttled my tethering data earlier in the day.
And as I type this, the water lapping the beach in Bora Bora, I’ve been warned I’ve reached 20% battery on the Apple Vision Pro. I traveled as light as I could on this trip, with three high-output Anker batteries and several MagSafe ones to snap onto my iPhone.
There’s much more to explore in the workflows from glasses to headsets. For now, I’m using the last few percentage points of battery on the Vision Pro here tonight to note that photos and videos will be added to this article illustrating my text after all my hardware, and my head, have a chance to rest and charge up again.
I returned on Thursday to AWE once again and called Don Hopper on WhatsApp from my iPhone. I switched to my view on the Meta glasses with a double tap of the capture button and gave him a tour of the venue on whatever T-Mobile would let eke through. I walked down to the expo show floor with Apple Vision Pro in one hand and my suitcase strolling behind me in the other.
The glasses disconnected from my phone in the middle of it all, I assume from Bluetooth interference, and it was time anyway to try phase two of this little experiment. On the expo show floor, I put on the Vision Pro and continued the same WhatsApp call via iPhone Mirroring, seeing Don on his computer in Missouri from a VR headset in the middle of Long Beach, though this experience too seemed to be under the strain of overwhelming wireless interference.
I’m typing this on a laptop connected to hotel Wi-Fi in another area of Los Angeles that does me no more favors in uploading videos and photos from this event than T-Mobile did in Long Beach.