On Vision Pro only Apple’s software can directly capture the real world.

Apple touted capturing spatial photos and videos as a selling point for Vision Pro, but in a WWDC23 developer talk, an Apple engineer confirmed third party developers don’t get access to camera feeds, citing privacy concerns.

So what happens if you try to run an iPad app that leverages device cameras?

For the selfie camera, visionOS will return a virtual webcam view of you as your Persona, Apple’s name for its realistic face-tracked avatars. This means you’ll be able to use iPad video calling apps like Zoom with no specific developer integration needed.

0:00

/

Your Persona will be shown to iPhone/iPad apps requesting the selfie cam.

For the rear camera though, visionOS will return a black feed with a “no camera” icon in the center. This ensures the app works without crashing, but obviously renders any in-app photography experiences useless. It also prevents developers from building their own custom computer vision solutions.

To be clear, developers can build apps that use real world passthrough as the background, and the headset’s APIs tell them about surfaces and furniture in the user’s space. But they won’t get actual access to the actual camera imagery as they do on iPhone and iPad.

Meta too disallows raw camera access on its Quest headsets, also citing privacy concerns, and HTC blocks it on Vive XR Elite. HTC’s headset does have built-in ArUco fiducial marker tracking though.

ByteDance doesn’t allow raw camera access on the consumer Pico 4, but the Pico 4 Enterprise sold to registered businesses does allow it.

AR/VR headsets are a new category in wider culture, and companies like Apple and Meta are likely trying to avoid the negative publicity involved if an app were to exploit raw camera access for malicious purposes. As headsets become more widely adopted and accepted in society though this may change, just as iPhone developers didn’t get raw camera access until iOS 4 in 2010.

Write A Comment