A tutorial video reveals how you set up EyeSight and your Persona on Apple Vision Pro.
Vision Pro is set to be the first headset to ship with a display on the front. It shows a rendered view of your upper face to other people in the room when they’re nearby, a feature called EyeSight. When an app is blocking your view of the physical environment, EyeSight shows a translucent colored pattern in front of your virtual eyes, and when you’re in a VR app the pattern becomes opaque.
The EyeSight display goes beyond 2D – it’s lenticular. It shows different views for different viewing angles to achieve a sense of depth, so that it looks like a transparent view to your eyes rather than a flat image.
To generate a believable view the headset needs to know what you look like though, including your skin tone, eye shape, eye color, nose shape, and the other details of your upper face. So during setup, you hold the headset in front of your face and the display guides you through a face scan.
The exact process is revealed in a tutorial video included in visionOS beta 6, spotted by X user M1Astra.
This face scan also generates your Persona, Apple’s realistic virtual avatars for FaceTime and third-party apps with social features.
Your Persona is driven in real time by the headset’s eye and face tracking sensors. It currently only includes your upper body, but Apple is reportedly working on full body tracking for some time after launch.
Meta has been showing off research towards realistic avatars for over four years now, but it looks like Apple will be the first to ship – albeit not at the same quality of Meta’s research. Apple’s front display also enables a polished and intuitive setup experience, whereas headsets without one would have to rely on audio feedback or smartphone scans.