Ray-Ban Meta glasses now have an early access program for multimodal AI, letting you ask it questions about what you’re looking at.

The Ray-Ban Meta smart glasses launched at Connect 2023 as the successor to Ray-Ban Stories, the first-person camera glasses from 2021 that let you capture hands-free first-person photos and videos, take phone calls, and listen to music. Compared to the original model, the new glasses have improved camera quality, a superior microphone array, water resistance, and can livestream to Instagram.

But their most significant new feature is Meta AI. Currently available in the US only, it’s a conversational assistant you can talk to by saying “Hey Meta”. Meta AI is much more advanced than the current Alexa, Siri, or Google Assistant, because it’s powered by Meta’s Llama 2 large language model, the same kind of technology that powers ChatGPT.

Examples from Meta of what can be asked with multimodal.

In the new early access program, Meta AI will no longer be limited to speech input. It can now also answer queries about what you’re looking at or the last photo you took.

This multimodal capability, called Look and ask, has many potential use cases, such as answering questions about what you’re cooking, suggesting a caption for a photo, or even translating a poster or sign in another language.

The Look and ask early access program is available to a “limited number” of owners in the US, and should roll out more widely some time next year.



0:00

/0:19



Example of Meta AI translating a poster.

Another update arriving to Meta AI is real-time search capability. The system will automatically decide to search the web using Bing to find answers for questions relating to current events, sports scores, and more. This real-time search capability will be “rolling out in phases” to all US-based customers.

Meanwhile, there’s still no official timeline for even the audio-only Meta AI coming to any of the other 14 countries the glasses are sold in.

Write A Comment