Meta’s AI for Ray-Ban smart glasses can identify objects and translate languages
Meta is finally going to let people try its splashiest AI features for the Meta Ray-Ban smart glasses, though in an early access test to start. Meta announced that it’s going to start rolling out its multimodal AI features that can tell you about things Meta’s AI assistant can see and hear through the camera and microphones of the glasses.
Mark Zuckerberg demonstrated the update in an Instagram reel where he asked the glasses to suggest pants that would match a shirt he was holding. It responded by describing the shirt and offering a couple of suggestions for pants that might complement it. He also had the glasses’ AI assistant translate text and show off a couple of image captions.
Zuckerberg said, “People would talk to the Meta AI assistant throughout the day about different questions you have.” He further suggested that it could answer questions about what wearers are looking at or where they are.
The AI assistant also accurately described a lit-up, California-shaped wall sculpture in a video from CTO Andrew Bosworth. He explained some of the other features asking the assistant to help caption the taken photos or translating and summarising fairly all common AI features seen in other products from Microsoft and Google.
“The test period will be limited in the US to “a small number of people who opt in,” Bosworth said.