- AI capabilities include translation and identification of objects, animals, and landmarks.
- Users can activate the smart assistant by saying “Hey Meta”.
- The AI features are currently available through an early access waitlist for users in the US.
- Meta plans to refine the features over time, addressing the occasional struggles with tasks such as identifying distant zoo animals or recognizing exotic fruits.
Meta will bring AI capabilities to its Ray-Ban smart glasses, enabling features like object and language recognition. The smart assistant, activated by saying “Hey Meta,” struggled with some tasks, but the company plans to improve the features over time.
Also read: Meta Ray-Ban glasses: Do they infringe on user privacy?
AI capabilities and language support
Meta plans to integrate AI into its Ray-Ban smart glasses starting next month, as reported by The New York Times. These AI capabilities, including translation and identification of objects, animals, and landmarks, have been in testing since last December.
Users can activate the glasses’ smart assistant by saying “Hey Meta” followed by a prompt or question. The assistant responds through speakers integrated into the frames. The NYT provides insight into the glasses’ performance during various activities such as shopping in a grocery store, driving, visiting museums, and even exploring the zoo.
Early access and future improvements
While Meta’s AI successfully identified pets and artwork, it wasn’t flawless. The glasses struggled to identify distant zoo animals behind cages and failed to recognize an exotic fruit called a cherimoya after multiple attempts. Regarding translations, the glasses support English, Spanish, Italian, French, and German.
Meta is expected to refine these features over time. Currently, the AI features are accessible via an early access waitlist for users in the US.