Meta Unveils Three New Features for Ray-Ban Smart Glasses: Live AI, Live Translations, and Shazam
New Features for Early Access Program Members
Meta has announced the rollout of three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam. These features are currently limited to members of Meta’s Early Access Program, while Shazam support is available to all users in the US and Canada.
Live AI and Live Translation
Live AI and live translation were first teased at Meta Connect 2024 earlier this year. Live AI allows users to naturally converse with Meta’s AI assistant while it continuously views their surroundings. For example, if a user is perusing the produce section at a grocery store, they can ask Meta’s AI to suggest some recipes based on the ingredients they’re looking at. Meta says users will be able to use the live AI feature for roughly 30 minutes at a time on a full charge.
Live translation, on the other hand, allows the glasses to translate speech in real-time between English, Spanish, French, or Italian. Users can choose to either hear translations through the glasses themselves or view transcripts on their phone. However, users must download language pairs beforehand and specify what language they speak versus what their conversation partner speaks.
Shazam Support
Shazam support is a more straightforward feature. Users simply need to prompt the Meta AI when they hear a song, and it should be able to identify the song. Meta CEO Mark Zuckerberg demonstrated the feature in an Instagram reel.
Software Requirements
If users don’t see the features yet, they should check to ensure their glasses are running the v11 software and that they’re also running v196 of the Meta View app. If they’re not already in the Early Access Program, they can apply via a website.
The AI-Powered Future of Smart Glasses
The updates come as Big Tech is pushing AI assistants as the raison d’être for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, and specifically positioned its Gemini AI assistant as the killer app. Meanwhile, Meta CTO Andrew Bosworth recently blogged that "2024 was the year AI glasses hit their stride." In the blog, Bosworth also asserted that smart glasses may be the best possible form factor for a "truly AI-native device" and the first hardware category to be "completely defined by AI from the beginning."
Conclusion
Meta’s new features for its Ray-Ban smart glasses aim to make the devices more convenient and connected. With live AI, live translation, and Shazam support, users can expect a more seamless experience when interacting with their surroundings.
FAQs
Q: What are the new features announced by Meta for its Ray-Ban smart glasses?
A: Live AI, live translation, and Shazam support.
Q: Which users can access the new features?
A: Members of Meta’s Early Access Program, as well as all users in the US and Canada for Shazam support.
Q: What is live AI, and how does it work?
A: Live AI allows users to naturally converse with Meta’s AI assistant while it continuously views their surroundings.
Q: What is live translation, and how does it work?
A: Live translation allows the glasses to translate speech in real-time between English, Spanish, French, or Italian.
Q: How do I get the new features?
A: Check to ensure your glasses are running the v11 software and that you’re also running v196 of the Meta View app. If you’re not already in the Early Access Program, you can apply via a website.

