With ‘Hey Meta,’ Ray-Ban Wearers Will Unlock All-New AI Abilities — and Privacy Concerns

April 29, 2025

As Google starts to revive its Google Glass concept, Meta is already a step ahead with new artificial intelligence functions coming to glasses this summer. The Ray-Ban smart glasses, in partnership with Meta, are getting several powerful AI updates for US and Canadian users. 

Operating the Meta View app on a connected smartphone, users of Ray-Ban smart glasses will also be able to use the “Hey Meta, start live AI” command to give Meta AI a live view of whatever they are seeing through their glasses. 

Similar to Google’s Gemini demo, users will be able to ask Meta AI conversational questions about what it sees and how it might solve problems. Meta provided the example of Meta AI giving possible substitutes for butter based on what it sees when you look in the pantry. 

Even without live AI, you’ll be able to ask specific questions about objects that you’re looking at.

In addition to new seasonal looks, Ray-Ban’s smart glasses also will be able to use the “Hey Meta, start live translation” command to automatically translate incoming languages including English, French, Italian and Spanish. The glasses’ speakers will translate as other people talk and you can hold up your phone so the other party can see a translated transcript too. 

Meta AI and concerns about being filmed

unveiled-ces-2025-5058
Meta AI glasses at CES 2025. James Martin/CNET

When I reached Inna Tokarev Sela, CEO and founder of AI data company illumex about privacy issues with smart glasses like these, she mentioned that in her own experience with Ray-Ban smart glasses, people usually reacted when they noticed the recording indicator light, which meant the glasses were watching. That can make some people uneasy, whether they are concerned about being filmed by a stranger or by what Meta may be doing with all that visual data it’s collecting.

“In the new models you can control the notification light, which could pose a privacy risk,” Sela said. “But everyone films everyone all the time anyway at touristy landmarks, public events, etc. What I expect is that Meta will not divulge any information on anyone, unless they register and explicitly give their consent.”

This could lead to other consent headaches too, depending on if users are recording for other purposes. “For example, users should be able to opt in and choose the type of information to expose when they’re in someone’s frame — similar to LinkedIn, for example,” Sela said. “Of course, any recording resulting from the glasses should not be admissible to use in a court of law, as with any other kind of recording, without explicit permission.”

Additional updates and rollout schedules

Along with the AI upgrades, Ray-Ban’s smart glasses will be able to post automatically on Instagram or send a message on Messenger with the right voice commands. New compatibility with music streaming services also will allow you to play songs through Amazon MusicApple Music and Spotify on your glasses in lieu of earbuds.

Meta reports that the rollout of these new features will happen this spring and summer, along with object recognition updates for EU users arriving in late April and early May. 

Meta and Ray-Ban didn’t immediately respond to a request for further comment. 

Watch this: Meta Ray-Bans Live Translation and Live AI Demo

 

Search

RECENT PRESS RELEASES