‘Hey Meta’: New AI Features Come to Meta’s Ray-Bans

April 24, 2025

As Google starts to revive its Google Glass concept, Meta is already a step ahead with new artificial intelligence functions coming to glasses this summer. The Ray-Ban smart glasses, in partnership with Meta, are getting several powerful AI updates for US and Canadian users. 

Operating the Meta View app on a connected smartphone, users of Ray-Ban smart glasses will also be able to use the “Hey Meta, start live AI” command to give Meta AI a live view of whatever they are seeing through their glasses. 

Similar to Google’s Gemini demo, users will be able to ask Meta AI conversational questions about what it sees and how it might solve problems. Meta provided the example of Meta AI giving possible substitutes for butter based on what it sees when you look in the pantry. 

Even without live AI, you’ll be able to ask specific questions about objects that you’re looking at.

In addition to new seasonal looks, Ray-Ban’s smart glasses will be also able to use the “Hey Meta, start live translation” command to automatically translate incoming languages including English, French, Italian and Spanish. The glasses’ speakers will translate as other people talk and you can hold up your phone so the other party can see a translated transcript too. 

Along with these AI upgrades, the smart glasses will be able to post automatically on Instagram or send a message on Messenger with the right voice commands. New compatibility with music streaming services will also allow you to play songs through Apple MusicAmazon Music and Spotify on your glasses in lieu of earbuds.

Meta reports that the rollout of these new features will happen this spring and summer, along with object recognition updates for EU users coming next week. 

Meta and Ray-Ban didn’t immediately respond to a request for further comment. 

Watch this: Meta Ray-Bans Live Translation and Live AI Demo