Meta’s Revamped Ray-Ban Gen 2 Glasses Are Worth It Just for the Battery Boost
September 29, 2025
I stared at a flower outside my hotel near Meta’s campus and asked my new Ray-Ban Meta Gen 2 glasses to identify the species. I got multiple answers. Each time I asked if Meta was sure about that flower, its response changed. Eventually, the AI embedded in the glasses admitted that, yes, it was being unreliable. On the plus side, at least I don’t have to worry as much about battery life now.
Smart glasses are better than they’ve ever been, thanks to Meta. They’re not perfect, not by a long shot, but I don’t expect perfection. I just want a pair of smart glasses that’ll last most of the day before needing a recharge. At $379, the second-gen Ray-Ban Meta glasses are my go-to choice, and a clear upgrade over the still-available $299 first-gen model. Double the battery life is more than worth it, and it has a massive impact on how functional these glasses feel.
Note that Meta’s Ray-Bans and Oakley HSTN glasses — which share the same camera and battery features and differ only in design — officially only support prescriptions up to a +6 or -6 . I’m a -8/-8.5, but have custom lenses in them. You can get custom lenses via third-party suppliers, but they’re not authorized by Meta.
Same design, much better battery
My adventure testing the new Ray-Bans began right when the new model was announced at Meta’s campus last week. I chose an identical Wayfarer frame to make sure my lenses would be compatible. In size and shape, the new versions look identical. The leather-like charging case hasn’t changed either.
What’s changed at the higher $379 price is battery life and camera quality, with battery life being the biggest improvement. My 2-year-old Ray-Bans barely lasted a few hours on a charge, but the new models run anywhere from 4 to 12 hours, depending on use.
One day at Meta, the battery lasted from 8 a.m. to nearly 9 p.m. with occasional AI prompts, photos, videos, some music and phone calls. Another day, on a nonstop run to the airport with music and podcasts playing, it lasted from 9 a.m. breakfast to my 1 p.m. flight. Results varied day to day, but I’m no longer in the same battery-life panic with my glasses that I used to be.
If you’re doing some of Meta’s more intensive tasks, like Live AI, the battery drains a lot quicker. Live AI is a mode that keeps the camera on continuously so the AI can analyze or translate things on the fly. The previous glasses lasted only 30 minutes in this mode, but the second-gen version lasted one hour and 20 minutes in my at-home test.
Even in casual use, I find the battery running out by late in the day, leaving me with dead glasses or the need to recharge. The new Ray-Bans fast-charge to 50% battery in 20 minutes. I tried meditating one evening while they charged back up; you could just as easily take a nap or rest your eyes — or bring a spare pair of glasses.
None of this is ideal. Smart glasses should last a full day, like a smartwatch or a phone. Recharging means taking the glasses off, and Meta still doesn’t offer swappable batteries or a magnetic cable for charging while you wear them. Instead, that case remains the charging option — though it does have passthrough USB-C and its own battery, adding up to 48 extra hours of use.
Audio is great unless you’re in noisy areas
I’m still impressed by the Ray-Bans when it comes to listening to music and making phone calls. The tiny speakers embedded in the frame sound ambient, natural, and surprisingly loud. The built-in array of five microphones — the same as before — is fantastic for phone calls; no one ever realizes I’m speaking from glasses. Voices and podcasts, in particular, come through sharp and clear.
And yet, even with an automatic volume-adjusting mode for noisier environments, there’s only so much open-air speakers and mics can handle. Noise-canceling earbuds easily outperform these glasses in public or on a plane, but there’s serious convenience in not having to fish out earbuds.
The physical controls remain the same: You can use voice or the touchpad on the right arm to play music or podcasts or take calls, but I find I trigger that touchpad too easily sometimes. Still, it feels both magical and strange to wander around with my own personal ambient soundtrack and no visible earbuds, even if my wife and kids can hear the music a bit, too — there’s some audio bleed since the design is open-ear.
Camera: 3K video and stabilization, with slo-mo mode to come
As usual, I’ve been taking a lot of photos on the new Ray-Bans. I often use them as tiny snapshots for my memory. What did that menu say? What did those jars of jam have in them? Where did I park?
Slow-motion and “hyperlapse” timelapse modes will arrive to the second-gen Ray-Ban Meta and Oakley Meta glasses sometime this year. For now, the advantage is 3K video recording (2,203×2,938 pixels in portrait mode) at 30 frames per second, along with video stabilization.
(Video can also be shot at 1080p, though you’ll need to switch to the higher-res option in the Meta AI app’s glasses settings.) Photos appear unchanged, as far as I can tell, using the same ultrawide 12-megapixel camera.
Better video is a welcome upgrade, but a few key features I’ve wanted aren’t here yet. The glasses can only capture wide-angle and portrait mode (vertical) photos and video.
Unlike the new iPhones, these glasses don’t use square sensors on the front-facing cameras, which would allow both landscape and portrait shots. I’d love that feature for sharing on YouTube, CNET or other places where vertical video feels awkward.
There’s no zoom for photos either. And because I can’t see what I’m capturing, every shot feels like a leap of faith. With the camera only on the left side of the glasses, getting the aim right is tricky.
Meta’s sports-focused wraparound Oakley Vanguard visors coming in October center the camera, but these Ray-Bans (and the existing Oakley HSTN ones) still don’t.
And Meta still hasn’t dabbled in dual-camera 3D photo and video recording, which is surprising because the Quest would be a perfect place to view that content.
In fact, the Ray-Bans still don’t connect with Quest headsets at all, apart from sharing the common Facebook, Instagram and WhatsApp apps.
Meta glasses AI is a work in progress
The Trojan horse of these glasses, and most smart glasses now, is their promise to be wearable AI vessels. The idea is to let AI access your eyes and ears via the camera and microphones to try to help you interact with the world.
Meta calls that long-term vision “contextual AI,” and right now, it still needs a lot of work.
While these glasses can describe your surroundings or offer supposedly helpful commentary by snapping a photo and analyzing it, the range of responses is unpredictable. Sometimes Meta is accurate; other times it just makes things up. Most days, I find myself having existential arguments with the on-glasses AI voice of Judi Dench (one of several voices you can choose from) about things like the stuffed animals my son is holding up on the sofa. A brief snippet of our chats appears on the right.
Meta’s glasses also have some wonderfully interesting and even helpful assistive elements. They can describe what’s in front of you by snapping a photo. There’s also a Live AI mode that continuously uses the glasses’ video feed, but it drains the battery more quickly.
They can read a page of a book right in front of you or translate text into another supported language — currently French, Italian, German, Spanish and Portuguese. Plus, they can do live translation, much like Apple’s AirPod Pros and Google’s Translate app.
I know people who use the glasses’ AI vision features to help with vision impairment, and Meta also partners with Be My Eyes, a volunteer service that can access your glasses’ camera feed and audio to assist you remotely. There’s also a more detailed AI mode for vision impairment that provides richer descriptions to aid with navigation. But the glasses sometimes fail at their task, overgeneralize or misunderstand — and Meta itself warns about inaccuracies in the fine print.
Later this year, Meta is rolling out a fascinating “conversation focus” feature for the glasses, designed to tune out other voices in a room and zero in on whoever you’re looking at using the beam-forming microphones. For now, though, I still find the glasses mostly unaware of what I’m doing. I can ask for a photo to be snapped and analyzed, or restart Live AI, but that’s about it.
Meta needs more AI hooks to other apps
Another issue is that the glasses don’t work with many other apps. The Meta AI can hook in to Apple Music, Amazon Music, Spotify and iHeartRadio to play music, or use Shazam. Phone calls and texts can also be received, you can manage Google calendar appointments and the glasses can handle video calls and messages with WhatsApp, Facebook, Instagram and Facebook Messenger, but that’s it for now. All the other functions and apps on your phone are inaccessible. I can’t search for a file or send an email or check an iMessage, for example.
It still reminds me of the early days of smartwatches, before Google and Apple developed wrist wearables that (mostly) mirror what’s on your phone. The Ray-Bans are semi-firewalled off from your phone, and can only access the limited connections available through the app.
And, to further that, Meta AI is the only AI service on the glasses — no OpenAI, no Siri, no Gemini. Meta AI is far from perfect, and a year on I still find it’s a mixed bag when it comes to accuracy and usefulness.
Getting notifications is also an awkward process. The glasses announce messages via audio, which can be extremely distracting during a regular day. There’s no subtler way to indicate messages, as far as I’ve seen.
The ones to get if you’re interested in smart glasses now
Meta’s glasses, for all their unfinished pieces, are still the best on the market by far. The improved battery life this time around is a big step up, and I’ll definitely be wearing these more often. I’m not the sporty type, but if you are, it’s worth noting that Meta’s Oakley HTSN glasses offer similar battery life to these second-gen Ray-Bans.
I’d get these over the Ray-Ban Displays, which I haven’t even reviewed yet, just because they’re more affordable and simply functional. The Displays have a new interface and emerging tech that could take a year or more to really develop. But the second-gen Ray-Bans are excellent now.
Excellent, but not perfect. Google is coming out with its own AI camera and audio glasses soon, maybe as early as 2026 with Warby Parker and other eyewear partnerships. Google’s glasses should connect to a wider range of Google apps and services, although it’s still unclear. But others are also coming into this space, too.
At least these Ray-Bans still don’t cost an arm and a leg, and they’re going to improve over time. Do you want Meta on your face? That’s the other big question, especially when it comes to AI and data privacy and Meta’s own policies on AI and content moderation. You’re in Meta’s world with these Ray-Bans, but it’s not intruding too hard on yours, yet. For now, at least.
Search
RECENT PRESS RELEASES
Related Post