Meta Ray-Ban Display review: Chunky frames with impressive abilities

Oct 17, 2025 - 21:00
 0
Meta Ray-Ban Display review: Chunky frames with impressive abilities

I've been wearing the $800 Meta Ray-Ban Display glasses daily for ten days and I'm still a bit conflicted. On one hand, I'm still not entirely comfortable with how they look. I've worn them on the bus, at the office, on walks around my neighborhood and during hangouts with friends. Each time, I'm very aware that I probably look a bit strange.

On the other hand, there's a lot I really like about using these glasses. The built-in display has helped me look at my phone less throughout the day. The neural band feels more innovative than any wrist-based device I've tried. Together, it feels like a significant milestone for smart glasses overall. But it's also very much a first-generation device with some issues that still need to be worked out.

Chunky statement glasses or hideously nerdy?

To once again state the obvious: The frames are extremely chunky and too wide for my face. The dark black frames I tried for this review unfortunately accentuate the extra thickness. I won't pretend it's my best look and I did feel a bit self-conscious at times wearing these in public. Meta also makes a light brown "sand" color that I tried at the Connect event, and I think that color is a bit more flattering, even if the frames are just as oversized. (Sidenote: Smart glasses companies, please, please make your frames available in something other than black!) 

But, everyone has a different face shape, skin tone and general ability to "pull off" what one of my friends charitably described as "chunky statement glasses." What looks not-great on my face, may look good on someone else. I really wish Meta could have squeezed this tech into slightly smaller frames, but I did get more used to the look the more I wore them. Overall, I do think the size is a reasonable tradeoff for a first-generation product that's pretty clearly aimed at early adopters. 

Here's how they look in the lighter
Here's how they look in the lighter "sand" color.
Karissa Bell for Engadget

The reason the glasses are so thick compared with Meta's other frames is because there are a lot of extra components to power the display, including a mini projector and waveguide. And, at 69 grams, the display glasses are noticeably heavier. I didn't find it particularly uncomfortable at first, but there is a noticeable pressure after six or seven hours of wear. Plus, the extra weight and width also made them consistently slide down my nose. I'm not sure I'd feel comfortable wearing these on a bike ride or a jog as I'd worry about them falling off. 

While I tested these, I was very interested to get reactions from friends and family. I didn't get many positive comments about how they looked on my face, though a few particularly generous colleagues assured me I was "pulling them off." But seeing people's reactions as soon as the display activated was another matter. Almost everyone has had the same initial reaction: "whoa." 

Quality display with some limitations

As I discussed in my initial impressions, these glasses have a monocular display on the right side, so it doesn't offer the kind of immersive AR I experienced with the Orion prototype last year. You have to look slightly up and to the right to focus on the full-color display. It's impressively bright and clear, but doesn't overtake your vision. 

At 20 degrees, the field of view is small, but it never felt like a limitation. Because the content you see isn't meant to be immersive, it never feels like what's on the display is being cut off or like you have to adjust where you're looking to properly see it. The display itself has three main menus: an app launcher, a kind of home screen where you can access Meta AI and view notifications and a settings page for adjusting brightness, volume and other preferences. 

Like Meta's other glasses, there's an LED that lights up when the camera is in use.
Like Meta's other glasses, there's an LED that lights up when the camera is in use.
Karissa Bell for Engadget

For now, there are only a handful of Meta-created "apps" available. You can check your Instagram, WhatsApp and Messenger inboxes and chat with Meta AI. There's also a simple maps app for walking navigation, a music/audio player, camera and live translation and captioning features. There's also a mini puzzle game called "Hypertrail."

One of my favorite integrations was the ability to check Instagram DMs. Not only can you quickly read and respond to messages, you can watch Reels sent by your friends. While the video quality isn't as high as what you'd see on your phone, there's something very cool about quickly watching a clip without having to pull out your phone. Meta is also working on a standalone Reels experience that I'm very much looking forward to.

I also enjoyed being able to view media sent in my family group chats on WhatsApp. I often would end up revisiting the photos on videos once I pulled out my phone, but being able to instantly see these messages as they came in tickled whatever part of my brain responds to instant gratification. 

There's some impressive tech inside those thick frames.
There's some impressive tech inside those thick frames.
Karissa Bell for Engadget

The display also solves one of my biggest complaints with Meta's other smart glasses: that it's really difficult to frame photos. When you open the camera app on the display model, you can see a preview of the photo and even use a gesture to zoom in to properly frame your shot. Similarly, if you're on a WhatsApp video call you can see both the other person's video as well as a small preview of your own like you would on your phone's screen. It's a cool trick but the small display felt too cramped for a proper video call. People I used this with also told me that my video feed had some quality issues despite being on Wi-Fi.

The glasses' live captioning and translation features are probably the best examples of Meta bringing its existing AI features into the display. I've written before about how Meta AI's translation abilities are one of my favorite features of the Ray-Ban smart glasses. Live translation on the display is even better, because it delivers a real-time text feed of what the person in front of you is saying. I tried it out with my husband, a native Spanish speaker, and it was even more natural than the non-display glasses because I didn't have to pause and wait for the audio to relay what he was saying. It still wasn't an exactly perfect translation, and there were still a few occasions when it didn't catch everything he said, but it made the process so much simpler overall. 

Likewise, live captions transcribes conversations in real-time into a similar text feed. I've found that it's a cool way to demo these glasses' capabilities, but I haven't yet found an occasion to use this in anything other than a demo. However, I still think it could be useful as an accessibility aid for anyone who has trouble hearing or processing audio. 

Another feature that's useful for travel is walking navigation. Dictate an address or location (you can say something like "take me to the closest Starbucks") and the glasses' display will guide you on your route. The first time I tried this was the roughly 10-minute walk from my bus stop to Yahoo's San Francisco office. The route only required two turns, but it didn't quite work. My glasses confidently navigated me to an alleyway behind the office building rather than the entrance. These kinds of mishaps happen with lots of mapping tools — Meta's maps rely on data from OpenStreetMap and Overture — but it was a good reminder that it's still early days for this product. 

I don't use Meta AI a ton on any of my smart glasses, but having a bit of visual feedback for these interactions was a nice change. I retain information much better from reading than listening, so seeing text-based output to my queries felt a lot more helpful. It's also nice that for longer responses from the assistant, you can stop the audio playback and swipe through informational cards instead.

Meta AI on the glasses' display delivers information in a card-like interface.
Meta AI on the glasses' display delivers information in a card-like interface.
Meta

While cooking dinner one night, I asked for a quick recipe for teriyaki salmon and Meta AI supplied what seemed like a passable recipe onto the display. The only drawback was the display goes to sleep pretty quickly unless you continue to interact with the content you're seeing, so the recipe I liked disappeared before I could actually attempt it. (You can view your Meta AI history in the Meta AI app if you really want to revisit something.) 

My main complaint is that I want to be able to do much more with the display. Messaging app integrations are nice, but I wish the display worked with more of the apps on my phone. When it worked best, I was happy to be able to view and dismiss messaging notifications without having to touch my phone; I just wish it worked with all my phone's notifications. 

There are also some frustrating limitations on sending and receiving texts. For example, there's no simple way to take a photo on your glasses and text it to a friend with the glasses. You have to wait for the glasses to send a "preview" of your message to your phone and then manually send the text. Or, you can opt in to Meta's cloud services and send the photo immediately as a link, but I'm not sure many of my friends would readily open a "media.meta.com" URL. 

The glasses also don't really support non-WhatsApp group chats. You can receive messages sent in group chats, but there's no indication the message originated in a group thread. And, it's impossible to reply in the same thread; instead, replies are sent directly to the person who texted, which can get confusing if you're not checking your phone. It was also a little annoying that reading and even replying to texts from my glasses wouldn't mark the text as read in my phone's inbox. Meta blames all this on Apple's iOS restrictions, and says it's hoping to work with the company to improve the experience. 

The band + battery life

The glasses are controlled using Meta's Neural Band, which can translate subtle gestures like finger taps into actions on the display. Because the band relies on electromyography (EMG), you do need a fairly snug fit for it to work properly. I didn't find it uncomfortable, but, like the glasses, I don't love how it looks as a daily accessory. It also requires daily charging if you wear the glasses all day.

But the band does work surprisingly well. In more than a week, it almost never missed a gesture, and it never falsely registered a gesture, despite my efforts to confuse it by fidgeting or rubbing my fingers together. The gestures themselves are also pretty intuitive and don't take long to get used to: double tapping your thumb and middle fingers wakes up or puts the display to sleep, single taps of your index and middle fingers allow you to select an item or go back, and swiping your thumb along the side of your index finger lets you navigate around the display. There are a few others, but those are the ones I used most often.

The Meta Neural Band requires a snug fit to work properly.
The Meta Neural Band requires a snug fit to work properly.
Karissa Bell for Engadget

Each time you make a gesture, the band emits a small vibration so you get a bit of haptic feedback letting you know it registered. I've used hand tracking-based navigation in various VR, AR and mixed reality devices and I've always felt a bit goofy waving my hands around. But the neural band gestures work when your hand is by your side or in your pocket. 

The other major drawback of these glasses is that heavy use of the display drains the battery pretty quickly. Meta says the Ray-Ban Display’s battery can go about six hours on a single charge, but it really depends on how much you're using the display. With very limited use, l was able to stretch the battery to about seven hours, but if you're doing display-intensive tasks like video calling or live translation, it will die much, much more quickly. 

The Meta Ray-Ban Display glasses, charging case and neural band.
The Meta Ray-Ban Display glasses, charging case and neural band.
Karissa Bell for Engadget

The glasses do come with a charging case that can deliver a few extra charges on-the-go, but I was a bit surprised at how often I had to recharge the case. With my normal Ray-Ban Meta glasses I can go several days without topping up the charging case, but with the Meta Ray-Ban Display case, I'm charging it at least every other day. 

Privacy and safety

Whenever I write or post on social media about a pair of Meta-branded glasses, I inevitably hear from people concerned about the privacy implications of these devices. As I wrote in my recent review of Meta's second-gen Ray-Ban glasses, I share a lot of these concerns. Meta has made subtle but meaningful changes to its glasses' privacy policy over the last year, and its track record suggests these devices will inevitably scoop up more of our data over time.

In terms of privacy implications of the display-enabled glasses, there isn't a meaningful difference compared to their counterparts. Meta's policies are the same for all its wearables. I suppose you could use live translation to surreptitiously eavesdrop on a conversation you wouldn't typically understand, though that's technically possible with Meta's other glasses too. And the addition of a wrist-based controller means taking photos is a bit less obvious, but there's still an LED indicator that lights up when the camera is on. 

The neural band allows you to snap photos without touching the capture button or using a voice command.
The neural band allows you to snap photos without touching the capture button or using a voice command.
Karissa Bell for Engadget

I have been surprised at how many people have asked me if these glasses have some kind of facial recognition abilities. I'm not sure if that's a sign of people's general distrust of Meta, or an assumption based on seeing similar glasses in sci-fi flicks, but I do think it's telling. (They don't, to be clear. Meta currently only uses facial recognition for two safety-related features on Facebook and Instagram.) Meta hasn't done much to earn people's trust when it comes to privacy, and I wish the company would use its growing wearables business to try to prove otherwise.

On a more practical level, I have some safety concerns. The display didn't hinder my situational awareness while walking, but I could see how it might for others. And I'm definitely not comfortable using the display while driving. Meta does have an audio-only "driving detection" setting that can automatically kick in when you're traveling in a car, but the feature is optional, which seems potentially problematic. 

Should you buy these?

In short: probably not. As much as I've been genuinely impressed with Meta's display tech, I don't think these glasses make sense for most people right now. And, at $800, the Meta Ray-Ban Display glasses are more than twice as much as the company's very good second-generation Ray-Ban glasses, which come in a wide range of much more normal-looking frame styles and colors. 

The Meta Ray-Ban Display glasses, on the other hand, still look very much like a first-gen product. There are some really compelling use cases for the display, but its functionality is limited. The glasses are also too thick and bulky for what's meant to be an everyday accessory. At the end of the day, most people want glasses that make them look good. There’s also the fact that right now, these glasses are somewhat difficult to actually buy. They are only available at a handful of physical retailers, which currently have a very limited supply, Meta is also requiring would-be buyers to schedule demo appointments in order to buy, though some stores — like the LensCrafters where I bought my pair — aren’t enforcing this.

Still, there's a lot to be excited about. Watching people's reactions to trying these has been almost as much fun as using them myself. Meta also has a solid lineup of new features already in the works, including a standalone Reels app, a teleprompter and gesture-based handwriting for message replies. If you're already all-in on smart glasses or, like me, you've been patiently waiting for glasses with a high quality, usable display, then the Meta Ray-Ban Display glasses are worth the investment now — as long as you can accept the thick frames.


This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-ray-ban-display-review-chunky-frames-with-impressive-abilities-193127070.html?src=rss