I’m currently wearing a pair of smart glasses called the Even Realities G2. Another two pairs, from Rokid, sit on my desk. A few feet away, I’ve got the Meta Ray-Ban Display charging alongside their Neural Wristband. In my closet are six pairs of $50 smart sunnies that an overzealous Walmart rep sent me. Those sit next to some Xreal, RayNeo, and Lucyd glasses, plus an old pair of Razer Anzu. Later, I’m calling my optician because I’m hoping to test a pair of the new Ray-Ban Meta Optics, which can supposedly handle my challenging prescription. I’m drowning in smart eyewear — and even more is on the horizon.
Right now, it’s difficult to tell these devices apart. Not only do they look alike, but most are similarly unsubtle in their attempts to stick AI on your face. They’re loaded with promises about how wearable AI can change your life: It’ll make you healthier by tracking what you eat, make you smarter by capturing notes on every word you utter, and make you more creative by transforming your surroundings into playlists and date ideas.
But after a year of testing, I’m yet to see anything that lives up to those promises. And if the smart glasses category is going to succeed, it’s going to need a better story for why they should stay on your face all day.
Regardless of what model I put on in the morning, modern smart glasses make me feel like James Bond. I can walk around the neighborhood wearing a pair of chunky Ray-Bans, listen to my audiobook, and see my texts without pulling out my phone. If I feel like getting a coffee, I can put in the name of a local cafe and get directions. No one looking at me would know.
That’s doubly true when the glasses come with cameras or gesture-based accessories (see: the Even Realities G2 and Meta Ray-Ban Display). Secretly controlling an invisible display that only I can see? Incredibly cool. Capturing my cat’s antics without him knowing? Move over, David Attenborough, I’m the wildlife documentarian now.
I have never felt more hip than when I was walking down a Williamsburg street last summer, wearing a pair of Oakley Meta HSTN. The most stylish man in Brooklyn stopped me to ask about the glasses and my experience. I’ve also never felt less like a good citizen than when I unintentionally recorded a florist while testing the Meta Ray-Ban Display.

“Good” modern smart glasses are defined by how much you can get away with. It’s good if no one clocks them. That makes them stylish and versatile enough for everyday wear. It’s good if you have a fancier model that doesn’t require you to speak AI voice commands aloud. You’re less conspicuous, but still get the benefits.
Even Realities’ G2 glasses can be controlled by tapping on the side of an accompanying smart ring. I could be looking at a teleprompter on the G2’s display, and someone standing in front of me would be none the wiser. When I was at my local LensCrafters getting fitted for the Nuance Audio — a pair of glasses that double as over-the-counter hearing aids — the optician asked if I was ready to “be a superspy” because I’d be “able to hear all the good gossip from across the room.” (The reality? Good gossip comes straight to your DMs, and I mostly just hear tinny garbling.)
There’s a reason spies operate incognito. Recognizability is a threat when you’re wearing one of these devices — for you and the people around you. In public bathrooms, I now worry about making others uncomfortable. I’m not a creep, but strangers don’t know that. When I occasionally wear camera glasses to a concert or show, I wonder how long I’ll be able to do so before venues start banning them. (Cruises and courtrooms already have.) On the one hand, I got okay-ish Stray Kids concert footage last year. On the other hand, will Patti LuPone stop her next Broadway show to berate me if the glasses accidentally turn on, their LED indicator light flashing in a dark crowd?
The angrier people get about this tech’s privacy invasions, the more nervous and deceitful I feel wearing them. People might know these devices exist, but most still don’t expect to see them in their day-to-day lives. I’ve yet to have a negative in-person interaction, but would the internet be calling these “pervert glasses” if glassholes weren’t making a comeback?
The optimist in me says this is the most affordable, stylish, comfortable, and capable smart glasses have ever been. The skeptic in me asks whether that’s a good thing.
It’s a big step forward that I don’t feel ugly wearing these glasses. The harder thing is convincing myself to keep them on. Big Tech wants smart glasses to be AI wearables, but right now, the AI stinks for most people. Meta AI isn’t great, and the glasses that come with proprietary models layered over ChatGPT aren’t much better. These AI integrations are fine for basic tasks like controlling music playback or asking about the weather. But the advanced AI features are often a battery drain, stupendously basic, or unusable in daily life. Sometimes all of the above.
My spouse, who exclusively uses their Meta glasses to identify obscure car models, sometimes drags me along to local car shows. One time, I had to listen to Meta AI fail six times to identify a Ferrari. At the Vatican Museum, it correctly identified the Belvedere Torso, but the lack of a holy Wi-Fi signal rendered the AI otherwise useless. Rokid’s AI constantly tells me I’ve not adequately set up permissions for certain features, or that my Bluetooth connection is spotty. I quickly gave up on the Lucyd glasses because using ChatGPT through them was more trouble than it was worth. Even Realities recently built in a Conversate feature, which uses AI to define phrases or present useful factoids related to your conversation. I tried using it in a product briefing. The feature peppered my vision with the definition of “artificial intelligence” and “wearable technology.”
When I go to tech companies’ shiny smart glasses demos, I always ask what scenarios I should try. I’m usually given examples like identifying a book to read from a shelf of travel books, or getting recipe suggestions from a well-curated shelf of pasta, red wine, and sun-dried tomatoes. Ooh, maybe ask the AI to generate a playlist based on a piece of artwork hanging on the wall? These scenarios feel utterly inorganic. My to-be-read pile of books is a mishmash of genres. When I snapped a photo and asked for a recommendation, Meta AI told me it didn’t have preferences or opinions — I should just pick what interests me. My fridge is a hodgepodge of veggies about to wilt, separate from my pantry. I tend to play music based on my mood, not a painting’s.
The features that have felt purposeful are occasional. I like turn-by-turn navigation, except New York City has a handy grid system and every smart glasses maker recommends you don’t use the devices for driving. AI translation requires quiet environments where you don’t have cross talk, which don’t materialize often. Same goes for live captioning. Teleprompters can be useful, if you’re the sort of person who often gives lectures. I’m just not.

I found smart glasses to be most useful when I’m traveling. Outside of some accessibility communities, these glasses are best for business people or content creators always on the go — which is maybe why Silicon Valley is so gung-ho on them. For everyone else, they’re a cool pair of open-ear headphones.
Wearing these glasses, it’s never been clearer that companies are inventing scenarios because they so badly want this to work. And the better the tech gets, the question I’m left asking is: But why are you insisting I need this on my face?
Sometimes I feel tech companies have forgotten that, first and foremost, people wear glasses to see. Only in the past few weeks has Meta, the front-runner, come out with a version of its glasses that supports all prescriptions. And of all the brands I’ve tested, only Even Realities confidently said, “We can absolutely handle your prescription with zero problem.” I was told they can accommodate up to ±12. Impressive, though you’re still out of luck if you need bifocals.
Most of these devices don’t support my vision needs, which means every morning, one of the first choices I make is: Do I wear contacts, contacts and smart glasses, or my normal, “dumb” glasses? Sometimes that’s an easy choice. Most days, it’s not. As the tech improves, it’ll be easier to make these devices lighter and incorporate displays for more complex prescriptions. But because of the countless permutations of face size and vision, this is an infrastructural and supply chain problem too. (One that smart rings also share.) Fixing this will take time.

Even if I intend to wear smart glasses all day, my eyes sometimes get so dry I have to swap them out for my regular glasses. Also, what happens if your glasses break? This has happened to me a few times in my life. The last time, I was lucky that a pair of pliers and a heat gun did the trick, but these kinds of DIY repairs are impossible with smart glasses, where the tech lives in the frames. New glasses in the US can be an exorbitant expense, and the whole idea that I wouldn’t be able to replace nose pads or screws on my own? I never thought I’d have to ponder right to repair for my vision.
A smartphone’s main benefit is useful for everyone regardless of their body or needs. There are multiple sizes, plentiful accessibility features, and accessories like cases, straps, and mounts for any situation. Until glasses can claim the same, they’re doomed to be niche devices.
Oddly, in a way, I’m more optimistic about smart glasses than ever. The current crop of smart glasses still ain’t fully it — but for the first time it’s not because the devices just plain suck. It’s more that I don’t think anyone’s presented a clear idea of why you’d want these on your face all day, every day. But finally, I can at least see glimmers of why I might want to use these glasses sometimes.
Regardless of what Big Tech thinks, AI isn’t it. In real life, you look unhinged nattering away at your glasses. But companies have also seemingly forgotten gadgets are meant to be put away. A phone can go into your pocket. A laptop gets stashed in your bag. The only time I take my glasses off is to sleep. In an ideal world, I’d like the “smart” part of glasses to be something I can easily remove, depending on the situation. I find certain features potentially useful for my job, but like with my phone, I’d love a mode that turns them off when I’m off the clock. Big Tech doesn’t seem to agree. It wants the next big thing regardless of whether it makes sense for the device. To me, that’s where all this cultural friction comes from.

I’d wager most people would be okay with temporarily sacrificing some privacy in specific scenarios where the benefit outweighs the cost. Smart glasses in museum tours? Awesome. As tools in factories to help multilingual employees better communicate? Makes sense! Camera on your face 24/7 that can surreptitiously capture images and then feed a faceless corporation’s AI your data to ultimately fuel its targeted ad revenue? Instantly creepy and no thank you.
I’ve tested about a dozen of these things. Several more are on the way, and I’m sure I’ll hear companies tell me how the next generation will fix my issues with the current one. Or come up with several more half-baked reasons why these should be 24/7 devices. But so far, none of these fancy AI use cases are what I’m enjoying.
The smart glasses I enjoy most are the jabroni-chic Oakley Meta Vanguard. I use them exclusively for training and recording race moments, everybody else can clearly see why I look like a mall cop, and no one is likely to punch my brains out because who wants to go near a sweaty cyberpunk doofus while they’re running as fast as they can? It’s okay that these glasses aren’t all-purpose. They were never meant to be.
Read the full article here
