As someone who has a set of the Meta Ray-Ban smart glasses, the AI smarts are what really makes them special. In fact, I’d go as far as saying it’s the best feature. Up until now, it’s only officially been available in the US.

The availability of these features has been a little confusing, as I’ve been using them for a while in the UK. Not everybody had access, however. But now, the feature suite is officially out for those in the UK to get their hands (well, ears) on.

They now pack in Meta AI, which is powered by the Llama model. It’s essentially an AI chatbot in voice format. You can ask it queries, and it’ll give you answers – and pretty much nothing is too much for it to handle. It’s also great at helping me brainstorm or find a fact I’d usually quickly Google. If I wanted help writing a headline or a line of code, it’ll tell me what to write.

What really stands out is the vision feature. Meta and Ray-Ban’s glasses can take a photo with the camera and work out what you’re looking at. It’s a bit gimmicky at first – I asked it to identify the iPhone I was holding (and it could). But the more I’ve used it, the more useful it’s become. It can tell you facts about something you’re looking at, summarise text you’re reading, or translate signs into a language you understand.

Basically, anything you’d want the Humane Pin to do, it can do. And faster. Meta’s specs are a bit confusing when it comes to knowing your location. It knows the town I’m in, but can’t tell me what’s nearby (even though others have asked for recommendations nearby). And the AI model gets confused, since it isn’t supposed to know your location, but the hardware does. Weird.

In the near future, Meta says the Ray-Ban specs will support real-time translation from other languages such as Spanish, French, and Italian. With the AI features, I love using these smart glasses, and I think they’re the perfect AI hardware device.

Read the full article here

Share.
Leave A Reply

Exit mobile version