If you’ve got one of the best iPhones, you may be holding out for Apple’s rumoured smart glasses – a pair of AI-focused glasses designed to rival products like the Ray-Ban Meta (pictured above). 

And while they’re likely a while off yet, new rumours have emerged to help fuel the hype.

According to a MacRumors source, you may be able to control the Apple smart glasses without having to touch them at all.

Instead, the glasses are said to rely on hand gesture controls, detected via a secondary camera built into the frame. Along with a main high-resolution camera for photos and video, a lower-resolution wide-angle lens apparently tracks your hand movements and provides visual input for Siri, offering a way to interact without a screen.

That last bit is important, because, at least in this first version, there reportedly isn’t one. Unlike the Vision Pro, Apple’s glasses are expected to skip a display entirely. Instead, like Meta’s glasses, the focus appears to be more on capturing content, asking questions, and getting quick AI-driven responses about the world around you.

Apple’s version, though, is expected to tie in with a rumoured smarter, next-gen Siri – the one expected to arrive with iOS 27 – suggesting a stronger AI focus from day one.

We’ve also had hints about the design – Apple is said to be testing multiple styles, including frames made from acetate – a plant-based material often used in premium eyewear that’s lighter and more flexible than standard plastics.

As for timing, nothing’s locked in. The latest rumours suggest a possible preview later this year, with a launch in 2027 – though, as ever with Apple’s more experimental hardware, that timeline could easily slip. Stay tuned. 

Read the full article here

Share.
Leave A Reply

Exit mobile version