Smart glasses with microphones, cameras, embedded computing and even AR (augmented reality) have been the subject of spy thrillers for decades. But the reality of face-worn wearables with truly meaningful utility has become something of a white whale for the consumer tech industry.
It’s not for lack of trying, of course. I tested Google Glass at the company's offices in 2013, I was among the first in the UK to buy the original Snapchat Glasses in 2017, and big names like Sony and Oppo have tried their hand at the factor of form for years.
However, despite their best attempts, no company has managed to break the balance between form and function that has led to mainstream acceptance or adoption.
Foundry | Alex Walker-Todd
The most recent entrant into the field is the Ray-Ban Meta smart glasses (pictured), which, despite initially launching in late 2023, made headlines again in October this year, thanks to the expansion of functionality Meta AI in more markets internationally. including the UK.
This is the biggest distinction between Meta and (owners of the Ray-Ban brand) Luxottica, the first foray into smart glasses. 2021's Ray-Ban Stories contained an equally interesting assortment of technologies, but ultimately proved less than the sum of their parts, and AI might well have been the missing ingredient.
Different Shades of AI
One of the main benefits of the inclusion of Meta AI in Ray-Ban Meta glasses is its new multimodality. Through a feature called “look and ask,” the glasses can take a snapshot of whatever is in front of you and – using machine vision associated with Meta’s Llama AI model – explain what you’re looking at.
Depending on your request, you can even use “search and ask” to quickly summarize signage or literature, extract dietary information from food packaging, or discover new recipes, inspired solely by the ingredients you have in front of you . However, for the majority of users in the UK, this important facet of the Ray-Ban Meta experience remains inaccessible, with no set date for when it will be added.
Foundry | Alex Walker-Todd
Aside from those who have updated to the latest Ray-Ban Meta software via a regionally set VPN in the US, or have been whitelisted to access beta updates for the Meta View app, most UK users still don't have this device. definition of upgrade.
The reason? A combination of European AI law and GDPR, which have collectively hampered Meta's AI efforts in the region, resulting in a limited experience for local users.
Meta AI on your face
If, like me, you're a UK-based Ray-Ban Meta user, you're probably well aware of the limitations the built-in Meta AI experience currently suffers from.
Having Meta's assistant always present and completely hands-free is a very interesting boon in daily use; more accessible than turning to Gemini or Siri on my phone and less distracting, because there's no interface to stare at. I can still run conventional queries on the digital assistant — like checking the weather or step-by-step instructions for baking the perfect brownies — but beyond that, the experience still feels decidedly sparse and incomplete.
On paper, the combination of the Ray-Ban Meta's form factor and hardware configuration is a recipe for success, in terms of making AI interaction meaningful in everyday use.
However, without the machine vision-based multimodal component, the most useful features of these smart glasses are instead photo and video capture, built-in Bluetooth audio, and the ability to answer calls with solid voice clarity (thanks to to a quintet of microphones arranged around the frame). . A much less “intelligent” skill set than Meta wants to focus on.
Foundry | Alex Walker-Todd
Great for you, less so for everyone
When it comes to face-worn wearable technology, as mentioned above, no manufacturer has yet cracked the code to widespread adoption or acceptance, but the Ray-Ban Metas are arguably the best attempt at the industry to date.
Although the company's augmented reality efforts remain reserved for its Meta Quest XR headsets and Meta Orion concept, the addition of AI makes the Ray-Ban Meta spec the most accessible smart glasses yet. That said, the same privacy concerns raised in previous Ray-Ban Stories, not to mention Meta directly, don't really detract from this latest generation of smart glasses.
Rather, these concerns are the driving force behind the delayed expansion of revolutionary meta-AI integration and multimodality in glasses (outside of the US, Canada and Australia). My frustrations come squarely from the perspective of a user who knows they are unable to fully utilize the cutting edge technology they have.
That said, I already know that if I wait, my experience East will improve. One could argue that the opposite is true for everyone else on the other side of the Ray-Ban Meta camera lens. The lack of multimodality actually grants passers-by in the UK a greater degree of privacy than those where the full functionality of the glasses is already available.
Unless you're already able to spot a pair of Ray-Ban Meta smart glasses, understand that they have a built-in camera and can be used to take hands-free photos or videos and even streaming live, chances are you won't have a say. as to whether your image is captured, shared online, transmitted via Meta's servers, or any combination of these factors.
On the one hand, products like Ray-Ban Meta glasses are to be accepted in society, if only because people don't necessarily know that they are intelligent at first glance. As for their Meta AI integration, even if the wearer benefits, those around them will likely be less thrilled with the growing AI-powered repertoire of these specs, if they are even aware of its presence.