Google Glass has found yet another lease of life — but is it too little too late for smart glasses?

It’s been more than a decade since Google Glass smart glasses were announced in 2013, followed by their rapid withdrawal – in part due to low adoption. Their second, later (and lesser known) iteration was released in 2017 and aimed at the workplace. They were withdrawn in 2023.
In December 2025, Google achieved a new promise for smart glasses — with two new products coming out in 2026. But why have Google’s smart glasses struggled where others succeed? And will Google be successful the third time around?
These are the types of accessories that have emerged over the centuries and are currently adopted as normal in society.
Some of the most recent academic research takes this approach, building sensors in jewelry that people would actually want to wear. The research developed a scale to measure the social acceptability of wearable technology (the Wear scaleor Wearable Acceptability Range), which includes questions such as: “I think my peers would find this device acceptable to wear. »
Noreen Kelly of Iowa State University and colleagues showed that At its core, this scale measured two things: that the device helped people achieve a goal (making it worth wearing) and that it did not create social anxiety about privacy and being seen as rude.
This latter problem was highlighted by the term that appeared for Google Glass users: Glassholes. Although many studies have examined the potential benefits of smart glasses, from mental health to use in surgery, privacy issues and other issues are ongoing for newer smart glasses.
All that said, “look and feel” remains the most common concern of potential buyers. The most successful products were designed to be desirable first as accessories and then with smart technologies. Usually, in fact, by designer brands.
A beautiful show
After Google Glass, Snapchat launched smart glasses called “glasses”, with built-in cameras, focused on fashion and more easily accepted in society. The now most famous smart glasses were launched by Meta (Facebook’s parent company), in collaboration with designer brands like Ray-Ban and Oakley. Most of these products include front-facing cameras and support for conversational voice agents from Meta AI.
So, what do we expect from Google smart glasses in 2026? Google promised two products: one that is audio only and one that has “screens” displayed on the lenses (like Google Glass).

The biggest guess (based on promotional videos) is that these will see a significant change in form factor, from the futuristic, even scary and unfamiliar design of Google Glass, to something that is more normally thought of as glasses.
Google’s announcement also focused on adding AI (in fact, they announced them as “AI Glasses” rather than smart glasses). Both product types (audio-only AI glasses and AI glasses with projections in the field of view) are not particularly new, however, even when combined with AI.
Meta’s Ray-Ban products are available in both modes and include voice interaction with their own AI. These were more successful than the recent Humane AI Pin, for example, which included front-facing cameras, other sensors, and voice assistance from an AI agent. This is the closest thing we’ve had so far to Star Trek lapel communicators.
Direction of travel
There is a good chance that the main directions of innovation in this area will be, firstly, reducing the bulk of smart glasses, which have necessarily been bulky to include electronics and yet appear normally proportionate.
“Building glasses you’ll want to wear” is how Google puts it, and so we could see innovation from the company that simply improves the aesthetics of smart glasses. They also work with popular partner brands. Google also announced the release of wired XR (Mixed Reality) glasses, which have a significantly reduced form factor compared to virtual reality headsets on the market.
Second, we could expect greater integration with other Google products and services, where Google offers much more commonly used products than Meta, including Google Search, Google Maps, and GMail. Their promotional materials show examples of viewing Google Maps information in the AI glasses, while walking the streets.
Finally, and perhaps the biggest area of opportunity, is to innovate by including additional sensors, perhaps integrating them with other wearable health products from Google, where we see many of their current projects, including the introduction of their own smart rings.
Much research has focused on what can be detected from common touch points on the head, including heart rate, body temperature and galvanic skin response (skin moisture, which changes with, for example, stress), and even brain activation via EEG for example. With current advances in mainstream neurotechnology, we could easily see Smart glasses using EEG to track brain data in the coming years.
This edited article is republished from The conversation under Creative Commons license. Read the original article.



