What it’s like to wear Google’s Gemini-powered AI glasses

Google wants to give people access to its Gemini AI assistant with the blink of an eye: The company has struck a partnership with eyeglasses makers Warby Parker and Gentle Monster to make AI smart glasses, it announced at its Google I/O developer conference in Mountain View Tuesday. These glasses will be powered by Google’s new Android XR platform, and are expected to be released in 2026 at the earliest.

To show what Gemini-powered smart glasses can do, Google has also built a limited number of prototype devices in partnership with Samsung. These glasses use a small display in the right lens to show live translations, directions and similar lightweight assistance. They also feature an integrated camera that gives Gemini a real-time view of your surroundings and can also be used to capture photos and videos.

“Unlike Clark Kent, you can get superpowers when you put your glasses on,” joked Android XR GM and VP Shahram Izadi during Tuesday’s keynote presentation.

[Photo: Janko Roettgers]

Going hands- (and eyes-) on

Google demonstrated its prototype device to reporters Tuesday afternoon. Compared to a regular pair of glasses, Google’s AI device still features notably thicker temples. These house microphones, a touch interface for input, and a capture button to take photos.Despite all of that, the glasses do feel light and comfortable, similar to Meta’s Ray-Ban smart glasses.

The Google glasses’ big difference compared to Meta’s reveals itself almost immediately after putting them on: At the center of the right lens is a small, rectangular see-through display. It doesn’t obstruct your view of the world when not actively in use. However, during the demo, I at times noticed a purple reflection from the waveguide that’s at the core of the display in the upper right corner of my field-of-view.

Google’s AI assistant can be summoned with a simple touch gesture. Once active, Gemini automatically accesses the outward-facing camera of the glasses, which makes it possible to ask about anything you see. During my short demo, the assistant correctly described the content of a painting, identified its painter, and offered some information about books hand-selected by Google for the demo.

In addition to AI assistance, the glasses can also be used for live translation and navigation. Google only showed the latter to members of the media. When in Google Maps mode, the glasses automatically display turn-by-turn walking directions while looking up. Look down, and the display includes a small, circular street map floating in front of you.

The display itself looked bright and legible, even when showing multiple lines of text at a time. However, Google conducted these demos indoors; it’s unclear how bright sunlight will impact legibility. 

Also unknown at this point is how long the batteries of such a device will last. Android XR glasses are designed for all-day wear, according to Izadi, but that doesn’t really tell us how many hours they can be used at a time.

Lots of open questions

Third-party apps were also notably absent from the demo. Izadi said Tuesday that glasses running Android XR will work with your phone, “giving you access to your apps while keeping your hands free.” How exactly that will work is unclear, as the display integrated into the prototype was too small to display the full UI of most apps. Most likely, Android XR will render apps in a simplified, device-optimized fashion, similar to the way apps show up on smart watches such as the Apple Watch and Google’s Android Wear devices.

The emergence of these kinds of devices also raises more fundamental questions about privacy. The prototype device shown at Google’s event this week has an LED that’s supposed to signal to bystanders when it takes photos or records video, and an internal LED that signals to the wearer when footage is being captured.

However, the LED doesn’t turn on on while Google’s Gemini assistant observes the world through the camera. According to a Google spokesperson, that’s because any video ingested this way is not being stored, but only temporarily used to make sense of the world. Bystanders, however, may not be as receptive to that distinction. They may assume that a device that can “see” the world at all times also continuously captures video.

Lastly, it’s still unclear what Google’s vision for other form factors looks like. The company also announced plans to release a pair of tethered AR glasses in partnership with Chinese AR startup Xreal Tuesday. With displays in both eyes, that device will be able to render much more immersive experiences, and presumably emphasize entertainment and work applications over more basic assistance.

In addition, Google’s roadmap for Android XR-powered devices includes glasses without any display at all. These are likely going to be similar to Meta’s Ray-Ban smart glasses, albeit with access to Google’s Gemini assistant instead of Meta’s AI. Omitting a display brings down the manufacturing costs of smart glasses, while also helping with an important goal: To make devices that look and feel familiar to anyone who has ever worn a pair of glasses.

“We know that these need to be stylish glasses that you’ll want to wear all day,” Izadi said.

https://www.fastcompany.com/91338811/android-xr-glasses-warby-parker-xreal?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 22d | May 21, 2025, 10:20:03 PM


Login to add comment

Other posts in this group

Chime’s cofounder on the company’s IPO: ‘We’re just getting started’

A dozen years after its launch, fintech company Chime rang the bell this morning at the Nasdaq MarketSite in Times Square to ce

Jun 12, 2025, 8:20:06 PM | Fast company - tech
What is a fridge cigarette? The viral Diet Coke trend explained

It hits at a certain time in the afternoon, when a familiar craving strikes. You walk to the kitchen. The satisfying sound of a can cracking, the hiss of bubbles. It’s time for a “fridge cigarette

Jun 12, 2025, 8:20:06 PM | Fast company - tech
This startup wants AI to help manage software infrastructure, not just write code

Many developers find that AI programming assistants have made writing code easier than ever. But maintaining the infrastructure that actually runs that code remains a challenge, requiring engineer

Jun 12, 2025, 6:10:21 PM | Fast company - tech
Apple fumbled its personal AI debut, but the alternative was far worse

Welcome to AI DecodedFast Company’s weekly newsletter that breaks down the most important news in the world of AI. You can sign up to receive this newsletter every week 

Jun 12, 2025, 6:10:18 PM | Fast company - tech
Greenhouse and Clear team up to fight fake job applications flooding tech hiring

Fraudulent job applications have become a serious issue in the era of

Jun 12, 2025, 1:30:02 PM | Fast company - tech
‘We’re on the cusp of more widespread adoption’: Laura Shin on Trump, stablecoins, and the global rise of cryptocurrency

With the first family actively engaged in memecoin ventures, speculation about the future of cryptocurrency has never been hotter. Laura Shin, crypto expert and host of the podcast Unchained

Jun 12, 2025, 11:10:06 AM | Fast company - tech
Thanks to AI, the one-person unicorn is closer than you think

When Mike Krieger helped launch Instagram in 2010 as a cofounder, building something as simple as a photo filter took his team wee

Jun 12, 2025, 11:10:04 AM | Fast company - tech