What it’s like to wear Google’s Gemini-powered AI glasses

Google wants to give people access to its Gemini AI assistant with the blink of an eye: The company has struck a partnership with eyeglasses makers Warby Parker and Gentle Monster to make AI smart glasses, it announced at its Google I/O developer conference in Mountain View Tuesday. These glasses will be powered by Google’s new Android XR platform, and are expected to be released in 2026 at the earliest.

To show what Gemini-powered smart glasses can do, Google has also built a limited number of prototype devices in partnership with Samsung. These glasses use a small display in the right lens to show live translations, directions and similar lightweight assistance. They also feature an integrated camera that gives Gemini a real-time view of your surroundings and can also be used to capture photos and videos.

“Unlike Clark Kent, you can get superpowers when you put your glasses on,” joked Android XR GM and VP Shahram Izadi during Tuesday’s keynote presentation.

[Photo: Janko Roettgers]

Going hands- (and eyes-) on

Google demonstrated its prototype device to reporters Tuesday afternoon. Compared to a regular pair of glasses, Google’s AI device still features notably thicker temples. These house microphones, a touch interface for input, and a capture button to take photos.Despite all of that, the glasses do feel light and comfortable, similar to Meta’s Ray-Ban smart glasses.

The Google glasses’ big difference compared to Meta’s reveals itself almost immediately after putting them on: At the center of the right lens is a small, rectangular see-through display. It doesn’t obstruct your view of the world when not actively in use. However, during the demo, I at times noticed a purple reflection from the waveguide that’s at the core of the display in the upper right corner of my field-of-view.

Google’s AI assistant can be summoned with a simple touch gesture. Once active, Gemini automatically accesses the outward-facing camera of the glasses, which makes it possible to ask about anything you see. During my short demo, the assistant correctly described the content of a painting, identified its painter, and offered some information about books hand-selected by Google for the demo.

In addition to AI assistance, the glasses can also be used for live translation and navigation. Google only showed the latter to members of the media. When in Google Maps mode, the glasses automatically display turn-by-turn walking directions while looking up. Look down, and the display includes a small, circular street map floating in front of you.

The display itself looked bright and legible, even when showing multiple lines of text at a time. However, Google conducted these demos indoors; it’s unclear how bright sunlight will impact legibility. 

Also unknown at this point is how long the batteries of such a device will last. Android XR glasses are designed for all-day wear, according to Izadi, but that doesn’t really tell us how many hours they can be used at a time.

Lots of open questions

Third-party apps were also notably absent from the demo. Izadi said Tuesday that glasses running Android XR will work with your phone, “giving you access to your apps while keeping your hands free.” How exactly that will work is unclear, as the display integrated into the prototype was too small to display the full UI of most apps. Most likely, Android XR will render apps in a simplified, device-optimized fashion, similar to the way apps show up on smart watches such as the Apple Watch and Google’s Android Wear devices.

The emergence of these kinds of devices also raises more fundamental questions about privacy. The prototype device shown at Google’s event this week has an LED that’s supposed to signal to bystanders when it takes photos or records video, and an internal LED that signals to the wearer when footage is being captured.

However, the LED doesn’t turn on on while Google’s Gemini assistant observes the world through the camera. According to a Google spokesperson, that’s because any video ingested this way is not being stored, but only temporarily used to make sense of the world. Bystanders, however, may not be as receptive to that distinction. They may assume that a device that can “see” the world at all times also continuously captures video.

Lastly, it’s still unclear what Google’s vision for other form factors looks like. The company also announced plans to release a pair of tethered AR glasses in partnership with Chinese AR startup Xreal Tuesday. With displays in both eyes, that device will be able to render much more immersive experiences, and presumably emphasize entertainment and work applications over more basic assistance.

In addition, Google’s roadmap for Android XR-powered devices includes glasses without any display at all. These are likely going to be similar to Meta’s Ray-Ban smart glasses, albeit with access to Google’s Gemini assistant instead of Meta’s AI. Omitting a display brings down the manufacturing costs of smart glasses, while also helping with an important goal: To make devices that look and feel familiar to anyone who has ever worn a pair of glasses.

“We know that these need to be stylish glasses that you’ll want to wear all day,” Izadi said.

https://www.fastcompany.com/91338811/android-xr-glasses-warby-parker-xreal?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Creado 3mo | 21 may 2025, 22:20:03


Inicia sesión para agregar comentarios

Otros mensajes en este grupo.

Instagram’s new location sharing map: how it works and how to turn it off

Instagram’s new location-sharing Map feature is raising privacy concerns among some users, who worry their whereab

8 ago 2025, 17:40:06 | Fast company - tech
The one part of crypto that’s still in crypto winter

Crypto is booming again. Bitcoin is near record highs, Walmart and Amazon are report

8 ago 2025, 13:10:06 | Fast company - tech
Podcasting is bigger than ever—but not without its growing pains

Greetings, salutations, and thanks for reading Fast Company’s Plugged In.

On August 4, Amazon announced that it was restructuring its Wondery podcast studio. The compan

8 ago 2025, 13:10:04 | Fast company - tech
‘Clanker’ is the internet’s favorite slur—and it’s aimed at AI

AI skeptics have found a new way to express their disdain for the creeping presence of

8 ago 2025, 10:50:02 | Fast company - tech
TikTok is losing it over real-life octopus cities

Remember when the internet cried actual tears for an anglerfish earli

7 ago 2025, 23:20:03 | Fast company - tech
Why OpenAI’s open-source models matter

Welcome to AI DecodedFast Company’s weekly newsletter that breaks down the most important news in

7 ago 2025, 18:40:05 | Fast company - tech
4 ways states are placing guardrails around AI

U.S. state legislatures are where the action is for placing guardrails around artificial intelligence technologies, given

7 ago 2025, 18:40:04 | Fast company - tech