Project Astra, Google's vision for a universal AI assistant, is pulling into focus

Last year at Google I/O, one of the most interesting demos was Project Astra, an early version of a multimodal AI that could recognize your surroundings in real-time and answer questions about them conversationally. While the demo offered a glimpse into Google's plans for more powerful AI assistants, the company was careful to note that what we saw was a "research preview."

One year later though, Google is laying out its vision for Project Astra to one day power a version of Gemini that can act as a "universal AI assistant." And Project Astra has gotten some important upgrades to help the company accomplish this. Google has been working on upgrading Astra's memory — the version we saw last year could only "remember" for 30 seconds at a time — and added computer control so Astra can now take on more complex tasks.

In its latest video showcasing Astra, Google shows the assistant browsing the web and pulling out specific pieces of information necessary to complete a task (in this example, fixing a mountain bike). Astra is also able to look through past emails to find specific specs of the bike in question and call a local bike shop to inquire about a replacement part.

Eventually, according to DeepMind's Demis Hassabis, Astra's advancements will show up in Gemini. "Our ultimate vision is to transform the Gemini app into a universal AI assistant that will perform everyday tasks for us, take care of our mundane admin, surface delightful new recommendations, making us more productive and enriching our lives," Hassabis writes in a blog post. "This starts with the capabilities we first explored last year in our research prototype Project Astra, such as video understanding, screen sharing and memory."

Some of that work is already evident in Gemini Live, which recently got some multimodal capabilities of its own. But, as I noted last year, Project Astra gets even more interesting in the context of smart glasses — an idea Google briefly teased in its I/O video last year. That vision appears to be inching closer to reality, with Hassabis noting that Google is working on bringing Project Astra abilities to "new form factors, like glasses." There's no clear timeframe on when any of this will be available, but given Google's updates on Android XR elsewhere at I/O, we know the company has big plans for AI-powered smart glasses later this year.

This article originally appeared on Engadget at https://www.engadget.com/ai/project-astra-googles-vision-for-a-universal-ai-assistant-is-pulling-into-focus-174539875.html?src=rss https://www.engadget.com/ai/project-astra-googles-vision-for-a-universal-ai-assistant-is-pulling-into-focus-174539875.html?src=rss
Creată 8h | 20 mai 2025, 18:50:19


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

Solar trade association warns of 'devastating energy shortages' if incentives are cut

The Solar Energy Industries Association released an

20 mai 2025, 23:30:11 | Engadget
Google XR glasses hands-on: Lightweight but with a limited field of view

One of the biggest reveals of Google I/O was that the company is officially back in the mixed reality game with its own

20 mai 2025, 23:30:09 | Engadget
Our favorite budget streaming stick drops to only $20 for Memorial Day

The popular Amazon Fire Stick HD

20 mai 2025, 21:10:20 | Engadget
Google demos Android XR glasses at I/O with live language translation

Google has dug back into its past and introduced its latest take on smart glasses during

20 mai 2025, 21:10:19 | Engadget