Since its debut at the end of last year, Gemini 2.0 has gone on to power a handful of Google products, including a new AI Mode chatbot. Now Google DeepMind is using that same technology for something altogether more interesting. On Wednesday, the AI lab announced two new Gemini-based models it says will "lay the foundation for a new generation of helpful robots."
The first, Gemini Robotics, was designed by Deepmind to facilitate direct control of robots. According to the company, AI systems for robots need to excel at three qualities: generality, interactivity and dexterity.
The first involves a robot's flexibility to adapt to novel situations, including ones not covered by its training. Interactivity, meanwhile, encapsulates a robot's ability to respond to people and the environment. Finally, there's dexterity, which is mostly self-explanatory: a lot of tasks humans can complete without a second thought involve fine motor skills that are difficult for robots to master.
"While our previous work demonstrated progress in these areas, Gemini Robotics represents a substantial step in performance on all three axes, getting us closer to truly general purpose robots," says DeepMind.
For instance, with Gemini Robotics powering it, DeepMind's ALOHA 2 robot is able to fold origami and close a Ziploc bag. The two-armed robot also understands all the instructions given to it in natural, everyday language. As you can see from the video Google shared, it can even complete tasks despite encountering roadblocks, such as when the researcher moves around the Tupperware he just asked the robot to place the fruit inside of.
Google is partnering with Apptronik, the company behind the Apollo bipedal robot, to build the next generation of humanoid robots. At the same time, DeepMind is releasing Gemini Robotics-ER (or embodied reasoning). Of the second model, the company says it will enable roboticists to run their own programs using Gemini's advanced reasoning abilities. DeepMind is giving "trusted testers," including one-time Google subsidiary Boston Dynamics, access to the system.
This article originally appeared on Engadget at https://www.engadget.com/ai/deepminds-latest-ai-model-can-help-robots-fold-origami-and-close-ziploc-bags-151455249.html?src=rss https://www.engadget.com/ai/deepminds-latest-ai-model-can-help-robots-fold-origami-and-close-ziploc-bags-151455249.html?src=rssMelden Sie sich an, um einen Kommentar hinzuzufügen
Andere Beiträge in dieser Gruppe


For less than the price of a fully decked-out MacBook Pro

China wants to work with other countries and has laid out its plans for the global governance

As of Friday, anyone trying to watch porn online in the UK will need to subject themselves to an awkward selfie or get their photo ID ready. The UK government announced it will start checking compl

Welcome to our weekly roundup of what's going on in the indie game space. There have been quite a few high-profile arrivals this week, as well as others that have left early access and or arrived o

s.p.l.i.t is the most badass typing game I’ve ever played. It’s actually more of a hacking simulator, cyberpunk thriller and puzzle experience than a typing game, but its core loop is book

A password manager is a crucial tool for anyone, really, but especially students going back to school in the fall. If you're one of them, you probably have more logins to remember now than ever bef