Forget chatbots. Physical and embodied AI are now coming for your job

Amazon recently announced that it had deployed its one-millionth robot across its workforce since rolling out its first bot in 2012. The figure is astounding from a sheer numbers perspective, especially considering that we’re talking about just one company. The one million bot number is all the more striking, though, since it took Amazon merely about a dozen years to achieve. It took the company nearly 30 years to build its current workforce of 1.5 million humans.

At this rate, Amazon could soon “employ” more bots than people. Other companies are likely to follow suit, and not just in factories. Robots will be increasingly deployed in a wide range of traditional blue-collar roles, including delivery, construction, and agriculture, as well as in white-collar spaces like retail and food services. 

This occupational versatility will not only stem from their physical designs—joints, gyroscopes, and motors—but also from the two burgeoning fields of artificial intelligence that power their “brains”: Physical AI and Embodied AI. Here’s what you need to understand about each and how they differ from the generative AI that powers chatbots like ChatGPT. 

[Photo: Amazon]

What is Physical AI?

Physical AI refers to artificial intelligence that understands the physical properties of the real world and how these properties interact. As artificial intelligence leader Nvidia explains it, Physical AI is also known as “generative physical AI” because it can analyze data about physical processes and generate insights or recommendations for actions that a person, government, or machine should take.

In other words, Physical AI can reason about the physical world. This real-world reasoning ability has numerous applications. A Physical AI system receiving data from a rain sensor may be able to predict if a certain location will flood. It can make these predictions by reasoning about real-time weather data using its understanding of the physical properties of fluid dynamics, such as how water is absorbed or repelled by specific landscape features.

Physical AI can also be used to build digital twins of environments and spaces, from an individual factory to an entire city. It can help determine the optimal floor placement for heavy manufacturing equipment, for example, by understanding the building’s physical characteristics, such as the weight capacity of each floor based on its material composition. Or it can improve urban planning by analyzing things like traffic flows, how trees impact heat retention on streets, and how building heights affect sunlight distribution in neighborhoods.

[Photo: Amazon]

What is Embodied AI?

Embodied AI refers to artificial intelligence that “lives” inside (“embodies”) a physical vessel that can move around and physically interact with the real world. Embodied AI can inhabit various objects, including smart vacuum cleaners, humanoid robots, and self-driving cars.

Like Physical AI, Embodied AI can reason about physics, as well as how one object affects another. However, since Embodied AI literally “embodies” a physical entity, such as a robot, it can also alter the real world around it, whether that be a robotic arm performing surgery, a humanoid bot working construction, or a self-driving truck transporting supplies from one location to another.

Embodied AI has advanced capabilities due to the mobility of its physical body and, as Nvidia explains, additional sensors, which can include cameras or LiDAR, that enable it to perceive its surroundings.

A real-time distinction

It is worth noting that the terms “Physical AI” and “Embodied AI” are increasingly being used interchangeably to describe any AI that understands the physics and spatial relationships of the real world and uses that understanding to power the brains behind bots. 

However, most experts agree that Physical AI and Embodied AI are interrelated but distinct varieties of artificial intelligence.

Henrik I. Christensen, an expert on robotics and AI and a professor of computer science at the University of California, San Diego, says that one distinguishing factor between the two is their real-time operational capabilities. “Physical AI denotes systems that [infer things] related to the physical world, such as friction, elasticity,” Christensen told me via email. This kind of system “may not operate in real time but has a detailed model of interaction in the physical world.”

Embodied AI, on the other hand, “denotes systems that operate in the physical world [and also] interact with objects in the real world, [so] they must operate in real-time,” Christensen says.

This real-time requirement is essential for robots working in the real world. If a robot doesn’t grab something as fast as it should, disaster can strike on the factory floor. He notes that Embodied AI systems often need to use simplified models to ensure they can “provide an answer fast enough.”

Will robots take all the jobs?

LLM artificial intelligence systems that power ChatGPT, Claude, Llama, Grok, and others have long been seen as a threat to white-collar jobs, since they can reason about information and generate answers based on that information, much like a human can. However, because LLMs lack both a physical presence and an understanding of how physics affects objects in the real world, they have generally been seen as less of a threat to blue-collar jobs, which typically involve physical labor and an understanding of how objects interact in the real world.

But Physical AI and Embodied AI systems change the blue-collar risk assessment. Physical AI systems now possess reasoning capabilities regarding physical interactions, and Embodied AI enables robots to apply that understanding in the real world.

Yet, for now, at least, LLMs still pose a greater threat to white-collar jobs than Physical AI and Embodied AI do to blue-collar ones. This is because LLM technology is readily available and easily deployable across organizations at scale. While Physical AI systems could see nearly as speedy a rollout in the years ahead, Embodied AI systems face more hurdles due to the need to manufacture legions of robots capable of operating in real-world environments.

However, as Amazon’s one millionth robot rollout demonstrates, companies are increasingly interested in integrating more bots into the workforce, whether that’s in the factory or in the kitchen flipping burgers. As for why? Well, to take a line from my own novel, Beautiful Shining People, “bots never accidentally drop or damage things—not to mention they never get sick, or need days off, or give away free burgers to their friends.”

In other words, Physical AI and Embodied AI-powered robots have the potential to save companies a significant amount on their biggest expense: labor. And they are sure to take advantage of it. The only question for me, then, is: When AI takes all our jobs, who will be left to buy the things these companies sell?

https://www.fastcompany.com/91363903/forget-chatbots-physical-embodied-ai-job-robots-robotics-digital-twins-manufacturing-jobs?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Erstellt 5h | 19.07.2025, 10:50:03


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

I’m a two-time tech founder. But restaurants are where I learned to lead

Sudden equipment failures. Supply chain surprises. Retaining staff as the goalposts move in real time. These aren’t challenges I’ve faced as a tech founder—but I have faced them running restaurant

19.07.2025, 13:10:05 | Fast company - tech
Staying hands on made scaling to $1B+ fun for Cloudflare’s founder

On this week’s Most Innovative Companies podcast, Cloudflare COO Michelle Zatlyn talks with Fast Company staff writer David Salazar about hitting $1B in revenue and going global, as well as

19.07.2025, 08:30:05 | Fast company - tech
‘Who did this guy become?’ This creator quit his job and lost his TikTok audience

If you’ve built an audience around documenting your 9-to-5 online, what happens after you hand in your notice?

That’s the conundrum facing Connor Hubbard, aka “hubs.life,” a creator who

18.07.2025, 20:50:06 | Fast company - tech
Meta-owned WhatsApp could be banned in Russia. Here’s why

WhatsApp should prepare to leave the Russian market, a lawmaker who regulates the IT sector

18.07.2025, 16:20:03 | Fast company - tech
The simple pleasures of computing in 1995

This is an edition of Plugged In, a weekly newsletter by Fast Company global technology editor Harry McCracken. You can sign up to receive it each Friday and read all issues

18.07.2025, 13:50:08 | Fast company - tech
The AOL hacking tool that invented phishing and inspired a generation

If you were a teenager on America Online back then, there’s a good chance you got the email. Unlike a lot of the files floating around the early

18.07.2025, 13:50:06 | Fast company - tech