AI reality check: New NPUs don’t matter as much as you’d think

&xcust=2-1-2199565-1-0-0&sref=https://www.pcworld.com/feed" rel="nofollow">interviewing Dan Rogers of Intel, but I was able to get some tests in.) Procyon runs the LLMs on top of the processor and calculates a score, based upon performance, latency, and so on.

Without ado, here are the numbers:

  • Procyon (OpenVINO) NPU: 356
  • Procyon (OpenVINO) GPU: 552
  • Procyon (OpenVINO): CPU: 196

The Procyon tests proved several points: First, the NPU does make a difference; compared to the performance and efficiency cores elsewhere in the CPU, the NPU outperforms it by 82 percent, all by itself. But the GPU’s AI performance is 182 percent of the CPU, and outperforms the NPU by 55 percent.

Intel Meteor Lake Procyon AI detail
The detailed results for the Procyon NPU inferencing test for the Core 7 Ultra 165H.

Mark Hachman / IDG

Put another way: If you value AI, buy a large, beefy graphics card or GPU first.

But the second point is less obvious: Yes, you can run AI applications on a CPU or GPU, without any need for a dedicated AI logic block. All the Procyon tests demonstrate is that some blocks are more effective than others.

What Intel is saying (and, to be fair, has been saying) is that the NPU is more efficient. In the real world, “efficiency” is chipmaker code for “long battery life.” At the same time, Intel has tried to emphasize that the CPU, GPU, and NPU can work together.

Intel Meteor Lake Compute Tile NPU AI hybrid
Intel made this point early on about how various parts of Meteor Lake could work together, such as on Stable Diffusion.

Intel

In this case, the NPU’s efficiency equates to AI applications that operate over time, and probably on battery. And the best example of that is a lengthy Microsoft Teams call from the depths of a hotel room or conference center (just like CES!) where AI is being used to filter out noise and background activity.

Typically, AI art applications like Stable Diffusion launched first as a way to generate local AI art using the power of your GPU, alongside a ton of available VRAM. But over time AI applications have evolved to run on less powerful configurations, including mostly on the CPU. This is a familiar metaphor; you’re not going to run a graphics-intensive game like Crysis well on integrated hardware, but it should run — just very, very slowly. AI LLMs / chatbots will do the same, “thinking” for a long time about their responses and then “typing” them out very slowly. LLMs that can run on a GPU will perform better, and cloud-based solutions will be much faster.

However, AI will evolve

It’s interesting, too, that (as of this writing) UL’s Procyon app recognizes the CPU and the GPU in the AMD Ryzen AI-powered Ryzen 7040, but not the NPU. We’re in the very early days of AI, when not even the basic capabilities of the chips themselves are recognized by the applications that are designed to use them. This just complicates testing even further.

The point is, however, that you don’t need an NPU to run AI on your PC, especially if you already have a gaming laptop or desktop. NPUs from AMD, Intel, and Qualcomm will be nice to have, but they’re not must-haves, either.

Intel Lunar Lake CES 2024
Intel’s Michelle Johnston Holthaus, an executive in its Client Computing Group, holds up a sample of Lunar Lake, which is currently sampling to Intel’s PC partners.

Mark Hachman / IDG

It won’t always be this way, though. Intel’s promising that the NPU in the upcoming Lunar Lake chip due at the end of this year will have three times the NPU performance. It’s not saying anything about the CPU or the GPU performance. It’s very possible that, over time, the NPU’s performance in various PC chips will grow so that their AI performance will become massively disproportionate compared to the other parts of the chip. And if not, a slew of AI accelerator chip startups have plans to become the 3Dfx’s of the AI world.

For now, though, we’re here to take a deep breath as 2024 begins. New AI PCs matter, as do the new NPUs. But consumers probably won’t care as much as chipmakers that AI is running on their PC, versus the cloud, no matter how loud the hype is. And for those that do care, the NPU is just one piece of the overall solution.

https://www.pcworld.com/article/2199565/ai-reality-check-new-npus-dont-matter-as-much-as-youd-think.html

Creată 2y | 11 ian. 2024, 21:20:07


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

Intel CPUs are crashing again during summer heatwaves, Firefox dev warns
17 iul. 2025, 20:40:04 | pcworld.com
Peacock’s with-ads plan will soon be the priciest of the big streamers
17 iul. 2025, 20:40:03 | pcworld.com
A chunk of Mars just sold for a record-breaking $5.3 million at auction
17 iul. 2025, 18:20:10 | pcworld.com
YouTuber investigated by Italian police for playing retro ROMs
17 iul. 2025, 18:20:09 | pcworld.com
Nvidia is doubling the frame rate of RTX 40 graphics cards for free
17 iul. 2025, 18:20:09 | pcworld.com
How I’d set up a Roku for a 90-year-old
17 iul. 2025, 16:10:15 | pcworld.com
Steam thwarts Counter-Strike scammers with new trade reversal system
17 iul. 2025, 16:10:14 | pcworld.com