We’ve spent several years now obsessing over models and assistants, but here’s a new interesting truth: the next competitive edge in AI won’t be another benchmark, but electrons. And not just any electrons, but cheap ones. As the “AI wars” heat up, the winners won’t simply be those with the best UX or the most compute. They’ll be the firms that can secure abundant low-cost power at scale, hour after hour, year after year.
That’s where AI is colliding with the physical world, and where the story stops being about software and starts being about grids, turbines, and price curves. Most recent analyses show that AI-driven data centers are now a visible driver of U.S. electricity demand and are starting to send retail prices higher, a clear signal that the constraint is shifting from graphics processing units (GPUs) to kilowatt-hours.
Then there’s also a lot of insistence about water, and it deserves some observations. In fact, the problem is that “water use” is often confused with “water consumption.” In data centers, much of the water involved in most cooling systems is withdrawn, then used to absorb heat, and later returned. Warmer, yes, but reentering the water cycle after discharge is brought back within permitted temperature ranges. Only some designs (notably evaporative cooling) consume water through vapor losses; others trade water for electricity by leaning on air-cooled chillers or direct-to-chip liquid loops that dramatically cut onsite withdrawals.
Think local
The right way to think about the problem is local: Water stress is a catchment-level issue, not a global one, and the risk depends on where you site the load and which cooling technology you choose. In short, the headlines often overstate a universal “thirst” that the engineering and the definitions don’t support.
None of this, of course, minimizes communities that are water-stressed, where a single facility can matter. Investigations have shown clusters of data centers in arid regions, prompting scrutiny and new local rules. That’s the right debate: match technology choices to basin realities, and stop treating “water for AI” as the same problem everywhere. In places with abundant non-potable or reclaimed water, or with dry/thermosyphon cooling, the footprint can be managed; in stressed watersheds, it becomes a siting decision, not an engineering afterthought.
Electricity is different. There is no local workaround if the price is structurally high. And on cost, the market is brutally clear. The latest Lazard Levelized Cost of Energy+ (LCOE+) report again shows utility-scale wind and solar at the bottom of the price stack, with new gas combined-cycle plants rising in cost and nuclear still the most expensive new build in rich-country conditions. If you’re trying to run large training runs or always-on inference, the delta between clean, cheap power and legacy generation is not a rounding error — it is the margin that decides where you build and whether the unit economics make sense.
Consider nuclear: Georgia’s Vogtle expansion finally went online, but only after historic cost and schedule overruns that translated into material rate hikes for customers. If AI’s advantage is speed and scale, it’s hard to square that with technologies that arrive late, over budget, and with levelized costs that sit at the wrong end of the curve. The physics is fine. The economics, today, are not.
This is why the new moat isn’t “access to energy” in the abstract: It’s access to cheap energy, reliably delivered. The firms that can lock in 24/7 low-cost supply, time-shift non-urgent workloads into off-peak windows, and colocate compute with stranded or overbuilt renewables will win. Everyone else will pay retail, and pass those costs on to users or investors. We are already seeing utilities, grid operators, and tech companies negotiate curtailment and flexibility, and the International Energy Agency’s (IEA’s) modeling makes the near-term picture obvious: AI-related demand is rising, and it will test systems that were not designed for this kind of always-on compute.
The China factor
This brings us to the comparison nobody in Silicon Valley likes to make out loud: China. Look past the coal headlines for a moment and follow the build rates. China hit its 2030 wind-and-solar target in 2024, six years early, and added roughly 429 GW of net new capacity to the grid in 2024 alone, the vast majority wind and solar, backed by massive investment in transmission. Pace matters, because marginal megawatt-hours from ultra-low-cost renewables set the floor for training and inference costs. China’s grid still has big challenges (curtailment among them), but if you’re simply asking “Who is manufacturing cheap electrons at scale the fastest?” the answer today is not the United States.
That doesn’t mean resignation; it means focus. If the U.S. wants to stay competitive in AI economics, the priority is not another model announcement: It’s a buildout of cheap generation and the wires to move it. Anything that delays that, be it doubling down on gas price volatility, pretending coal is cheap once you factor in capacity payments and externalities, or dreaming of next-gen nuclear that won’t arrive on time, will keep AI sited where the power is inexpensive and predictable. In a world of location-aware workloads, electrons decide geography.
The takeaway
The practical takeaway for companies is straightforward: If you are spending real money on AI, your CFO should now know your blended cost of electricity as intimately as your cloud bill, and should be negotiating for both. Favor regions with abundant wind and solar and strong transmission, insist on time-of-use pricing and demand-response programs, push your vendors on 24/7 carbon-free energy rather than annual offsets that do nothing for peak prices or local loads. None of this is environmental, social, and governance (ESG) posturing. It’s cost control for a compute-intensive product line whose unit economics are married to energy markets.
On water, keep the conversation precise. Ask for cooling designs, not slogans. Is the system evaporative or closed-loop? What’s the water-use effectiveness and the discharge temperature profile? Where does the site sit on the World Resource Institute’s (WRI’s) aqueduct map today and under climate-adjusted scenarios? If your supplier can’t answer those basics, they’re not ready to build where you’re planning to grow. But don’t let the “AI is drinking the planet” meme obscure a simpler reality: With the right technology and siting, the binding constraint is cheap electricity, not moisture in a recirculating loop.
The narrative arc is changing. The first phase of the AI boom rewarded companies that could raise capital and buy a lot of GPUs. The next phase will reward those that can buy electrons cheaply, cleanly, and continuously. If you want a preview of who wins the assistant wars, don’t look at the demos. Look at the interconnection queues, the power-purchase agreements, and mostly, the maps of wind and solar buildouts—the cheapest energy available. Software is glamorous, but power is destiny.
Jelentkezéshez jelentkezzen be
EGYÉB POSTS Ebben a csoportban


In business, the art of the pivot is a delicate thing, difficult to get right. That’s why it doesn’t happen that often; you only do it when you’re convinced the alternative—conti


Google dodged a bullet Tuesday when a federal judge ruled the company does no

Grab is a rideshare service-turned superapp, not available in the U.S. but rapidly growing in Southeast Asia. It’s even outmaneuvered global players like Uber to reach a valuation north of $20 bil

A quarter-century ago, David Saylor shepherded the epic Harry Potter fantasy series onto U.S. bookshelves. As creative director of

There’s no other phone I’d rather be using right now than Samsung’s Galaxy Z Fold7—and that’s a problem.
I’ve been a foldable phone appreciator for a while now, and a couple of years ago