← Front Page
AI Daily
Electrical transmission tower at dusk with power lines crossing an empty landscape
Energy • April 25, 2026

There Are Fixes for AI's Power Crisis. Here Is Why They Are Not Happening.

By AI Daily Editorial • April 25, 2026

OpenAI shut down Sora this spring, at least temporarily, in part because the video-generation tool was consuming more computational resources than the company could sustain. One of the most talked-about AI products of the year, throttled not by a technical failure but by a power constraint.

That single detail captures a wider problem the AI industry is running into at speed. Data centres need enormous amounts of electricity, both to run the servers and to prevent them from overheating. As AI models grow more capable and more widely deployed, the demand keeps climbing. And the United States power grid, which was never built for this moment, is struggling to keep pace.

"Basically, we have run out of headroom, largely speaking, in the US," Ben Hertz-Shargel, an expert on electrification and data centres at energy research firm Wood Mackenzie, told CNN. "There is a land grab happening, where companies believe that access to more capacity for compute will be necessary to win the future battle over AI services."

The shortage is not invisible to the companies at the centre of it. OpenAI warned the White House last year of an "electron gap" that it said puts US leadership in AI at risk, framing it with the phrase "electrons are the new oil." Elon Musk has said he expects more chips to be manufactured than can be switched on, purely because the power is not there. Google has stated publicly that the current pace of energy development is not meeting the potential demand from AI.

Solutions exist and are well understood. Expanding renewable energy, scaling battery storage, building more nuclear capacity, modernising the three loosely connected regional grids that make up the US electrical network. The problem is not the existence of options. It is that none of them are fast. Permitting new transmission lines routinely takes a decade or more, caught between regulatory complexity, local opposition, and the need for coordination across jurisdictions that rarely agree. Nuclear capacity takes longer still. And the political environment for the large-scale federal coordination that would accelerate any of these has not been favourable.

So the pressure is building from both directions: rising demand from AI, stalled supply from the grid. The industry's response has largely been to lobby for faster permitting and maintain incentives for private-sector energy investment. Whether that lobbying translates into meaningful policy change, and on what timeline, remains an open question.

Against that backdrop, a research team at the University of Cambridge published findings this week in the journal Science Advances that point toward a different kind of answer. They have engineered a new nanoelectronic device using a modified form of hafnium oxide that mimics how the brain processes and stores information in the same place, rather than constantly shuttling data between separate memory and processing units as conventional chips do. That back-and-forth transfer is a significant source of energy waste. The researchers say this new device, functioning as a stable, low-power memristor, could cut AI energy consumption by up to 70 percent.

Neuromorphic computing, the broader approach that this device belongs to, has been a promising direction for years. The gap between a published research result and a commercially viable chip is wide and not always crossed. But the logic is sound: if expanding supply is slow and politically difficult, reducing demand is the other lever. A 70 percent reduction in AI's energy appetite would reframe the grid problem entirely.

The hard reality is that both paths, grid expansion and hardware efficiency, face lead times measured in years, and the AI industry's appetite for power is growing now.

Sources