The race to dominate AI has mostly focused on models, chips, and data. But quietly, another constraint is emerging as a key player in the race: the grid.
As demand for compute explodes, companies like OpenAI and Nvidia are now running up against physical limits – the availability of electricity and the capacity of the grid.
So basically, the AI boom now depends on energy infrastructure that was never built for this scale. And quietly, electricity is becoming one of the most strategic resources in the entire industry.
Why is power suddenly part of the AI conversation?
Running AI at scale isn’t just a software problem. It’s a hardware and infrastructure challenge.
Each time you prompt a language model, it triggers a chain of processes across racks of GPUs in data centers. These machines draw a tremendous amount of power to compute results. And, not to forget, a lot of power consumption also goes towards staying cool, managing memory, and pushing data across the network fast enough to feel instant.
What’s changed is how much electricity is needed to keep up. AI adoption has grown faster than the systems supporting it. Companies are now asking a different kind of question: not “do we have the chips?” but “do we have enough power to turn them on?”
That’s how electricity entered the AI conversation – as a central concern.
What OpenAI and others are planning right now
As per a blog post on Sep 23, OpenAI and its partners (Oracle, SoftBank, and Nvidia), announced plans for a major infrastructure expansion. They’re building five new data center campuses across the United States, under a project code-named Stargate. Together, these sites are expected to draw around 10 gigawatts of electricity once fully operational.
That number is hard to grasp. For context, 10 gigawatts is enough to power several million homes, or a sizable portion of a state’s peak electricity demand. It puts OpenAI and its collaborators in the realm of industrial-scale energy consumers, alongside steel plants, chemical refineries, and major manufacturing hubs.
The scale of this project has even led to some viral comparisons. Some claimed OpenAI would soon use more electricity than entire countries. But the original quote, from ARM’s CEO, was actually referring to global data centers as a whole – not one company.
Even so, this kind of investment signals a shift in priorities. These are no longer just tech projects. They’re energy projects, and they’re being planned like it.
Why electricity is now the bottleneck? (and not chips!)
For the last two years, AI companies have been heavily focused on GPUs. The big question was whether you could get your hands on enough high-end GPUs to train competitive models. In 2025, they’re fighting for power.
So, now, it is not just about how many servers you have. It’s about whether those servers can run at full capacity and that depends entirely on access to electricity.
Across the U.S., grid infrastructure isn’t keeping up. Even if a company buys land and builds a new data center, it might have to wait years before it can draw enough power to run it. Transmission lines are overcommitted, local permitting is slow, and utilities are flooded with connection requests. Many projects are stuck in what’s known as the interconnection queue, a growing backlog of sites waiting to be approved and plugged into the grid.
This isn’t just a supply chain issue. It’s a coordination problem. And it’s changing how infrastructure gets planned, where facilities are built, and how fast new AI capabilities can reach the public.
In image: U.S. map showing interconnection queues and high-demand AI zones. Source
So does China have the advantage?
This is where things get interesting. Some observers believe China has a major advantage in this sector. And for a good reason.
Its electricity prices are often lower, especially in provinces with large hydro or coal capacity
It has centralized planning, which allows faster coordination between land, power, and fiber
It has already launched large AI infrastructure zones, bundling compute with grid access
In contrast, the U.S. system is more fragmented. Energy is deregulated in many states. Even after land is secured, projects may be delayed by years due to permitting or grid congestion. And in high-demand areas, like Northern Virginia or Central Texas, available power is already stretched thin.
But having easier access to electricity doesn’t guarantee leadership.
Power alone won’t win the AI race
Even with abundant power, you need the right chips, software, and people to turn electricity into intelligence.
Right now, China faces export restrictions that limit its access to Nvidia’s most advanced chips. Local alternatives exist, but they often require more energy per unit of compute.
The software layer adds another challenge. Much of the core tooling (like CUDA, PyTorch, and model optimization libraries) originated in the U.S. and continues to be supported by its developer ecosystem. These tools shape how models are built, fine-tuned, and deployed at scale.
Beyond that, AI depends on talent, capital, research environments, and access to frontier models—each of which carries its own geopolitical weight.
Electricity is necessary to run AI. But on its own, it doesn’t move the needle far enough.
What’s happening next?
The next phase of AI infrastructure isn’t just about faster chips or larger models. It’s about securing long-term access to electricity.
OpenAI and its partners are choosing data center locations based on energy availability, not just land or fiber. They’re setting up near hydropower and nuclear sources, signing long-term purchase agreements, and exploring emerging options like small modular reactors.
Power planning is no longer a background task, it’s now part of how AI gets built.
Some analysts believe this shift could shape where the next generation of frontier models will emerge. Not because of compute shortages. Not because of rising costs. Simply because there may not be enough electricity in the right places, at the right time.
The bottomline
The idea that “cheap power will decide who wins AI” makes for a great headline. But it oversimplifies what’s actually happening.
Electricity has become a critical part of the AI stack. And yes, availability and reliability matter more than ever. But power alone isn’t a winning strategy.
What really determines progress is the ability to bring multiple ingredients together—compute, infrastructure, software, skilled teams, and capital. No one input dominates the rest.
Power unlocks potential. But turning that potential into working AI still takes a lot more.