Building Power Supply Isn’t the Only Way to Win the AI Race

September 09, 2025
AI

AI

The “AI Race” is seen as this decade’s Space Race: a signal of American leadership and competitiveness globally, and a key priority of the Trump administration. There are significant economic ramifications as well, with the U.S. AI market having the potential to measurably impact GDP beginning in 2027.

However, the future of mega-energy-intensive industries like AI depends on our power grid. Therefore, the country risks missing out on an AI economic wave — over $100 billion in market growth over the next five years — due to a lack of energy.

This potential lack of energy is somewhat of an elasticity issue. If you were to think of the grid as a rubber band, in its current form, it can stretch to accommodate more demand, but only if pulled in the right way, at the right times. That is, in meeting the need for electricity during the approximately 40 hours a year that the grid is stretched thin to its full capacity, we could unlock new capacity to expand critical industries like AI.

However, instead of strategically stretching the grid periodically as needed, regulatory conversations and political action to address our supply and demand imbalance tend to emphasize putting steel in the ground: building more of the supply-side and grid infrastructure solutions that the power industry has relied on for decades.

Even with President Trump’s efforts to accelerate new energy generation, building nuclear and natural gas power plants are still long-lead, capital-intensive projects that are a 15-year solution to a five-year problem. U.S. electricity demand is projected to grow 25% by 2030 compared to 2023 levels, outpacing supply and pushing power prices up as much as 40%.

We need immediate solutions to compete in the global AI race, and there are untapped gigawatts of power already available in our current electricity system. Instead of only adding supply, we can also reduce peak demand by aggregating large, flexible loads and other customer-sited energy assets into virtual power plants (VPPs).

Estimates gauge that this approach would unlock nearly 100 gigawatts of growth potential on our grid. Adding that same amount of capacity by building new natural gas plants would cost close to $100 billion and take years — if not over a decade — to achieve. Able to be deployed in months without costly infrastructure investments, VPPs provide flexibility to our existing grid by using customer energy assets such as flexible load reduction, battery storage systems, solar panels and smart thermostats to level supply and demand. They currently provide 30 GW to 60 GW of capacity in the U.S. — about 6-12% of the country’s total natural gas capacity — but VPPs have not reached their full potential. The Department of Energy has reported that accelerating VPP deployment to reach 80 GW to 160 GW can support the rapid rise in energy demand and help reduce price spikes.

Turning our AI ambitions into reality requires regulators and policy makers to prioritize VPPs as a competitive edge in the AI race and enable our existing grid to support more economic growth.

First, we need to incentivize energy flexibility among large energy customers. If new demand drivers, like AI data centers or crypto-mining facilities, join VPPs and participate in demand-side flexibility programs, they can access the energy they need faster, support the grid, and drive economic growth in the U.S. Those same VPPs also serve as opportunities for all types of large energy users to enroll and offset their rising energy costs. Regulators should embrace this VPP participation, which meets demand by stretching the grid. For example, in Texas, Senate Bill 6 was approved by legislators and will require new large loads like data centers to participate in load management — a model we may see other parts of the country adopt.

Second, we need to lift key restrictions on VPP participation. Despite FERC Order 2222, which was designed to promote VPP participation, implementation in some power markets has fallen well short of FERC’s vision. Regulators can encourage increased VPP enrollment by ensuring customer energy assets can participate in multiple grid programs simultaneously and by allowing local utilities to dispatch VPPs.

And finally, we need to push for improved access to meter data. In the U.S., many electricity customers have a smart meter that produces real-time data about their power consumption, but most utilities restrict access to this data. If state and energy regulators helped give VPP operators and their customers better access to non-sensitive meter data, it could spur faster payments for participation, more customer sign-ups and an acceleration of available distributed resources across the country.

In short, it’s high time we cut the bias and treat distributed energy as equivalent to supply-side solutions. A dispatched demand reduction through a VPP is functionally identical to a dispatch of increased generation from a peaker power plant.

By rapidly accelerating VPPs, which are already proving their worth across the country, we can help capture the near-term value of an American AI renaissance. If AI companies embrace energy flexibility, driven by regulatory and policy support, this power-hungry industry will go from a grid problem to an economic solution that helps the U.S. compete on the global stage.

Michael D. Smith is CEO of CPower Energy, which operates a virtual power plant platform across more than 23,000 commercial and industrial customer sites in the U.S.

Published by

Michael Smith

Michael Smith

Chief Executive Officer, CPower

Michael Smith
Michael Smith

Chief Executive Officer, CPower

Skip to content