Data Center Growth: What is Reality, and Who Should be Paying for It?

by | Aug 19, 2025 | Data Centers, Fusion Energy

Nova Laser Bay LLNL

The Fusion Report has run several articles on the impact of data centers on both electricity consumption and electricity management. One of the clear messages is that the growth in data centers, particularly AI data centers, is putting strain on the power grid in a number of locations. In particular, Northern Virginia, home to the largest concentration of data centers, is expected to quadruple its power demand by 2030, while Texas grid authorities expect that half of new power demand by 2030 will be from data centers. This is also happening in Ohio, Illinois, North Dakota, Georgia, Arizona, and Utah.

More concerning is that this growth has increased electricity rates in these areas for residential and small business customers. In response to this situation, Texas recently passed legislation that allows state independent system operators (ISOs) to unplug data centers during power emergencies, while forcing customers drawing more than 75MW to pay for grid connection costs (California, Georgia, Oregon, and New Jersey are also considering similar laws).

What is Reality on the Rate of AI Data Center Growth?

One of the key questions in the areas of data center electricity consumption is how fast are data centers really going to grow? Over the last few months, there has been considerable controversy over whether this forecasted electricity growth is real.

This controversy was fueled by a study from the London Economic Institute (funded by the Southern Environmental Law Center). The study focused on what would happen if utilities started growing electricity generation capacity based on “conventional wisdom” projections on data center energy consumption, and the impact that would have on electricity costs. The study found several factors that threw these projections into question:

  1. Much of the data from utilities on data center electricity consumption growth is based on requests for electrical service. It was found that many data centers put in multiple requests for service with a number of local utilities, essentially duplicating future demand. Moreover, many of these are early in the interconnection process, before financial commitments have been made.
  2.  The ability of AI chipmakers to keep up with the demand for their AI chips. In particular, there were questions whether NVIDIA, by far the leading chip vendor to AI data centers, had the capacity to build all of the GPGPUs required to support this growth.

If the points above were valid (less demand than anticipated, and an inability to support it on the chip production level), demand growth due to AI data centers should be far less than forecasted by “conventional wisdom”, and the unneeded electricity generation buildup that would follow will just raise rates for customers. On the other hand, the lag in power plant construction vs demand could certainly mean that the potential “over-building” of demand results in a higher likelihood that the power is there when needed. Moreover, the increased electricity demand will also impact the utility grid in a number of areas, an area we have recently covered.

In Any Case, Who Should be Paying for the Increased Demand?

Unsurprisingly, this question has its roots in the old question of “why do large electricity consumers like industry get significant breaks on their electricity, which essentially becomes subsidized by normal ratepayers”? This old question has been an issue for some time, and has conventionally been explained that these differences are due to higher demand that tends to be more consistent and forecastable than residential electricity consumption. When there is enough supply to meet demand, this can be seen as a reasonable point.

However, in areas where data centers are causing electricity shortages, this is clearly not the case. Moreover, the companies that own these data centers are often highly profitable tech giants who can afford the costs associated with the infrastructure improvements required. Finally, unlike industries which create jobs, data centers themselves create very few long-term jobs – they are mostly “empty buildings”. Of course, data center construction itself creates a significant number of jobs, but these are temporary. Also, the indirect jobs created by data centers, whether software developers or businesses using AI, do not have to be “co-located’ with the data centers themselves, which can create situations where data center developers “arbitrage” where they put data centers to capitalize on likely electricity rates, since electricity (and water used to cool data center equipment) is one of the greatest data center operational costs.

In the end, there needs to be incentives/regulations that avoid negative impacts on local ratepayers for data center energy consumption, while at the same time not discouraging the development of data centers. Similarly, improving the power efficiency of AI chips needs to be incentivized as well – an NVIDIA H200 GPU can use up to 700 watts, and there can be tens of thousands (or more) of these in a large AI data center. Both of these approaches can help mitigate both the electricity demand and its cost.