A friend texted me last week: "Is AI really making my electricity bill go up? I keep seeing this everywhere and I'm deciding if I sign a petition to prevent a data center from being built in my Atlanta neighborhood."
The short answer is: kind of. Yes, AI data centers are driving up costs, but it's not as extreme as the headlines make it sound.
AI data centers are not the only reason your bill is skyrocketing. Electricity prices were already rising before the AI boom, driven by aging infrastructure, wildfire costs, tariffs on construction materials, and the push to electrify everything from cars to stoves. However, data centers are accelerating a trend that was already in motion at a scale that's hard to ignore.
I went deep on the data so you don't have to. Here's what's actually happening, who's getting hit hardest, and what might actually fix it.
Key takeaways: Electricity prices are up ~30% since 2020, driven by aging infrastructure, wildfire costs, and electrification. AI data centers are accelerating this trend, accounting for 40% of new demand growth. The mid-Atlantic, Texas, and California are most affected. Flexible data center management could ease grid strain — but policy needs to catch up.
Your Bill Was Already Going Up
Let's be clear: electricity prices were rising long before ChatGPT existed.
Since 2020, residential electricity prices in the US have gone up about 30%. In 2025 alone, prices rose nearly 7%, more than double the rate of inflation.
About 70% of transmission lines and transformers in the US are over 25 years old. The American Society of Civil Engineers just gave our energy infrastructure a D+. This isn't just age, it's decades of deferred maintenance and underinvestment by utilities that had every incentive to build new things rather than take care of what they had. For comparison, China's electricity costs about half what ours do, largely because they invested in grid infrastructure ahead of demand rather than scrambling to catch up after the fact. We're now paying the bill for decades of kicking the can down the road, and that cost alone would be driving your bill up even without a single data center.
In California, billions have gone toward wildfire prevention and grid hardening. Tariffs on steel and aluminum (the materials you need to build power lines, substations, and transformers) have pushed equipment costs up 6–9% across the board. Climate change is increasing demand through hotter summers and more extreme weather events. Finally, the push to electrify everything from EVs to heat pumps to induction stoves is adding load to a grid that was already strained.
AI data centers didn't start the fire, but they're adding a lot of fuel.
How Data Centers Actually Show Up on Your Bill
Data centers are massive buildings full of servers that power everything from Google searches to ChatGPT to your Netflix recommendations. They run 24/7 and consume enormous amounts of electricity, both for computing and for keeping the machines cool.
The scale is hard to wrap your head around. Data centers are expected to consume more electricity than all US homes for the first time by the end of 2026. AI workloads are projected to drive a 165% increase in data center power demand by 2030.
What this means for you:
Grid Upgrades
Utilities have to build new transmission lines, substations, and generation capacity to serve data centers due to the sheer amount of energy consumption they require. State regulators approve rate increases to fund those investments, and the costs get spread across all customers like you. As a government regulated monopoly, utilities are required to service all loads, no matter how big or small they are. Utilities requested more than $29 billion in rate increases in just the first half of 2025, double the amount from the year before.
Wholesale Price Pressure
Data centers compete with homes and businesses for the same electricity. When demand outpaces supply, wholesale prices rise, and those increases flow through to retail rates. Goldman Sachs found that electricity prices jumped 6.9% in 2025, more than double headline inflation, with data centers accounting for 40% of demand growth.
The Discount Problem
Data centers negotiate bulk rates and special pricing with utilities. A Yale Climate Connections analysis found that residential prices rose 25% since 2020, while industrial users are actually paying less than they were two years ago. The infrastructure costs are being shared by everyone, but the discounts are not.
To put that in perspective: if you're a household paying $250/month for electricity, a meaningful portion of your rate increases over the past few years went toward infrastructure that primarily serves data centers. Meanwhile, those same data centers negotiated rates that insulate them from the very price increases their demand helped create.
Which Regions Are Getting Hit Hardest
The impact isn't evenly distributed geographically.
The Mid-Atlantic and Midwest: Ground Zero
The PJM grid region, covering 13 states from Virginia to Illinois, is where the largest concentration of data centers is being built. The cost of securing future power in PJM has exploded, with $23 billion in costs attributable to data centers according to the grid's own watchdog. Unfortunately, those costs get passed directly to consumers. States like Virginia and New Jersey are most acutely impacted, with the governors in both states declaring a state of emergency for residential electricity prices.
Texas
Texas projects that total electricity load will grow from 87 GW in 2025 to 145 GW by 2031, with data centers accounting for nearly half of that growth. The state's deregulated market means wholesale price swings reach consumers faster. Customers on variable-rate plans are the most immediately exposed to spikes driven by surging data center demand. Texas passed Senate Bill 6 in 2025 specifically to force large data centers to bear more of the infrastructure costs rather than passing them to residential ratepayers. It's a step in the right direction. But the rules around how transmission costs are actually allocated to data centers won't be finalized until late 2026. In the meantime, analysts still forecast a 45% price surge in parts of Texas this summer.
California
California has some insulation from its diversified generation mix (solar, wind, hydro alongside natural gas), but rates are already the highest in the continental US at about 33 cents per kWh, 67% above the national average. Data center demand adds pressure on top of wildfire costs, grid hardening, and the renewable transition. Even a modest percentage increase translates to real dollars when you're starting from the most expensive rates in the country.
The Most Insulated Regions
The Pacific Northwest (Washington, Oregon) relies heavily on hydroelectric power, making it the least exposed to natural gas and data center-driven price spikes. States with heavy nuclear or wind generation (Illinois, Iowa, Kansas, South Dakota) also have a natural hedge. If you live in these areas, you're more likely to feel the AI boom at the gas pump than on your utility bill.
Data Centers Could Actually Help the Grid (If Done Right)
Data centers don't have to be grid liabilities. If managed responsibly, they could become some of the most valuable flexible assets on the grid.
Not all computing tasks are equally urgent. Training an AI model can pause for a few minutes without consequence, but processing a real-time transaction can't wait. If data centers can shift their energy-intensive work to times when the grid has excess capacity and pull back during peak demand, they stop being a constant drain and start acting more like a giant battery. This is particularly useful when considering how to add more renewables to the grid, which tend to flood the grid with excess energy at off-peak hours.
Emerald AI, an Nvidia-backed startup, has been running live demonstrations showing exactly this. In a trial with National Grid in London, their software reduced a data center's electricity consumption by up to 40% during grid stress events while keeping critical workloads running. Their modeling suggests that flexible demand management could unlock nearly 100GW of grid capacity using infrastructure we already have — roughly one-fifth of total US capacity — without building a single new power plant.
If data centers become flexible grid participants instead of rigid energy hogs, utilities won't need to build as much new infrastructure to serve them. Less infrastructure buildout means less cost passed to you.
However, this can't get implemented overnight. It requires regulators to mandate flexibility, utilities to adopt new standards, and tech companies to actually follow through rather than just signing voluntary pledges. The technology exists today, but the question is whether policy catches up before consumers absorb years of avoidable cost increases.
Why We're Building Nura
We know how frustrating it is to hear that AI is the inevitable future from your boss and then get told that AI data centers are making your electricity unaffordable by your politicians. It feels like the world is pulling you in a million directions without breaking down how any of these changes actually hit you in your day-to-day life.
Your bill certainly doesn't help. It doesn't show you how much of your cost is driven by infrastructure buildout versus your actual usage. It doesn't tell you whether a different rate plan would save you money based on your specific usage pattern.
That's what we're building with Nura. Our goal is to bring transparency to every dollar spent on energy in America, starting with empowering consumers with the tools to understand their bill and take control of it.
If you're a PG&E customer in the Bay Area, you can join the Nura beta waitlist here.
