Powering AI Without Breaking the Grid

December 17, 2025
3 min read

Copyright © 2025 by Energy Intelligence. All rights reserved. This article first appeared in December 2025 on Energy Intelligence.

The electricity grid is about to reach a breaking point from an operational and financial standpoint. Energy demand from data centers is soaring faster than the grid can keep up with, and ratepayers could end up shouldering the costs. Across the next five years, utilities face a choice: either strictly stick to their role as electricity suppliers, or step up as partners of the artificial intelligence revolution.

The Alarm Bells Are Ringing

Just this year alone, data center-fueled demand for power jumped 22% across the US, and that’s forecast to triple by 2030. These data centers have 24-7 power requirements, reaching eye-watering levels among hyperscale facilities. But the power system, as it stands, is not equipped to keep pace. Regions around the world are tackling the universal challenge of an outdated grid. The US government has recognized that an aging grid poses serious threats to the country’s economic well-being and its status at the top of the AI race. Across the pond, in Europe, almost one-third of grid infrastructure is more than 40 years old, with this factor widely acknowledged as a culprit for the mass blackout in the Iberian Peninsula in April 2025.

Even if utilities are ready to dive in and renew the grid, hurdles from the permitting side are another barrier to contend with. Worryingly, permitting delays are racking up hidden costs for ratepayers to the tune of tens of billions of dollars. Slashing permitting timelines by just one year could reap $22 billion in project returns on infrastructure development. This is against the backdrop of spiking utility costs for US consumers.

Hence, hyperscalers (which include the likes of Microsoft, OpenAI, Meta, Amazon, xAI, Oracle, CoreWeave and Google) are turning to a modus operandi of a “bring your own power” approach. They’re moving off-grid, employing independent power sources, in the form of gas turbines, fuel cells or even small modular nuclear reactors to meet their energy needs. However, this model brings severe consequences, including higher cost per kilowatt, ballooning carbon footprints, unproven reliability, poor resiliency and lower efficiency, to name but a few.

Enabling hyperscalers to operate off the grid for about 50 to 60 peak hours each year could significantly reduce the additional costs their datacenters currently shift onto other ratepayers — households and small businesses. Each utility customer allows a utility to spread its fixed grid and network cost, most of which is towers, poles and wires. This can bring down the typical per-kilowatt charges paid by small households and small businesses. So, when hyperscalers move completely away from the grid, the fallout is felt by everyone, including the average consumer and small business.

The Way Forward Lies in Partnership

This is a wake-up call for the utility sector. In order not to be left in the dust by aggressively accelerating hyperscalers, which would force even more costs on ratepayers, utility providers need to evolve beyond their role as simply service providers.

Rather than operating in isolation, utilities and hyperscalers must come together in a co-development approach. This would be rooted in adaptive, dynamic load-management agreements and flexible interconnection models. Using load management agreements, utilities can reroute power to where it is most needed during peak demand hours. This method also allows for the hyperscalers to bank on their own power during the utilities’ peak demand hours and revert back to the grid power during all other times to minimize the risk of outages and control energy prices.

In this case, utility providers are able to keep the lights on for hyperscalers regardless of the massive energy loads required while improving reliability and resiliency for hyperscalers. They can also keep the runaway utility rates in control for the average consumer.

Costs from data center energy needs should not sit on utilities (and ratepayers) but rather the hyperscalers themselves. Passing the capital risk to the demand generators frees up much-needed space for utilities to focus on facilitating grid interconnection and reliable oversight for homes and hyperscalers alike.

In action, utilities would collaborate with hyperscalers to design and connect on-site (or near-site) power generation. Each party has its delegated role: Hyperscalers carry the capital risk for the new data center-borne power generation. Meanwhile, utilities remain grid stewards, providing the operational and technical know-how to keep power running smoothly.

Minimize Financial Risk for the Utility Ratepayer

The reason this co-development model calls for the capital risk to sit with hyperscalers is simple: to protect ratepayers from potential over-building. In many ways, the data center storm echoes the dot-com bubble. It is absolutely crucial that organizations do not get ahead of themselves, particularly as forecasting data center growth is not a hard science. In a partnership that requires hyperscalers to finance their own dedicated power capacity and grid interconnection because of this surging demand, the capital risk for other consumers is significantly lowered.

There is also a legal aspect to further shield everyone from attached “bubble” risks. The outlined co-development model forges greater load forecasting and cost recovery discipline that moves away from speculative grid capacity expansion. Instead, service and operational upgrades are bound to long-term contracts between utility providers and hyperscalers. In this setup, contracts stipulate cost-sharing and demand management provisions, and data center operators have access to utilities’ expertise, regulatory certainty and faster grid interconnection. At the same time, consumers across the board enjoy greater cost and infrastructure stability.

It is also worth mentioning that there is a regulatory factor to consider in facilitating this scenario, too. Reforms in federal permitting could accelerate these partnerships and make them a feasible and affordable reality. For example, the US Department of Energy’s Coordinated Interagency Transmission Authorizations and Permits now has a two-year consolidated process to streamline environmental reviews for transmission and interconnected projects. Blending these accelerated reforms with subsidized capital from the hyperscalers means that utilities can improve affordability for ratepayers and other customers.

The utility providers that evolve as co-authors of this next AI-studded chapter for the grid will ultimately rewrite what is to come for the sector as a whole. The next few years could be a crucible to forge scalable, affordable growth for the energy sector that prevents ratepayers from taking yet another hit. But that requires careful and balanced collaboration with hyperscalers, utilities and government bodies alike.

Link to the original article: here

Powering AI Without Breaking the Grid

December 17, 2025
3 min read
January 23, 2026
Hari Vasudevan
Founder & CEO of KYRO AI
Author
Rabiya Farheen
Content Strategist
No items found.
Share on

Copyright © 2025 by Energy Intelligence. All rights reserved. This article first appeared in December 2025 on Energy Intelligence.

The electricity grid is about to reach a breaking point from an operational and financial standpoint. Energy demand from data centers is soaring faster than the grid can keep up with, and ratepayers could end up shouldering the costs. Across the next five years, utilities face a choice: either strictly stick to their role as electricity suppliers, or step up as partners of the artificial intelligence revolution.

The Alarm Bells Are Ringing

Just this year alone, data center-fueled demand for power jumped 22% across the US, and that’s forecast to triple by 2030. These data centers have 24-7 power requirements, reaching eye-watering levels among hyperscale facilities. But the power system, as it stands, is not equipped to keep pace. Regions around the world are tackling the universal challenge of an outdated grid. The US government has recognized that an aging grid poses serious threats to the country’s economic well-being and its status at the top of the AI race. Across the pond, in Europe, almost one-third of grid infrastructure is more than 40 years old, with this factor widely acknowledged as a culprit for the mass blackout in the Iberian Peninsula in April 2025.

Even if utilities are ready to dive in and renew the grid, hurdles from the permitting side are another barrier to contend with. Worryingly, permitting delays are racking up hidden costs for ratepayers to the tune of tens of billions of dollars. Slashing permitting timelines by just one year could reap $22 billion in project returns on infrastructure development. This is against the backdrop of spiking utility costs for US consumers.

Hence, hyperscalers (which include the likes of Microsoft, OpenAI, Meta, Amazon, xAI, Oracle, CoreWeave and Google) are turning to a modus operandi of a “bring your own power” approach. They’re moving off-grid, employing independent power sources, in the form of gas turbines, fuel cells or even small modular nuclear reactors to meet their energy needs. However, this model brings severe consequences, including higher cost per kilowatt, ballooning carbon footprints, unproven reliability, poor resiliency and lower efficiency, to name but a few.

Enabling hyperscalers to operate off the grid for about 50 to 60 peak hours each year could significantly reduce the additional costs their datacenters currently shift onto other ratepayers — households and small businesses. Each utility customer allows a utility to spread its fixed grid and network cost, most of which is towers, poles and wires. This can bring down the typical per-kilowatt charges paid by small households and small businesses. So, when hyperscalers move completely away from the grid, the fallout is felt by everyone, including the average consumer and small business.

The Way Forward Lies in Partnership

This is a wake-up call for the utility sector. In order not to be left in the dust by aggressively accelerating hyperscalers, which would force even more costs on ratepayers, utility providers need to evolve beyond their role as simply service providers.

Rather than operating in isolation, utilities and hyperscalers must come together in a co-development approach. This would be rooted in adaptive, dynamic load-management agreements and flexible interconnection models. Using load management agreements, utilities can reroute power to where it is most needed during peak demand hours. This method also allows for the hyperscalers to bank on their own power during the utilities’ peak demand hours and revert back to the grid power during all other times to minimize the risk of outages and control energy prices.

In this case, utility providers are able to keep the lights on for hyperscalers regardless of the massive energy loads required while improving reliability and resiliency for hyperscalers. They can also keep the runaway utility rates in control for the average consumer.

Costs from data center energy needs should not sit on utilities (and ratepayers) but rather the hyperscalers themselves. Passing the capital risk to the demand generators frees up much-needed space for utilities to focus on facilitating grid interconnection and reliable oversight for homes and hyperscalers alike.

In action, utilities would collaborate with hyperscalers to design and connect on-site (or near-site) power generation. Each party has its delegated role: Hyperscalers carry the capital risk for the new data center-borne power generation. Meanwhile, utilities remain grid stewards, providing the operational and technical know-how to keep power running smoothly.

Minimize Financial Risk for the Utility Ratepayer

The reason this co-development model calls for the capital risk to sit with hyperscalers is simple: to protect ratepayers from potential over-building. In many ways, the data center storm echoes the dot-com bubble. It is absolutely crucial that organizations do not get ahead of themselves, particularly as forecasting data center growth is not a hard science. In a partnership that requires hyperscalers to finance their own dedicated power capacity and grid interconnection because of this surging demand, the capital risk for other consumers is significantly lowered.

There is also a legal aspect to further shield everyone from attached “bubble” risks. The outlined co-development model forges greater load forecasting and cost recovery discipline that moves away from speculative grid capacity expansion. Instead, service and operational upgrades are bound to long-term contracts between utility providers and hyperscalers. In this setup, contracts stipulate cost-sharing and demand management provisions, and data center operators have access to utilities’ expertise, regulatory certainty and faster grid interconnection. At the same time, consumers across the board enjoy greater cost and infrastructure stability.

It is also worth mentioning that there is a regulatory factor to consider in facilitating this scenario, too. Reforms in federal permitting could accelerate these partnerships and make them a feasible and affordable reality. For example, the US Department of Energy’s Coordinated Interagency Transmission Authorizations and Permits now has a two-year consolidated process to streamline environmental reviews for transmission and interconnected projects. Blending these accelerated reforms with subsidized capital from the hyperscalers means that utilities can improve affordability for ratepayers and other customers.

The utility providers that evolve as co-authors of this next AI-studded chapter for the grid will ultimately rewrite what is to come for the sector as a whole. The next few years could be a crucible to forge scalable, affordable growth for the energy sector that prevents ratepayers from taking yet another hit. But that requires careful and balanced collaboration with hyperscalers, utilities and government bodies alike.

Link to the original article: here

Hari Vasudevan
Founder & CEO of KYRO AI

Hari Vasudevan, PE, is a serial entrepreneur and engineer focused on AI-driven solutions for utilities, construction, and storm response. As Founder and CEO of KYRO AI, he leads the development of AI-powered software that helps utility, vegetation, and field service teams digitize operations, improve storm response and restoration, and reduce operational risk. He also serves as Vice Chair and Strategic Adviser for the Edison Electric Institute’s Transmission Subject Area Committee and holds bachelor’s and master’s degrees in civil engineering with professional engineering licensure in multiple states.

Discover more related blogs.