As data centers surge to the forefront of U.S. infrastructure, their massive energy appetite is sparking real concern about grid strain — especially as artificial intelligence continues to fuel exponential compute demand. With projections suggesting data centers could account for 12% of total U.S. electricity consumption by 2030, the question is no longer just how to support their growth, but how to do so sustainably.
According to David Chernis, director of flexible compute platforms at CPower, the future of the data center industry hinges on one key shift: flexibility. “Data centers will need to be flexible if we're going to keep all of this innovation in the U.S.,” he said. In other words, facilities that once demanded 100% uptime may need to adapt to new roles as active partners in grid stability.
A new study from Duke University’s Nicholas Institute suggests the potential is massive. If large-load users like data centers curtail energy usage just 0.25% of the time, the grid could absorb 76 gigawatts of new energy load — without the immediate need for new power plants. With just 1% curtailment, the grid could integrate up to 126 gigawatts. That level of flexibility could unlock an era of expansion without triggering an infrastructure crisis.
But this shift would require a new mindset for data centers, Chernis said. “They’ve traditionally thought of themselves as real estate companies, not energy companies.” Their mission-critical services — like hosting live websites or financial platforms — allow little room for downtime. But not all servers are mission-critical. Crypto mining, for example, can pause without consumer impact. These operations represent 2.3% of U.S. energy usage and can shed their entire load within seconds, functioning like a “digital battery” during emergencies.
Chernis sees these non-live workloads as key players in a new energy paradigm, where data centers contribute to the grid during periods of peak demand rather than simply consuming. This isn’t just theoretical — some states are already leading the way.
In Texas, the newly passed Senate Bill 6 requires large energy users to provide backup power, participate in load reduction during emergencies, and contribute to grid upgrades. The law marks a pivotal moment in redefining the relationship between tech infrastructure and public utilities — a model other states are now eyeing.
Chernis believes this evolving regulatory environment makes participation in demand response programs — where large users shift or reduce energy use in exchange for compensation — more vital than ever. “It’s about being a partner with the grid instead of being a burden,” he said.
There are financial and social benefits to this approach. Reducing strain on the grid can help local economies thrive by preventing blackouts and stabilizing electricity prices for homes and businesses. “You're supporting productivity gains in the local area,” Chernis explained. “That inherently provides a cost benefit.”
Still, adoption faces challenges. Risk aversion is one of the biggest barriers, especially in a mission-critical environment where outages are costly. Chernis noted that the industry often resists change, even when the tools for flexible consumption are readily available.
Another issue is the slow interconnection timeline, which currently averages 5 to 8 years. That lag makes it harder for data centers to balance their own energy supply and demand in real time.
CPower, which operates as a virtual power plant platform, helps companies tap into demand flexibility by coordinating energy use across nearly 23,000 sites. Their services range from optimizing energy allocation and navigating regulatory compliance to unlocking revenue through grid participation. “We assess your capabilities and tailor monetization opportunities,” Chernis said.
Ironically, the same AI systems contributing to increased demand are also key to solving the problem. CPower is using AI to optimize energy consumption, predict grid failures, and intelligently manage load across networks.
“The vertical demand growth caused by AI is real,” Chernis said. “But AI is also central to the solution.”
As electricity demand rises and new grid regulations take shape, the message is clear: the most successful data centers of tomorrow will be those that operate not just as infrastructure, but as collaborators in America’s energy future.
