AI Energy Crunch May Be Short-Lived

• 3 min read

Graphic of an AI data center power supply
Here is one AI concern that might just solve itself sooner than most people thought.

Get the Latest Research & Insights

Sign up to receive an email summary of new articles posted to AMG Research & Insights.

Graphic of an AI data center power supply

Recent anxieties over artificial intelligence (AI)’s soaring energy consumption have gripped both the tech industry and environmental advocates, with data centers facing scrutiny for their seemingly unsustainable power demands.

But 2026 is shaping up to be a year of dramatic change as breakthrough efficiency gains suggest these concerns may be short-lived.

The story begins with hardware innovation. The leap from Nvidia’s Blackwell chips to the newly unveiled Rubin generation is claimed to deliver up to a tenfold improvement in energy efficiency. This means fewer chips would be needed—four times fewer, in fact—to train AI models, and the adoption of liquid cooling systems has slashed the Power Usage Effectiveness (PUE) ratio from 1.5 to below 1.1, a major milestone for data-center sustainability.

But the transformation doesn’t stop at the server rack. Analysts predict that by the end of 2026, a significant share of AI workloads will shift to edge devices—think smartphones and laptops—where smaller, specialized models process data locally. This architectural shift would not only boost computing efficiency but also relieve the burden on centralized data centers.

Grid optimization is also entering the spotlight. New liquid cooling systems are being used to redirect waste heat for building and industrial heating, while AI-powered utilities are optimizing electricity transmission, creating extra capacity without the need for new infrastructure.

Underlying these advances is Koomey’s Law, which states that the energy required per computation halves every 1.57 years. Even as demand for AI grows, the energy needed per unit continues to fall—a trend that gives the industry powerful tools to address environmental concerns through innovation rather than restriction.

The implications are clear: AI’s energy challenges are driving a wave of efficiency innovation, and the current phase likely represents peak energy intensity. As operations migrate to better chips and local devices, the sector is poised for progressively greater efficiency. The shift toward edge computing and smaller models marks the largest architectural change in AI deployment to date, promising to further reduce centralized energy demands.

In short, while the energy demands of AI have posed real challenges, the industry’s rapid response suggests that indefinite growth in consumption is unlikely. Instead, technology’s ability to solve its own problems may soon offer relief to both the sector and its critics.

HOW AMG CAN HELP

Not a client? Find out more about AMG’s Personal Financial Management (PFM) or to book a free consultation call 303-486-1475 or email us the best day and time to reach you.

This information is for general information use only. It is not tailored to any specific situation, is not intended to be investment, tax, financial, legal, or other advice and should not be relied on as such. AMG’s opinions are subject to change without notice, and this report may not be updated to reflect changes in opinion. Forecasts, estimates, and certain other information contained herein are based on proprietary research and should not be considered investment advice or a recommendation to buy, sell or hold any particular security, strategy, or investment product.

Get the latest in Research & Insights

Sign up to receive a weekly email summary of new articles posted to AMG Research & Insights.