Rising demand for cloud computing and artificial intelligence has pushed U.S. data centers to their limits.
Operators now face growing pressure to control energy costs without compromising performance.
A research team at Penn State believes artificial intelligence can help by rethinking how these facilities manage one of their biggest expenses, cooling.
The team developed software that uses a physics-based AI model to analyze real-time climate and economic data.
It then recommends how and when data centers should adjust cooling to improve efficiency and lower costs.
The system trains inside a virtual replica of a facility, allowing it to test scenarios before applying them in the real world.
Smarter cooling with AI
Cooling accounts for a major share of data center energy use, making it a key target for savings.
The researchers focused on replacing static cooling strategies with adaptive ones.
“Cooling currently accounts for about 40% of a data center’s total electricity use — it just goes to keeping the data center operational,” said Wangda Zuo, professor of architectural engineering at Penn State.
Operators also face fluctuating weather and volatile electricity prices.
These factors can quickly raise operating costs and reduce profits.
Traditional systems struggle to respond because they rely on fixed temperature targets.
“Traditionally, data centers are cooled to static thermal targets, which can lead to substantial financial losses when electricity prices are high,” Zuo said.
The new AI system addresses this limitation. It dynamically adjusts cooling rates based on external conditions.
It can increase cooling when electricity is cheap and scale back when costs rise, while staying within safe operating limits.
Digital twin training model
The research team trained its AI using a “digital twin,” a simulated version of a data center.
This virtual model mirrors real-world conditions, including temperature, humidity, and equipment constraints.
The system relies on a physics-informed reinforcement learning approach.
It combines engineering rules with machine learning, allowing the AI to make decisions that remain safe and practical.
The team tested the model using a simulated data center in Houston, Texas.
The city’s heat and humidity provided a challenging environment. The AI learned to optimize cooling while maintaining reliability.
“Each hardware component used to cool a data center has its own operational ranges that cannot be violated, so we integrated them into our modeling,” said Viswanathan Ganesh, a doctoral candidate and first author on the study.
This approach allows operators to improve efficiency without risking hardware damage. It also reduces the need for massive training datasets, which many AI systems require.
Crypto mining efficiency gains
The software could also improve cryptocurrency mining, which demands significant computing power.
Mining systems solve complex mathematical problems to validate blockchain transactions and earn rewards.
These operations often run continuously, making cooling costs a major expense.
By aligning cooling with favorable weather and lower electricity prices, the AI system could improve profitability.
The researchers say their approach offers a lower-cost alternative to hardware upgrades like liquid cooling.
Instead of replacing infrastructure, operators can use software to optimize existing systems.
The team will present the work at the IEEE ITherm Conference in May.
Originally written by: Aamir Khollam
Source: Interesting Engineering
Published on: 6 April 2026
Link to original article: New AI uses real-time weather insights to cut data center cooling costs by 25%