This post uncovers methodologies to optimize Data Center cooling and reduce environmental carbon emissions for engineers who design and operate hyperscale facilities.
In addition to an increased demand for electricity, the trend is likely to cause carbon emissions from data centers to more than double by the end of the decade. This article examines how data centers can optimize energy efficiency to meet these new challenges.
New Cooling System Designs
According to Anay Arun, over 7 years of experience in Mechanical Engineer seasoned in Product design, Tribology, Finite Elements Analysis, Thermal Engineering and Subject Matter Expertise in Data center engineering- using low-power and high efficiency fan motors and chillers can significantly reduce data center power consumption.
“New cooling system designs are using the same approach people use in smart home energy systems efficiency,” says Arun. “Only use what you need when you need it.”
For example, based on Arun’s design and testing in a data center environment shows variable-speed VFD driven motors reduce power usage by up to 20 percent. These fans use thermostatic measurements to slow down when the high-density CPU/Server is idle and then increase when the need increases.
Arun stresses that cooling methods are also essential to data center energy efficiency. “Just as a car’s radiator helps cool down an automobile engine, liquid cooling dissipates data center heat with the use of a liquid,” he explains. “Liquid cooling tends to be more effective than air-based cooling and typically is a quieter method.”
Steps for Optimizing Data Center Power Efficiency
In addition to updating cooling systems, Arun has designed and validated some key steps which data center engineers can take to operate more efficiently.
Conduct regular power assessments. “You can’t reduce power usage and prepare for an upsurge if you don’t understand the data center’s current performance,” Arun points out. He recommends routine PUE (power usage effectiveness) evaluations to identify and monitor energy consumption, cooling efficiency, and server utilization. This data will help engineers make informed decisions and accordingly optimize the system and minimize embedded carbon.
Schedule routine maintenance of critical infrastructure. Although problems can occur when we least expect them, conducting regular maintenance can help prevent downtime and data loss. According to research done by Arun, he outlines several strategies which can be incorporated in design to run the data center with high efficiency and reduce power consumption.
- Advanced monitoring of cooling units, including air filters for static pressure, water quality, and variable speed fans.
- Use of outside air economization based on ambient conditions.
- Increase the Cold Aisle temperature by studying the hardware limit/compatibility. This will reduce the use of chillers, DX or Evaporative coolers.
- Robust design of containment system in the data center to prevent air leakages. Ensure proper sealing of ducting system.
- Continuous monitoring of air flow pattern in the data center. Ensure there is no interference of discharge air with incoming air flow.
- Use of high efficiency i.e > 98% Uninterrupted Power Supply (UPS) in data center.
- Clean data center environment to protect and improve hardware stability.
- Increase the count of system telemetry for granular data in the data center. This will ensure real-time alerts of any issues.
- Implement security measures, including data encryption, to secure communication channels.
- Perform system backups to prepare for disaster recovery scenarios.
- Ensure the center’s compliance with industry standards on air pollution.
- Use machine learning (ML) and data analytics to predict potential critical equipment malfunctions.
What Is a Green Data Center?
With the expanding use of AI and its power needs, is the goal of maintaining a green data center realistic? Arun, who has over seven years of mechanical and data center engineering experience, says yes.
“A green data center is one that has location, infrastructure, and equipment designed for maximum energy efficiency and lowest environmental impact,” he shares as a definition of the concept.
Global data centers currently consume 1 to 2 percent of overall power, with experts predicting that percentage to reach 3 to 4 percent by 2030. Data centers will need to rely on operational efficiency metrics to handle that increase while maintaining green energy goals.
Arun points to new advancements in green data center technology
- Waste recycling. In addition to high energy usage, environmental waste (e-waste) is a significant problem for data centers. E-waste recycling involves the breaking down of electronic devices and equipment into raw materials, which are then used to manufacture new products. This process helps reduce the overall environmental impact by conserving components and materials to create new products. Another new recycling measure involves recycling excess data center heat to other nearby energy systems for utilization.
- Using AI as a forecast tool. Although AI is the primary reason energy use is increasing at data centers, it also can be a tool to help them become more energy-efficient. Arun explains that AI and ML can help predict energy use and thus enhance PUE metrics at data centers.
“Running a data center based on demand vs response is the principle for energy efficiency,” says Arun. “When data center administrators have the tools they need to optimize the cooling based on server demand, overall power use can be dramatically reduced.”
