loader

AIRFLOW FOR DUMMIES – BREAKING DOWN BEST PRACTICES

Good airflow management is essential in today’s data centersIncreasing rack heat loads are driving the need to deliver cooling to where it is really needed. Not only does good airflow management allow for greater IT load capacity and increased cooling efficiency, but it can significantly reduce energy and operating costs. But, where do you start? 

IT ALL BEGINS WITH COOLING 

Did you know that cooling can account for 40% or more of the energy consumed in a data center? And in most cases, a high portion of that expense is wasted due to ineffective airflow. 

Delivering the correct quantity of airflow at the right temperature to the inlets of the IT equipment is the most crucial element of cooling. “Proper quantity of airflow doesn’t mean as much airflow as possible, but rather the specific quantity needed to meet the cooling requirements of your equipment. 

The most common method of data center cooling for IT equipment is to provide air to the inlets of the IT equipment at a volume and temperature that is adequate to cool the internal components. Guidelines for such metrics are specified in the ASHRAE thermal guidelines, which present the recommended air inlet temperature. This is the temperature of the air that is entering the inlet side of the IT equipment.  

Current ASHRAE Data Center Thermal Guidelines outline a recommended temperature range of between 18° to 27°C (64° to 81°F), at the IT equipment inlets. This significant broadening in inlet temperatures from previous years was developed in conjunction with major IT equipment providers to reduce cooling operating energy costs.  

UNDERSTANDING SET POINTS  

Cooling system temperature set points are the key to an efficient airflow management strategy. Set points relate to the return air temperature and humidity arriving back to the cooling unit, not the supply air temperature leaving the cooling systemThis is a very important distinction because it has created industry-wide confusion. Today, many operators believe they are within the ASHRAE recommendations, not realizing they are more than likely overcooling their facility unnecessarily. This simple mistake is costing your business thousands of dollars a year. 

If the return air reaches the cooling unit below the set point, the cooling unit may respond by going into a no cooling state. When this happens, the supply air temperature rises, causing the return air temperature to rise, and the cooling unit then goes into cooling mode. If too much cooling is being provided or this cycle occurs frequently, this can cause the cooling units to short cycle.  In other words, incorrect temperature setup will cause cooling units to switch on and off oftenrisking failure of the cooling system. 

Supply air temperature arriving at the IT equipment inlets is impacted by:

1. Air temperature exiting the cooling unit

2. The path the air takes to get to the air inlets

3. The level of air mixing within the room

These factors impact how much the temperature rises between the air exiting the cooling unit and the air entering the IT equipment inlets. 

ISN’T COLDER BETTER?  

In facilities lacking efficient airflow management, the temperature rise on the path to the IT equipment can increase 10-15°C. In a well-designed data center, with good airflow, the temperature rise may only be 1-2°C. This is crucial because the temperature of the air delivered to the IT equipment inlets is the most important aspect; if the air temperature is too high, the IT equipment will not be properly cooled, resulting in equipment failure. In the same breath, delivering air that is too cold, achieves no benefit and results in the cooling systems consuming much more energy than required. 

The quick fix, one would think, would be to increase the cooling unit set points to balance out inefficiencies in airflow. However, this has to be done very carefully as an excessive increase in set points could very well push the IT equipment inlet air temperatures beyond the 27°C recommended level. 

Instead, the issues at the root of the problem should be addressed prior to making any changes in the operation of the cooling system or adding more cooling capacity. 

WHERE DO AIRFLOW ISSUES ARISE? 

There are many factors that can cause inefficiencies in your critical facility cooling. These are the most common in today’s data centers: 

  • Poor Separation of Supply and Return Airflow  
  • Lack of Blanking Panels  
  • Openings Between Racks 
  • Air Bypass 
  • Air Recirculation  
  • Airflow leakage  
  • Incorrect use of Perforated Tiles  
  • Excess Air Supply 
  • ITE Placement 
  • Reverse or Side Flow Equipment 

Assessing your facility, starting with these common issues, is the best way to formulate your airflow management strategy. It’s important to establish a baseline measurement prior to any changes you make. This will allow you to truly understand the evolution of your facility as you push towards optimization. 

Airflow issues are multi-faceted and require a holistic approach in order to achieve peak efficiency and maximize cooling capacity. Having over 15 years experience in cooling optimization, SCTi understands the intricacies of airflow management, and through their solution-based approach, have been able to provide measurable energy and operational savings to maximize data center efficiency.

If you’d like to learn more about optimizing data center cooling, read our blog post, “Debunking Data Center Cooling Myths – You Can Cool Better with Less Equipment.” 

You can also download our white paper: Airflow for Dummies

Scroll to top