Data Center Containment Best Practices That Won't Bust Your Budget

Understanding Airflow Management
[ Page 1 of 5 ]  Page 1 Page 2 Page 3 Page 4 Page 5 next page
Provided by Mission Critical

Learning Objectives:

  1. Discuss the benefits of an airflow containment strategy.
  2. Identify the four R's of containment best practices.
  3. Review ASHRAE recommended operating temperatures.
  4. Describe the importance of matching cooling capacity with IT load.
  5. Review the typical steps that need to be taken to properly match the cooling capacity of the data center with the IT load.

Credits:

AIA
1 AIA LU/Elective
IACET
0.1 IACET CEU*
PDH
1 PDH*
As an IACET Accredited Provider, BNP Media offers IACET CEUs for its learning events that comply with the ANSI/IACET Continuing Education and Training Standard.
This test is no longer available for credit

Rack Densities On the Rise: Now Is the Prime Time for Airflow Containment

Data center containment best practices that won’t bust your budget — Part 1

By Tom Cabral and Rob Huttemann

When prepping for, or implementing containment measures, keep the four R’s in mind: rack, row, room, and raised floor.

In a reality where cabinet densities are rising — with recent research stating upwards of 10 kW per rack, on average — increasing energy efficiency and keeping operating costs down in data centers has never been so relevant or so challenging.

Ensuring data centers remain operable while minimizing cooling expenses is no small task. After all, cooling is frequently one of the largest energy consumers in data centers, usually because many still operate by oversupplying cool air to keep servers with inlet temperatures below 80⁰F.

The good news is, there are proven benefits of raising temperatures in data centers to save on energy usage.

Airflow containment — the ability to isolate, redirect, and recycle hot exhaust air — has withstood many market trends over the years and continues to stand as the core answer to the juggling act of opposing forces. In fact, the energy savings are so convincing that federal and state governments in the U.S. now require airflow containment in new and retrofit data center designs.

Optimization

In a completely contained system, it is possible to circulate only the necessary amount of cooled air through data centers while isolating the heat created by IT equipment. This eliminates the need to oversupply equipment with cool air, resulting in tremendous energy savings.

Furthermore, by developing and implementing a good airflow containment strategy, it’s also possible to remove hot spots and achieve a lower power usage effectiveness (PUE), long considered a leading metric of energy efficiency in data centers. For a variety of reasons listed below, airflow containment is still the practice of choice for air-cooled facilities with power densities above 1 kW:

  1. Eliminates hot spots within the rack
  2. Supports four times higher heat and power densities (6 kW to 30-plus kW)
  3. Enacts “passive cooling,” which uses 100% of supplied air and reduces chilled air waste
  4. Transfers server equipment delta T directly back to the HVAC unit
  5. Promotes cost savings through higher set points
  6. Enables supply air delivery from the ceiling, wall, or floor

In simple terms, airflow containment best practices require blocking all openings between the supply air and return air areas so that temperatures within the room can be raised to meet the recommended efficiency ranges from the ASHRAE — 64° to 81°. These openings include cable access ports in floor tiles, unused or open rack mount units, or any open space between hot and cold aisles.

So, how do you get started with containment improvements? Just as there are industry recognized best practices for other aspects within data centers, there are also best practices to follow for successful containment projects. When prepping for, or implementing containment measures, keep the four R’s in mind: rack, row, room, and raised floor. Each of these areas must be addressed in order to reap all the potential efficiency benefits of containment solutions.

When we meet with data center operators, cost is their No. 1 reason for not following containment best practices. They assume that to realize efficiencies, they’ve got to “do it all.” While it’s true that all four R’s need to be addressed to achieve improvements, there are actions you can take in each area that won’t break the bank.

Containment Best Practices for Racks and Cabinets

Unsealed cabinet openings are typically the most overseen step in the containment process, yet they are more economical to address. These openings recirculate the equipment’s hot exhaust air back to the intake ports, which contributes to equipment failure. These openings should be sealed with air dams, filler panels, bottom panels, brush grommets, and other sealing accessories for the best possible isolation.

Cabinet-level containment can be further maximized with the deployment of a vertical exhaust duct, or “chimney,” at the top of the cabinet to guide hot air into the ceiling plenum, where it can be returned to the cooling system. This is a smart approach because it both contains and expels hot exhaust air within the cabinet, leaving the room cool.

Additionally, deployment is fairly simple because the strategy is centered on the cabinet. There are minimal life cycle costs since there is no need for raised floors or major room remodels. This is a perfect strategy for new or retrofit installations of high- to mid-density output.

Cabinet Design Basics

As the main structure supporting equipment, data center cabinets must be sturdy and specifically designed to facilitate airflow management. Consider the following features when looking for optimal data center cabinets.

  • Variety of heights and depths — 48U cabinet options and four depths support larger and wider equipment.
  • Door perforation — Highly perforated front and rear doors provide front-to-back airflow.
  • Equipment support options — Adjustable mounting rails and sturdy shelves support a wide variety of equipment.
  • Split side panels — Two-piece side panels are easy to remove for quick access to equipment and cables.
  • Provisions for cabling and power distribution units (PDUs) — Brush-covered cable openings on the top provide quick and easy access to cabling. Interior space allows for stable PDU mounting.
  • Color options — Bright white color options reduce energy costs as they requires less lighting.
Future-Ready Designs

There’s no denying that the rise of the IoT, edge computing, smart manufacturing, and more are further pushing the thermal envelope for data centers.

But efficiency does not have to be compromised to ensure uptime. The simplest and quickest return on investment is with a reliable airflow containment strategy. The type of containment strategy should be based on each organization’s business requirements and architectural limitations. In addition, fear of expense shouldn’t keep you from implementing containment measures. Because as long as there is isolation from hot and cold air, any method utilized is valid. Finally, a reputable manufacturer that provides design tools and consultancy services to help customers develop a plan personalized to their unique needs and budget is an added bonus to any successful data center design that will be primed for years of growth.

Tom Cabral has worked in the telecom industry for more than 20 years and has been employed with Chatsworth Products (CPI) for nearly 15 years, serving as a regional sales manager, field applications engineer, and his current role of product application specialist. Cabral provides technical advice and design specifications on complex product applications and acts as a technical liaison with a high level of knowledge on product operation and performance. In addition to graduating Summa Cum Laude with a Bachelor of Science degree in marketing and business communications from the University of Maryland, he is also RCDD-certified.

Rob Huttemann is the senior vice president of operations for Critical Environments Group (CEG). He has more than 30 years of industry experience and familiarity with data center and supporting infrastructure management, with a specific focus on power, space and storage, cooling, and overall best practices.

 

[ Page 1 of 5 ]  Page 1 Page 2 Page 3 Page 4 Page 5 next page
Originally published in Mission Critical
Originally published in October 2020

Notice