Data centre cooling the energy-efficient way
11 October 2012
Data centres can account for half the power consumption of a company and, on average, 37 percent of this is used for cooling; implementing energy reduction measures has therefore become a priority for data centre managers. Les Hunt reviews some recent offerings from three leading industry suppliers.
Data Centre World, which took place in London back in March, was a good place to discover the latest innovations in data centre cooling. Stulz’s ‘CyberRow’ air conditioning unit was one product that caught my eye.
This is installed directly in the server room and between the racks, where its horizontal air flow provides targeted, energy-efficient cooling. A useful benefit of CyberRow is that it can also be retrofitted, regardless of the make of rack installed, offering configuration flexibility, particularly in smaller rooms.
CyberRow provides not just one cool airflow through a raised floor, but two, streaming horizontally to the left and right, directly in front of the server racks. This ensures that cooling air is distributed evenly and in a targeted manner around the entire data centre.
Unlike traditional air conditioning systems, CyberRow is not reliant on building structures such as raised floors to distribute the air. The system is easily installed in any server room, because it is compatible with every make of rack. Moreover, when that extra bit of cooling is required, it is easily installed, whereas conventional systems might run up against problems due to the building design.
And because CyberRow cools right where heat is generated, the air is transported over shorter distances. This means the system can be targeted closely where it is needed; result - over-dimensioning is avoided and energy saved.
CyberRow is available in four cooling variants including the standard chilled water configuration. It also offers two models with compressor cooling and ambient air or water-cooled heat exchange. In addition, a hybrid ‘GE’ (direct expansion with indirect free cooling) system is available whereby a water-cooled coil is added to the CyberRow DX model.
This secondary free-cooling coil is connected to an external heat exchanger (‘drycooler’) outside of the building. As soon as the ambient temperature permits, the unit switches from cost-intensive compressor mode to the more resource-conserving ‘indirect free cooling’ mode via the drycooler.
Meanwhile, the data centre serving Cambridge University’s Department of Engineering is achieving recurrent energy savings and a power usage effectiveness (PUE) of less than 1.06, thanks to the use of computer room evaporative coolers from another Data Centre World exhibitor, EcoCooling, which reported the completion of a second project at this site.
Four EcoCooling CRECs (computer room evaporative coolers) provide 90kW of cooling with N+1 redundancy. Air is fed directly into the cold aisle of this first floor data centre which is populated with high density racks.
The supply and extraction is provided by electronically commutated (EC) axial fans which, at full design load, consume less than 5kW, which means their use will only add a maximum of 0.05 to the PUE.
And since the data centre is currently only partly populated, the variable speed EC fan exploits this and energy use is further improved. The current energy use is less than 2% of the IT cooling load.
Air is supplied to a ceiling void and then filtered via G4 grade cartridges. This allows a large number of filter elements to be installed with minimum cost and space requirements – believed to be the first installation of its kind using a fresh air system to provide a dust free environment at a fraction of the cost of normal filters.
The control system for this cooling arrangement was also designed by EcoCooling and uses a PLC to controls airflow, humidity and temperature within the data centre, as well as links to the fire alarm system. These controls directly interface with a Honeywell Trend BMS for data and fault reporting.
The data centre is run at the lower end of the ASHRAE temperature standards at 18.5C. This costs no more to achieve but gives a more comfortable environment for operators and also reduces temperature related failures.
Temperatures within the data centre may rise to up to 24C on the very hottest days but this is within ASHRAE guidelines. Colder temperatures which can result in server shutdowns are prevented by the use of a patented attempering system which mixes the hot air from the servers back into the air circulation to the server room, maintaining the room at the set temperature. The control system dynamically changes the air supply set point automatically to control the maximum relative humidity so that ASHRAE compliant conditions (20-80%) are achieved at all times.
Another exhibitor, Rittal reported a fascinating joint project with IBM, which is claimed to reduce the energy consumption of a data centre by as much as 10 percent. By integrating Rittal’s RiZone IT infrastructure monitoring and management software with IBM's MMT (Management and Measurement Tool), comprehensive energy management may be implemented for all parts of a data centre, providing monitoring and active regulation of cooling power to achieve this level of energy usage reduction.
IBM tested the behaviour of the combined systems in part of its own data centre and found it was possible to change the ideal set point for the ambient temperature from 20°C to 24.63°C on the cooling system by using optimisation algorithms. These higher flow temperatures provide greater potentials for saving on climate control.
Climate control is steered actively and automatically, based on comprehensive rules and control loops. The energy-efficient arrangement of the dynamic IT infrastructure is performed while the system runs through a systematic diagnosis and analysis of all infrastructure parameters such as temperature, humidity or CPU usage.
Rittal RiZone recognises IBM MMT as an SNMP-enabled terminal device. While RiZone monitors the data from the data centre, MMT ensures the optimisation. The interaction of both applications provides a complete view of the entire infrastructure and enables MMT to regulate and control active processes such as the server standby or the cooling units.
Apart from climate control, RiZone also monitors, regulates, and manages access, power supply and security of data centres in a modular and scalable way. It is suitable for applications ranging from an individual rack in a data centre to a company's complete IT infrastructure. And following this joint project, it is now possible to combine it with a management system (like IBM MMT), which is based on real-time sensor networks.
Intel is now embedding thermal sensors with its servers, eliminating the need for additional external sensors such as those mounted on the front of the server cabinet. These embedded sensors provide real-time power, airflow and CPU-utilisation data.
Now, Intel and the data centre design software specialist, Future Facilities have announced a project to integrate the latter's Virtual Facility 3D data centre CFD model with this sensor data to enable effective CFD analysis and IT load capacity planning and optimisation. The combination of sensors with the CFD model also facilitates the design and operation of automated cooling control systems.
Historically, one of the greatest data centre cooling challenges has been accommodating new IT power densities and airflow requirements that were outside the specification of the original cooling design. By combining Virtual Facility's CFD modelling capabilities with data gathered from the embedded Intel sensors, data centre managers will be able to predict and mitigate cooling problems before they commit to any IT upgrade.