This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Reducing the cost of enclosure cooling

13 November 2015

Facility teams and data centre managers know that to survive in a world where low cost cloud infrastructure is dominant, they need to cut costs to the bone. The hardest thing to cut has always been the cost of cooling. Mark Hirst reports.

Aisle containment is an air management system that can be retrofitted to data centres. It can extend the life of older data centres and enable higher densities without the need for expensive refits of existing cooling systems.

Higher input temperatures are supported by industry body, the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), and over the last decade input air temperatures have risen from around 18°C to more than 26°C - new generations of silicon electronics and motherboard design having made this possible.

Free air cooling uses ambient air to remove the heat from the data centre, and the stated goal of most of these systems is to have no mechanical cooling at all. However, there are few places on earth that are appropriate locations for data centres where the outside air temperature is low enough to cool them all year round. It is not just ambient air that is a challenge, the different technologies known as free air cooling come with a number of additional challenges, from data centre design to airborne particulate matter.

The challenges of ambient air
It is not possible to retrofit pure ambient air systems into existing facilities. Take the Hewlett Packard data centre in Wynyard, UK as an example. This requires a 5m space beneath the data hall to create a volume of air large enough to produce sufficient pressure to push air through the data hall above. A chimney vents the hot air from the data hall, which reduces the air pressure sufficient to draw ambient air into the data hall. This air must be filtered to remove any particulates that might otherwise impact on the performance of the equipment in the data hall.

City air tends to have high lead and other particulates – especially those emitted from diesel vehicles – as well as general dust. It’s also warmer than air in the countryside, potentially limiting the number of days when ambient air can be used without having to resort to secondary cooling.

Surprisingly, air in the countryside is even ‘dirtier’ in terms of its effects of data centre hardeware. Pollen, dust, insects even swarms of bees and wasps have been captured by the filters that protect large air halls sited in countryside environments. And although the ambient temperature is lower than in cities, high winds are a problem as they can force small particles of dust through filter screens.

Humidity levels can pose a serious risk to the data centre. Too low, and the risk of static electricity discharge increases, with its inevitable effects on electronic equipment; too high, and condensation becomes a problem, leading to short-circuits and corrosion, particularly in power supply systems.

Taking advantage of the natural temperature differences between land and sea, data centres are increasingly being located in coastal regions. The downside is salt laden air, which is highly corrosive; any implementation of free air cooling in this type of environment requires a significant investment in technologies to clean the air of all salt before it is considered for use as a cooling medium.

To prevent dew point problems, air from the data centre is mixed with the air being drawn in from outside. Some heating may be necessary to reduce the risk of condensation and its detrimental effects on data centre hardware systems. Air that is particularly humid will require dehumidifiers to be brought online, increasing overall energy costs.

Airside economisers
Airside economisers tackle both particulate and dew point issues and can be retrofitted to an existing facility. After filtration of airborne particulates, the air is passed either through an air-to-air heat exchanger (indirect) or through a back-up water or DX air coil (direct).

The Green Grid, a collaborative organisation committed to improving the resource efficiency of data centres and business computing ecosystems, estimates that free air cooling may be possible in a variety of climates. For example, in hot climates such as those of Florida, Mexico, Texas, Portugal, southern Spain and the Middle East, some 2,500-4,000 hours per year of free air cooling is possible at night and during the winter months.

In temperate climates such as those of the UK, northern France, The Netherlands, New York and parts of California, this can rise to 6,500 hours. Further north and data centre owners should expect up to 8,000 hours, although there will be additional costs associated with the removal of excess humidity and the need for air heating.

One of the most common failure points of airside economisers is poor pressure management. Insufficient pressure will create stagnant air which will just increase in temperature as it is poorly circulated.

It is important to locate temperature sensors at strategic points around the data centre and to integrate this sensor network with the data centre information management systems, allowing operators to identify temperature hotspots. Two issues arise with poorly located sensors: firstly, too much return air can be added into the airflow causing input temperatures to rise unexpectedly; secondly, air will be over-cooled creating a large difference between hot and cold, and heightening the risk of condensation.

Money is there to be saved
An Intel survey looked at the use of airside economisers operating at external ambient air temperatures of 32°C. One of the conclusions was that a 10MW facility could save almost $3 million per year, while the risk of increased equipment failure was so low as to be insignificant.

In the temperate climates of the UK and mid-US, free air cooling is able to deliver a Power Usage Effectiveness (PUE) as low as 1.05 against an industry average of 2.0. This means that energy lost to non IT equipment is just 6 percent of the total energy bill.

Mark Hirst is head of T4 Data Centre Solutions with Cannon Technologies


Contact Details and Archive...

Print this page | E-mail this page