Record and track power use regularly. Benchmark your existing energy usage against a comparable timeframe. Continue to measure energy usage consistently and regularly. Regular meter readings will help you better understand trends and eliminate any seasonal spikes or oddities. At the end of a 12-month period you should have enough data to benchmark against and get a much clearer picture of your overall efficiency.
Take a look at our Telstra Case Study for more information on quantifiable energy efficiency monitoring.
Have a monitoring system in place. Surprise incidences of downtime in server rooms and remote wiring closets lead to sleepless nights for many IT managers. Most have horror stories about how bad luck, human error, or just simple incompetence brought their server rooms down.
Read more in Schneider Electric White Paper No 103 'How Monitoring Systems Reduce Human Error in Distributed Server Rooms and Remote Wiring Closets'. This paper analyses several of these incidents and makes recommendations for how a basic monitoring system can help reduce the occurrence of these unanticipated events.
3 REGULATE AIR FLOW
A data centre is essentially a structure that manages the flow of cold air in and extracts hot air so preventing hot and cold mixing. If the airflow process is not managed correctly, mixing will occur. Controlling airflow and limiting mixing has to always be a priority.
Also look to reduce bypass airflow to eliminate hotspots. Uptime Institute defines bypass airflow as conditioned air that does not reach computer equipment. The air is escaping through cable cut-outs, holes under cabinets, misplaced perforated tiles, or even through holes in the computer room perimeter walls underneath the raised floor.
Perforated tiles on a raised floor often deliver substantially more or less airflow than expected, resulting in inefficiencies and even equipment failure due to inadequate cooling. Read more in Schneider Electric White Paper No 121 "Airflow Uniformity through Perforated Tiles in a Raised-Floor Data Centre". In this paper, the impact of data centre design parameters on perforated tile airflow is quantified and methods of improving airflow uniformity are discussed.
4. REDUCE AIRFLOW LEAKS
Install blanking plates to all unused rack U-space to stop hot air re-circulation from the server exhaust to the server inlet. Unused vertical space in open frame racks and rack enclosures creates an unrestricted recycling of hot air that causes equipment to heat up unnecessarily. The use of airflow management blanking panels can reduce this problem. It also stops hot and cold air mixing which reduces air conditioning efficiency.
Read more in Schneider Electric White Paper No 44 'Improving Rack Cooling Performance Using Airflow Management Blanking Panels'. This paper explains and quantifies the effects of airflow management blanking panels on cooling system performance.
5. CHECK FLOORING FOR GAPS
Gaps allows air to escape. Block up all holes in the tiles within the raised floor or fit brush grommets around cable entry and exit points through the floor tiles. This will stop the cool air entering the room and mixing without having performed any cooling function. This will greatly increase air conditioning efficiency. Block up all holes in the ceiling, walls or any other aperture such as old air conditioning ducts that will allow air to escape or unconditional air to enter the data centre.
6. SEPARATE HOT & COLD AIR
Ensure all servers within the racks and rows are facing the same direction then separate the rows into hot and cold aisles with the front of the servers facing one way and the back the other to ensure colder air is directed to the front of the servers and hotter air flows from of the server and hotter air flows from the back.
To further separate hot and cold air and improve the direction of airflow introduce aisle containment lay the data centre out in distinct hot and cold aisle format. Do not have a conflict of a hot aisle next to a cold aisle. Introducing roofs and in particular doors to the end of aisles can result in a major improvement.
Containment solutions can eliminate hot spots and provide energy savings. The best containment solution for an existing facility will depend on the constraints of the facility. While ducted hot aisle containment is preferred for highest efficiency, cold aisle containment tends to be easier and more cost effective for facilities with existing raised floor air distribution.
Read more in Schneider Electric White Paper No 153 'Implementing Hot and Cold Air Containment in Existing Data Centres'. This paper investigates
the constraints, reviews all available containment methods, and provides recommendations for determining the best containment approach.
7. CONTROL AIR TEMPERATURE
The average data centre runs at 21 degrees, but with extended ASHRAE standards, server manufacturers are happy with DC temperatures from 18 – 27 degrees. Steps to regulate air temperature such as turning off cooling units, where excess redundancy exists, or increasing the supply air temperature to the room can lead to further savings.
8. CHECK VOLTAGE & REVIEW RACK POWER OPTIONS
The voltage of your electricity transformers should be checked regularly to ensure they match exactly the supply voltage requirement for your equipment. A higher voltage than required means an unnecessary use of power, increasing costs.
Your rack power system needs to adapt to changing requirements. Read about rack powering options for high density in Schneider Electric White Paper No 29 where alternatives for providing electrical power to high density racks in data centres and network rooms are explained and compared. Issues addressed include quantity of feeds, single-phase vs. three-phase, number and location of circuit breakers, overload, selection of plug types, selection of voltage, redundancy and loss of redundancy. Guidelines are defined for rack power systems that can reliably deliver power to high density loads while adapting to changing needs.
9. SWITCH OFF REDUNDANT EQUIPMENT
A single new server can now do the job of multiple older servers, saving energy and lowering your system cooler requirements. Take the long term view when deciding whether to upgrade sooner rather than later.
Turn off the lights or better still have them put on PIR or motion sensors. Where the data centre is large, zone the lighting operation. Only light up areas where people are working and only for the duration that they are in the data centre. Not only will you save electricity but the heat generated by the light fittings will not add to the IT cooling load.
Electricity usage costs have become an increasing fraction of the total cost of ownership (TCO) for data centres. It is possible to dramatically reduce the electrical consumption of typical data centres through appropriate design of the data centre physical infrastructure and through the design of the IT architecture. Read more in Schneider Electric White Paper No 114 'Implementing Energy Efficient Data Centres'. This paper explains how to quantify the electricity savings and provides examples of methods that can greatly reduce electrical power consumption.
Schneider Electric White Paper, “Essential Elements of Data Centre Facility Operations" claims that human error and inattention can compromise the performance and therefore efficiency of any data centre design. 70% of data centre outages are directly attributable to human error according to the Uptime Institute’s analysis of their “abnormal incident” reporting (AIR) database. Much of this occurs as a result of poor operations and maintenance practices. Mitigating these threats and their effects requires an effective and efficient operations and maintenance (O&M) programme. This paper describes unique management principles and provides a comprehensive, high-level overview as well as practical tips and advice throughout of the necessary programme for operating a mission critical facility efficiently and reliably throughout its life cycle.