Tidy Data Centres are Green Data Centres

From Lauraibm

(Difference between revisions)
(For an overview on the topic(s), see also)
 
(2 intermediate revisions not shown)
Line 1: Line 1:
==MI Summary==
==MI Summary==
-
====Full article: [[Tidy Data Centres are Green Data Centres]]====
 
{{Tidy is Green}}
{{Tidy is Green}}
-
====Tidy Data Centres are Green Data Centres====
 
==Text of Article==
==Text of Article==
Line 16: Line 14:
It is just that alongside these new systems several of the other innovations demonstrated were surprising in both their simplicity and effectiveness. It is easy to imagine countless data center managers having to break off from a tour of the green data center situated in the bowels of IBM's South Bank HQ to physically kick themselves over their failure to implement configurations that once witnessed appear blindingly obvious.  
It is just that alongside these new systems several of the other innovations demonstrated were surprising in both their simplicity and effectiveness. It is easy to imagine countless data center managers having to break off from a tour of the green data center situated in the bowels of IBM's South Bank HQ to physically kick themselves over their failure to implement configurations that once witnessed appear blindingly obvious.  
-
One such example is IBM's approach to under-floor cabling. According to IBM's execs, cabling is often the last thing on a data center manager's mind and as a result the cabling in many server farms is so untidy it makes a teenager's bedroom look like a spotless operating theater.
+
One such example is, but any data center chief could feasibly enjoy some of these energy efficiency savings by simply bundling their cables together.  
-
 
+
-
Before concerns about power consumption made their way to center stage this untidiness was neither here nor there. But the under floor spaces that typically house a data center's network and power cables is also used to distribute cool air around the servers and, according to IBM, blocking up these channels with unbundled cables forces limits the air flow and forces the air conditioning units to work even harder, driving up both energy use and electricity bills.
+
-
 
+
-
Chris Scott, site and facilities service product line leader at IBM, said that the company had addressed this problem through its Integrated Rack Solution (IRS), which integrates the cabling into the server racks, neatly bundling cables together to ensure they pose minimal disruption to the all important air flow. By bundling the power and network cables separately the racks reliability is also increased, he added, while the under floor space is left free "for what it was originally designed for -- moving air to cool the servers."
+
-
 
+
-
Scott said that IBM's integrated racks had been optimized to enhance reliability, reduce maintenance work, and ensure air flow is maximized, but any data center chief could feasibly enjoy some of these energy efficiency savings by simply bundling their cables together.  
+
Similarly straight forward is IBM's answer to the age old data center problem of having to cool an entire room just to keep a few servers cool.  
Similarly straight forward is IBM's answer to the age old data center problem of having to cool an entire room just to keep a few servers cool.  
Line 45: Line 37:
* [[Data Centre ]]
* [[Data Centre ]]
 +
* [[IBM]]
[[Category:Written Aug-07]]  
[[Category:Written Aug-07]]  
Line 50: Line 43:
[[Category:Copied 2007 week 33]]
[[Category:Copied 2007 week 33]]
-
[[Category:Not yet summarised by MI]]
+
[[Category:Summarised by MI]]

Current revision as of 16:48, 22 September 2007

Contents

MI Summary

Full article: Tidy Data Centres are Green Data Centres

Surprisingly low-tech innovations are the secret to success for some of the world's greenest data centres.

Cabling is often the last thing on a data centre manager's mind and as a result the under-floor cabling in many server farms is so untidy that it blocks the channels used to distribute cool air, forcing the air conditioning units to work even harder, driving up both energy use and electricity bills. IBM's Integrated Rack Solution integrates the cabling into the server racks, and neatly bundles cables together to ensure they pose minimal disruption to the all important air flow.

Data centre managers have long known that keeping the front of the server racks cool is critical to their reliability and availability and as a result they have typically alternated cold corridors -- where cold air is pumped into a corridor with sets server racks facing inwards -- with hot corridors -- where the hot air is exhausted from the back of the racks and extracted from the data centre. However, hot air rises and as a result the warm air from the "exhaust" corridor typically "leaks" back over the top of the racks into the cold corridor. As a result the servers housed at the top of the racks are considerably less reliable than those at the bottom and the air-conditioning units once again have to work harder to keep the temperature down.

IBM's response to the problem? Stick a glass roof and door on the cold corridor. Consequently, the hot air is kept away from the front of the servers, as illustrated, and the air conditioning units not only have to cool a far smaller area but are able to do so without warm air seeping into cold corridor.

IBM reckons that combining these two relatively simple approaches can slash the energy used to cool a data centre by up to 50%.

Text of Article

Surprisingly low-tech innovations are the secret to success for some of the world's greenest data centers, as IBM showed at a demo of its latest, most energy efficient data center in London.

With the arms race to deliver the greenest data center intensifying by the week, it is sometimes easy to forget that while the leading hardware vendors are spending billions on developing new energy efficient processors or high tech environmental monitoring systems, some of the greenest data centers currently in operation rely on several surprisingly low tech innovations.

This was hammered home to me last week at a demo of IBM's new energy efficient data center facility at its London Headquarters, where many of the key energy saving techniques appeared to be situated on the "why on Earth didn't I think of that" side of simple.

Of course, that is not to say that IBM didn't also have plenty of high tech developments to show off, such as its new "cold battery" that promises to slash the power used in cooling technologies, its enhanced software solutions for managing data centers' power footprints, and improved water cooling technologies.

It is just that alongside these new systems several of the other innovations demonstrated were surprising in both their simplicity and effectiveness. It is easy to imagine countless data center managers having to break off from a tour of the green data center situated in the bowels of IBM's South Bank HQ to physically kick themselves over their failure to implement configurations that once witnessed appear blindingly obvious.

One such example is, but any data center chief could feasibly enjoy some of these energy efficiency savings by simply bundling their cables together.

Similarly straight forward is IBM's answer to the age old data center problem of having to cool an entire room just to keep a few servers cool.

Datacenter managers have long known that keeping the front of the server racks cool is critical to their reliability and availability and as a result they have typically alternated cold corridors -- where cold air is pumped into a corridor with sets server racks facing inwards -- with hot corridors -- where the hot air is exhausted from the back of the racks and extracted from the data center.

However, whilst this enhances cooling efficiency it is a less than perfect set up because, as every school boy or girl knows, hot air rises and as a result the warm air from the "exhaust" corridor typically "leaks" back over the top of the racks into the cold corridor. As a result the servers housed at the top of the racks are considerably less reliable than those at the bottom and the air conditioning units once again have to work harder to keep the temperature down.

IBM's response to the problem? Stick a glass roof and door on the cold corridor.

Consequently, the hot air is kept away from the front of the servers, as illustrated, and the air conditioning units not only have to cool a far smaller area but are able to do so without warm air seeping into cold corridor. It's hardly high tech, but it works.

IBM reckons that combining these two relatively simple approaches can slash the energy used to cool a data center by up to 50 percent -- which represents a considerable environmental and cost saving given that cooling systems account for over a third of the energy used in a typical data center.

Of course, IBM is not alone in advocating this more holistic approach to enhancing data center energy efficiency, but its demo data center certainly serves to illustrate that alongside all the billions of dollars being invested in enhancing energy efficiency there is still room for simple design innovation.

What's more, data center managers should take heart from the fact that while they may ultimately need to undertake expensive upgrades to bring down their electricity bills there are still simple commonsense steps they can take to enhance energy efficiency.

For an overview on the topic(s), see also

Personal tools