Data Centre

From Lauraibm

(Difference between revisions)
Line 28: Line 28:
==Summaries==
==Summaries==
 +
{{LW Aug-07}}
 +
{{SV Aug-07}}
{{SV Aug-07}}

Revision as of 15:24, 22 August 2007

In the News


Summaries

Full article: LinuxWorld 2007 goes green with Green Grid consortium (10-Aug-07)

  • Rackable offers 40U racks which use energy-efficient DC power distribution within the rack.
    • A large amount of power is wasted before it even touches the server when we have to go through a separate UPS and power distribution system that outputs AC which has to be converted back to DC again by the servers power supply.
  • Sun’s energy-efficient BlackBox project is a data centre in a standard shipping container. What makes Sun’s BlackBox efficient is that it uses water as a heat exchange mechanism. Since water is about 7 times more efficient in heat exchange over air, it reduces the amount of power consumption used for cooling. You basically pump cold water in to the BlackBox and warm water comes out.
  • SMEs with server rooms may need a little motivation to conserve power. These IT shops often use pedestal-based servers and a mixture of rack-mount servers and they often never see the electricity bill since that may be handled by departmental budgets. Business should seriously consider moving the power budget under the IT department and then they might get some motivation in containing energy costs through more efficient server rooms and desktop computers. Until then all the cries of green computing seems to fall upon deaf ears.

Full article: Silicon Valley's Green Detente (Aug-07)

It is almost 30 years since Jimmy Carter called the United States the "most wasteful nation on Earth." Now Silicon Valley finally has found common ground in the fight against climate change. The paper cites some of the efforts by HP, IBM and Sun. Economics no longer trumps the environment. Some of the most innovative minds are working from their R&D centres in Silicon Valley to save the planet. They developed the digital world -- now they're helping to take care of our world.

Full article: Tidy Data Centres are Green Data Centres

Surprisingly low-tech innovations are the secret to success for some of the world's greenest data centres.

Cabling is often the last thing on a data centre manager's mind and as a result the under-floor cabling in many server farms is so untidy that it blocks the channels used to distribute cool air, forcing the air conditioning units to work even harder, driving up both energy use and electricity bills. IBM's Integrated Rack Solution integrates the cabling into the server racks, and neatly bundles cables together to ensure they pose minimal disruption to the all important air flow.

Data centre managers have long known that keeping the front of the server racks cool is critical to their reliability and availability and as a result they have typically alternated cold corridors -- where cold air is pumped into a corridor with sets server racks facing inwards -- with hot corridors -- where the hot air is exhausted from the back of the racks and extracted from the data centre. However, hot air rises and as a result the warm air from the "exhaust" corridor typically "leaks" back over the top of the racks into the cold corridor. As a result the servers housed at the top of the racks are considerably less reliable than those at the bottom and the air-conditioning units once again have to work harder to keep the temperature down.

IBM's response to the problem? Stick a glass roof and door on the cold corridor. Consequently, the hot air is kept away from the front of the servers, as illustrated, and the air conditioning units not only have to cool a far smaller area but are able to do so without warm air seeping into cold corridor.

IBM reckons that combining these two relatively simple approaches can slash the energy used to cool a data centre by up to 50%.

Full article: Green Tech Shops have a way to go (8-Aug-07)

Data centres are the gas-guzzling jalopies of the technology world. Some require 40 or 50 times more power than comparable office space.

"It's somewhat analogous to someone who decides to purchase an energy-efficient car and says, `Gee, I'm using 30% less petrol with this, that means I can drive 30% more miles than I used to, and still do something for the environment,'" said an analyst with Pund-IT Research.

A new report from the EPA estimates that the easiest, least inexpensive changes to data centre operations - involving tweaks to software, layout and air conditioning - could boost efficiency by 20%. The EPA says 45% improvement - enough to lower electricity usage by 2011 - can be achieved with existing technologies.

Unlike in other office space, air conditioning cranks year-round, to overcome the 100-degree-plus air that the computers themselves throw off. That challenge has increased in recent years with the rise of compact "blade" servers that are crammed into server racks.

A 1 megawatt data center will ring up $17 million in electric bills over its 10-year life span. Even so, few data centres have taken obvious steps to reduce that load.

Almost all the energy that goes into the air conditioning systems is used to run giant chillers that make the air pumped through the rooms' raised floors a brisk 55 degrees or so, sometimes as low as the 40s. Such extremely cold air is blasted in to guarantee that no single server's temperature gets much above the optimum level, which is around 70 degrees.

But the air conditioning doesn't have to be so cold if the layout of server rooms is better designed to improve air flow, smoothing out all the various microclimates that can develop.

And in many places, the outside air is plenty cold enough much of the year, for free. Yet only recently have data centres adopted systems that can take filtered outside air for cooling the computer rooms.

To be fair, some data centres are buried too deep within buildings to gulp fresh air. But the main reason for the A/C over-reliance is that data centers were built for one thing - to maximize the performance of the Web sites, computer programs and networking equipment that they run. If the air conditioning is colder than necessary, so be it.

"There are probably two key metrics for the IT guy: no downtime and `no security breaches on my watch,'" said the VP of one cooling efficiency firm. "They normally do not know, don't care and aren't measured by their electric bill."

In fact, in many companies, any given department's responsibility for the overall utility bill is determined by such factors as employee headcount or square feet of office space. By that measure, the IT department comes out way ahead.

An IBM VP estimates this is still the state of affairs 70-80% of the time. The tech shops "aren't actually paying their real energy bill," Sams says.

Full story: Self-sustaining data centers almost a reality (6-Aug-07)

As part of an ongoing effort to make IT more green, IBM, Hewlett-Packard and Sun Microsystems are said to be in the midst of researching and testing self-sustained data centres that need no cooling equipment.

Full story: EPA Urges Data Centres to Cut Power (6-Aug-07)

At current rates, US data centres will double power consumption over the next four years, according to the Environmental Protection Agency (EPA). But if more efficient use were made of servers, this upward trend could be slowed or even reversed. In 2006, America's servers consumed about 61 billion kilowatt-hours, representing a cost of about $4.5bn and 1.5% of total electricity consumption in the USA. In a “best practice” scenario described by the EPA, in which data centres are aggressively consolidated and direct liquid cooling applied to servers and storage, there could be a decline in the energy curve down to a level of 45 billion kilowatt-hours. The EPA wants to investigate an Energy Star rating system for servers, but IT vendors are likely to be ambivalent about this.

Business disruption from power failures up 350%

Power failures accounted for 26% of all disruptions reported to SunGard in 2006, compared to just 7% in 2005.

Business disruption from power failures up 350% (30-Apr-07)

According to SunGard, power failures accounted for just 7% of IT disruptions in the UK in 2005, leaping to 26% in 2006. The UK boss of SunGard said: “With IT kit drawing more power than ever before, it is imperative that businesses plan for possible interruptions to their power supply.”

Personal tools