Data Centre

From Lauraibm

(Difference between revisions)
 
(2 intermediate revisions not shown)
Line 6: Line 6:
==Data Centre In the news==
==Data Centre In the news==
 +
* [[How to Run an Efficient Data Centre (1-Nov-07)]]
 +
* [[New Project Says "Green IT Begins with Green Data" (19-Oct-07)]]
* [[Outsourcing Addresses "Green" IT Issues (8-Oct-07)]]
* [[Outsourcing Addresses "Green" IT Issues (8-Oct-07)]]
* [[Data centre carbon emissions soar (10-Oct-07)]]
* [[Data centre carbon emissions soar (10-Oct-07)]]

Current revision as of 12:06, 2 November 2007


Storage in the news

Data Centre In the news

Summaries

Full article: BLADE Network Technologies Brings Full 10 Gigabit Ethernet "Green Blade" Switch to HP BladeSystem (8-Aug-07)

BLADE Network Technologies, which claims to be the leading provider of network switching infrastructure to the blade server market, announced the release of the first full 10Gb Ethernet "Green Blade" switch for HP BladeSystem. For the first time, HP BladeSystem c-Class can now be equipped to provide 400Gb of Ethernet bandwidth for the high throughput and low latency required by blade server virtualization workloads and converged "single fabric" Ethernet networks.

BLADE 10G Ethernet switch reduces power consumption by 95% per 10G Ethernet port compared to 10G Ethernet modules used in racked chassis-based network switches.

Full article: CIOs Plugging Into Utilities' Green Leadership Ideas (14-Aug-07)

What can CIOs do to make data centres more energy efficient?

  • application software virtualization and server consolidation
  • Massive Array of Idle Disks (MAID), which stores rarely-used data on hard disks that are normally turned off, helping customers realize 75 percent or more in energy savings compared to typical always-on systems.

Full article: Holistically (including economics and environment), Moore’s Law could be failing (14-Aug-07)

Datacenter layouts? Taylor says that the average data centre uses three times the amount of cooling than it actually needs because it’s laid out with aesthetics in mind, rather than energy efficiency.

IDC’s storage systems VP asks if part of the problem isn’t the sales person for EMC (or HP) who is looking to close the next sale, driving more gear into the data centres and simply propagating the problem.

Are data centres too cold? An HP rep points out that most data centres are kept at around 60 degrees which is much colder than those data centres need to be given the specs to which servers are built (he claims 68 to 72 degrees is more appropriate). But on the flip side, he acknowledges that many data centre managers see the 60 degree level as buying them some time in the event that there’s some failure that knocks the cooling facilities out (rather than living on the edge where, if the cooling fails, system failure or shutdown closely follows).

Full article: How Green is Silicon Valley (7-Sep-07)

PC makers Hewlett Packard and Dell, chipmakers Intel and Advanced Micro Devices, as well as Microsoft and Google, have joined consortia to improve energy efficiency of hardware and within the firms' giant data centres.

A recent EPA report estimated that US data centres consumed around 61bn kilowatt hours in 2006 at a cost of about $4.5bn. That's about 1.5% of total US electricity consumption, more than the electricity used by all American TV sets.

Gartner is sceptical about the influence the consortium will have. "Competition among the majority of vendor contributors is likely to slow the development of clear standards that can be used by data centre managers," it wrote in a note to clients.

A report from research company, Ipsos, says that a majority of respondents would buy consumer products from companies that demonstrated their environmentally sensitive credentials.

Full article: Tech Companies are Greener, but are they Green Enough? (12-Sep-07)

But it's a lot easier to put out a news release than to build a data centre with a significantly smaller environmental footprint. Depending on the configuration and the equipment involved, as little as 30 to 40 percent of the juice flowing into a data center is used to run computers. Most of the rest goes to keeping the hardware cool, since heat saps performance. Unlike in other office space, that A/C cranks year-round, to overcome the 100-degree (38 Celsius)-plus air that the computers themselves throw off. That challenge has increased in recent years with the rise of compact "blade" servers that are crammed into server racks. This is why big data centres can devour several megawatts of power, enough for a small city.

Full article: Today's “Green T” may not be as Healthy as we Think (27-Aug-07)

Marketing departments at many of the leading manufacturers have been spending more time on using mathematics to support the supposed green initiatives being touted than on making tangible and material engineering changes to the design of their solutions to realize true environmental benefits.

One of the most blatant examples of this practice is where a hardware manufacturer exploits increased disk drive densities and advertises that the new generation of their products are now suddenly dramatically more efficient per unit of storage. While this is an undisputable fact, it does not in fact address the root cause, which is the unmanaged explosion of data being stored.

The problem with using denser IT as an energy efficiency strategy is that companies are being asked to store more data, and for longer periods of time to comply with regulatory demands - so no matter what the storage technology, more data is being created and stored.

Many companies tackle this problem through software, by applying de-duplication, single-instance storage, or compression methods that result in an overall reduction in the amount of data stored. While these software solutions certainly achieve a more measurable and arguably more tangible result, these approaches also do not address the same root cause identified above – the unmanaged explosion of data being stored.

In order to really make a dent in reducing energy consumption, we must work with the creators of content and data. The effort towards data reduction and therefore less reliance on ever increasing storage capacities must be collaborative and universal.

This IDC analyst seems obsessed with data volumes as the cause of all IT-related environmental damage. For instance, 'The wide accessibility of free internet storage and content depots (such as social networks), which IDC forecasts will be a major driver of storage capacities, encourages individuals to generate more content and to leverage the availability of these free resources without much thought towards the collective environmental impact.'

Full article: Teradata Ships Green Data Warehouse Server (22-Aug-07)

NCR Teradata has rolled out an energy efficient version of its 5500 Server data warehousing platform which it claims used 75% less power. Teradata's chief development officer said the reduction in electricity savings returned by a single server is enough to power 40 homes for a year. He added that energy savings are also accrued from a 30% drop in cooling costs because of the server's cabinet design.

Full article: Sun Goes for the Green with new Data Centre (21-Aug-07)

Sun will open the doors on 21st August to a new, 76,000-square-foot data center. Company executives believe the new data centre will demonstrate its ability to deliver eco-friendly technology to its customers. The new data center will use about 500KW of electricity compared to the 2MW consumed by older data centres. This will also allow the company to reduce its carbon dioxide emissions by 4,100 tons every year. The new Santa Clara data centre has been online since June and Sun is also launching similar initiatives in Blackwater, England, and Bangalore, India.

Besides the new data centre, Sun will begin offering a new set of services designed to address issues of power and cooling for customers' data centers. The company will begin offering the new services—Sun Eco Services suite.

Within the service offering, Sun will offer both basic and advanced assessment services for customers' data centers, as well as cooling efficiency and optimization services. Sun is also working with another company, Worldwide Environmental Services, to deliver additional services and plans for rack placement and cooling suggestions for data centers.

"Worldwide Environmental Services specializes in this type of services and whether our customers are looking to build out a data center or build a new data center, we want to offer them optimal amount of eco services that is the best for our customers," Nowack said.

The price of these services starts at about $10,000 for the basic assessment package and can increase up to $30,000 to $40,000 for more advanced services with additional on-site visits by Sun engineers and the company's partners.

Full article: LinuxWorld 2007 goes green with Green Grid consortium (10-Aug-07)

  • Rackable offers 40U racks which use energy-efficient DC power distribution within the rack.
    • A large amount of power is wasted before it even touches the server when we have to go through a separate UPS and power distribution system that outputs AC which has to be converted back to DC again by the servers power supply.
  • Sun’s energy-efficient BlackBox project is a data centre in a standard shipping container. What makes Sun’s BlackBox efficient is that it uses water as a heat exchange mechanism. Since water is about 7 times more efficient in heat exchange over air, it reduces the amount of power consumption used for cooling. You basically pump cold water in to the BlackBox and warm water comes out.
  • SMEs with server rooms may need a little motivation to conserve power. These IT shops often use pedestal-based servers and a mixture of rack-mount servers and they often never see the electricity bill since that may be handled by departmental budgets. Business should seriously consider moving the power budget under the IT department and then they might get some motivation in containing energy costs through more efficient server rooms and desktop computers. Until then all the cries of green computing seems to fall upon deaf ears.

Full article: Tech's Own Data Centres are their Green Showrooms (21-Aug-07)

Another article on the Sun data centre consolidation and the Fujitsu hydrogen fuel cell announcements.

Full article: Silicon Valley's Green Detente (Aug-07)

It is almost 30 years since Jimmy Carter called the United States the "most wasteful nation on Earth." Now Silicon Valley finally has found common ground in the fight against climate change. The paper cites some of the efforts by HP, IBM and Sun. Economics no longer trumps the environment. Some of the most innovative minds are working from their R&D centres in Silicon Valley to save the planet. They developed the digital world -- now they're helping to take care of our world.

Full article: Tidy Data Centres are Green Data Centres

Surprisingly low-tech innovations are the secret to success for some of the world's greenest data centres.

Cabling is often the last thing on a data centre manager's mind and as a result the under-floor cabling in many server farms is so untidy that it blocks the channels used to distribute cool air, forcing the air conditioning units to work even harder, driving up both energy use and electricity bills. IBM's Integrated Rack Solution integrates the cabling into the server racks, and neatly bundles cables together to ensure they pose minimal disruption to the all important air flow.

Data centre managers have long known that keeping the front of the server racks cool is critical to their reliability and availability and as a result they have typically alternated cold corridors -- where cold air is pumped into a corridor with sets server racks facing inwards -- with hot corridors -- where the hot air is exhausted from the back of the racks and extracted from the data centre. However, hot air rises and as a result the warm air from the "exhaust" corridor typically "leaks" back over the top of the racks into the cold corridor. As a result the servers housed at the top of the racks are considerably less reliable than those at the bottom and the air-conditioning units once again have to work harder to keep the temperature down.

IBM's response to the problem? Stick a glass roof and door on the cold corridor. Consequently, the hot air is kept away from the front of the servers, as illustrated, and the air conditioning units not only have to cool a far smaller area but are able to do so without warm air seeping into cold corridor.

IBM reckons that combining these two relatively simple approaches can slash the energy used to cool a data centre by up to 50%.

Full article: Green Tech Shops have a way to go (8-Aug-07)

Data centres are the gas-guzzling jalopies of the technology world. Some require 40 or 50 times more power than comparable office space.

"It's somewhat analogous to someone who decides to purchase an energy-efficient car and says, `Gee, I'm using 30% less petrol with this, that means I can drive 30% more miles than I used to, and still do something for the environment,'" said an analyst with Pund-IT Research.

A new report from the EPA estimates that the easiest, least inexpensive changes to data centre operations - involving tweaks to software, layout and air conditioning - could boost efficiency by 20%. The EPA says 45% improvement - enough to lower electricity usage by 2011 - can be achieved with existing technologies.

Unlike in other office space, air conditioning cranks year-round, to overcome the 100-degree-plus air that the computers themselves throw off. That challenge has increased in recent years with the rise of compact "blade" servers that are crammed into server racks.

A 1 megawatt data center will ring up $17 million in electric bills over its 10-year life span. Even so, few data centres have taken obvious steps to reduce that load.

Almost all the energy that goes into the air conditioning systems is used to run giant chillers that make the air pumped through the rooms' raised floors a brisk 55 degrees or so, sometimes as low as the 40s. Such extremely cold air is blasted in to guarantee that no single server's temperature gets much above the optimum level, which is around 70 degrees.

But the air conditioning doesn't have to be so cold if the layout of server rooms is better designed to improve air flow, smoothing out all the various microclimates that can develop.

And in many places, the outside air is plenty cold enough much of the year, for free. Yet only recently have data centres adopted systems that can take filtered outside air for cooling the computer rooms.

To be fair, some data centres are buried too deep within buildings to gulp fresh air. But the main reason for the A/C over-reliance is that data centers were built for one thing - to maximize the performance of the Web sites, computer programs and networking equipment that they run. If the air conditioning is colder than necessary, so be it.

"There are probably two key metrics for the IT guy: no downtime and `no security breaches on my watch,'" said the VP of one cooling efficiency firm. "They normally do not know, don't care and aren't measured by their electric bill."

In fact, in many companies, any given department's responsibility for the overall utility bill is determined by such factors as employee headcount or square feet of office space. By that measure, the IT department comes out way ahead.

An IBM VP estimates this is still the state of affairs 70-80% of the time. The tech shops "aren't actually paying their real energy bill," Sams says.

Full story: Self-sustaining data centers almost a reality (6-Aug-07)

As part of an ongoing effort to make IT more green, IBM, Hewlett-Packard and Sun Microsystems are said to be in the midst of researching and testing self-sustained data centres that need no cooling equipment.

Full story: EPA Urges Data Centres to Cut Power (6-Aug-07)

At current rates, US data centres will double power consumption over the next four years, according to the Environmental Protection Agency (EPA). But if more efficient use were made of servers, this upward trend could be slowed or even reversed. In 2006, America's servers consumed about 61 billion kilowatt-hours, representing a cost of about $4.5bn and 1.5% of total electricity consumption in the USA. In a “best practice” scenario described by the EPA, in which data centres are aggressively consolidated and direct liquid cooling applied to servers and storage, there could be a decline in the energy curve down to a level of 45 billion kilowatt-hours. The EPA wants to investigate an Energy Star rating system for servers, but IT vendors are likely to be ambivalent about this.

Business disruption from power failures up 350%

Power failures accounted for 26% of all disruptions reported to SunGard in 2006, compared to just 7% in 2005.

Business disruption from power failures up 350% (30-Apr-07)

According to SunGard, power failures accounted for just 7% of IT disruptions in the UK in 2005, leaping to 26% in 2006. The UK boss of SunGard said: “With IT kit drawing more power than ever before, it is imperative that businesses plan for possible interruptions to their power supply.”

Personal tools