Green Building Forum - Data Centre Cooling Tue, 19 Dec 2023 04:51:06 +0000 http://www.greenbuildingforum.co.uk/newforum/ Lussumo Vanilla 1.0.3 Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91419#Comment_91419 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91419#Comment_91419 Fri, 08 Oct 2010 16:16:19 +0100 MarkBennett
As a result, I'm interested in seeing if we can come up with a solution that serves the needs of the room and users but minimises the install and ongoing running costs. The company is a large multi-national but with a reputation for being more environmentally conscious than all of our peers, so I think there would be some buy-in from the people that will have to sign it off.

Can anyone point me towards some useful papers/websites that will help me understand the possible approaches?]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91421#Comment_91421 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91421#Comment_91421 Fri, 08 Oct 2010 16:21:52 +0100 DamonHD
Eg here: http://www.theregister.co.uk/2010/09/23/yahoo_compute_coop/

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91440#Comment_91440 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91440#Comment_91440 Sat, 09 Oct 2010 08:52:17 +0100 SteamyTea
Or try these, not read them myself.

ftp://download.intel.com/technology/EEP/data-center-efficiency/state-of-date-center-cooling.pdf

http://kn.theiet.org/magazine/issues/0919/data-centre-0919.cfm

http://www.intel.com/it/pdf/reducing_data_center_cost_with_an_air_economizer.pdf]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91445#Comment_91445 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91445#Comment_91445 Sat, 09 Oct 2010 11:28:09 +0100 Peter_in_Hungary The majority of the savings shown in the above links seem to be harnessing the ability to dump the heat generated by the DC to the outside without using almost as much power as used by the IT equipment to do so. The Yahoo site appears to use natural assisted convection to do this. The ability to do this will depend on the construction of the building and not easy to retrofit. (Note, yahoo produced a purpose designed building to do this, even if the concept of design and chimney effect was stolen from agriculture).
The problem I faced in my previous life, where we were just beginning to look at green issues, albeit driven by cost reduction, was trying to find a use for the low grade heat produced by either the IT kit or HVAC. It seems that Yahoo have not cracked this one either as they are dumping it to air, a waste but no use for it. We went through several loops trying to find a use for this heat including piping it to an adjacent hotel to heat their swimming pool. (The pipe runs were too far to be viable). In the end we found no use for it and it remained a loss.
To day life has moved on and heat pumps are much improved so depending on the location and surrounding demand it may be viable to upgrade the low grade heat from the DC to a level where it may be exported to heat the DC support offices if not already doing so or exported at a price (= sold) to adjacent organisations.
It is very difficult to future proof DCs as business demands and IT advances produce too many variables, all you can do is project the trends.
For the ongoing maintenance the power supplies, distribution boards and connection boxes were regularly photographed using thermal imaging and this readily showed up hot spots which were then scheduled for maintenance. The advent of this process saved considerable down time and maintenance costs over the previous methods of physical inspection which did not always pick up problems as accurately as the thermal imaging.
Peter]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91450#Comment_91450 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91450#Comment_91450 Sat, 09 Oct 2010 12:38:43 +0100 DamonHD
They have been virtualising servers to get utilisation up and thus average useful cycles per Watt and per m^2.

The building I was in has no heating, only cooling, using its machine-room heat for the rest of the building in when required I think.

And I've demonstrated to myself that if energy efficiency is a sufficiently high priority then reducing W per unit of useful work done by an order of magnitude is by no means impossible. (My server suite at home to run our main Web-facing services such as Web sites, mail, etc, is down from ~600W to ~4W excluding comms, with comms down from ~50W to ~8W. And that reduction also meant we could get rid of air-con in summer which probably added another 30% to annual consumption. It's also gone from taking several m^2 of racking to the corner of my desk and coming off-grid entirely!)

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91455#Comment_91455 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91455#Comment_91455 Sat, 09 Oct 2010 13:32:41 +0100 Peter_in_Hungary The building I was in has no heating, only cooling, using its machine-room heat for the rest of the building in when required I think.

That is usual in my experience. Data Centres usually produce more heat than can be used in the support offices that accompany them. The efficiency challenge is to find something useful to do with the surplus, instead of dumping it outside. Perhaps when technology reduces its power demands sufficently many of these buildings without heating other than data centers will find themselves with a problem in the winter.

For your own situation, that sound like a generation change in the hardware to me. If that is the case then yes a new generation of kit will allow such changes but the accountants will rarely allow such expense on energy consumption alone, usually energy usage is 1 line item on a justification document. Well done for getting such a reduction and coming off grid. For the next jump you will probably be waiting for the research labs to put the whole lot on 1 IC pluggable into the processor that my not be bigger than the kit that sits in the corner of your desk today.
Peter]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91460#Comment_91460 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91460#Comment_91460 Sat, 09 Oct 2010 15:40:53 +0100 DamonHD
Rgds

Damon

PS. It is already fully solid-state with very few chips: http://www.earth.org.uk/note-on-SheevaPlug-setup.html]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91502#Comment_91502 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91502#Comment_91502 Sun, 10 Oct 2010 06:55:51 +0100 SteamyTea
As this is a refurbishment and probably next to, or in , the main building, can the heat be used anywhere. Otherwise it is the Damon Route (that will be DR from now on) of installing new IT equipment and reduce the load by a factor of 5. 10 kW instead of 50 kW is a lot more manageability.]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91696#Comment_91696 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91696#Comment_91696 Tue, 12 Oct 2010 20:22:18 +0100 DamonHD
http://www.theregister.co.uk/2010/10/12/capgemini_merlin_data_center/

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91789#Comment_91789 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=91789#Comment_91789 Wed, 13 Oct 2010 23:26:31 +0100 SimonH http://www.thegreengrid.org/

Seems like quite a useful resource with lots of reference material - of you've paid to be a member :-(]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92021#Comment_92021 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92021#Comment_92021 Mon, 18 Oct 2010 08:08:24 +0100 DamonHD
http://hardware.slashdot.org/story/10/10/17/196220/Small-Startup-Prevails-In-Server-Cooling-Chill-Off

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92024#Comment_92024 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92024#Comment_92024 Mon, 18 Oct 2010 08:48:27 +0100 SteamyTea Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92157#Comment_92157 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=92157#Comment_92157 Tue, 19 Oct 2010 21:05:49 +0100 Peter_in_Hungary this is a lot less than you get in some of the posts on this forum!!!

Peter]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107259#Comment_107259 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107259#Comment_107259 Fri, 08 Apr 2011 13:32:23 +0100 djh
http://www.technologyreview.com/computing/37317/?a=f

http://opencompute.org/]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107280#Comment_107280 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107280#Comment_107280 Fri, 08 Apr 2011 18:06:05 +0100 wookey
Calxeda have some interesting kit (ARM-based servers) which will do things like reduce power-consumption per compute node by a factor of 10 and space consumption per compute node by a factor of 100. Now actually, that'll put your power copnsumption per square meter _up_ if you exploit the opportunity fully, but it clearly has the potential to make a big difference in datacentres. Current hardware is not suitable for all workloads, but more capable stuff will be along soon enough.

So, my advice if you want to green-up your datacentre is start looking to see if you can use any of this kit.

There is a big jeavons effect here though - people will just do more computing on all that kit now they suddenly have some spare power budget.

(disclaimer: I have have recently become employed by ARM, but that doesn't change my opinions, it just means I can't tell you the _really_ cool stuff yet)

We have a chunky datacentre here, but can't really use the waste heat from it because the building needs cooling nearly all the time, and very little heating. CHP to local housing (there is some right next door) would be an interesting idea for using that heat. ARM are looking at all this stuff for their recently-aquired building refurb. I find it interesting the way that the corporate situation is not at all like the domestic one, and quite different solutions are appropriate.]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107283#Comment_107283 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107283#Comment_107283 Fri, 08 Apr 2011 19:59:16 +0100 DamonHD
Lots and lots of scope for improvement.

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107294#Comment_107294 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107294#Comment_107294 Sat, 09 Apr 2011 00:36:49 +0100 Jeff Norton (NZ)
The company I currently work for use Mitsubishi VRV systems which have BC controllers to manage the refrigerant, the waste heat can be used to heat other areas or hot water cylinders.]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107324#Comment_107324 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107324#Comment_107324 Sat, 09 Apr 2011 11:26:50 +0100 pmcc Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107327#Comment_107327 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107327#Comment_107327 Sat, 09 Apr 2011 12:01:04 +0100 DamonHD
Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107411#Comment_107411 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107411#Comment_107411 Sun, 10 Apr 2011 22:39:32 +0100 wookey
If your stuff is running on linux servers then there is minimal porting issue. Debian has been available on ARM since 2000, for example, and most code with either 'just work', or need a rebuild for ARM. Most of the things that made moving code to ARM difficult have been consigned to history: (i.e. unsigned char default, different behaviour of unaligned loads, lack of FP).

Obviously any proprietary code you use can be a problem if the suppliers don't make a suitable ARM build. (Don't use proprietary code would be my advice :-)

Microsoft announced ARM support for Windows in November which does mean people will be able to use Windows on their ARM kit soon. (Do people still use Windows in datacentres?)]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107422#Comment_107422 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107422#Comment_107422 Mon, 11 Apr 2011 08:28:46 +0100 DamonHD
Note also for the Linux/ARM issue: I agree with most of what you say, but if you want to run (say) .Net/mono or Java (as I do) then you have to find a non-stock runtime, and indeed for any other 3rd-party stuff you have to find a distro that builds what you need unless you want to spend your life rebuilding and debugging. For example Ubuntu abandoned the ARMv5 in the SheevaPlug meaning that I'd now have to change distros to do anything beyond a kernel upgrade.

So ARM remains slightly more painful/expensive from that point of view even for code like mine which is completely architecture neutral. (Shell scripts and Java.)

Rgds

Damon]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107823#Comment_107823 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107823#Comment_107823 Sat, 16 Apr 2011 17:17:52 +0100 wookey
I haven't actually tried cross-grading a sheevaplug, but it should work, no doubt with some tiresome fiddling about. Or you can just install from scratch painlessly: http://www.cyrius.com/debian/kirkwood/sheevaplug/plugs.html]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107960#Comment_107960 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107960#Comment_107960 Tue, 19 Apr 2011 11:00:37 +0100 mybarnconversion
http://feedproxy.google.com/~r/treehuggersite/~3/GKmuMwQmbSc/9-percent-data-center-cooling-energy-reduction-fluid-submerged-servers-mineral-oil.php

;) ... must be tricky to fish them out when only a hard-reboot will do ...]]>
Data Centre Cooling http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107974#Comment_107974 http://www.greenbuildingforum.co.uk/newforum/comments.php?DiscussionID=6386&Focus=107974#Comment_107974 Tue, 19 Apr 2011 12:53:34 +0100 djh Posted By: mybarnconversionSimple solution, dunk them ... must be tricky to fish them out when only a hard-reboot will do
That's a great system. :peace:

No need to fish them out. Is it Google or Sun that puts servers in shipping containers? They never open it in service, just swap the whole container out when too many of the servers have failed. So should be ideal to fill the container with oil.]]>