At a recent data centre cooling seminar the discussion was all about where the industry would go next when current air-cooling technologies and techniques exhausted all the efficiency gains over today’s standard DX units. The consensus seemed to be that liquid cooling, particularly liquid immersion cooling would inevitably reign supreme as it has two major advantages over other cooling techniques:
- Certain liquids can be up to 4000-times more effective at removing high heat loads than air
- Liquid delivered to the server submerged in it, can be as hot as the maximum operating temperatures allow, reducing the cost of cooling (sometimes eliminating it altogether), and, in addition, providing a good heat source that can be used with thermocouples or other engineering solutions to generate electricity (see: http://phys.org/news/2013-05-green-conversion-electricity.html)
So who’s doing this? Well, it appears that this market is beginning to expand quite rapidly, as higher performance computing (HPC) becomes more ubiquitous in industry. Data Center Knowledge report (http://www.datacenterknowledge.com/archives/2013/07/01/the-immersion-data-center/) that CGG have just installed a futuristic-looking data centre in Houston, Texas, that wouldn’t look out of place in a sci-fi movie. Computer circuits are immersed in mineral oil ‘baths’ making the data centre eerily quiet. A small British start-up, Iceotope (http://www.iceotope.com/), have been winning awards for their innovative ‘data centre in a rack’ design that is now beginning to gain traction in HPC environments.
If you think you’ve seen this all before, well you’re right, Cray were putting their supercomputers into liquid in the 1980′s and IBM have also been doing it for decades. It largely fell out of favour as data centre managers became risk-averse to water in their data centres. Today’s technologies are highly proven in terms of prevention of leakage and I’ve yet to hear of anyone who has experienced one, though I’m sure there are some examples out there.
Something to think about anyway when you design your next data centre…where will you put the pipes, and how will you re-use the heat?