Home » Politics » How Lenovo is Pursuing Sustainable Supercomputers

Share This Post

Politics

How Lenovo is Pursuing Sustainable Supercomputers

How Lenovo is Pursuing Sustainable Supercomputers

The Leibniz Supercomputing Centre (LRZ) in Munich, Germany, contains no ordinary supercomputer. Sure, it has thousands of servers, or nodes, stacked in rows in a windowless vault with technicians working diligently on huge data crunching conundrums for research organisations; running simulations to try and better predict future natural disasters like tsunamis and earthquakes. But it is eerily quiet. Almost too quiet.

The familiar whir of hot air being whooshed away by power-hungry computers is almost entirely absent. Where are all the fans? Almost all gone, as it turns out.

The LRZ SuperMUC NG, which uses massive arrays of Lenovo’s ThinkSystem SD650 servers, requires nearly no fans at all – just those for cooling the power supply units and in the in-row-chillers on every eighth row.

As a result, “the ambient noise in the datacentre is now lower than in a typical office space,” notes Rick Koopman, EMEA Technical Leader for High-Performance Computing at Lenovo.

/* custom css */
.tdi_3_8f1.td-a-rec-img{ text-align: left; }.tdi_3_8f1.td-a-rec-img img{ margin: 0 auto 0 0; }

Despite this, Lenovo has been able to keep the LRZ running all this time while overseeing energy reduction levels of 40 per cent; greatly lowering the centre’s electricity bill and environmental impact all at the same time. “We wanted to optimise what we put into a supercomputer and what comes out of it from an efficiency perspective,” continued Koopman.

The secret? A focus on sustainability and using warm water to cool the datacentre.

A Green Giant

“By having this emphasis on sustainability and a reduction on the carbon footprint for their large general-purpose supercomputer, they now have a very efficient system and the SuperMUC NG is just one example,” explains Koopman.

When the company first began working on SuperMUC at LRZ in 2012, typical HPC compute nodes used processors requiring 100 to120W (Watts) of power per processor. That figure is now typically over 200W and will increase to over 300W by 2021. However, with more wattage comes greater heat, which ultimately needs to be removed. If these processors are not kept at their optimal operational temperature range of under 80 degrees Celsius, the silicon in the chips begins to break.

“If you have one server with two 300W processors, four accelerators using up to 500W each with additional memory, drives and network adapters, you’d be looking at more than 3000W per server. There are 36 of these servers in one standard 19-inch 42-unit rack,” says Koopman.

Comparatively, a typical washing machine requires 500Ws. Therefore, one computer rack in this example uses the same power as 210 washing machines all running at the same time.

Enter Warm Water Cooling

“The old way was chilling the datacentre room by using fans to blow the hot air away. Hence all the noise. But air cooling is far from efficient for current and future HPC solutions, and because they use increasingly dense arrays of hardware, it is even less workable,” adds Koopman.

This is where the concept of warm water cooling comes in – the idea of pushing water that to us feels warm, but at 45 to 50 degrees Celsius is still cooler than processors running at peak performance. In this way, LRZ is able to remove approximately 90 per cent of heat energy from the SD650 nodes, cleanly and quietly.

The same mass of water stores four times more energy compared to air at a given temperature, and it’s possible to have the water supply in direct contact with all the elements that need to be cooled, to make the process much more targeted. “The heat transfer to water is just much more efficient,” concludes Koopman.

Since the water is also contained in a pipe system, it can be re-used. Depending on the location of the datacentre and the outdoor temperature, simply running it through heat exchanger equipment on the roof of the datacentre allows the excess heat from the hardware to radiate away.

On top of the energy savings, green impact and lower electricity bill this solution delivers, the warm water can provide heating for a nearby agriculture greenhouse and can even be used as part of the campus heating for facilities such as LRZ.

Cooling Tech

This, however, is just one element of Lenovo’s Neptune liquid cooling technology, which approaches datacentre energy efficiency in three ways: warm water cooling, software optimisation (which has delivered over 10 per cent additional energy savings by throttling hardware when needed) and infrastructure advances.

This technology can be applied as a standard solution for many large datacentres to allow them to reduce the number of required chillers to keep them cool. Fewer chillers are needed for the creation of this cold water and that adds up on a supercomputer scale. But while the solutions are currently being used in datacentres, the technology has the potential for applications elsewhere in IT.

Edited by Jenna Delport

/* custom css */
.tdi_4_e55.td-a-rec-img{ text-align: left; }.tdi_4_e55.td-a-rec-img img{ margin: 0 auto 0 0; }

You Deserve to Make Money Even When you are looking for Dates Online.

So we reimagined what a dating should be.

It begins with giving you back power. Get to meet Beautiful people, chat and make money in the process. Earn rewards by chatting, sharing photos, blogging and help give users back their fair share of Internet revenue.https://www.pmdates.com/assets/sources/uploads/5e2ec867e1d61_pmdates392x105.png

Share This Post