Data Centre Cooling is More Important than You’d Think

Hot weather can send data centres into meltdown. And when temperatures soar during heatwaves, data centre cooling is a major issue.

Data Centre Cooling is More Important than You’d Think

At long last, after all the storms, all the floods, all the hailstones – summer is coming.

Here in the UK, the faintest little whiff of sunshine tends to make us all a bit… funny. Scores of us end up flocking to the beach or park – and picking up a bag of charcoal becomes an impossible task. Summer doesn’t last long around here, so we’ll take all the hot days we can get, thank you very much!

But while we do love the sun, we also love to moan about how hot it is. Specifically, us here at deeserve! Why? Because hot weather can send data centres into meltdown. And when temperatures soar during heatwaves, data centre cooling is a major issue.

We all know that computers get hot when they’re working hard – which is why you’ll hear your fans spinning in your PC when you’re doing intense work, of why your phone can get hot when you’re pushing it with games or camerawork.

In this post, we explain why consumer-grade cooling systems simply aren’t enough for a data centre, and why data centre cooling is so important in maintaining service levels for Cloud connected businesses.

Why good data centre cooling is essential

So, we all know our computers and devices get hot under workloads. But Data centres are on a whole different level.

The always-on demand for instant data transfer and access means that the machines in data centres never switch off, and are frequently maxing out their performance. This uses huge amounts of power. And a byproduct of all this power used to crunch data is heat. Lots of heat:

“A typical large data centre can generate 20 to 50 MW of heat, Hudson said, while a data centre campus can generate up to 300 MW — enough to power a mid-sized city.” – ACHR News

That’s more than a couple of fans and heatsinks can handle. For heat on this level, you need an industrial-scale cooling system.

That’s because the hotter components like CPUs and GPUs get, the more they have to throttle their performance to avoid failure and fires. This means the businesses with hardware installed can’t get the performance they need, and often at critical times.

So, cooling is absolutely essential for performance. But smart cooling implementation also leads to some other measurable benefits:

Server uptime and efficiency gains

Proper data centre cooling allows servers to stay online for longer, and to operate at peak performance – while ensuring active cooling is prioritised to hotspots, not just globally.

Longer-lasting hardware

Letting hardware overheat regularly causes it to fail – that’s just a fact of life. Not only does that impact uptime, it also impacts hardware expenditure and e-waste production.

Now – all that heat generated has to go somewhere. One day, it could be reclaimed and reused with heat pumps, but that’s a story for another day. For now, let’s look at how all this heat gets dealt with currently.

How data centres are cooled

Data centre cooling is a science and an art. It takes diligent monitoring, and may employ several different techniques to get the desired result. None of these methods are cheap to engineer – but some are more energy efficient than others.

Liquid cooling

Liquid cooling systems use water to cool the servers. Heat is transferred to the cool liquid, which then “boils off” the energy. Water has a higher density than air, and is better at storing heat – so the cooling effect can be more noticeable; but this is true in the other direction, and quickly shedding heat in a system at equilibrium can be difficult. Plus, we all know what water does to electronics!

Air cooling

Air conditioners designed for server rooms move warm air and humidity out, and use refrigerant coils to cool the air being pumped back in. This is effective for whole-room cooling, and can help maintain a temperature controlled ecosystem within the data centre. Running them 24/7 is super expensive, though, and they need regular maintenance to work at their best.

Raised floors

A raised floor has several benefits – but for cooling, they create a chilled space below the raised platform where heat can be transferred through chilled water or server room air conditioners.


All the hot air has to go somewhere, so HVAC systems are pretty much essential across the board.

Hot and cold aisle arrangement

Simply designing the data centre around this layout can help reduce cooling costs and maintain performance. It works by making sure a hot aisle has a cool aisle next to it, and with good airflow, heat can be more efficiently changed. This can’t work everywhere, but purpose-built data centres, like the GS2 data centre in London, have maximised the performance gains of built-in cooling.

Of course there are loads of other ways to cool a data centre – like building it in the frigid glacial temperatures of Iceland, or sinking them way down into the ocean – but these are some of the most common methods.

Cooling at Verne (Volta) Data Centre, London

We operate rack space and Private Clouds at Verne Data Centre in London (formerly Volta) – which uses an innovative row-based cooling system. In-row cooling is not new tech by any means, but it is certainly uncommon; in fact, of all the data centres we’ve been to, this is the only one we’ve seen that uses it. Verne’s cooling was installed back in 2014, but even now, it is still one of the most effective and efficient systems in the industry.

Row-based cooling is a modular system that closely couples coolers to servers. At Verne, each cooler is no more than 4 metres from any server.

This allows for precise cooling control with less wastage, because the cooler is fed with the warmest possible air directly from the server. Coolers can be coupled to the densest area, and filter downwards in cooling priority. The hot air is cooled, then provided directly back to the cold aisle – ensuring the racks always consume cool air.

The precision and efficiency boost means that each cooler can deliver up to 60kW of cooling using just 75 watts of power.

It’s smart, too. Each cooler continually monitors the whole cooling system – so if one module fails, its neighbours automatically adjust their capacity, to protect the servers and allow continuous operation.

The row-based cooling system has resilience of N+1 per unit, and uses 42% less fan power than the industry standard (CRAC units). It’s more reliable and more efficient than other systems, and being modular, it can easily and quickly be swapped out and repaired.

View Verne Data Centre technical specifications

Volta/Verne Row-based Cooling
Volta/Verne‘s state-of-the-art row-based cooling

Backup and redundancy is important, too. Story time…

We work with a lot of data centres around the country (well, the world actually), and no matter where the data centre is located, cooling is always a top concern.

Not long ago, we had an alert through our IT monitoring platform that the cooling system at a data centre (not Volta/Verne) wasn’t working at 100% (probably!) – we saw temperatures fluctuate and rise more than normal, and that triggered the alert once it went over our threshold. Thankfully, this data centre had cooling redundancy, and other parts were still operational – the effect of that was, whilst the data centre floor was warmer than we’d expect, it was still well within operational limits.

But this is a pretty high-end data centre, so the cooling issue was fixed later that day – and it’s good to know that, even if the worst happens, there’s always some cooling available. Still, this is a good reminder that no data centre is 100% fail-proof; things can and do go wrong. What matters is that it’s monitored, addressed and fixed, while the built-in redundancies save the data centre from entering a full shutdown.

At deeserve, we keep a watchful eye over cooling at data centres – and we choose to work with data centre partners that operate at tier 3 or higher, to give total piece of mind.

Reliable Data US and UK Centres

As well as GS2, deeserve operates rack space at Volta data centre in London, and overseas at Equinix MI6 in Miami. These are among the most reliable data centres in the world – all covered by our diligent IT monitoring team.

Want to know more? Just call us on 01509 80 85 86 or send your message to [email protected].

More services to help

We offer a comprehensive range of IT services to suit all businesses - from "helpdesk"-style IT support to data centre hosting services.

Proactive IT Systems

Established solution partners

  • Microsoft
  • Hp
  • WatchGuard
  • cisco.

Want to work with us?

We are driven by creating experiences that deliver
results for your business and for your customers.
Or just email on [email protected]