Categories: Sanger Science28 May 20196.1 min read

We’ve got the power

Running one of the largest biosciences data centres in Europe as efficiently as possible requires cool heads and even cooler machines


Sanger Institute's data centre. Image credit: Wellcome Sanger Institute, Genome Research Limited

Walking into the Sanger Institute’s Morgan building on the Wellcome Genome Campus, you’d be forgiven for not realising this unassuming building encloses a huge data centre, almost hidden towards the back. What looks like an office building at first, is home to four 250m2 halls rammed full of core processors. A tall glass wall shows off one of the quadrants and its processors, all blinking green and blue lights.

In total, there are over 35,000 computer cores across the four halls, assembled on 400 racks, with hundreds of kilometres of fibre cables connecting everything. The data centre is one of the largest biosciences data centres in Europe

The data centre has grown alongside the scale of our science – we produce vast amounts of DNA and RNA sequence data that has to be analysed and stored. Efficiently managing the hardware, maintaining backup systems and monitoring the health of the data centre are essential tasks.

Simon Binley, Data Centre Manager at the Wellcome Sanger Institute
Simon Binley

The data centre consumes 4 megawatts (MW) of power. To save on electricity, costs, and to reduce CO2 emissions, we operate a combined, cooling, heating, and power plant. This generates electricity and captures waste heat, which is then used to heat the buildings that need it, or is converted into energy for cooling systems in the data centre.

Simon Binley, Data Centre Manager at the Wellcome Sanger Institute, writes about our data infrastructure, and how he is working to constantly improve efficiency. His team have recently installed a new system that will enable even further energy savings. Simon has just been awarded ‘Data Centre Manager of the Year’, in the Data Centre Solutions awards.


By Simon Binley, Data Centre Manager at the Wellcome Sanger Institute

Data is at our very core

At the core of our technical infrastructure is a fleet of DNA sequencing machines which generate vast quantities of data. Our sequencing facility is one of the largest in the world.

That DNA sequence data is analysed on site, in our data centre - it’s the core of our work here. Our scientists and technicians are studying the data to provide insights into health and disease, evolution and the fundamental biology of life.

Maintaining the Sanger Institute data centre. Image credit: Wellcome Sanger Institute, Genome Research Limited

Every year, the sequencers generate data more quickly as the technology improves. We are sequencing more DNA, from more people, more organisms and more individual cells, than ever before. Data from DNA sequencing is set to soon become the biggest source of data on the planet, and Information Technology (IT) has to keep pace with that demand.

It’s hard to predict exactly what level of data centre capacity we will need over the next five to ten years, but all the trends point towards us needing greater IT capability; the more processing power you have, the greater the scientific capability.

Efficiency

Efficiency of operation is one of my key goals for the data centre. Simply put, any money saved on IT, in terms of either capital expenditure or operating costs, means more funding is available for key scientific research.

As part of our ongoing demand for greater processing power, we recently opened a fourth data hall within our data centre facility. Comprising of more than 400 racks and consuming 4MW of power (the equivalent of providing power to 2,000 - 2,500 homes), we now house one of the largest charitable data centres in Europe.

Managing our hardware

As part of this upgrade, we worked with Efficiency IT, a specialist company in data centre design and build, to install EcoStruxure IT Expert, a cloud-based data centre infrastructure management software platform, from Schneider Electric. It provides management insight into the operation of all key infrastructure assets in the data centre; this includes racks, power distribution units (PDUs), uninterruptible power supplies (UPS) and cooling equipment.

Previously, we had not used a unified management platform for our hardware assets. Such systems had grown organically as the data centre grew. It was perhaps natural that before, the focus was on the IT equipment housed within the data centre and not necessarily on the power flow. This solution unifies those two important data centre elements.

Having the ability to see and manage more than 400 racks of critical IT equipment, and all the components contained within each, gives us a significant advantage and enables a massive improvement to the operation of the data centre – including improving efficiency.

Beyond the data centre

As well as improving the operation of the data centre itself, the new systems and processes can also be extended to other assets throughout the Institute and beyond.

Several rooms containing communications equipment are distributed about the campus; they can all be brought under the visibility of the one system – and importantly, so too can the sequencers on which our science depends. These are each supported by individual UPS systems that can be networked and brought under the control of EcoStruxure IT. Downtime of more than 12 hours would require the chemicals in a sequencer to be replaced at a significant cost, so careful monitoring of the health of a UPS battery can help to avoid unnecessary expenditure.

Cooling power

Cooling, as ever, is a major point of focus when addressing operating efficiency. The payback for installing the new system and adopting our new processes will be seen through reduced operating costs, especially in terms of reduced energy consumption .

From a power and energy perspective, we expect to save between 5 and 10 per cent of power over the first two years in the data centre itself. We can do that by raising the room temperature in the data halls, currently set at 19C, to 21C. For a 4MW facility, that represents a significant cost saving, thanks to a reduced cooling effort required, and as the product matures, we would look to increase that even further .

We have ambitious targets for improving the Power Usage Effectiveness (PUE) rating for the data centre from between 1.6 and 1.8 where it is now, to between 1.4 and 1.6. Any improvement in PUE automatically reduces electrical spend and allows us to invest in other areas.

We’re all part of a really exciting scientific endeavour which requires us all to be at the top of our game. I’m keen to keep us at the forefront of data centre management and efficiency; it means there is more resource available for more equipment and the science at the heart of our mission.


Image credit: Wellcome Sanger Institute, Genome Research Limited

Find out more