• Electricity Incentivisation Scheme (EIS) at the University of Cambridge
• Design of Engineering’s Data Centre cooling system
• Energy use from 2010 onwards
• Next steps
Energy Efficient Server Rooms at the University of Cambridge
1. Energy Efficient Server Rooms at the
University of Cambridge
David Green
dsg1000@eng.cam.ac.uk
Department of Engineering
2. Presentation Overview
2
• Electricity Incentivisation Scheme (EIS) at the University
of Cambridge
• Design of Engineering’s Data Centre cooling system
• Energy use from 2010 onwards
• Next steps
3. The Electricity Incentivisation Scheme (EIS)
3
• Financial incentives to use
electricity more efficiently
• Annual allowances at departmental
level
• Financial reward if use less than
allowance
• Financial penalty if exceed
allowance
• Implemented 1 August 2008
• Energy & Carbon Reduction Project
In 2010/11 electricity
usage was 4.4% below
target, saving:
• £0.51 million
• 4,950 MWh
• 2,678 tonnes CO2
4. Department of Engineering Overview
4
• Accounts for around 10% of
university.
• Activities based in 7 buildings.
• Around 600 members of staff
• Four year M.Eng course – around
1,200 students.
• Postgraduate students numbers:
• 2011 (792) - 2012 (830)
5. 5
Server Room Cooling Project - Introduction
• The Problem
• Increase cooling capacity to
support future purchases
• Minimise all aspects of running
costs and carbon footprint
• The Solution
• Review cooling arrangements,
expand and consider options
• Alternative approach to cooling
• The Results
• PUE of 1.1
6. 6
The Problem
Background
• Initially a distributed arrangement.
• Centralised computing resources in two computer
rooms (34 racks,12 racks)
Pre 2010 Cooling Arrangement
• Refrigerant based CRAC system, full recirculation via
under floor plenum
• 63kW plug-load
Key Project Drivers
• University Energy Incentivisation Scheme (EIS)
• Further server purchases planned
• IT electricity consumption is a significant part of the
Department’s energy base load
Approach
• KJ Tait feasibility
study
• Support from the
University’s Estate
Management
• Computing Staff
• Salix Funding
7. 7
The Problem - Server Room Cooling Project
Drive to reduce energy costs
and carbon footprint
Consolidation of server
rooms
Power management
Existing DX cooling
equipment could not cope
with future plans
To implement a solution in a
live data centre
17. The Solution - Installation
17
Mechanical
Cooling Plant
Data Centre (34
racks – 150kW)
Electrical supply
distribution and
metering
18. 18
The Results – Key Points
• System has been operational
since December 2010
• IT load has risen from 63kW to
95kW
• Mix of low and medium density
servers
• Update of air filtration and
humidity control.
• Ambient conditions exceeded
30C with high RH
• Cold aisle did not exceed 25C
• Max RH 70%
• PUE 1.1 over 2 1/2 years
• Annual savings 200 tonnes
carbon and ~£40K
• Some fan and equipment failures
• Some visible dust
19. 19
June 2010 - kWh used per day, per consumer unit
791
810
866
840
837
837
823
852
866
826
851
850
850
831
781
848
908
863
633
624
772
1492
1565
1587
1493
1457
1474
1458
1568
1574
1506
1525
1525
1525
1546
1284
1601
1584
1528
745
830
1069
0
200
400
600
800
1000
1200
1400
1600
1800
01/06/2010
02/06/2010
03/06/2010
04/06/2010
05/06/2010
06/06/2010
07/06/2010
08/06/2010
09/06/2010
10/06/2010
11/06/2010
12/06/2010
13/06/2010
14/06/2010
15/06/2010
16/06/2010
17/06/2010
18/06/2010
19/06/2010
20/06/2010
21/06/2010
22/06/2010
23/06/2010
24/06/2010
25/06/2010
26/06/2010
27/06/2010
28/06/2010
29/06/2010
30/06/2010
kWhr
Air-con Units kWh used
Racks Units kWh used
The Results – Energy Use 2010
IT Load
~63kW
Cooling and
Lighting
~35kW
• `
20. 20
June 2010 - kWh used per day, per consumer unit
791
810
866
840
837
837
823
852
866
826
851
850
850
831
781
848
908
863
633
624
772
1492
1565
1587
1493
1457
1474
1458
1568
1574
1506
1525
1525
1525
1546
1284
1601
1584
1528
745
830
1069
0
200
400
600
800
1000
1200
1400
1600
1800
01/06/2010
02/06/2010
03/06/2010
04/06/2010
05/06/2010
06/06/2010
07/06/2010
08/06/2010
09/06/2010
10/06/2010
11/06/2010
12/06/2010
13/06/2010
14/06/2010
15/06/2010
16/06/2010
17/06/2010
18/06/2010
19/06/2010
20/06/2010
21/06/2010
22/06/2010
23/06/2010
24/06/2010
25/06/2010
26/06/2010
27/06/2010
28/06/2010
29/06/2010
30/06/2010
kWhr
Air-con Units kWh used
Racks Units kWh used
The Results – Energy Use 2011
June 2011 - kWh used per day, per consumer unit
98
96
96
96
109
103
106
96
96
97
98
103
128
159
168
180
178
175
260
175
170
2315
2294
2232
2190
2254
2298
2356
2325
2221
2300
2393
2255
2237
2191
2203
2220
2239
2279
2210
2058
2184
0
500
1000
1500
2000
2500
3000
01/06/2011
02/06/2011
03/06/2011
04/06/2011
05/06/2011
06/06/2011
07/06/2011
08/06/2011
09/06/2011
10/06/2011
11/06/2011
12/06/2011
13/06/2011
14/06/2011
15/06/2011
16/06/2011
17/06/2011
18/06/2011
19/06/2011
20/06/2011
21/06/2011
22/06/2011
23/06/2011
24/06/2011
25/06/2011
26/06/2011
27/06/2011
28/06/2011
29/06/2011
30/06/2011
kWhr
Air-con Units kWh used
Racks Units kWh used
PUE of 1.1
PUE of 1.65
21. 21
Design Development 2012 onwards – temperature,
humidity & air quality monitoring
• Enhanced filtration and air
quality monitoring
• Humidity limiting control
algorithm and web interface
• Fan updates and flow
dampers
• Low levels of equipment
failure
• Hosting from other
university departments
• Fire suppression
24. 24
Initial Results – Contamination and Server Failure
• Initially limited filtration, now extensive
and multi staged.
• Some visible dust and black
particulates.
• Basic analysis showed the particulates
to consist of dust, possibly pollen
particles and diesel engine exhaust
particulates.
• There has been a small number of fan
failures on servers but this is difficult
to directly attribute to the cooling
system.
25. 25
The Results – Reliability and Maintenance
• Initially maintenance was not
comprehensively scheduled
• Location - surprising amount of
large fibres caught by insect
screen in the Spring
• 3 monthly maintenance of the
equipment is required
• Routine ‘deep’ cleaning of facility
to ISO 7
• With internal installation room
cleanliness needs to be
maintained
26. Visibility of Building Performance - Energy Dashboard
26
• Visibility of actual building
performance.
• Digital signage.
• Encourage individuals to ‘own’
and take responsibility.
• ‘Buy-in’ now apparent in some
equipment purchases.
• Individual racks are metered
27. Engineering’s Data Centre electrical loads
• 300 MWh electrical base load
• Pre 2010 – 35% = Server rooms
• Now 2 x Data Centres and 23% of
base load
• Purchasing vs energy performance
27
28. 28
Summary
• Evaporative cooling has resulted in
significant energy and carbon savings
• Second Data Centre in Engineering is
now also based on this technology
• Interest from academic and commercial
sectors
• Catalyst for good practice in terms of
energy and carbon reduction
• Option for hot air exhaust use in natural
ventilation strategy – purge/enhance
stack ventilation strategy.