NREL is a national laboratory operated by the Alliance for Sustainable Energy, LLC for the US Department of Energy. The document discusses trends in data center design, including examples from NREL of large energy savings. It covers topics like environmental conditions, cooling systems, electrical systems, and data center metrics. The presentation provides information on optimizing data center efficiency and reducing costs through best practices in design and operation.
US Trends in Data Centre Design with NREL Examples of Large Energy Savings
1. NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.
US
Trends
in
Data
Centre
Design
with
NREL
Examples
of
Large
Energy
Savings
Understanding
and
Minimising
The
Costs
of
Data
Centre
Based
IT
Services
Conference
University
of
Liverpool
O?o
Van
Geet,
PE
June
17,
2013
3. 3
BPG
Table
of
Contents
• Summary
• Background
• Informa?on
Technology
Systems
• Environmental
CondiLons
• Air
Management
• Cooling
Systems
• Electrical
Systems
• Other
Opportuni?es
for
Energy
Efficient
Design
• Data
Center
Metrics
&
Benchmarking
5. 5
Data
Center
equipment’s
environmental
condiLons
should
fall
within
the
ranges
established
by
ASHRAE
as
published
in
the
Thermal
Guidelines
book.
Environmental
CondiLons
ASHRAE
Reference:
ASHRAE
(2008),
(2011)
(@
Equipment
Intake) Recommended Allowable
Temperature
Data
Centers
ASHRAE
18°
–
27°C
15°
–
32°C
(A1)
5°
–
45°C
(A4)
Humidity
(RH)
Data
Centers
ASHRAE
5.5°C
DP
–
60%
RH
and
15oC
DP
20%
–
80%
RH
Environmental
SpecificaLons
(°C)
8. 8
EsLmated
Savings
Baseline
System
DX
Cooling
with
no
economizer
Load
1
ton
of
cooling,
constant
year-‐round
Efficiency
(COP)
3
Total
Energy
(kWh/yr)
10,270
RECOMMENDED
RANGE
ALLOWABLE
RANGE
Results
Hours
Energy
(kWh)
Hours
Energy
(kWh)
Zone1:
DX
Cooling
Only
25
8
2
1
Zone2:
Mul?stage
Indirect
Evap.
+
DX
(H80)
26
16
4
3
Zone3:
Mul?stage
Indirect
Evap.
Only
3
1
0
0
Zone4:
Evap.
Cooler
Only
867
97
510
57
Zone5:
Evap.
Cooler
+
Outside
Air
6055
417
1656
99
Zone6:
Outside
Air
Only
994
0
4079
0
Zone7:
100%
Outside
Air
790
0
2509
0
Total
8,760
538
8,760
160
Es0mated
%
Savings
-‐
95%
-‐
98%
9. 9
Data
Center
Efficiency
Metric
• Power
Usage
EffecLveness
(P.U.E.)
is
an
industry
standard
data
center
efficiency
metric.
• The
raLo
of
power
used
or
lost
by
data
center
facility
infrastructure
(pumps,
lights,
fans,
conversions,
UPS…)
to
power
used
by
compute.
• Not
perfect,
some
folks
play
games
with
it.
• 2011
survey
esLmates
industry
average
is
1.8.
• Typical
data
center,
half
of
power
goes
to
things
other
than
compute
capability.
9
“IT power” + “Facility power”
P.U.E. =
“IT power”
12. “I
am
re-‐using
waste
heat
from
my
data
center
on
another
part
of
my
site
and
my
PUE
is
0.8!”
ASHRAE
&
friends
(DOE,
EPA,
TGG,
7x24,
etc..)
do
not
allow
reused
energy
in
PUE
&
PUE
is
always
>1.0.
Another
metric
has
been
developed
by
The
Green
Grid
+;
ERE
–
Energy
Reuse
EffecLveness.
h?p://www.thegreengrid.org/en/Global/Content/white-‐papers/ERE
13. 13
ERE
–
Adds
Energy
Reuse
Utility
Cooling
UPS PDU
IT
Rejected
Energy
(a)
(b)
(c) (d)
(f)
(e)
Reused
Energy
(g)
14. 14
Credit:
Haselden
ConstrucLon
• More
than
1300
people
in
DOE
office
space
on
NREL’s
campus
• 33,445
m2
• Design/build
process
with
required
energy
goals
̶
50%
energy
savings
from
code
̶
LEED
Pla?num
• Replicable
̶ Process
̶ Technologies
̶ Cost
• Site,
source,
carbon,
cost
ZEB:B
̶
Includes
plugs
loads
and
datacenter
• Firm
fixed
price
-‐
US
$22.8/m2
construcLon
cost
(not
including
$2.5/m2
for
PV
from
PPA/
ARRA)
• Opened
June
10,
2010
(First
Phase)
DOE/NREL
Research
Support
Facility
15. 15
RSF
Datacenter
• Fully
containing
hot
aisle
– Custom
aisle
floor
and
door
seals
– Ensure
equipment
designed
for
cold
aisle
containment
§ And
installed
to
pull
cold
air
Ø Not
hot
air…
–
1.18
annual
PUE
–
ERE
=
0.9
• Control
hot
aisle
based
on
return
temperature
of
~90F.
• Waste
heat
used
to
heat
building.
• Outside
air
and
EvaporaLve
Cooling
• Low
fan
energy
design
• 176
Sq
m.
Credit:
Marjorie
Scho?/NREL
18. 18
Move
to
Liquid
Cooling
• Server
fans
are
inefficient
and
noisy.
– Liquid
doors
are
an
improvement
but
we
can
do
beger!
• Power
densiLes
are
rising
making
component-‐
level
liquid
cooling
soluLons
more
appropriate.
• Liquid
Benefit
– Thermal
stability,
reduced
component
failures.
– Beger
waste
heat
re-‐use
op?ons.
– Warm
water
cooling,
reduce/eliminate
condensa?on.
– Provide
cooling
with
higher
temperature
coolant.
• Eliminate
expensive
&
inefficient
chillers.
• Save
wasted
fan
energy
and
use
it
for
compuLng.
• Unlock
your
cores
and
overclock
to
increase
throughput!
19. 19
Liquid
Cooling
–
Overview
Water
and
other
liquids
(dielectrics,
glycols
and
refrigerants)
may
be
used
for
heat
removal.
• Liquids
typically
use
LESS
transport
energy
(14.36
Air
to
Water
Horsepower
ra?o
for
example
below).
• Liquid-‐to-‐liquid
heat
exchangers
have
closer
approach
temps
than
Liquid-‐to-‐air
(coils),
yielding
increased
outside
air
hours.
20. 20
2011
ASHRAE
Liquid
Cooling
Guidelines
NREL
ESIF
HPC
(HP
hardware)
using
24
C
supply,
40
C
return
–W4/W5
21. 21
NREL
HPC
Data
Center
Showcase
Facility
• 10MW,
929
m2
• Leverage
favorable
climate
• Use
direct
water
to
rack
cooling
• DC
manager
responsible
for
ALL
DC
cost
including
energy!
• Waste
heat
captured
and
used
to
heat
labs
&
offices.
• World’s
most
energy
efficient
data
center,
PUE
1.06!
• Lower
CapEx
and
OpEx.
Leveraged
exper0se
in
energy
efficient
buildings
to
focus
on
showcase
data
center.
Chips to bricks approach
• Opera?onal
1-‐2013,
Petascale+
HPC
Capability
in
8-‐2013
• 20-‐year
planning
horizon
̶ 5
to
6
HPC
genera?ons.
High
Performance
CompuLng
22. 22
CriLcal
Data
Center
Specs
• Warm
water
cooling,
24C
̶ Water
much
beger
working
fluid
than
air
-‐
pumps
trump
fans.
̶ U?lize
high
quality
waste
heat,
40C
or
warmer.
̶ +90%
IT
heat
load
to
liquid.
• High
power
distribuLon
̶ 480VAC,
Eliminate
conversions.
• Think
outside
the
box
̶ Don’t
be
sa?sfied
with
an
energy
efficient
data
center
nestled
on
campus
surrounded
by
inefficient
laboratory
and
office
buildings.
̶ Innovate,
integrate,
op?mize.
Dashboards
report
instantaneous,
seasonal
and
cumulaLve
PUE
values.
23. 23
• Data
center
equivalent
of
the
“visible
man”
– Reveal
not
just
boxes
with
blinky
lights,
but
the
inner
workings
of
the
building
as
well.
– Tour
views
into
pump
room
and
mechanical
spaces
– Color
code
pipes,
LCD
monitors
NREL
ESIF
Data
Center
Cross
SecLon
24. 24
• 2.5 MW – Day one
capacity (Utility $500K/
yr/MW)
• 10 MW – Ultimate
Capacity
• Petaflop
• No Vapor Compression
for Cooling
Data Center
25. 25
Summer Cooling Mode
PUE –
Typical Data Center =
1.5 – 2.0
NREL ESIF= 1.04
* 30% more energy
efficient than your
typical “green” data
center
Data Center
26. 26
Winter Cooling Mode
ERE – Energy Reuse
Effectiveness
How efficient are we
using the waste heat to
heat the rest of the
building?
NREL ESIF= .7 (we use
30% of waste heat)
(more with future campus
loops)
Future Campus
Heating Loop
Future
Campus
Heating
Loop
High Bay
Heating
Loop
Office
Heating
Loop
Conference
Heating
Loop
Data Center
27. 27
95 deg
Air
75 deg
Air
• Water to rack Cooling for High Performance
Computers handles 90% of total load
• Air Cooling for Legacy Equipment handles 10% of total Load
Data Center – Cooling Strategy
28. 28
PUE
1.0X
-‐-‐
Focus
on
the
“1”
Facility PUE
IT Power Consumption
Energy Re-use
We all know how to do this!
True efficiency requires 3-D optimization.
29. 29
Facility PUE
IT Power Consumption
Energy Re-use
We all know how to do this!
Increased work per watt
Reduce or eliminate fans
Component level heat exchange
Newest processors are more efficient.
True efficiency requires 3-D optimization.
PUE
1.0X
-‐-‐
Focus
on
the
“1”
30. 30
Facility PUE
IT Power Consumption
Energy Re-use
True efficiency requires 3-D optimization.
We all know how to do this!
Increased work per watt
Reduce or eliminate fans
Component level heat exchange
Newest processors are more efficient.
Direct liquid cooling,
Higher return water temps
Holistic view of data center
planning
PUE
1.0X
-‐-‐
Focus
on
the
“1”
31. 31
What’s
Next?
ü Energy
Efficient
supporLng
infrastructure.
ü Pumps,
large
pipes,
high
voltage
(380
to
480)
electrical
to
rack
ü Efficient
HPC
for
planned
workload.
ü Capture
and
re-‐use
waste
heat.
Can
we
manage
and
“opLmize”
workflows,
with
varied
job
mix,
within
a
given
energy
“budget”?
Can
we
do
this
as
part
of
a
larger
“ecosystem”?
31 Steve Hammond
32. 32
Other
Factors
32 5
DemandSMART: Comprehensive Demand Response
Balancing supply and demand on the electricity grid is difficult and expensive. End users
that provide a balancing resource are compensated for the service.
Annual Electricity Demand As a Percent of Available Capacity
50%
100%
Winter Spring Summer Fall
75%
25%
90%
4MW solar
Use waste heat
Better rates, shed load
DC as part of Campus Energy System
33. 33
ParLng
Thoughts
• Energy Efficient Data Centers – been there, done that
– We know how, let’s just apply best practices.
– Don’t fear H20: Liquid cooling will be increasingly prevalent.
• Metrics will lead us into sustainability
– If you don’t measure/monitor it, you can’t manage it.
– As PUE has done; ERE, Carbon Use Effectiveness (CUE), etc. will help drive
sustainability.
• Energy Efficient and Sustainable Computing – it’s all about the “1”
– 1.0 or 0.06? Where do we focus? Compute & Energy Reuse.
• Holistic approaches to Energy Management.
– Lots of open research questions.
– Projects may get an energy allocation rather than a node-hour allocation.
35. 35
• Thermoelectric
power
generaLon
(coal,
oil,
natural
gas
and
nuclear)
consumes
about
1.1
gallon
per
kW
hour,
on
average.
• This
amounts
to
about
9.6
M
gallons
per
MW
year.
• We
esLmate
about
2.5
M
gallons
water
consumed
per
MW
year
for
on-‐site
evaporaLve
cooling
towers
at
NREL.
• If
chillers
need
0.2MW
per
MW
of
HPC
power,
then
chillers
have
an
impact
of
2.375M
gallons
per
year
per
MW.
• Actuals
will
depend
on
your
site,
but
evap.
cooling
doesn’t
necessarily
result
in
a
net
increase
in
water
use.
• Low
Energy
use
=
Lower
water
use.
Energy
Reuse
uses
NO
water!
Water
ConsideraLons
“We shouldn’t use evaporative cooling, water is scarce.”
NREL PIX 00181
36. 36
Data
Center
Efficiency
• Choices regarding power, packaging, cooling, and energy
recovery in data centers drive TCO.
• Why should we care?
• Carbon footprint.
• Water usage.
• Mega$ per MW year.
• Cost: OpEx ~ IT CapEx!
• A
less
efficient
data
center
takes
away
power
and
dollars
that
could
otherwise
be
used
for
compute
capability.
37. 37
HolisLc
Thinking
• Approach
to
Cooling:
Air
vs
Liquid
and
where?
– Components,
Liquid
Doors
or
CRACs,
…
• What
is
your
“ambient”
Temperature?
– 55F,
65F,
75F,
85F,
95F,
105F
…
– 13C,
18C,
24C,
30C,
35C,
40.5C
…
• Electrical
distribuLon:
– 208v
or
480v?
• “Waste”
Heat:
– How
hot?
Liquid
or
Air?
Throw
it
away
or
Use
it?
38. 38
Liquid
Cooling
–
New
ConsideraLons
• Air
Cooling
– Humidity
– Fan
failures
– Air
side
economizers,
par?culates
• Liquid
Cooling
– pH
&
bacteria
– Dissolved
solids
– Corrosion
inhibitors,
etc.
• When
considering
liquid
cooled
systems,
insist
that
providers
adhere
to
the
latest
ASHRAE
water
quality
spec
or
it
could
be
costly.
40. 40
2011
ASHRAE
Thermal
Guidelines
2011
Thermal
Guidelines
for
Data
Processing
Environments
–
Expanded
Data
Center
Classes
and
Usage
Guidance.
White
paper
prepared
by
ASHRAE
Technical
Commi?ee
TC
9.9
41. 41
Energy
Savings
PotenLal:
Economizer
Cooling
Energy
savings
poten?al
for
recommended
envelope,
Stage
1:
Economizer
Cooling.12
(Source:
Billy
Roberts,
NREL)
42. 42
Data
Center
Energy
• Data
centers
are
energy
intensive
faciliLes.
– 10-‐100x
more
energy
intensive
than
an
office.
– Server
racks
well
in
excess
of
30kW.
– Power
and
cooling
constraints
in
exis?ng
facili?es.
• Data
Center
inefficiency
steals
power
that
would
otherwise
support
compute
capability.
• Important
to
have
DC
manager
responsible
for
ALL
DC
cost
including
energy!
43. 43
Energy
Savings
PotenLal:
Economizer
+
Direct
EvaporaLve
Cooling
Energy
savings
poten?al
for
recommended
envelope,
Stage
2:
Economizer
+
Direct
Evap.
Cooling.12
(Source:
Billy
Roberts,
NREL)
44. 44
Energy
Savings
PotenLal:
Economizer
+
Direct
Evap.
+
MulLstage
Indirect
Evap.
Cooling
Energy
savings
poten?al
for
recommended
envelope,
Stage
3:
Economizer
+
Direct
Evap.
+
Mul?stage
Indirect
Evap.
Cooling.12
(Source:
Billy
Roberts,
NREL)
45. 45
Data
Center
Energy
Efficiency
• ASHRAE
90.1
2011
requires
economizer
in
most
data
centers.
• ASHRAE
Standard
90.4P,
Energy
Standard
for
Data
Centers
and
Telecommunica0ons
Buildings
• PURPOSE:
To
establish
the
minimum
energy
efficiency
requirements
of
Data
Centers
and
TelecommunicaLons
Buildings,
for:
• Design,
construcLon,
and
a
plan
for
operaLon
and
maintenance
• SCOPE:
This
Standard
applies
to:
• New,
new
addiLons,
and
modificaLons
to
Data
Centers
and
TelecommunicaLons
Buildings
or
porLons
thereof
and
their
systems
• Will
set
minimum
PUE
based
on
climate.
• More
detail
at
:
h?ps://www.ashrae.org/news/2013/ashrae-‐seeks-‐
input-‐on-‐revisions-‐to-‐data-‐centers-‐in-‐90-‐1-‐energy-‐standard-‐scope
46. 46
1. Reduce
the
IT
load
-‐
VirtualizaLon
&
ConsolidaLon
(up
to
80%
reducLon)
2.
Implement
contained
hot
aisle
and
cold
aisle
layout.
̶ Curtains,
equipment
configura?on,
blank
panels,
cable
entrance/exit
ports,
3. Install
economizer
(air
or
water)
and
evaporaLve
cooling
(direct
or
indirect).
4. Raise
discharge
air
temperature.
Install
VFD’s
on
all
computer
room
air
condiLoning
(CRAC)
fans
(if
used)
and
network
the
controls.
5. Reuse
data
center
waste
heat
if
possible.
6. Raise
the
chilled
water
(if
used)
set-‐point.
̶ Increasing
chiller
water
temp
by
1°C
reduces
chiller
energy
use
by
about
3%
7. Install
high
efficiency
equipment
including
UPS,
power
supplies,
etc..
8. Move
chilled
water
as
close
to
server
as
possible
(direct
liquid
cooling).
9. Consider
centralized
high
efficiency
water
cooled
chiller
plant
̶ Air-‐cooled
=
2.9
COP,
water-‐cooled
=
7.8
COP
Energy
ConservaLon
Measures
47. 47
Equipment
Environmental
SpecificaLon
Air Inlet to IT Equipment is the
important specification to meet
Outlet temperature is not
important to IT Equipment
48. 48
Recommended
Range
(Statement
of
Reliability)
Preferred
facility
opera?on;
most
values
should
be
within
this
range.
Allowable
Range
(Statement
of
FuncLonality)
Robustness
of
equipment;
no
values
should
be
outside
this
range.
MAX
ALLOWABLE
RACK
INTAKE
TEMPERATURE
MAX
RECOMMENDED
Over-‐Temp
Recommended
Range
Under-‐Temp
MIN
RECOMMENDED
MIN
ALLOWABLE
Allowable
Range
Key
Nomenclature
49. 49
Improve
Air
Management
• Typically,
more
air
circulated
than
required.
• Air
mixing
and
short
circuiLng
leads
to:
– Low
supply
temperature
– Low
Delta
T
• Use
hot
and
cold
aisles.
• Improve
isolaLon
of
hot
and
cold
aisles.
– Reduce
fan
energy
– Improve
air-‐condi?oning
efficiency
– Increase
cooling
capacity
49
Hot
aisle/cold
aisle
configuraLon
decreases
mixing
of
intake
&
exhaust
air,
promoLng
efficiency.
Source:
hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf
50. 50
Isolate
Cold
and
Hot
Aisles
Source:
hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf
70-80ºF vs. 45-55ºF
95-105ºF vs. 60-70ºF
51. 51
Adding
Air
Curtains
for
Hot/Cold
IsolaLon
Photo
used
with
permission
from
the
NaLonal
Snow
and
Ice
Data
Center.
h?p://www.nrel.gov/docs/fy12osL/53939.pdf
53. 53
Three
(3)
Cooling
Device
Categories
IT
Equipment
Rack
cooling
water
rack
containment
SERVER
FRONT
1
-‐
Rack
Cooler
• APC-‐water
• Knürr(CoolTherm)-‐water
• Knürr(CoolLoop)-‐water
• Rigal-‐water
2
-‐
Row
Cooler
• APC(2*)-‐water
• Liebert-‐refrigerant
IT
Equipment
Rack
IT
Equipment
Rack
IT
Equipment
Rack
row
containment
cooling
water
cooling
water
SERVER
FRONT
3
-‐
Passive
Door
Cooler
• IBM-‐water
• Vege/Coolcentric-‐water
• Liebert-‐refrigerant
• SUN-‐refrigerant
SERVER
FRONT
IT
Equipment
Rack
cooling
water
Courtesy
of
Henry
Coles,
Lawrence
Berkeley
Na0onal
Laboratory
54. 54
“Chill-‐off
2”
EvaluaLon
of
Close-‐
coupled
Cooling
SoluLons
Courtesy
of
Geoffrey
Bell
and
Henry
Coles,
Lawrence
Berkeley
Na0onal
Laboratory
less energy
use
55. 55
Cooling
Takeaways…
• Use
a
central
plant
(e.g.
chiller/CRAHs)
vs.
CRAC
units
• Use
centralized
controls
on
CRAC/CRAH
units
to
prevent
simultaneous
humidifying
and
dehumidifying.
• Move
to
liquid
cooling
(room,
row,
rack,
chip)
• Consider
VSDs
on
fans,
pumps,
chillers,
and
towers
• Use
air-‐
or
water-‐side
free
cooling.
• Expand
humidity
range
and
improve
humidity
control
(or
disconnect).