One of our most popular webinar presentations on data center cooling: 2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Close Coupled Cooling Technology.
If you're looking for a solution, it's simple physics: Water is 3,500 times more effective at cooling than air. But, liquid cooling carries a large stigma particularly because of the large price tag. And, if you're like other Data Center Managers, the words of Jerry McGuire may be ringing in your head "Show me the money!"
To view the recorded webinar presentation, please visit http://www.42u.com/data-center-liquid-cooling-webinar.htm
10. Cooling Options: A Tutorial Close Coupled Cooling Rear Door Air Cooled Supplemental In Row Solutions Active Air 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
25. TCO: Real Estate Cost Analysis Real Estate Savings $129,617 $195, 946
26. TCO: Real Estate Cost Analysis Available 40KW, 3 rack space with LCP 20KW 30 – 40 KW common 40KW 10 rack sp. 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW L C P L C P 20KW Available 40+KW, 2 rack space L C P L C P 30 – 40 KW
27. TCO: Energy Cost Analysis Energy Savings Excellent - $69K Average - $128K Poor - $302K
28.
29.
30. Cooling: Total Cost of Ownership Freecooler Low noise Chiller Pump Heatrecovery Pump- station Buffertank Emergency water building airconditioning Chiller options
35. The Green Impact The EPA estimates that at the current data center growth rate that we will need 10 new power plants before 2012 just to support IT growth! http://www.energystar.gov/index.cfm?c=prod_development.server_efficiency_study The use of close coupled cooling results in less energy loss in moving the air through the data center, saving fan energy, as well as reduces the cost to produce chilled water. This can result in significant energy usage reduction to cool the IT load, the place where the biggest impact can be seen today.
40. Q&A To Receive a Copy of Today’s Presentation: [email_address]
Notas do Editor
Ladies and Gentlemen: Thanks for standing by and welcome to today’s session in the DirectNET web Seminar Serious. Today’s presentation is entitled: “2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Liquid Packages.” During the presentation, all participants will be in a listen only mode. However, we encourage your questions or comments at anytime through the “chat” feature located at the lower left of your screen. These questions will be addressed as time allows. As a reminder, this Web Seminar is being recorded, today, November 14 th 2007 and a recording will be sent to all attendees within 48 hours.
<Jen> Before we get started today, I’d like to introduce our speaker: Joining use from Rittal Corporation is Daniel Kennedy Rimatrix Solution Center Coordinator (IT Group). A degreed electrical engineer, with a concentration in power systems, Daniel brings engineering support to the IT market, having been involved in all aspects of system design, installation, and support of high density IT deployments. With Rittal for almost 4 years, Daniel brings fresh ideas for the design and implementation of data center facilities. He has provided training to end users, installation contractors, and design consultants, in conferences, classrooms, as well as in the field. Daniel has shared this knowledge with many different clients in financial services, Fortune 500 companies, manufacturing organizations, as well as education institutions and government agencies. Daniel has been the lead project engineer supporting Rittal’s efforts in bringing close-coupled liquid cooling systems to North America . Moderating today’s conference is Jennifer Osborn, Data Center Infrastructure Consultant for DirectNET. In her 5 years at DirectNET, Jennifer has focused on server management and datacenter management solution design and implementation, including consultative needs analysis, project engineering, and support for enterprise applications. In that time she has attained several certifications from some of the largest vendors within the industry. Jennifer obtained her Bachelor of Science, Interior Design degree from Michigan State University. Prior to joining the team at DirectNET, Inc. Jennifer had an initial career focus in the Commercial Interior Design Industry with an emphasis on IT spatial analysis and datacenter design. This developed into Project Management in the Architectural and Design Industry. <LISA> Jennifer, I’ll turn the conference over to you.
<Jen> Thanks Lisa. Before we get started today, let’s quickly go over today’s agenda. First, to set the stage our discussion around Liquid Cooling,, we’re going to review how power is changing the landscape of our data centers. As part of that introduction we’ll share a few very disturbing statistics, which, unfortunately, AREN’T just stats for many of you joining today’s discussion. But, as we all know, the best way to counter the affects of a increasing power and heat load is to employ bullet-proof cooling solution. But, with so many cooling options out there, how do you know which is right for your organization? Well, we’re going to walk you through a quick tutorial of your options and discuss the pros and cons’ of each and where each is best applied, based on density loads. After that tutorial of all the solutions available, we’re going to dive down into liquid cooling – a solution that has a lot of buzz right now but has just has many questions. What exactly is the architecture ? How does is compares to conventional cooling solutions? What's the cost savings can expect? We’ll answer all of those questions. And as you’ll learn, Liquid cooling definitely changes the landscape by which you’ve conventionally measured cost and savings. We’ll also give a brief introduction to Rittal’s Liquid Cooling Package. Finally, we’ll go over 8 Considerations each Data Center should make before investing in a cooling solution. As a reminder, everyone attending today’s conference will receive a copy of the Liquid Cooling Study – sections of which we’ll discuss today TRANSITION: <Jen> Daniel, before we start talking about liquid cooling, let’s start off by acknowledging some of the very extensive changes that are taking place in a data center environment. Specifically, we’re hearing a lot of buzz around the word integration, integration, integration. What exactly does that mean and why is it is critical to today’s data center?
Daniel>….. <Jen> Then, when you talk about data center integration you’re talking about integration of several components –systems, infrastructure, cross-functional teams. There’s a people, process and technology component, correct? <Daniel>…. NO TRANSITION
<Daniel>….When you think about the data center ecosystem, there are many parts that have to work together, including;…….. TRANSITION <Jen> yes, I would say that this layout is very typical for many of the data centers that I encounter. Which, conventionally speaking, has worked very well. But, more and more, I’m finding that these data centers are having problems with climate control.
<Daniel> Exactly, Jen. …. TRANSITION: <Daniel> Let me share a few interesting, if not concerning, stats around this the issue of climate control.
<Daniel>……..In the most simple form, what’s happening is that servers are more dense, drawing more power than ever before. And that trend is growing. NO TRANSITION
<Daniel>: Likewise, we’re seeing a significant increase the amount of Mechanical Space. <Jen>: This is an extremely important fact, Daniel. Basically, what you’re seeing is that more space has to allocated to the equipment that runs mission critical applications – meaning you have to cut back on the IT hardware that can actually DRIVE business, right? <Daniel>…. TRANSITION<Jen>: Those are some rather grim statistics, Daniel – but I wouldn’t be surprised if many of the folks joining us today haven’t already felt the pressure of these changes. So, how do they remedy the situation?
<Daniel>;. There are several options that can be employed to manage your cooling situation. Because there’s not really a one-size-fits-all approach, I think we should spend a little time going through each of these.
<Daniel>:… <Jen>: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment?
<Daniel>:… <Jen>: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment? TRANSITION <Jen>: Because Close Coupled is probably a new term for folks in our audience, can you show us what you mean by that?
<DANIEL>: “Imagine if we only had to cool the rack, and not the enter room – that’s closed couple cooling.”
TRASITION <Jen>: It’s clear how these compare to CRAC units – based one the design- and how it can help with limiting the actual real estate needed. But, one of the questions I frequently get about CRAC units is the issues of redundancy – so, what happens if the unit fails?
<DANIEL> “Failure depends on how you design it.”
<Daniel>:…. TRANSITION <Jen>: This all makes sense, but I know the question on everyone’s mind is cost? What does a solution like this cost?
TRANSITION <Jen>: This all makes sense. And, I think we would all agree that cooling at the server level is far better than cooling at the room level. But, I know the question on everyone’s mind is cost? What does a solution like this cost?”
TRANSITION<Jen>: The Real Estate savings makes intuitive sense -- if you’re cooling at the rack level. But what about Energy? Engery cos Is probably the largest cost driver for conventional tools. Is this cost lowered with closed couples or can do we just expect the same types of spending?
<Daniel> In summary, we are talking about two distinct areas of savings – physical real estate and energy. <Daniel> But, Jen, these are just some of the expected cost savings areas. Closed couple cooling provides a very interesting savings opportunity for FREE Energy? TRANSITION <Jen> FREE energy? I’ve never heard of such a thing. I think you’re going to have to show us what you mean by that one!
TRANSITION <Jen> That sounds great for CA but many of my customers are on the south eastern United States. What results can they expect?
TRANSTION: <Jen> So, close coupled cooling offers not only savings with in real estate and energy, but based on outdoor temperatures, there’s also an opportunity for free cooling. I think the next logical question for some folks joining us today it may be worthwhile to put this in terms that our audience will understand. What I mean by that, is how do they know how energy efficient their data center currently is?
<TRANSITION> Jen: Many of customer, getting below 2.0 would be wonderful, if this is way to get there and at a cost savings, it seem like a very viable solution. But, these customer are also talking to me about w/ any new solution that they be environmentally friendly.
TRANSITION <Jen>: Another item of concern is the ‘newness’ of the technology. I think many folks may be apprehensive about being the first to adopt a new technology, particularly one that requires putting water so close to their servers.
TRANSITION <Jen>: That’s a really important point. Liquid cooling isn’t really the ‘uncharted territory’ that we may think it is. You’ve overcome how some misconceptions but with any solution, there has to be drawbacks. What do you consider some of the drawbacks to Closed-couple cooling?
TRANSITION <Jen> I think you’ve given us some great information around closed-couple cooling – obtaining free cooling, savings areas that we may not have originally considered, that this is NOT a new technology. For those listening today who are evaluating solutions, what are some of the things you would recommend we consider before purchasing a solution.