SlideShare a Scribd company logo
1 of 25
Green Computing
Eco Friendly Data Centre
Kunal Sahu 12BCL1034
Abstract
 The green data center has moved from theoretical to the realistic, with IT leaders
being challenged to construct new data centers (or retrofits the existing one) with
energy saving features, sustainable materials and other environmental efficiencies
in mind.
 This project deals with the effects that are caused by the data center. How severe
these effects are and how to overcome these. The measures that has been
provided are not only in constructional point of view but also focusing on other
dimensions.
Introduction
In recent years, there has been a rapid growth in the Information Technology
(IT) industry in India. Data Centers: the key infrastructure component
powering this sector–operate continuously (24x7) throughout the year, and are very
energy intensive. These high–tech facilities generally consume many times the energy
of a typical office building as much as a hundred times more on a square-meter basis.
These facilities are experiencing significant growth in India, making it one of the fastest
growing energy–use sectors and impacting electrical supply and distribution.
By one estimate, data centers consumed 0.8% of the world’s electricity in 2005, at an
aggregate cost of 7.2 billion USD. These figures are nearly double the estimates for
2000, indicating an annual growth rate of 16%. Although the net consumption of data
centers in India has yet to be quantified, the IT heavy nature of the country’s economy
suggests relative data center energy consumption in India exceeds the global average.
It’s not “speculation” any more…
 Atmospheric warming is a now a broadly accepted trend.
 41% of the Earth’s population (2.3 billion) live in water-stressed areas; 3.5 billion will do
so by 2025.
 Increasingly includes developed areas such as Western USA, Australia, SE England.
 There is now increasing pressure on traditional energy resources, especially from rapidly
industrializing nations (China, India).
 Energy consumption is expected to increase by over 129% in parts of Asia by 2020.
 CEO/CFO focus is turning to energy costs & environmental costs (CO2 emissions).
 Major investors increasingly see poor green performance as a source of risk.
 Information technology (IT) accounts for….
–Accounts for 2% of anthropogenic CO2.
–Roughly equivalent to aviation industry.
–IT energy usage will double next 4 years
Energy Demands Have Data Centers at a
Tipping Point
 IT energy demand doubling every 9 – 24 months.
 6500 US data centers consume electricity equal to the State of Utah.
 On average 100 units of energy generation equal 3 units of work for productive IT.
 IT accounts for 2% of anthropogenic CO2 emissions.
 e-waste cannot be ignored – 1 billion computers potential scrap by 2010.
 Enough to fill the Rose Bowl each year.
 IBM investing to lead in power and cooling efficiency!
What is a Green Data Center?
Diagnose
Get the facts to understand your
energy use and opportunities for
improvement
Build
Plan, build, and
upgrade to
energy efficient
data centers
CoolVirtualize
Implement
virtualization and
other innovative
technologies
Manage &
Measure
Seize control
with energy
management
software
Use innovative
cooling
solutions
How is energy typically used in the data
center?
55% 45%
Server/Storage hardware
70% 30%
Processor
Data center
Ways for Eco friendly Data Center
By applying several simple design choices the efficiency of facility, reduce costs, and
reduce your impact on the environment can be improved. Here are the top five best
practices:
 Measure PUE
 Manage airflow
 Adjust the thermostat
 Use free cooling
 Optimize power distribution
1. Measure PUE
We can't manage what we don’t measure, so we have to be sure to track the data
center's energy use. The industry uses a ratio called Power Usage Effectiveness (PUE) to
measure and help reduce the energy used for non-computing functions like cooling
and power distribution. To effectively use PUE, it's important to measure often. It
should be sampled at least once per second. It’s even more important to capture
energy data over the entire year, since seasonal weather variations affect PUE.
It’s a measure of how effectively you deliver power and
cooling to the IT equipment.
 For a service provider, they should focused on reducing the energy use while
serving the explosive growth of the Internet. Most data centers use almost as much
non-computing or "overhead" energy (like cooling and power conversion) as they
do to power their servers.
 According to the Uptime Institute's 2014 Data Center Survey, the global average of
respondents' largest data centers is around 1.7.
 To decrease PUE a different approach should be taken. The facilities should have
different power and cooling infrastructures, and should be located in different
climates. As seasonal weather patterns also impact PUE values, which may tend to
give lower PUE during cooler quarters. Some have managed to maintain a low PUE
average across our entire fleet of data center sites around the world—even during
hot, humid Atlanta summers.
2. Manage airflow
Good air flow management is crucial to efficient data center operation. Minimize hot
and cold air mixing by using well-designed containment. Then, eliminate hot spots and
be sure to use blanking plates (or flat sheets of metal) for any empty slots in your rack.
A little analysis can have big payoffs. For example, thermal modelling using
computational fluid dynamics (CFD) can help quickly to characterize and optimize air
flow for the facility without having to reorganize the computing room.
Cooled Air flow
Hot Air flow
 One of the simplest ways to save energy in a data center is to raise the
temperature. It’s a myth that data centers need to be kept chilly. According to
expert recommendations and most IT equipment manufacturers' specifications,
data center operators can safely raise their cold aisle to 80°F or higher. By doing so,
we can significantly reduce facility energy use.
 Other way is to use thermal modelling to locate “hot spots” and better understand
airflow in the data center. In the design phase, physically arrange equipment to
even out temperatures in the facility. Even after that, we can move certain
equipment like computer room air conditioners (CRACs) to reduce hot spots and
even out the ambient temperature—ultimately reducing the amount of time the
CRAC must run.
To cut cooling costs and save energy, prevent the “hot aisle” air behind the server racks
from mixing with the “cold aisle” in front of the server racks. In large data centers, people
use appropriate ducting and permanent enclosures. In addition, they take simple measures
well-suited for smaller “closet” style data centers. For instance, we:
 Use blanking panels (or flat sheets of metal) to close off empty rack slots and prevent
hot aisle air from seeping into to the cold aisle.
 Hang plastic curtains (like those used in commercial refrigerators) to seal off the cold
aisle.
 Enclose areas with components that run hotter (such as power supply units or PSUs)
with plastic curtains.
These efforts help to reduce the total amount of energy used for cooling. At the same time,
they ensure that the cooler air which is send into the cold aisles is truly cool enough to do
its job.
3. Adjust the thermostat
It has long been believed that IT equipment needs to run at low temperatures—
between 15°C/60°F and 21°C/70°F. However, the American Society of Heating,
Refrigerating and Air Conditioning Engineers (ASHRAE) recommends cold aisle
temperatures of up to 27°C/81°F, which have found to have no detrimental effect on
equipment. Most IT equipment manufacturers spec machines at 32°C/90°F or higher,
so there is plenty of margin. In addition, most CRACs are set to dehumidify the air
down to 40% relative humidity and to reheat air if the return air is too cold. Raising the
temperature and turning off dehumidifying and reheating provides significant energy
savings.
An elevated cold aisle temperature allows CRACs to operate more efficiently at higher
intake temperatures. Also, it allows for more days of “free cooling”—days where
mechanical cooling doesn’t need to run—if the facility has air- or water-side
economization.
The simple act of raising the temperature from 22°C/72°F to 27°C/81°F in a single
200kW networking room could save tens of thousands of dollars annually in energy
costs.
4. Use free Cooling
Chillers typically use the most energy in a data center's cooling infrastructure, so there
is largest opportunity for savings by minimizing their use. Taking advantage of "free
cooling" to remove heat from facility without using a chiller. This can include
 using low temperature ambient air,
 evaporating water, or
 a large thermal reservoir.
While there's more than one way to free cool, water and air-side economizers are
proven and readily available.
 Cooling with water—not chillers
The electricity that powers a data center ultimately turns into heat. Most data centers
use chillers or air conditioning units to cool things down, requiring 30-70% overhead
in energy usage. Data centers often use water as an energy-efficient way to cool
instead.
 Trap hot air and cool equipment with water
Google have designed custom cooling systems for their server racks that they’ve
named “Hot Hut” because they serve as temporary homes for the hot air that leaves
servers—sealing it away from the rest of the data center floor. Fans on top of each Hot
Hut unit pull hot air from behind the servers through water-cooled coils. The chilled air
leaving the Hot Hut returns to the ambient air in the data center, where their servers
can draw the chilled air in, cooling them down and completing the cycle.
 Take advantage of evaporative cooling: Evaporation is a powerful tool. In our
bodies, it helps us maintain our temperature even when outside temperatures are
warmer than we are. It also works similarly in data center’s cooling towers. As hot
water from the data center flows down the towers through a material that speeds
evaporation, some of the water turns to vapor. A fan lifts this vapor, removing the
excess heat in the process, and the tower sends the cooled water back into the
data center.
 Use the natural cooling power of sea water: Evaporating water isn't the only way to
free cool. Some data centers uses sea water to cool without chillers. Data centers
can be establish in places like Hamina, Finland because its cold climate and its
location on the Gulf of Finland as Google does. The cooling system are designed
which pumps cold water from the sea to the facility, transfers heat from operations
to the sea water through a heat exchanger, and then cools this water before
returning it to the sea. This approach can provide all of needed cooling year round,
and doesn’t need to install any mechanical chillers at all.
5. Optimize power distribution
We can minimize power distribution losses by eliminating as many power
conversion steps as possible. For the conversion steps make sure to have, a specify
efficient equipment transformers and power distribution units (PDUs). One of the
largest losses in data center power distribution is from the uninterruptible power
supply (UPS), so it's important to select a high-efficiency model. Also, keep high
voltages as close to the power supply as possible to reduce line losses.
Building custom, highly-efficient servers: Usually servers are high-performance
computers that run all the time. They're the core of data centers, and should be
designed to use as little energy as possible. It can be done by minimizing power loss
and by removing unnecessary parts. Also ensure that servers use little energy when
they're waiting for a task, rather than hogging power when there’s less computing
work to be done.
Optimize the power path: A typical server wastes up to a third of the energy it uses
before any of that energy reaches the parts that do the actual computing. Servers lose
the most energy at the power supply, which converts the AC voltage coming from a
standard outlet to a set of low DC voltages. They then lose more at the voltage
regulator, which further converts the power supply's output to the voltages required
by microchips. Designed with low efficiency standards in order to save on initial cost,
traditional servers end up costing much more in electricity in the long run.
Used efficient power supplies and highly efficient voltage regulator modules to ensure
that most of the power goes to the components that do the actual computing work.
Cut out two of the AC/DC conversion stages by putting back-up batteries directly on
the server racks. It estimate an annual savings of over 500 kWh per server—or 25%—
over a typical system.
Conclusion
The primary concern before and after converting to a green data center should be
power management. Most of the administrators are not comfortable monitoring
power usage because the main concern is performance. The administrators are more
concerned about performance uptime and not the power usage. This practice needs to
change and power usage should be measured and monitored where needed. Even
after converting to a GDC, the enterprise needs to monitor the usage to reap the
benefits of the transition-to-green process.
The vendors can also help in power management by selling energy efficient hardware
or recyclable hardware. Based on the growth of the industry, the power needs are
increasing and thus there is a need to monitor the power usage. As the power needs
increase, more earth-friendly measures can to be taken to help the environment. Also,
the management and administrators need to discuss the increasing needs of the data
centers. The power scheme currently used in a data center will change with the growth
of industrial as well as the educational sector.

More Related Content

What's hot (20)

Green Computing
Green ComputingGreen Computing
Green Computing
 
Green computing
Green computingGreen computing
Green computing
 
Green computing
Green computingGreen computing
Green computing
 
IS GREEN COMPUTING GOOD FOR BUSINESS?
IS GREEN COMPUTING GOOD FOR BUSINESS?IS GREEN COMPUTING GOOD FOR BUSINESS?
IS GREEN COMPUTING GOOD FOR BUSINESS?
 
Green computing
Green computingGreen computing
Green computing
 
Green Computing
Green ComputingGreen Computing
Green Computing
 
Green computing 28thdec15_siddharth
Green computing 28thdec15_siddharthGreen computing 28thdec15_siddharth
Green computing 28thdec15_siddharth
 
Pathways to green computing2
Pathways to green computing2Pathways to green computing2
Pathways to green computing2
 
Green Computing
Green ComputingGreen Computing
Green Computing
 
Green computing
Green computingGreen computing
Green computing
 
Green computing ppt
Green computing  pptGreen computing  ppt
Green computing ppt
 
D Archana
D ArchanaD Archana
D Archana
 
Green computing
Green computingGreen computing
Green computing
 
Green computing
Green computingGreen computing
Green computing
 
Green computing ppt
Green computing pptGreen computing ppt
Green computing ppt
 
Green computing 1
Green computing 1Green computing 1
Green computing 1
 
Green computing
Green computingGreen computing
Green computing
 
Green computing ameera
Green computing ameeraGreen computing ameera
Green computing ameera
 
Green computing
Green computingGreen computing
Green computing
 
Final green computing slide by: Anurag.Saxena
Final green computing slide by: Anurag.SaxenaFinal green computing slide by: Anurag.Saxena
Final green computing slide by: Anurag.Saxena
 

Similar to Green computing

Optimizing The Data Centre Environment
Optimizing The Data Centre EnvironmentOptimizing The Data Centre Environment
Optimizing The Data Centre Environmentmixalisg
 
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersHigh Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersSchneider Electric
 
Green Data Centre for banks
Green Data Centre for banksGreen Data Centre for banks
Green Data Centre for banksPritam Raha Roy
 
Green data center_rahul ppt
Green data center_rahul pptGreen data center_rahul ppt
Green data center_rahul pptRAHUL KAUSHAL
 
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsIn the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsVantage Data Centers
 
Energy efficient cooling
Energy  efficient  coolingEnergy  efficient  cooling
Energy efficient coolingrohit goud
 
Full chapter in a single perfect format 2
Full chapter in a single perfect format 2Full chapter in a single perfect format 2
Full chapter in a single perfect format 2Snehasis Panigrahi
 
Green cloud computing
Green cloud computingGreen cloud computing
Green cloud computingJauwadSyed
 
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...IRJET Journal
 
Cooling a Data Center - DP Air
Cooling a Data Center - DP Air Cooling a Data Center - DP Air
Cooling a Data Center - DP Air dpsir
 
Actual Time Online Thermal Mapping Of significant Components In Data Hub
Actual Time Online Thermal Mapping Of significant Components In Data Hub Actual Time Online Thermal Mapping Of significant Components In Data Hub
Actual Time Online Thermal Mapping Of significant Components In Data Hub IRJET Journal
 
Datacenter Design - DP Air
Datacenter Design - DP AirDatacenter Design - DP Air
Datacenter Design - DP Airdpsir
 

Similar to Green computing (20)

Optimizing The Data Centre Environment
Optimizing The Data Centre EnvironmentOptimizing The Data Centre Environment
Optimizing The Data Centre Environment
 
Google ppt. mis
Google ppt. misGoogle ppt. mis
Google ppt. mis
 
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersHigh Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
 
Green IT
Green ITGreen IT
Green IT
 
Green Data Centre for banks
Green Data Centre for banksGreen Data Centre for banks
Green Data Centre for banks
 
Green data center_rahul ppt
Green data center_rahul pptGreen data center_rahul ppt
Green data center_rahul ppt
 
Green IT
Green IT Green IT
Green IT
 
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsIn the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
 
Energy efficient cooling
Energy  efficient  coolingEnergy  efficient  cooling
Energy efficient cooling
 
Build Energy Saving into Your Datacenter
Build Energy Saving into Your DatacenterBuild Energy Saving into Your Datacenter
Build Energy Saving into Your Datacenter
 
Data Centers
Data CentersData Centers
Data Centers
 
The next wave of GreenIT
The next wave of GreenITThe next wave of GreenIT
The next wave of GreenIT
 
Full chapter in a single perfect format 2
Full chapter in a single perfect format 2Full chapter in a single perfect format 2
Full chapter in a single perfect format 2
 
Green cloud computing
Green cloud computingGreen cloud computing
Green cloud computing
 
B7 merlin
B7 merlinB7 merlin
B7 merlin
 
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...
HEAT TRANSFER ANALYSIS ON LAPTOP COOLING SYSTEM BEFORE AND AFTER INTRODUCING ...
 
Green It V1 0
Green It   V1 0Green It   V1 0
Green It V1 0
 
Cooling a Data Center - DP Air
Cooling a Data Center - DP Air Cooling a Data Center - DP Air
Cooling a Data Center - DP Air
 
Actual Time Online Thermal Mapping Of significant Components In Data Hub
Actual Time Online Thermal Mapping Of significant Components In Data Hub Actual Time Online Thermal Mapping Of significant Components In Data Hub
Actual Time Online Thermal Mapping Of significant Components In Data Hub
 
Datacenter Design - DP Air
Datacenter Design - DP AirDatacenter Design - DP Air
Datacenter Design - DP Air
 

More from kunalsahu9883

Effect of masonry walls in the progressive collapse of a ten storied rc building
Effect of masonry walls in the progressive collapse of a ten storied rc buildingEffect of masonry walls in the progressive collapse of a ten storied rc building
Effect of masonry walls in the progressive collapse of a ten storied rc buildingkunalsahu9883
 
Urban planning using fuzzy ahp and gis
Urban planning using fuzzy ahp and gisUrban planning using fuzzy ahp and gis
Urban planning using fuzzy ahp and giskunalsahu9883
 
Water requirements and irrigation scheduling of pearl millet in rajasthan
Water requirements and irrigation scheduling of pearl millet in rajasthanWater requirements and irrigation scheduling of pearl millet in rajasthan
Water requirements and irrigation scheduling of pearl millet in rajasthankunalsahu9883
 
Flood vulnerability and risk mapping
Flood vulnerability and risk mappingFlood vulnerability and risk mapping
Flood vulnerability and risk mappingkunalsahu9883
 
E-commerce: Algorithm for Calculating Discount
E-commerce: Algorithm for Calculating DiscountE-commerce: Algorithm for Calculating Discount
E-commerce: Algorithm for Calculating Discountkunalsahu9883
 
Sustainable technology and design in auroville
Sustainable technology and design in aurovilleSustainable technology and design in auroville
Sustainable technology and design in aurovillekunalsahu9883
 
Influence line diagram for model arch bridge
Influence line diagram for model arch bridgeInfluence line diagram for model arch bridge
Influence line diagram for model arch bridgekunalsahu9883
 
Use of golden ratio in architecture
Use of golden ratio in architectureUse of golden ratio in architecture
Use of golden ratio in architecturekunalsahu9883
 

More from kunalsahu9883 (10)

Effect of masonry walls in the progressive collapse of a ten storied rc building
Effect of masonry walls in the progressive collapse of a ten storied rc buildingEffect of masonry walls in the progressive collapse of a ten storied rc building
Effect of masonry walls in the progressive collapse of a ten storied rc building
 
India next village
India next villageIndia next village
India next village
 
Urban planning using fuzzy ahp and gis
Urban planning using fuzzy ahp and gisUrban planning using fuzzy ahp and gis
Urban planning using fuzzy ahp and gis
 
Water requirements and irrigation scheduling of pearl millet in rajasthan
Water requirements and irrigation scheduling of pearl millet in rajasthanWater requirements and irrigation scheduling of pearl millet in rajasthan
Water requirements and irrigation scheduling of pearl millet in rajasthan
 
Flood vulnerability and risk mapping
Flood vulnerability and risk mappingFlood vulnerability and risk mapping
Flood vulnerability and risk mapping
 
Solar Roadways
Solar Roadways Solar Roadways
Solar Roadways
 
E-commerce: Algorithm for Calculating Discount
E-commerce: Algorithm for Calculating DiscountE-commerce: Algorithm for Calculating Discount
E-commerce: Algorithm for Calculating Discount
 
Sustainable technology and design in auroville
Sustainable technology and design in aurovilleSustainable technology and design in auroville
Sustainable technology and design in auroville
 
Influence line diagram for model arch bridge
Influence line diagram for model arch bridgeInfluence line diagram for model arch bridge
Influence line diagram for model arch bridge
 
Use of golden ratio in architecture
Use of golden ratio in architectureUse of golden ratio in architecture
Use of golden ratio in architecture
 

Recently uploaded

Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgUnit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgsaravananr517913
 
Indian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptIndian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptMadan Karki
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.eptoze12
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
Earthing details of Electrical Substation
Earthing details of Electrical SubstationEarthing details of Electrical Substation
Earthing details of Electrical Substationstephanwindworld
 
US Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionUS Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionMebane Rash
 
Transport layer issues and challenges - Guide
Transport layer issues and challenges - GuideTransport layer issues and challenges - Guide
Transport layer issues and challenges - GuideGOPINATHS437943
 
Application of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptxApplication of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptx959SahilShah
 
Arduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptArduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptSAURABHKUMAR892774
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxKartikeyaDwivedi3
 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfROCENODodongVILLACER
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...VICTOR MAESTRE RAMIREZ
 
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)dollysharma2066
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitterShivangiSharma879191
 
computer application and construction management
computer application and construction managementcomputer application and construction management
computer application and construction managementMariconPadriquez1
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxk795866
 

Recently uploaded (20)

Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgUnit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
 
Indian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptIndian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.ppt
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
Earthing details of Electrical Substation
Earthing details of Electrical SubstationEarthing details of Electrical Substation
Earthing details of Electrical Substation
 
US Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionUS Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of Action
 
Transport layer issues and challenges - Guide
Transport layer issues and challenges - GuideTransport layer issues and challenges - Guide
Transport layer issues and challenges - Guide
 
Application of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptxApplication of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptx
 
Arduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptArduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.ppt
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptx
 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdf
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...
 
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter
 
computer application and construction management
computer application and construction managementcomputer application and construction management
computer application and construction management
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptx
 

Green computing

  • 1. Green Computing Eco Friendly Data Centre Kunal Sahu 12BCL1034
  • 2. Abstract  The green data center has moved from theoretical to the realistic, with IT leaders being challenged to construct new data centers (or retrofits the existing one) with energy saving features, sustainable materials and other environmental efficiencies in mind.  This project deals with the effects that are caused by the data center. How severe these effects are and how to overcome these. The measures that has been provided are not only in constructional point of view but also focusing on other dimensions.
  • 3. Introduction In recent years, there has been a rapid growth in the Information Technology (IT) industry in India. Data Centers: the key infrastructure component powering this sector–operate continuously (24x7) throughout the year, and are very energy intensive. These high–tech facilities generally consume many times the energy of a typical office building as much as a hundred times more on a square-meter basis. These facilities are experiencing significant growth in India, making it one of the fastest growing energy–use sectors and impacting electrical supply and distribution. By one estimate, data centers consumed 0.8% of the world’s electricity in 2005, at an aggregate cost of 7.2 billion USD. These figures are nearly double the estimates for 2000, indicating an annual growth rate of 16%. Although the net consumption of data centers in India has yet to be quantified, the IT heavy nature of the country’s economy suggests relative data center energy consumption in India exceeds the global average.
  • 4. It’s not “speculation” any more…  Atmospheric warming is a now a broadly accepted trend.  41% of the Earth’s population (2.3 billion) live in water-stressed areas; 3.5 billion will do so by 2025.  Increasingly includes developed areas such as Western USA, Australia, SE England.  There is now increasing pressure on traditional energy resources, especially from rapidly industrializing nations (China, India).  Energy consumption is expected to increase by over 129% in parts of Asia by 2020.  CEO/CFO focus is turning to energy costs & environmental costs (CO2 emissions).  Major investors increasingly see poor green performance as a source of risk.
  • 5.
  • 6.  Information technology (IT) accounts for…. –Accounts for 2% of anthropogenic CO2. –Roughly equivalent to aviation industry. –IT energy usage will double next 4 years
  • 7. Energy Demands Have Data Centers at a Tipping Point  IT energy demand doubling every 9 – 24 months.  6500 US data centers consume electricity equal to the State of Utah.  On average 100 units of energy generation equal 3 units of work for productive IT.  IT accounts for 2% of anthropogenic CO2 emissions.  e-waste cannot be ignored – 1 billion computers potential scrap by 2010.  Enough to fill the Rose Bowl each year.  IBM investing to lead in power and cooling efficiency!
  • 8. What is a Green Data Center? Diagnose Get the facts to understand your energy use and opportunities for improvement Build Plan, build, and upgrade to energy efficient data centers CoolVirtualize Implement virtualization and other innovative technologies Manage & Measure Seize control with energy management software Use innovative cooling solutions
  • 9. How is energy typically used in the data center? 55% 45% Server/Storage hardware 70% 30% Processor Data center
  • 10. Ways for Eco friendly Data Center By applying several simple design choices the efficiency of facility, reduce costs, and reduce your impact on the environment can be improved. Here are the top five best practices:  Measure PUE  Manage airflow  Adjust the thermostat  Use free cooling  Optimize power distribution
  • 11. 1. Measure PUE We can't manage what we don’t measure, so we have to be sure to track the data center's energy use. The industry uses a ratio called Power Usage Effectiveness (PUE) to measure and help reduce the energy used for non-computing functions like cooling and power distribution. To effectively use PUE, it's important to measure often. It should be sampled at least once per second. It’s even more important to capture energy data over the entire year, since seasonal weather variations affect PUE. It’s a measure of how effectively you deliver power and cooling to the IT equipment.
  • 12.  For a service provider, they should focused on reducing the energy use while serving the explosive growth of the Internet. Most data centers use almost as much non-computing or "overhead" energy (like cooling and power conversion) as they do to power their servers.  According to the Uptime Institute's 2014 Data Center Survey, the global average of respondents' largest data centers is around 1.7.  To decrease PUE a different approach should be taken. The facilities should have different power and cooling infrastructures, and should be located in different climates. As seasonal weather patterns also impact PUE values, which may tend to give lower PUE during cooler quarters. Some have managed to maintain a low PUE average across our entire fleet of data center sites around the world—even during hot, humid Atlanta summers.
  • 13. 2. Manage airflow Good air flow management is crucial to efficient data center operation. Minimize hot and cold air mixing by using well-designed containment. Then, eliminate hot spots and be sure to use blanking plates (or flat sheets of metal) for any empty slots in your rack. A little analysis can have big payoffs. For example, thermal modelling using computational fluid dynamics (CFD) can help quickly to characterize and optimize air flow for the facility without having to reorganize the computing room. Cooled Air flow Hot Air flow
  • 14.  One of the simplest ways to save energy in a data center is to raise the temperature. It’s a myth that data centers need to be kept chilly. According to expert recommendations and most IT equipment manufacturers' specifications, data center operators can safely raise their cold aisle to 80°F or higher. By doing so, we can significantly reduce facility energy use.  Other way is to use thermal modelling to locate “hot spots” and better understand airflow in the data center. In the design phase, physically arrange equipment to even out temperatures in the facility. Even after that, we can move certain equipment like computer room air conditioners (CRACs) to reduce hot spots and even out the ambient temperature—ultimately reducing the amount of time the CRAC must run.
  • 15.
  • 16. To cut cooling costs and save energy, prevent the “hot aisle” air behind the server racks from mixing with the “cold aisle” in front of the server racks. In large data centers, people use appropriate ducting and permanent enclosures. In addition, they take simple measures well-suited for smaller “closet” style data centers. For instance, we:  Use blanking panels (or flat sheets of metal) to close off empty rack slots and prevent hot aisle air from seeping into to the cold aisle.  Hang plastic curtains (like those used in commercial refrigerators) to seal off the cold aisle.  Enclose areas with components that run hotter (such as power supply units or PSUs) with plastic curtains. These efforts help to reduce the total amount of energy used for cooling. At the same time, they ensure that the cooler air which is send into the cold aisles is truly cool enough to do its job.
  • 17. 3. Adjust the thermostat It has long been believed that IT equipment needs to run at low temperatures— between 15°C/60°F and 21°C/70°F. However, the American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) recommends cold aisle temperatures of up to 27°C/81°F, which have found to have no detrimental effect on equipment. Most IT equipment manufacturers spec machines at 32°C/90°F or higher, so there is plenty of margin. In addition, most CRACs are set to dehumidify the air down to 40% relative humidity and to reheat air if the return air is too cold. Raising the temperature and turning off dehumidifying and reheating provides significant energy savings.
  • 18. An elevated cold aisle temperature allows CRACs to operate more efficiently at higher intake temperatures. Also, it allows for more days of “free cooling”—days where mechanical cooling doesn’t need to run—if the facility has air- or water-side economization. The simple act of raising the temperature from 22°C/72°F to 27°C/81°F in a single 200kW networking room could save tens of thousands of dollars annually in energy costs.
  • 19. 4. Use free Cooling Chillers typically use the most energy in a data center's cooling infrastructure, so there is largest opportunity for savings by minimizing their use. Taking advantage of "free cooling" to remove heat from facility without using a chiller. This can include  using low temperature ambient air,  evaporating water, or  a large thermal reservoir. While there's more than one way to free cool, water and air-side economizers are proven and readily available.
  • 20.  Cooling with water—not chillers The electricity that powers a data center ultimately turns into heat. Most data centers use chillers or air conditioning units to cool things down, requiring 30-70% overhead in energy usage. Data centers often use water as an energy-efficient way to cool instead.  Trap hot air and cool equipment with water Google have designed custom cooling systems for their server racks that they’ve named “Hot Hut” because they serve as temporary homes for the hot air that leaves servers—sealing it away from the rest of the data center floor. Fans on top of each Hot Hut unit pull hot air from behind the servers through water-cooled coils. The chilled air leaving the Hot Hut returns to the ambient air in the data center, where their servers can draw the chilled air in, cooling them down and completing the cycle.
  • 21.  Take advantage of evaporative cooling: Evaporation is a powerful tool. In our bodies, it helps us maintain our temperature even when outside temperatures are warmer than we are. It also works similarly in data center’s cooling towers. As hot water from the data center flows down the towers through a material that speeds evaporation, some of the water turns to vapor. A fan lifts this vapor, removing the excess heat in the process, and the tower sends the cooled water back into the data center.  Use the natural cooling power of sea water: Evaporating water isn't the only way to free cool. Some data centers uses sea water to cool without chillers. Data centers can be establish in places like Hamina, Finland because its cold climate and its location on the Gulf of Finland as Google does. The cooling system are designed which pumps cold water from the sea to the facility, transfers heat from operations to the sea water through a heat exchanger, and then cools this water before returning it to the sea. This approach can provide all of needed cooling year round, and doesn’t need to install any mechanical chillers at all.
  • 22. 5. Optimize power distribution We can minimize power distribution losses by eliminating as many power conversion steps as possible. For the conversion steps make sure to have, a specify efficient equipment transformers and power distribution units (PDUs). One of the largest losses in data center power distribution is from the uninterruptible power supply (UPS), so it's important to select a high-efficiency model. Also, keep high voltages as close to the power supply as possible to reduce line losses. Building custom, highly-efficient servers: Usually servers are high-performance computers that run all the time. They're the core of data centers, and should be designed to use as little energy as possible. It can be done by minimizing power loss and by removing unnecessary parts. Also ensure that servers use little energy when they're waiting for a task, rather than hogging power when there’s less computing work to be done.
  • 23. Optimize the power path: A typical server wastes up to a third of the energy it uses before any of that energy reaches the parts that do the actual computing. Servers lose the most energy at the power supply, which converts the AC voltage coming from a standard outlet to a set of low DC voltages. They then lose more at the voltage regulator, which further converts the power supply's output to the voltages required by microchips. Designed with low efficiency standards in order to save on initial cost, traditional servers end up costing much more in electricity in the long run. Used efficient power supplies and highly efficient voltage regulator modules to ensure that most of the power goes to the components that do the actual computing work. Cut out two of the AC/DC conversion stages by putting back-up batteries directly on the server racks. It estimate an annual savings of over 500 kWh per server—or 25%— over a typical system.
  • 24. Conclusion The primary concern before and after converting to a green data center should be power management. Most of the administrators are not comfortable monitoring power usage because the main concern is performance. The administrators are more concerned about performance uptime and not the power usage. This practice needs to change and power usage should be measured and monitored where needed. Even after converting to a GDC, the enterprise needs to monitor the usage to reap the benefits of the transition-to-green process.
  • 25. The vendors can also help in power management by selling energy efficient hardware or recyclable hardware. Based on the growth of the industry, the power needs are increasing and thus there is a need to monitor the power usage. As the power needs increase, more earth-friendly measures can to be taken to help the environment. Also, the management and administrators need to discuss the increasing needs of the data centers. The power scheme currently used in a data center will change with the growth of industrial as well as the educational sector.