Presented at CERN - Geneva, Switzerland March 27, 2009 Dean Nelson Sr. Director, Global Lab & Datacenter Design Services (GDS) 1
Unique Challenge 2
Demand and Capacity Are Colliding... ...And Data Centers Are Right in The MIDDLE! 800 Watts per (8,600W m 2 ) ● Power Square Foot/Meter ● Demand ● Costs ● Users ● Space ● Services ● Heat ● Access 120 40 (1300W m 2 ) (430W m 2 ) Next 2003 2005 Generation 3
Moore’s Law in Action zv 128 Threads, 16 Cores 768 Cores, 28.5kW 64 Threads in 1997 E10K 5140 6048 30x 42 1& ½ Footprint Size Full Rack Power 720W 9620 Watts 30 kWatts each (Systems at peak utilization) 1,800 lbs. Weight ~2200 lbs. 2x per ~150k tpm Performance ~300k tpm 5140 4
Reality: Heterogeneous Data Centers Industry Average is Between 4-6kw/cabinet; >20kw Skyscrapers will be Integrated; Must Deal with mixed load environment 5
Why is this topic important? 6
Unprecedented Activity • Sun Datacenter Briefings over 17 months (07/07-2/09) > >675 – an average of ~8 per week > California = >4,000 people representing >400 customer companies have engaged in briefings and toured Santa Clara, CA, India and UK datacenters in 15 months > Colorado = Almost 1,000 people in less than two months > Challenges: Power, cooling, space, connectivity and utility costs > Interest: Investment Protection, Future-Proofing, Efficiency • Investments > 21 of these companies spending $19B in datacenter projects in the US alone > Does not include Microsoft, Google, Facebook or DRT 7
A different perspective A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year Usually on 24x7 Server Auto Travel Air Travel 440 Watt Server Toyota Camry Commercial Airliner 3,942 kWh/year 15,000 m/year Vancouver-Toronto (7 trips) 5.3 Tonnes CO 2 (24,000 km/year) 5.2 Tonnes CO 2 5.3 Tonnes CO 2 8
A different perspective A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year BUT Usually on 24x7 = Moore's Law Mandates Efficiency Gains Automotive Equivalent Efficiency, 10 year period 163 MPG! 9
Changing Priorities & Drivers • $15B investment ($1.2B solar project) • First Carbon Neutral, Waste Free, Car-Free City Investment in solar innovation will change the industry • At the end of a dock instead of the end of a street 10
Floating Data Centers • Tier1-Tier3 ECO datacenters at US and international ports • Capacity: 4000 racks and over 350 SunMDs • 75MW of power, free cooling from ocean water • Six months time to market, up to 40% less than traditional build • At the end of a dock instead of the end of a street 11
Strategy 12
A New Age Industrial Age Information Age Participation Age Global Production All Things Connected Unprecedented Contribution Global Consumption Data Storm Building Unprecedented Consumption 13
Top 20 Social Networks 1.3 Billion Users and growing Source: http://en.wikipedia.org/wiki/List_of_social_networking_websites (as of 11/10/2008) 14
The Shift Internet Infrastructure High Performance Computing 1m IT = Weapon Software as Services Global Consumer Services 100k UNDER SERVED 10k Moore’s 1k Law Core 100 Enterprise Apps 10 OVER SERVED 1 IT = Cost .1 1990 1995 2000 2005 2010 15
Innovate 16
A moment of silence... • Raised Floors Are Dead > No longer required > Go against physics > Increasingly cumbersome > Expensive • Next Generation equipment requires a new way of thinking... 17
Pod Architecture Modular Data Center Building Blocks Container and/or Brick & Mortar 18
Modular Pod Components Physical Design • Influenced by cooling, brick and mortar and/or container Cooling – Closely Coupled • In-row or overhead, not-aisle containment, and passive Power Distribution • Overhead or under-floor busway Cabling • Localize switching in each pod 19
Cooling in Sun Modular Datacenter • Integrated cooling modules • Circular airflow, 5 cooling zones per module • Variable-speed fans on a per-fan basis • Handles densities up to 25 kW/rack 20
Sun Pod Architecture 21
Sun Pod Architecture 22
Closeup: Power Distribution Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases • Requires no floor space or cooling > Transformers moved outside the datacenter • Snap-in cans with short whips > Non-disruptive > Reduced copper consumption > No in-place abandonment > Significant time reduction – from months to minutes 23
Closeup: Power Distribution Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases • Supports multiple Tier levels > Use multiple busways • Scalable by removing jumpers 24
Act 25
Design Services – Holistic Solution Global Lab and Datacenter Design Services Facilities IT/Eng GDS Bridging the gap Between Facilities and IT/Eng • Central design competency center • Understanding IT/Eng requirements, speak facilities’ language • Design services provide holistic solution • Aligned Business Organizations > http://www.sun.com/aboutsun/environment/docs/aligning_business_organizations.pdf 26
History: Sun’s Internal Challenge • Facilities is Sun’s second largest expense > Real estate, utility, tax, and support costs • 20+ years of organic growth > New products, reorgs, acquisitions > Lack of design standards and control of implementations for global technical infrastructure > Duplication and inefficiencies • Multi-billion dollar IT/R&D technical infrastructure portfolio > 860k ft 2 (80k m 2 ) of Eng and IT space globally (reduced from 1.4M ft 2 - 138k m 2 ) > 1,068 individual rooms (reduced from 1,685) > IT space = 17% of the portfolio (143k ft 2 / 13k m 2 – 275 rooms) > Engineering/Services = 83% of the portfolio (718k ft 2 / 67k m 2 – 793 rooms) 27
Corporate Drive to Reduce Costs • Spotlight: Santa Clara, CA - 2007 > Shed 1.8M ft 2 ( 167k m 2 ) of real estate > Compress 202k ft 2 ( 18.8k m 2 ) of datacenters space into < 80k ft 2 ( 7,400 m 2 ) of new Datacenter space in Santa Clara > Project included every major business unit in Sun > 12 month project duration (complete by 06/30/2007) • Approach > Phase I: Move 84k ft 2 ( 7,800 m 2 ) of datacenters into existing space > Phase II: Compress and build < 80k ft 2 ( 7,400 m 2 ) of next generation datacenter space in 12 months 28
Spotlight: Hardware Replacement • 88% space compression • 61% utility reduction • $9M cost avoidance • 2.2MW to 500Kw • 450% compute increase • 550 racks to 65 • 3,227 tons CO2 reduced • 312 cars off the road • Minimal downtime • Completed in 3 months • 2:1 Server, 3:1 Storage • 2,915 devices replaced Reality: ROI Sweet Spot = >4 to 1 Replacement Ratio 29
Bring Out Your Dead... 30
Global Consolidation • $250M investment • 41% global datacenter space compression > 1.44M ft 2 to 858k ft 2 • Scalable/Future Proof > 9MW to 21MW (CA) > 7MW to 10MW (CO) • Largest Liebert/APC installs • 15 Buildings to 2 • 152 Datacenters to 14 • $1.2M Utility Rebates $250k Innovation Award China, India, UK, Czech Republic, Norway • Enabled company pace • Reduced opex 30% (CA) 31
Colorado DC Consolidation • Largest, most complex & costly consolidation in Sun's history • 66% space compression • 66% Datacenter compression • 496k ft2 to 126k ft2 > 496k ft 2 to 126k ft 2 • Scalable/Future Proof • Scalable/Future Proof > 7MW to 10MW > 7MW to 10MW • First & Largest Liebert XD • First & Largest Liebert XD dynamic cooling installs dynamic cooling install • • Water treatment saves 600k gallons/year, eliminates chemicals • • Waterside economizer, free • cooling > 1/3 of year. • • Compressed 165k ft 2 raised floor • to <700 ft 2 ($4M Cost Avoidance) • Flywheel UPS, eliminates batteries. • Chillers 32% more efficient at avg China, India, UK, Czech Republic, Norway load than ASHRAE std • 2 ACE Awards • Removed 1M kWh per month • Removed 5% of global carbon 32
Share 33
Recommend
More recommend