art and science of nus data centres building
play

Art and Science of NUS Data Centres Building D ATA C ENTRE @ NUS - PDF document

Art and Science of NUS Data Centres Building D ATA C ENTRE @ NUS Contents I . Nostalgia I I . Science of Elements in a Data Centre I I I . NUS DC Live Migration the Art and Challenges I V. Milestones in Pictures V. Whats next


  1. Art and Science of NUS Data Centres Building

  2. D ATA C ENTRE @ NUS Contents I . Nostalgia I I . Science of Elements in a Data Centre I I I . NUS DC Live Migration – the Art and Challenges I V. Milestones in Pictures V. What’s next

  3. Nostalgia... Part I D ATA C ENTRE @ NUS

  4. D ATA C ENTRE @ NUS • Electrical System

  5. D ATA C ENTRE @ NUS • Raised Floor Cablings

  6. D ATA C ENTRE @ NUS • Air Conditioning System

  7. D ATA C ENTRE @ NUS • UPS System

  8. D ATA C ENTRE @ NUS • Layout and Space Utilization

  9. D ATA C ENTRE @ NUS • Tape Library

  10. D ATA C ENTRE @ NUS • Secure Printing of SmartCards

  11. D ATA C ENTRE @ NUS • Emergency Exits

  12. D ATA C ENTRE @ NUS • Surveillence Cameras

  13. D ATA C ENTRE @ NUS • Signages

  14. D ATA C ENTRE @ NUS Part I I Science of Elements in a Data Centre

  15. Critical Elements in a DataCentre • Tiering of Data Centre • IT Area Versus Non-IT area • Ceiling Height and Raised Floor System • Power Design (Supply + cabling) • HVAC units (Temperature and Humidity) • Fire Detection and Suppression • Environmental Monitoring System • Structural Loading • Physical Security • Structured Cabling • Network Considerations

  16. Tiering of Data Centre •Defining the Tiers. Tier I Tier I is composed of a single path for power and cooling distribution, without redundant components, providing 99.671% availability. Tier II Tier II is composed of a single path for power and cooling distribution, with redundant components, providing 99.741% availability. Tier III Tier III is composed of multiple active power and cooling distribution paths, but only one path active, has redundant components, and is concurrently maintainable, providing 99.982% availability. Tier IV Tier IV is composed of multiple active power and cooling distribution paths, has redundant components, and is fault tolerant, providing 99.995% availability

  17. This chart illustrates tier requirements:

  18. This chart illlustrates the tier attributes of the sites from which the actual availability numbers were drawn:

  19. [HP]

  20. Part I I I NUS DC Live Migration – the Art and Challenges

  21. Key Considerations and Difficulties Encountered • The datacentre must have minimal downtime as it is running live. • Renovated DC must be able to cater for DC operation till year 2010. • Deciding DC tier level. • Maximising IT Area. • Identifying possible/planned hot spot/area. • Non-standard air flow racks/equipments. • Inter-racks dependencies. • Control/minimising dust in operating IT area during the renovation. • Racks with weight overloading floor structural design. • Air flow issues within a rack. • Standardising racks design. • Linkage of M&E vendors and IT vendors during renovation and IT migration in this live DC.

  22. D ATA C ENTRE @ NUS OLD DC OLD DC • Single incoming power source (electrical to IT) • Standalone UPSes; not all equipment on UPS • Equipments mostly on single UPS supply • Insufficient cooling with no humidity control • Generator does not meet existing IT load requirement • No environmental monitoring system • Building smoke detectors and I nergen gas system at selected areas • Un-structured cablings

  23. New DC • Two different incoming power source to • Dedicated N+ 1 Crac units with humidity control , backup by a dedicated DC (electrical for IT) generator • Centralized 2N UPS with individual • New generator for IT electrical supply isolation transformer supplying power to all IT equipments in DC • EMS (temperature, humidity, water detection, PDU incoming and outgoing, • Dual power source for all IT racks crac units, FM200, etc) • 400mm Raised Floor System • Fire detection and protection systems - FM200 and building smoke detectors • Structured Cabling • Hot & Cold aisles with ducted return • Fire rated perimeters • Stand alone earthing system • CCTV Monitoring • Contactless Door Access System

  24. D ATA C ENTRE @ NUS I nteresting Facts... • Renovation Duration : 1 Year • DC NFA : 720 sqm • 60% attained IT Area

  25. Data Centre Highlights... Part I I I

  26. Security Contactless Card Access >> 0 >> 1 >> 2 >> 3 >> 4 >>

  27. Security Cameras >> 0 >> 1 >> 2 >> 3 >> 4 >>

  28. Two different incoming power source to DC >> 0 >> 1 >> 2 >> 3 >> 4 >>

  29. Centralized UPS for all IT equipment in DC >> 0 >> 1 >> 2 >> 3 >> 4 >>

  30. Dual power source for all IT racks >> 0 >> 1 >> 2 >> 3 >> 4 >>

  31. Dedicated A/C with humidity control (backup by generator) >> 0 >> 1 >> 2 >> 3 >> 4 >>

  32. New generator for project IT electrical requirement >> 0 >> 1 >> 2 >> 3 >> 4 >>

  33. EMS (temperature, humidity, liquid detection, etc) >> 0 >> 1 >> 2 >> 3 >> 4 >>

  34. FM200 Gas Cylinders >> 0 >> 1 >> 2 >> 3 >> 4 >>

  35. FM200 Pipings above Ceiling Boards >> 0 >> 1 >> 2 >> 3 >> 4 >>

  36. FM200 System >> 0 >> 1 >> 2 >> 3 >> 4 >>

  37. FM200 Release Nozzle on Ceiling Board >> 0 >> 1 >> 2 >> 3 >> 4 >>

  38. Water Detection Cable >> 0 >> 1 >> 2 >> 3 >> 4 >>

  39. Structured cabling below raised floor >> 0 >> 1 >> 2 >> 3 >> 4 >>

  40. Network Cablings >> 0 >> 1 >> 2 >> 3 >> 4 >>

  41. Earth Cablings >> 0 >> 1 >> 2 >> 3 >> 4 >>

  42. Return Air Duct for CRAC >> 0 >> 1 >> 2 >> 3 >> 4 >>

  43. Internal View of a Typical PDU >> 0 >> 1 >> 2 >> 3 >> 4 >>

  44. Milestones in Pictures... Part I V

  45. Installation of compressors for CRAC units >> 0 >> 1 >> 2 >> 3 >> 4 >>

  46. Installation of compressors for CRAC units >> 0 >> 1 >> 2 >> 3 >> 4 >>

  47. Installation of compressors for CRAC units >> 0 >> 1 >> 2 >> 3 >> 4 >>

  48. Installation of compressors for CRAC units >> 0 >> 1 >> 2 >> 3 >> 4 >>

  49. Delivery of CRAC units >> 0 >> 1 >> 2 >> 3 >> 4 >>

  50. Phase-3 >> 0 >> 1 >> 2 >> 3 >> 4 >>

  51. Phase-4 >> 0 >> 1 >> 2 >> 3 >> 4 >>

  52. Media Room Entrance >> 0 >> 1 >> 2 >> 3 >> 4 >>

  53. Media Room >> 0 >> 1 >> 2 >> 3 >> 4 >>

  54. Network Room Entrance >> 0 >> 1 >> 2 >> 3 >> 4 >>

  55. Network Room >> 0 >> 1 >> 2 >> 3 >> 4 >>

  56. UPS Room >> 0 >> 1 >> 2 >> 3 >> 4 >>

  57. What’s next... Part V D ATA C ENTRE @ NUS

  58. D ATA C ENTRE @ NUS DC @ NUS High School (NUSHS) • � Disaster Recovery Site outside Kent Ridge Campus � Construction Duration: Jun – Oct’ 05 (4 mth) � NFA = 800 sqm � Operational in Nov / Dec 2005 Primary Data Centre • � Able to cater to Computer Centre’s 10-year I T plans � Proposed NFA = 2,000 sqm � Coming soon...

  59. D ATA C ENTRE @ NUS

Recommend


More recommend