and
play

and Extreme Scale Research Computing D. Karres, Beckman Institute - PowerPoint PPT Presentation

NSF XSEDE Campus Champions and Extreme Scale Research Computing D. Karres, Beckman Institute J. Alameda, National Center for Supercomputing Applications S. Kappes, National Center for Supercomputing Applications Outline Research


  1. NSF XSEDE Campus Champions and Extreme Scale Research Computing D. Karres, Beckman Institute J. Alameda, National Center for Supercomputing Applications S. Kappes, National Center for Supercomputing Applications

  2. Outline • Research Computing @ Illinois – National Science Foundation Investments • Extreme Science and Engineering Discovery Environment (XSEDE) • Blue Waters – Research IT • Highlighted Core Services – Campus Champions • History/Motivation • Scope • Benefits

  3. Outline • Research Computing @ Illinois – National Science Foundation Investments • Extreme Science and Engineering Discovery Environment (XSEDE) • Blue Waters – Research IT • Highlighted Core Services – Campus Champions • History/Motivation • Scope • Benefits

  4. National Science Foundation Investments • Some Context: M. Parashar, NSF Town Hall, PEARC18, July 25, 2018 – Leadership HPC • Blue Waters (through December 2019) • Phase 1: Frontera @ TACC (production ~mid 2019) – Innovative HPC • Allocated through XSEDE • Large Scale, Long-tail, Data Intensive, Cloud – Services • XSEDE: Supporting Innovative HPC resources • XD Metrics Service (xdmod) • Open Science Grid

  5. Outline • Research Computing @ Illinois – National Science Foundation Investments • Extreme Science and Engineering Discovery Environment (XSEDE) • Blue Waters – Research IT • Highlighted Core Services – Campus Champions • History/Motivation • Scope • Benefits

  6. XSEDE Overview Fall 2018 Slides adapted from: Linda Akli, SURA Assistant Director, Education, Training, Outreach Manager, XSEDE Broadening Participation Program

  7. What is XSEDE? Foundation for a National CI Ecosystem • Comprehensive suite of advanced digital services that federates with other high-end facilities and campus-based resources Unprecedented Integration of Diverse Advanced Computing Resources • Innovative, open architecture making possible the continuous addition of new technology capabilities and services

  8. XSEDE Leadership

  9. Mission and Goals Mission: Accelerate scientific discovery Strategic Goals: • Deepen and Extend Use • Raise the general awareness of the value • Deepen the use and extend use to new communities • Contribute to the preparation of current and next generation scholars, researchers, and engineers • Advance the Ecosystem • Sustain the Ecosystem

  10. Total Research Funding Supported by XSEDE 2.0 All Others , 187.7, 10% NASA, 38.2, 2% $1.97 billion in research DOC, 55.2, 3% supported by XSEDE 2.0 DOD, 175.1, 9% September 2016 - April 2018 NSF, 754.3, 38% DOE, 325.0, 16% Research funding only. XSEDE leverages and integrates additional infrastructure, some funded by NSF (e.g. “Track 2” systems) and some not (e.g. Internet2). NIH, 432.0, 22% 10

  11. XSEDE Supports a Breadth of Research Earthquake Science Replicating Brain Circuitry to Direct a Realistic Prosthetic Arm Molecular Dynamics Nanotechnology Plant Science Storm Modeling Epidemiology Particle Physics Economic Analysis of Phone Network Patterns Large Scale Video Analytics (LSVA) Decision Making Theory XSEDE researchers visualize massive Joplin, Missouri tornado Library Collection Analysis

  12. Recovering Lost History A collaboration of social scientists, humanities scholars and digital researchers harnessed the power of high-performance computing to find and understand the historical experiences of black women by searching two massive databases of written works from the 18th through 20th centuries.

  13. XSEDE Visualization and Data Resources Storage Visualization • Resource file system storage : All compute/visualization allocations include access to limited disk and scratch space Visualization Portal on the compute/visualization • Remote, interactive, web- resource file systems to based visualization accomplish project goals • iPython / Jupyter Notebook integration • Archival Storage : Archival • R Studio Integration storage on XSEDE systems is used for large-scale persistent storage requested in conjunction with compute and visualization resources. • Stand-alone Storage : Stand- alone storage allows storage allocations independent of a compute allocation. 13

  14. Compute and Analytics Resources Bridges: Featuring interactive on-demand Comet: hosting a variety of tools including access, tools for gateway building, and Amber, GAUSSIAN, GROMACS, Lammps, virtualization. NAMD, and VisIt. Jetstream: A self-provisioned, scalable Stampede-2: Intel's new innovative MIC science and engineering cloud environment technology on a massive scale Wrangler: Data Analytics System combines database services, flash storage and long- Super Mic: Equipped with Intel's Xeon Phi term replicated storage, and an analytics technology. Cluster consists of 380 server. IRODS Data Management, compute nodes. HADOOP Service Reservations, and Database instances.

  15. Science Gateways The CIPRES science gateway: A NSF investment launching thousands of scientific publications with no sign of slowing down. https://sciencenode.org/feature/cipres-one-facet-in-bold-nsf-vision.php?clicked=title

  16. XSEDE High Throughput Computing Partnership Open Science Grid • Governed by the OSG consortium • 126 institutions with ~120 active sites collectively supporting usage of ~2,000,000 core hours per day • High throughput workflows with simple system and data dependencies are a good fit for OSG • Access Options: • OSGConnect available to any researcher affiliated with US institutions and who are funded by US funding agencies • OSG Virtual Organization such as CMS and ATLAS • XSEDE • https://portal.xsede.org/OSG-User-Guide 16

  17. Accessing XSEDE - Allocations Education Champion Startup Research 17

  18. XSEDE User Support Resources Technical information Training Extended Collaborative Support Service Help Desk/Consultants

  19. Workforce Development: Training XSEDE Training Course Catalog with all materials in a single location Course Calendar for viewing a listing of and registering for upcoming training events and a registration Online Training on materials relevant to XSEDE users Badges available for completing selected training Some events provide participation documentation Training Roadmaps

  20. pearc19.pearc.org July 28 - Aug 1, 2019 Chicago, IL

  21. Welcome to XSEDE!

  22. Outline • Research Computing @ Illinois • National Science Foundation Investments • Extreme Science and Engineering Discovery Environment (XSEDE) • Blue Waters • Research IT • Highlighted Core Services • Campus Champions • History/Motivation • Scope • Benefits

  23. Blue Waters Overview

  24. Blue Waters • Most capable supercomputer on a University campus • Managed by the Blue Waters Project of the National Center for Supercomputing Applications at the University of Illinois • Funded by the National Science Foundation Goal of the project Ensure researchers and educators can advance discovery in all fields of study 24

  25. Blue Waters System Top-ranked system in all aspects of its capabilities Emphasis on sustained performance • Built by Cray (2011 – 2012). • 45% larger than any other system Cray has ever built • By far the largest NSF GPU resource • Ranks among Top 10 HPC systems in the world in peak performance despite its age • Largest memory capacity of any HPC system in the world: 1.66 PB (PetaBytes) • One of the fastest file systems in the world: more than 1 TB/s (TeraByte per second) • Largest backup system in the world: more than 250 PB • Fastest external network capability of any open science site: more than 400 Gb/s (Gigabit per second)

  26. Blue Waters Ecosystem Petascale Applications EOT Computing Resource Allocations GLCPC Education, Outreach, SEAS: Software Engineering and Application Support Great Lakes and Training Consortium for Petascale User and Production Support Industry Computing WAN Connections, Consulting, System Management, Security, partners Operations, … Software Hardware Visualization, analysis, computational libraries, etc. External networking, IDS, back-up storage, import/export, etc Blue Waters System Processors, Memory, Interconnect, Online Storage, System Software, Programming Environment National Petascale Computing Facility 26

  27. Blue Waters Computing System 13.34 PFLOPS 1.66 PB IB Switch >1 TB/sec Scuba Subsystem : External Servers 10/40/100 Gb Storage Configuration Ethernet Switch for User Best Access 100 GB/sec 400+ Gb/sec WAN Spectra Logic: 200 usable PB Sonexion: 26 usable PB

  28. Blue Waters Allocations: ~600 Active Users NSF PRAC, 80% 30 – 40 teams, annual request for proposals (RFP) coordinated by NSF o Blue Waters project does not participate in the review process o Illinois , 7% 30 – 40 teams, biannual RFP o GLCPC , 2% 10 teams, annual RFP o Education , 1% Classes, workshops, training events, fellowships. Continuous RFP. o Industry Innovation and Exploration , 5% Broadening Participation, a new category for underrepresented communities 28

Recommend


More recommend