Far more than Petaflops: The Jlich Supercomputing Centre ScicomP 15 - - PowerPoint PPT Presentation

far more than petaflops
SMART_READER_LITE
LIVE PREVIEW

Far more than Petaflops: The Jlich Supercomputing Centre ScicomP 15 - - PowerPoint PPT Presentation

Mitglied der Helmholtz-Gemeinschaft Far more than Petaflops: The Jlich Supercomputing Centre ScicomP 15 & SP-XXL Thomas Lippert Barcelona Supercomputing Centre Institute for Advanced Simulation May 20, 2009 Jlich Supercomputing


slide-1
SLIDE 1

Mitglied der Helmholtz-Gemeinschaft

Thomas Lippert

Institute for Advanced Simulation Jülich Supercomputing Centre

Far more than Petaflops:

The Jülich Supercomputing Centre

ScicomP 15 & SP-XXL

Barcelona Supercomputing Centre May 20, 2009

slide-2
SLIDE 2

Supercomputing Drives Basic Sciences

Geophysics Solid State Physics Chemistry Particle Physics Structure of Matter Plasma Physics Nuclear Physics Astrophysik Kosmologie Astrophysics Cosmology New Physics

slide-3
SLIDE 3

Supercomputing Drives Applied Science

Environment Weather/Climatology Pollution / Ozone Hole Ageing Society Medicine Biology Energy Plasma Physics Fuel Cells Materials Spintronics Nano-Science

slide-4
SLIDE 4

Supercomputing Drives Engineering and Business Competitiveness

Reducing design costs by virtual prototyping:

  • faster time to

market

Allowing investigations where economics or ethics preclude experimentation

  • imperative of

supercomputing

slide-5
SLIDE 5

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

5

FROM JÜLICH TO EUROPE

slide-6
SLIDE 6

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

6

Jülich in Brief

Largest civilian research centre in Europe 360 Mio. Euro/a budget 4.300 staff members

  • 1.200 scientists
  • 700 guest scientists from 50 countries

9 Departments (Institutes) (Institute for Advanced Simulation)

slide-7
SLIDE 7

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

7

You might have heard of ….

slide-8
SLIDE 8

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

8

Jülich Supercomputing Centre (JSC)

slide-9
SLIDE 9

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

9

IAS Institute for Advanced Simulation

Jülich Supercomputing Centre (JSC)

Soft Matter Biophysics Hadron Physics Nano/Material Science

IAS Organisation

slide-10
SLIDE 10

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

10

Milestones

1961 Zentralinstitut für Angewandte Mathematik (ZAM) 1987 Höchstleistungsrechenzentrum HLRZ 1998 HLRZJohn von Neumann Institut für Computing (NIC) 2007 ZAM Jülich Supercomputing Centre (JSC) Member of Gauss Centre for Supercomputing 2008 Institute for Advanced Simulation Coordinator of the PRACE Project 2010 European Supercomputing Centre

slide-11
SLIDE 11

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

12

  • German Research School for Simulation Science
  • Co-funded by NRW, BMBF and Helmholtz Association
  • PhD and Master students in two-years course
slide-12
SLIDE 12

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

13

MATSE Education

Organization JSC (current staff assignment)

Large Scale Facility Grid & Infrastructures

Distributed Systems & Grid Computing

Computational Science Mathematical Methods

Computational Science SimLab Biology SimLab Plasma Phys. SimLab

  • Mol. Systems

Complex Systems

NIC Research Group

HPC Systems HPC Operations HPC Data Management HPC System Development

European HPC Infrastructure

UNICORE Development Grid Research Performance Analysis

Helmholtz Young Investigators Group

Mathematics & Education Modeling & Methods

Numerical Algorithms Mathematical Software

Education HPC Application Support Applied Visualization Program Optimization Programming Environments SL Operation

Research Group Quant.-Inf.

Distributed Systems Communication Systems HPC Networking JuNet &

  • Ext. Networks

Security Network Technologies Director Secretaries, Administration Technology Technology Development

File- & Archive Systems

D-Grid Operation Technical Infrastructure NIC Coordination Public Relations User/Project Management

Organization JSC

slide-13
SLIDE 13

HPC Systems

slide-14
SLIDE 14

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

15

slide-15
SLIDE 15

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

16

Supercomputers

1956 First Computer in Jülich 1989 Cray YMP 0.003 Teraflop/s 1996 Cray T3E 0.8 Teraflop/s 2003 IBM p690 9 Teraflop/s 2006 BGL: JUBL 46 Teraflop/s 2008 BGP: JUGENE 223 Teraflop/s 2009 JuRoPA 200 Teraflop/s HPC-FF 100 Teraflop/s BGP: JUGENE 1000 Teraflop/s

slide-16
SLIDE 16

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

17

IBM Blue Gene/L JUBL, 45 TFlop/s

2004 2005/6 2007/8

IBM Blue Gene/P JUGENE, 223 TFlop/s

2009

File Server GPFS File Server GPFS, Lustre

IBM Power 4+ JUMP, 9 TFlop/s IBM Power 6 JUMP, 9 TFlop/s IBM Blue Gene/P JUGENE, 1 PFlop/s

Developing Supercomputers @ JSC

Intel Nehalem Clusters HPC-FF 100 TFlop/s JUROPA 200 TFlop/s General-Purpose Highly-Scalable

slide-17
SLIDE 17

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

18

JUGENE: Jülich’s Scalable Petaflop System

IBM Blue Gene/P JUGENE

  • 32-bit PowerPC 450

core 850 MHz, 4-way SMP

  • 72 racks, 294,912 procs
  • 1 Petaflop/s peak
  • 144 TByte main memory
  • connected to a Global Parallel File System (GPFS) with

5 PByte online disk capacity and up to 25 PByte offline tape capacity

  • Torus network

First Petaflop system in Europe

slide-18
SLIDE 18
slide-19
SLIDE 19

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

20

Juropa

2208 compute nodes

  • 2 Intel Nehalem-EP quad-core processors
  • 2.93 GHz
  • SMT (Simultaneous Multithreading)
  • 24 GB memory (DDR3, 1066 MHz)
  • IB QDR HCA (via Network Express Module)

17664 cores, 207 TF peak

  • Sun Microsystems Blade SB6048
  • Infiniband QDR with non-blocking Fat Tree topology
  • ParaStation Cluster-OS
slide-20
SLIDE 20

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

21

HPC-FF

1080 compute nodes

  • 2 Intel Nehalem-EP quad-core processors
  • 2.93 GHz
  • SMT (Simultaneous Multithreading)
  • 24 GB memory (DDR3, 1066 MHz)

8640 cores, 101 TF peak

  • Bull NovaScale R422-E2
  • Infiniband QDR with non-blocking Fat Tree topology
  • ParaStation Cluster-OS
slide-21
SLIDE 21

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

22

Infiniband Topology

23 x 4 QNEM modules, 24 ports each 6 x M9 switches, 648 ports max. each, 468/276 links used Mellanox MTS3600 switches (Shark), 36 ports, for service nodes 4 Compute Sets (CS) with 15 Compute Cells (CC) each CC with 18 Compute Nodes (CN) and 1 Mellanox MTS3600 (Shark) switch each Virtual 648-port switches constructed from 54x/44x Mellanox MTS3600

slide-22
SLIDE 22

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

23

JUST – Jülich Storage Cluster

GPFS Storage Cluster for all our Supercomputers Supercomputers are Remote Clusters for GPFS 1 PB Capacity today, expansion to 6 PB in Q4 2009 20 GB/sec Bandwidth, expansion to 66 GB/sec in Q4 Tivoli Storage Manager (TSM) for backup, archive and HSM 2 SUN tape libraries used with TSM

  • 16 PB capacity today
  • Can be expanded to 32 PB next year
slide-23
SLIDE 23

Information and Technology Deputy Director JSC

slide-24
SLIDE 24

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

25

Preparing Infrastructure for ….

slide-25
SLIDE 25

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

26

Emerging multi- & many-core Architectures

Accelerators promise exciting performances at low power

  • Cell Broadband Engine: 200 / 100 GFlop/s (100 W)
  • nVIDIA Tesla T10: 1000 / 80 GFlop/s (200 W)
  • AMD FireStream 9270: 1200 / 240 GFlop/s (220 W)
  • Programming paradigm

CUDA, Brook, Cell-SDK, CellSS, RapidMind, OpenCL, ...

  • Application kernels have to be adapted by hand
slide-26
SLIDE 26

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

27

Many-core prototypes @ JSC

QPACE / eQPACE

  • Special purpose computer for lattice QCD

Main design goal: energy- and cost-efficiency Developed by SFB/TR “Hadron Physics” 3-D torus network based on FPGA – SPE-to-SPE comm. Ultra-dense packaging: 25,6 TFlop/s per rack

  • Explore broader purpose capabilities within PRACE WP8

Enhanced communication: Beyond nearest neighbor & MEM-to- MEM Support of standard communication layers (MPI)

JUICEnext QS22 cluster

Cell based computational platform and test facility

slide-27
SLIDE 27

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

28

Future Developments around JuRoPA

Cluster Management

  • ParaStation (incl. MPI)
  • GridMonitor

Operating System

  • SUSE SLES 11

Fighting Operating System Jitter

slide-28
SLIDE 28

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

29

Building the D-Grid

Internet Client Firewall JUGGLE

Globus LCG/gLite

UNICORE 5 Gateway

to the other UNICORE sites

UNICORE5

DMZ

UNICORE6

UNICORE 6 Registry SoftComp

UNICORE5 UNICORE6

slide-29
SLIDE 29

Communication Systems PRACE Project Manager

slide-30
SLIDE 30

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

31

High-speed Supercomputer Connectivity

slide-31
SLIDE 31

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

32

Pan-European Supercomputer Network Research and Provisioning

  • DEISA:

design and operation

  • f a pan-European

10 Gbit/s network

  • LOFAR:

planning and operation

  • f German-Dutch

peering

  • Phosphorus:

R&D in on-demand

  • ptical networking

DEISA Phosphorus LOFAR

slide-32
SLIDE 32

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

33

Data Communication – JuNet by Numbers

  • JSC: overall responsibility for Campus network JuNet &

external connections

  • JuNet
  • 94 Ethernet switches,

1,5 Tbit/s

  • 9.500 ports in 60 buildings
  • 300 WLAN Access Points
  • Supercomputing centre
  • 95 Ethernet switches,

>8 Tbit/s

  • 6.000 ports in 2 buildings
  • Infiniband, proprietary networks
  • External Connectivity
  • 5 Gbit/s X-WiN (redundant)
  • Dark fibres to RWTH, TZJ,

FhG-Birlinghoven

  • Project network operation:

DEISA, LOFAR, Phosphorus

  • VPN and dial-in services
slide-33
SLIDE 33

Thomas Eickermann, PRACE Project Coordination@FZ-Jülich

Towards the High-End HPC Service for European Science

slide-34
SLIDE 34

20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC

35

Computational science infrastructure in Europe

The European Roadmap for Research Infrastructures is the first comprehensive definition at the European level Research Infrastructures are

  • ne of the crucial pillars of the

European Research Area A European HPC service:

  • Horizontal
  • attractive for research

communities

  • supporting industrial

development

slide-35
SLIDE 35

20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC

36

  • Prepare the contracts to establish the PRACE

permanent Research Infrastructure as a single Legal Entity from 2010 on including governance, funding, procurement, and usage strategies.

  • Perform the technical work to prepare operation
  • f the Tier-0 systems in 2009/2010 including

deployment and benchmarking of prototypes for Petaflop/s systems and porting, optimising, Peta- scaling of applications

PRACE PROJECT

slide-36
SLIDE 36

20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC

37

PRACE – Initiative

New Partners - since May 2008

General Partners Principal Partners

General Partners

tier 1 tier 0

GENC I

slide-37
SLIDE 37

29.4. 2009 Physikalisches Kolloquium Universität Duisburg-Essen

38

HET: The Scientific Case

  • Weather, Climatology, Earth Science

– degree of warming, scenarios for our future climate. – understand and predict ocean properties and variations – weather and flood events

  • Astrophysics, Elementary particle physics, Plasma physics

– systems, structures which span a large range of different length and time scales – quantum field theories like QCD LHC, FAIR – ITER

  • Material Science, Chemistry, Nanoscience

– understanding complex materials, complex chemistry, nanoscience – the determination of electronic and transport properties

  • Life Science

– system biology, chromatin dynamics, large scale protein dynamics, protein association and aggregation, supramolecular systems, medicine

  • Engineering

– complex helicopter simulation, biomedical flows, gas turbines and internal combustion engines, forest fires, green aircraft

slide-38
SLIDE 38

General Partners

PRACE Initiative PRACE Project

Further PRACE Activities

BSC Genci EPSRC NCF GCS

RIS

slide-39
SLIDE 39

HPC Application Support

slide-40
SLIDE 40

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

41

User Research Fields

JUMP

~ 200 Projects

JUGENE

~40 Projects

slide-41
SLIDE 41

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

42

Levels of User Support and Training

User Support Training

slide-42
SLIDE 42

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

43

Simulation Laboratories: Community-oriented research and support units

SL Plasma Physics SL Biology SL Earth and Environment

Supercomputing Centre

SL Molecular Systems SL SL

slide-43
SLIDE 43

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

44

Example of Various Support Activities

Blue Gene/L Scaling Week, May 2006 Blue Gene Scaling Workshop, Dec 2006

  • Jointly with IBM, and Blue Gene/P Consortium (Argonne

National Lab.)

  • Scaling to 16k core on Blue Gene/L
  • Usage of Performance Tool SCALASCA
slide-44
SLIDE 44

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

45

JUGENE Usage Snapshot

slide-45
SLIDE 45

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

46

Scientific Visualization Simulation of Blood Flow in a Ventricular Assist Device (Prof. Marek Behr, RWTH Aachen)

slide-46
SLIDE 46

Coordination Office John von Neumann-Institute

slide-47
SLIDE 47

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

48

Soft Matter Composites DEISA I3HP Jülich Initiative Other 48

  • Proposals for computer time accepted from Germany and Europe
  • Peer review by international referees
  • Allocated by an independent Scientific Council (NIC)

National and European User Groups

Chemistry Many Particle Physics Elementary Particle Physics Biology/Biophysics Material Science Soft Matter Other

slide-48
SLIDE 48

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

49

GCS: Gauss Centre for Supercomputing

Germany‟s Tier-0/1 Supercomputing Complex

  • Association of Jülich, Garching and Stuttgart
  • A single joint scientific governance
  • Germany‟s representative in PRACE
  • More information: http://www.gauss-centre.de
slide-49
SLIDE 49

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

50

National HPC Pyramid

3

European HPC Centre Topical HPC Centre, Centre with regional tasks HPC Server Aachen, DKRZ, Dresden, DWD, Erlangen, G-CSC Frankfurt, HLRN (Hannover, Berlin), Karlsruhe, MPG/RZG, Paderborn University/Institute

~ 10 ~ 100

National HPC Centre Garching, Jülich, Stuttgart

Gauß Centre for Supercomputing Gauß Alliance

slide-50
SLIDE 50

Computational Science

slide-51
SLIDE 51

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

52

Methods & Algorithms Parallel Performance

Simulation Laboratories (New)

Earth & Environment Plasma Physics Energy Biology Molecular Systems NanoMikro

Cross-Sectional Teams

Astro- Particle

Research Groups

Quantum Information NIC Group Distributed Computing

Education & Training Programmes

Simulation Labs

slide-52
SLIDE 52

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

53

Example 1: Simulation Lab Biology

Research

  • Protein folding & interaction
  • Structure prediction
  • Systems biology

Support

  • Libraries, Bio databases  LSDF (Topic 2)
  • Benchmarking
  • Monte Carlo, FFT docking, Machine learning

Codes

  • PROFASI, SMMP

Outreach

  • FZJ: Biological institutes (ISB, INM), Helmholtz

Groups

  • Regional: ABC of Life Science Informatics
  • International: UC Berkeley, Michigan Tech

Protein 1LQ7

slide-53
SLIDE 53

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

54

54

Example 2: Simulation Lab Plasma Physics

Research

  • Kinetic methods: Particle-in-Cell, Vlasov, MD
  • Fluid + MHD models
  • Transport: Monte Carlo

Support

  • Plasma model porting & scaling
  • Code benchmarking – eg: 3D PIC

Codes

  • PSC, ILLUMINATION, PEPC, racoon, EIRENE

Outreach

  • FZJ groups: IEF (Plasma), IKP (Nuclear)
  • Regional: Univs. Aachen, Bochum, Düsseldorf
  • National: GSI (HA-EMMI), Garching, FZ-

Rossendorf

Laser-ion acceleration Solar flare modelling

slide-54
SLIDE 54

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

55

Petawatt Laser on Thin Foil

(Dr. Paul Gibbon, FZJ)

slide-55
SLIDE 55

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

56

Example 3: Research Group Quantum Information

Frontier research on quantum effects in computing

  • Mitigation of quantum effects in integrated circuits
  • Exploit „qubit‟ paradigm for algorithm acceleration
  • Massively parallel QC simulation

Research

  • Robustness of quantum algorithms

(Gate imperfections, decoherence, error correction)

  • First-principles simulation of real ion-trap quantum computers

Cooperations

  • New W2 professorship with RWTH Aachen (> June 2009)
  • FZJ, U. Groningen, U. Innsbruck

8-bit ion trap

slide-56
SLIDE 56

Mathematical Methods

slide-57
SLIDE 57

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

58

Cross-Sectional Group Mathematical Methods and Algorithms

Numerical Algorithms

  • Large-scale linear systems, eigenvalue problems
  • Error-controlled Fast Multipole Method
  • Kernels for new architectures (Cell Engine)

Modelling

  • Pedestrian dynamics (simulation faster than real-

time)

  • 3D soil-root water transfer

Software & Service

  • Parallel eigenvalue library, FMM program,

toolboxes

  • Mathematical software, benchmarking

Cooperations

  • FZJ research groups, Univs. Wuppertal, Bonn,

Cologne

  • FH Aachen, MPIKS Dresden, industrial partners

FMM Evacuation modelling Soil-root interface

slide-58
SLIDE 58
slide-59
SLIDE 59

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

60

Fundamental Diagram

slide-60
SLIDE 60

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

61

Simulation of the 2nd Bosphorus Bridge

Motivation

The earthquake safety analysis of the 2nd Bosphorus Bridge in Istanbul is of great practical interest.

Strategy

  • A high resolution FEM model of

the bridge has to be developed

  • Empirical knowledge of typical

earthquake loads at Istanbul provides input for the simulation

  • The resulting FEM calculations require supercomputer resources

Cooperation

Kandilli Observatory and Earthquake Research Institute Bogazici University, Istanbul

slide-61
SLIDE 61

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

62

62

Higher Education

German Research School for Simulation Sciences (GRS)

  • Joint foundation of FZJ and RWTH Aachen
  • Master courses, doctoral programme in simulation

sciences

Cooperations with regional universities

  • JSC scientists with professorships at Aachen,

Wuppertal and Bonn

  • Bachelor & Master programme in Technomathematics

(Aachen U. Appl. Sci.)

  • Biennial graduate schools in Scientific Computing
slide-62
SLIDE 62

Grid Technology and Infrastructures

slide-63
SLIDE 63

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

64

Development and Usage of UNICORE

2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 2009

  • More than a decade of German and

European research & development and infrastructure projects

  • And many others, e.g.

2010 2011 UNICORE UNICORE Plus EUROGRID GRIP GRIDSTART OpenMolGRID UniGrids VIOLA DEISA NextGRID CoreGRID D-Grid IP EGEE-II OMII-Europe A-WARE Chemomentum eDEISA PHOSPHORUS D-Grid IP 2 SmartLM PRACE D-MON DEISA2 ETICS2 SLA4D-Grid WisNetGrid

slide-64
SLIDE 64

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

65

Eclipse-based UNICORE Rich Client (URC)

slide-65
SLIDE 65

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

66 UNICORE WS-RF hosting environment XNJS – Site 1 IDB UNICORE Atomic Services OGSA-* Service Registry Local RMS (e.g. Torque, LL, LSF, etc.) Target System Interface – Site 1 Local RMS (e.g. Torque, LL, LSF, etc.) X.509, Proxies, SOAP, WS-RF, WS-I, JSDL OGSA-ByteIO, OGSA-BES, JSDL, HPC-P, OGSA-RUS, UR X.509, XACML, SAML, Proxies DRMAA UCC command- line client URC Eclipse-based Rich client Portal e.g. GridSphere HiLA Programming API Gateway – Site 1 UVOS VO Service External Storage USpace GridFTP, Proxies USpace XUUDB Workflow Engine Service Orchestrator XACML entity UNICORE WS-RF hosting environment XNJS – Site 2 IDB UNICORE Atomic Services OGSA-* Target System Interface – Site 2 XUUDB XACML entity Gateway – Site 2 CIS Info Service OGSA-RUS, UR, GLUE 2.0 Grid services hosting job incarnation web service stack data transfer to external storages authorization authentication scientific clients and applications central services running in WS-RF hosting environments Gateway

slide-66
SLIDE 66

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

67

Core D-Grid sites committing parts of their existing resources to D-Grid

  • Approx. 700 CPUs
  • Approx. 1 PByte of storage
  • UNICORE is installed and

used Additional Sites received extra money from the BMBF for buying compute clusters and data storage

  • Approx. 2000 CPUs
  • Approx. 2 PByte of storage

LRZ

DLR-DFD

UNICORE usage in D-Grid

slide-67
SLIDE 67

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

68

Consortium of leading national HPC centers in Europe

  • Deploy and operate a persistent, production quality, distributed,

heterogeneous HPC environment

UNICORE as Grid Middleware

  • On top of DEISA‟s core

services:

Dedicated network Shared file system Common production environment at all sites

  • Used e.g. for workflow

applications

IDRIS – CNRS (Paris, France), FZJ (Jülich, Germany), RZG (Garching, Germany), CINECA (Bologna, Italy), EPCC ( Edinburgh, UK), CSC (Helsinki, Finland), SARA (Amsterdam, NL), HLRS (Stuttgart, Germany), BSC (Barcelona, Spain), LRZ (Munich, Germany), ECMWF (Reading, UK)

www.deisa.eu

Usage in DEISA

slide-68
SLIDE 68

Performance Analysis

slide-69
SLIDE 69

20.5.2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC

70

Cross-Sectional Group Parallel Performance

Objective

  • Optimization tools for

parallel codes with highest scalability Research

  • Scalasca: performance

analysis tool for large-scale systems Current projects

  • ParMA, SILC (BMBF)
  • VI-HPS (Helmholtz)

Scalasca-Cube screenshot

slide-70
SLIDE 70

20.5.2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC

71

Blood Pump Code after Improvement

slide-71
SLIDE 71

Some Fundamental Stuff

slide-72
SLIDE 72

Approaching the Femto-Dimension

c nano piko femto m atto zepto yocto d

slide-73
SLIDE 73

Well understood: Era of BBN (Nucleosynthesis) To be confirmed: Era of Quark-Hadron Transition: QCD Unknown: Dark Matter

slide-74
SLIDE 74
slide-75
SLIDE 75
slide-76
SLIDE 76

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

77

Etwas Theorie…

12.1.2009 Physikalisches Kolloquium Ruhr-Universität Bochum

77

slide-77
SLIDE 77

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

78

slide-78
SLIDE 78

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

79

10 Breakthroughs of the Year 2008

SCIENCE VOL 322:

Proton‟s Mass „Predicted‟

STARTING FROM A THEORETICAL DESCRIPTION OF ITS INNARDS,

physicists precisely calculated the mass of the proton and

  • ther particles made of quarks and gluons. The numbers

aren‟t new; experimenters have been able to weigh the proton for nearly a century. But the new results show that physicists can at last make accurate calculations of the ultracomplex strong force that binds quarks….

slide-79
SLIDE 79

20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC

80

slide-80
SLIDE 80

Mitglied der Helmholtz-Gemeinschaft

Many Thanks For Your Attention !