Physics Plans and Machines in Germany Major Collaborations in Germany: • BMW: Budapest, Marseille, Wuppertal • CLS: Coordinated Lattice Simulations • ETMC: European Twisted Mass Collaboration • QCDSF: QCD Structure Functions • RBG: RBC-Bielefeld-GSI
Physics • BMW: – Thermodynamics (Wuppertal and Budapest) – T=0 Physics (Budapest-Marseille-Wuppertal) • CLS – K-, charm- and B-physics; structural properties of light baryons • QCDSF and ETMC: very similar and broad physics plans, e.g. – Structure functions, form factors and generalised parton distribution functions of baryons and mesons – Distribution Amplitudes of baryons and mesons – Disconnect contributions – Muon anomalous magnetic moment – Resonances – Hadron mass spectrum, decay constants, quark masses, running coupling and Λ parameter • RBC-Bielefeld-GSI: – Non-zero temperature and density – Equation of State
Some results from Germany 0.35 2000 0.3 O X * 1500 S * X 0.25 M[MeV] D S L 1000 N 〈 x 〉 u-d K* 0.2 r N f =2+1 DWF [RBC/UKQCD] (2.7fm) MRST 2002 N f =0 DWF [RBC] (2.4fm) experiment 500 0.15 K N f =2+1 Mix [LHPC] (2.5fm) width N f =2 Clover [QCDSF] (1.9-2.4 fm) QCD N f =0 Clover [QCDSF] (1.6 fm) 0.1 p 0 0.1 0.2 0.3 0.4 0.5 0 2 [GeV 2 ] m π
Simulation Plans Action: Symanzik improved gauge BMW stout improved N f = 2 + 1 staggered and (clover) Wilson Parameters a � 0.065fm, m π � 190 MeV QCDSF Action: N f = 2 NP-Clover m π = 138MeV, a ≈ 0 . 075 fm, V = 64 3 96 Parameters: Plan: N f = 2 + 1 SLiNC m π = 230MeV, a = 0 . 08 fm, V = 64 3 96 , SU(3) point Parameters: Configurations: all N f = 2 ensembles uploaded Action: N f = 2 + 1 + 1 , maximally tmQCD, Iwasaki gauge ETMC Parameters: a = 0 . 05 − 0 . 10 fm, L ≈ 3 . 0 fm, 150 < m π < 400 MeV Configurations: all N f = 2 (4 lattice spacings) ensembles uploaded CLS Action: N f = 2 , NP O(a)-improved Parameters: a = 0 . 04 − 0 . 08 fm, L � 4 . 0 fm, m π � 200 MeV Action: asqtad and p4 RBG Parameters: N τ = 6 , 8 , 12 , m π ≈ 150 MeV and m π ≈ 220 MeV
Supercomputer Infrastructure • apeNEXT in Zeuthen 3Teraflops and Bielefeld 5Teraflops → dedicated to LGT • NIC 72 racks of BG/P System at FZ-J¨ ulich 1 Petaflops • 2208 Nehalem processor Cluster computer: 208 Teraflops • Altix System at LRZ Munic 62 Teraflops • SGI Altix ICE 8200 at HLRN (Berlin, Hannover) 31 Teraflops • QPACE 4+4 Racks in J¨ ulich und Wuppertal 1900 PowerXCell 8i nodes 190 TFlops • University machines, e.g. University of Mainz: 280 nodes Dual Quadcore AMD Barcelona + Infiniband, 15 TFlops
ILDG Enthusiasm Collaboration ILDG BMW open CLS open ETMC YES QCDSF YES RBG open
Recommend
More recommend