Status des ATLAS Experimentes am LHC Beschleuniger des CERN
- E. Kneringer
Institut für Astro- und Teilchenphysik, Universität Innsbruck
FAKT-Tagung Langenlois, 24.9.2007
Status des ATLAS Experimentes am LHC Beschleuniger des CERN - - PowerPoint PPT Presentation
Status des ATLAS Experimentes am LHC Beschleuniger des CERN FAKT-Tagung Langenlois, 24.9.2007 E. Kneringer Institut fr Astro- und Teilchenphysik, Universitt Innsbruck Inhaltsbersicht Einleitende Kommentare Innsbruck Gruppe
Institut für Astro- und Teilchenphysik, Universität Innsbruck
FAKT-Tagung Langenlois, 24.9.2007
2
Einleitende Kommentare
Innsbruck Gruppe
Hauptteil
ATLAS collaboration ATLAS installation schedule ATLAS cavern Milestones and commissioning
Subdetector and TDAQ systems
Offline computing, physics analysis on the Grid Detector paper, CSC notes Analyses strategies, statistics forum Physics: QCD
Zum Schmunzeln
3
ATLAS Collaboration
Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, HU Berlin, Bern, Birmingham, Bologna, Bonn, Boston, Brandeis, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, Casablanca/Rabat, CERN, Chinese Cluster, Chicago, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, DESY, Dortmund, TU Dresden, JINR Dubna, Duke, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima, Hiroshima IT, Indiana, Innsbruck, Iowa SU, Irvine UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, Mannheim, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Nagoya, Naples, New Mexico, New York, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Regina, Ritsumeikan, UFRJ Rio de Janeiro, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, SLAC, Southern Methodist Dallas, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto, TRIUMF, Tsukuba, Tufts, Udine, Uppsala, Urbana UI, Valencia, UBC Vancouver, Victoria, Washington, Weizmann Rehovot, FH Wiener Neustadt, Wisconsin, Wuppertal, Yale, Yerevan
35 Countries 164 Institutions 1900 Scientific Authors total (1550 with a PhD, for M&O share)
New application for CB decision: Göttingen, Germany New Expressions of Interest: Santiago/ Valparaíso, Chile Bogotá, Colombia
4
Diameter 25 m Barrel toroid length 26 m End-cap end-wall chamber span 46 m Overall weight 7000 Tons length of cables 3000 km # of channels ~108 ATLAS superimposed to the 5 floors of building 40
Inner Detector (ID) tracker:
Calorimeter
Muon spectrometer
Magnets
5
The Underground Cavern at Pit-1 for the ATLAS Detector
Length = 55 m Width = 32 m Height = 35 m
Side A(irport) Side C “building a ship in a bottle” Globe
6
7
two years later
the cavern is full!
8
29 May 2007, on schedule 15 m high, 5 m wide, 240 tons weight
9
Atlas installation schedule v9.2 (14 Sep 2007)
close finish installation end of year (small wheels) now
10
Milestone weeks (typically 2 weeks duration)
from subdetector to permanent storage increasing number of subdetectors involved at each
stage
12 subdetectors in total
+ computing + power and cooling infrastructure + detector control systems + safety systems
data taking (cosmic runs, triggers at 3 Hz for M3)
First week
stable running of systems previously integrated
Second week
integration of new components into the global system
11
12
For the first time, Tier-0 processing was included as part of the commissioning run and was run during most
picked up the data files, written to CASTOR by DAQ, and ran the offline reconstruction (provided by the Offline Commissioning Group). The complete offline reconstruction chain was used to reconstruct cosmic ray data from part of the inner detector, calorimeters and muon system. Tracks were fitted to the data from the inner detector and muon chambers.
An ATLANTIS event display showing a cosmic track fitted to hits in the ID and muon chambers.
The full monitoring chain, which will be used to check correct performance across the detector, was also used to produce the relevant monitoring histograms for each sub-detector. In a subsequent processing step, monitoring histograms produced by the individual reconstruction jobs were merged to provide longer term data quality monitoring. The history plot, taken on Monday June 18 2007, is an example from this Tier-0
synchronization among the different sub-systems. Cosmic-ray tracks recorded in the barrel TRT during the M3 tests.
13
Dates Systems I ntegration Detector configuration Operations Cosmic run
M1
11-19/12 2006 DAQ R/O Barrel Lar & Tile Barrel calorimeters Barrel calorimeters Barrel Muon Barrel and End Cap calorimeters Barrel muon (5&6) EC muon MDT Barrel SCT, TRT EC muon TGC Barrel & EC calos+ muon Barrel TRT SCT R/O Level-1 Mu, Calo Converge to ATLAS detector As for M5 Achieve combined run 2 days Tile cosmic trigger
M2
28/2 to 13/3 2007 DAQ/EB DAQ V. 1.7 Muon barrel (S. 13) Monitoring/DQ Combined runs Mixed runs 2 x weekd ends Tile cosmic trigger + RPC cosmic trigger Periodic cosmic runs after M2
M3
4/6 to 18/6 2007 Barrel SCT Barrel TRT Muon EC (MDT,TGC) Offline 1st week focus on
management, coordination between desks 1 week Tile + Muon cosmic trigger (side A)
M4
23/8 to 3/9 2007 2 day setup 2 week ends Level-1 Calo HLT DAQ 1.8 Offline 13 ATLAS-like operations Use of DQ assessment 1 week Try also calorimeter trigger
M5
16/10 to 23/10 2007 ID EC (TRT) Pixel ATLAS-like operations 1 week
M6
November/December End Cap magnets ATLAS-like Operations Run during magnet test Global cosmic run Magnets on
new
14
weekly e-newsletter
15
as expected at HLT output
to separate streams, closing files at boundary of luminosity blocks
A complete test of the final chain of data handling, distribution and analysis from last stage of TDAQ to the user’s laptop.
First test with 3.6 million events.
Timescale: (4 phases) … Phase 3: 17-21 September 2007 Phase 4: 22-26 October 2007 (1-SFO) (29/10 - 6/11: N-SFOs)
16
Main recurrent discussion points at the visits are:
a general Tier-1 description hardware and human resources storage set-up local and remote networking databases and the 3D project installed middleware and Grid services the regional Tier-2/3 organisation re-processing and how to organise the recall of data
from tape
At the beginning of 2007 it became clear that an enhanced level of communication is needed between the ATLAS computing organisation and the Tier-1 centres.
→ Visits to Tier-1 Computing Centres around the world
17
Issues addressed:
How are particle signatures (e, gamma, muons,
taus, jets, ET,miss) going to be validated?
What strategies will be used to validate and
understand the object ID (efficiencies, fake rates from early data)?
What physics processes will be used to do
the validation?
What are the first measurements that will
be performed with data corresponding to 10, 100 and 1000 pb-1?
What strategies will be used to estimate the
backgrounds from data?
How will the trigger efficiencies be evaluated?
Question: How are the different physics groups planning to organize the analysis of the first data and what strategies (simple cut based analysis, multivariate techniques, blind analysis) are used to perform the analysis? Combined Performance Groups Physics Groups
18
Define ATLAS Statistics Standards in contact with Physics Coordination and ATLAS groups. Develop, validate and approve standard statistical tools in close contact with the software group. Represent ATLAS in discussions on statistical issues with CMS. Discuss and make recommendations on statistical issues such as blind analyses, model independent searches, etc.
Mandate: Ongoing activities: ATLAS Statistics Book Higgs combination (different channels & ATLAS + CMS) Software development (RooStat) Sequential analysis
19
Abstract: This paper describes the ATLAS experiment as installed in its experimental cavern at point 1 at CERN.
Draft 1:
27 July 2007 (300 pages)
missing: Chapter on Performance, chapt. 10
Draft 2:
19 Oct. 2007
paper out:
1 Dec. 2007
1.
Overview, main characteristics
2.
Solenoid and toroid magnet system, magnetic field
3.
Shielding and radiation levels
4.
Inner tracking system
5.
Calorimetry
6.
Muon spectrometer
7.
Hardware aspects of the trigger and data acquisition system
8.
Main features of the infrastructure in the ATLAS cavern
9.
Overview of global performance expected
10.
current status of installation and commissioning, expectations for the ultimate completion of the detector and its operation in 2007 and 2008.
20
The main goal of CSC is to get ready for data
not to write (80) CSC notes.
The CSC notes are to be produced by the Physics groups using data made for the
Computing System Commissioning
21
CSC requirement:
analysis on the grid with official Monte Carlo samples
GANGA
utility for job submission on the grid
system has become usable during 2007 for a Tier-3 site, like Innsbruck, configuration details
are different
setting up a Tier-site needs a lot of specialists
can now routinely process 50 k Monte Carlo events
through the software chain (simulation, reconstruction)
takes ~ 1 week on 32 CPUs (Woodcrest Xeon 3 GHz)
producing ~ 300 GB of output
at the moment we prefer to do things locally
22
stabilizing in ~1 year → far too long! releases incompatible!
23
~80%
σ ~ 50 mb
tracks charged MC primary generated
# cuts and reco after found tracks MC primary generated
# = ε
eff. eff. η
plots by A. Moraes
24
by P. Francavilla
25
Impressive progress
the distributed analysis and the physics
There are clearly also still
a few major challenges to
be fully ready for LHC
26
27
28
Informal news on ATLAS is available in the ATLAS eNews letter at http://aenews.cern.ch/
29