Status des ATLAS Experimentes am LHC Beschleuniger des CERN - - PowerPoint PPT Presentation

status des atlas experimentes am lhc beschleuniger des
SMART_READER_LITE
LIVE PREVIEW

Status des ATLAS Experimentes am LHC Beschleuniger des CERN - - PowerPoint PPT Presentation

Status des ATLAS Experimentes am LHC Beschleuniger des CERN FAKT-Tagung Langenlois, 24.9.2007 E. Kneringer Institut fr Astro- und Teilchenphysik, Universitt Innsbruck Inhaltsbersicht Einleitende Kommentare Innsbruck Gruppe


slide-1
SLIDE 1

Status des ATLAS Experimentes am LHC Beschleuniger des CERN

  • E. Kneringer

Institut für Astro- und Teilchenphysik, Universität Innsbruck

FAKT-Tagung Langenlois, 24.9.2007

slide-2
SLIDE 2

2

Inhaltsübersicht

Einleitende Kommentare

Innsbruck Gruppe

Hauptteil

ATLAS collaboration ATLAS installation schedule ATLAS cavern Milestones and commissioning

Subdetector and TDAQ systems

Offline computing, physics analysis on the Grid Detector paper, CSC notes Analyses strategies, statistics forum Physics: QCD

Zum Schmunzeln

slide-3
SLIDE 3

3

ATLAS Collaboration

Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, HU Berlin, Bern, Birmingham, Bologna, Bonn, Boston, Brandeis, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, Casablanca/Rabat, CERN, Chinese Cluster, Chicago, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, DESY, Dortmund, TU Dresden, JINR Dubna, Duke, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima, Hiroshima IT, Indiana, Innsbruck, Iowa SU, Irvine UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, Mannheim, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Nagoya, Naples, New Mexico, New York, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Regina, Ritsumeikan, UFRJ Rio de Janeiro, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, SLAC, Southern Methodist Dallas, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto, TRIUMF, Tsukuba, Tufts, Udine, Uppsala, Urbana UI, Valencia, UBC Vancouver, Victoria, Washington, Weizmann Rehovot, FH Wiener Neustadt, Wisconsin, Wuppertal, Yale, Yerevan

35 Countries 164 Institutions 1900 Scientific Authors total (1550 with a PhD, for M&O share)

New application for CB decision: Göttingen, Germany New Expressions of Interest: Santiago/ Valparaíso, Chile Bogotá, Colombia

slide-4
SLIDE 4

4

The ATLAS detector

Diameter 25 m Barrel toroid length 26 m End-cap end-wall chamber span 46 m Overall weight 7000 Tons length of cables 3000 km # of channels ~108 ATLAS superimposed to the 5 floors of building 40

Inner Detector (ID) tracker:

  • Si pixel and strip + transition radiation tracker
  • σ(d0) = 15μm@20GeV
  • σ/pT ≈ 0.05%pT ⊕ 1%

Calorimeter

  • Liquid Ar EM Cal, Tile Hadronic Cal
  • EM: σE/E = 10%/√E ⊕ 0.7%
  • Had: σE/E = 50%/√E ⊕ 3%

Muon spectrometer

  • Drift tubes, cathode strips: precision tracking +
  • RPC, TGC: triggering
  • σ/pT ≈ 2-7%

Magnets

  • Solenoid (ID) → 2T
  • Air core toroids (muon) → up to 4T
slide-5
SLIDE 5

5

The Underground Cavern at Pit-1 for the ATLAS Detector

Length = 55 m Width = 32 m Height = 35 m

Side A(irport) Side C “building a ship in a bottle” Globe

slide-6
SLIDE 6

6

ATLAS cavern (Sept. 26, 2005)

Barrel Toroid

slide-7
SLIDE 7

7

ATLAS cavern (Sept. 23, 2007)

Muon Big Wheel

two years later

the cavern is full!

slide-8
SLIDE 8

8

End Cap Toroid A

29 May 2007, on schedule 15 m high, 5 m wide, 240 tons weight

slide-9
SLIDE 9

9

Atlas installation schedule v9.2 (14 Sep 2007)

close finish installation end of year (small wheels) now

slide-10
SLIDE 10

10

Milestones

Milestone weeks (typically 2 weeks duration)

  • perating the experiment as a whole

from subdetector to permanent storage increasing number of subdetectors involved at each

stage

12 subdetectors in total

+ computing + power and cooling infrastructure + detector control systems + safety systems

data taking (cosmic runs, triggers at 3 Hz for M3)

First week

stable running of systems previously integrated

Second week

integration of new components into the global system

slide-11
SLIDE 11

11

ATLAS detector commissioning

slide-12
SLIDE 12

12

Offline Monitoring and Reconstruction

For the first time, Tier-0 processing was included as part of the commissioning run and was run during most

  • f the M3 data-taking period. The Tier-0 infrastructure

picked up the data files, written to CASTOR by DAQ, and ran the offline reconstruction (provided by the Offline Commissioning Group). The complete offline reconstruction chain was used to reconstruct cosmic ray data from part of the inner detector, calorimeters and muon system. Tracks were fitted to the data from the inner detector and muon chambers.

An ATLANTIS event display showing a cosmic track fitted to hits in the ID and muon chambers.

The full monitoring chain, which will be used to check correct performance across the detector, was also used to produce the relevant monitoring histograms for each sub-detector. In a subsequent processing step, monitoring histograms produced by the individual reconstruction jobs were merged to provide longer term data quality monitoring. The history plot, taken on Monday June 18 2007, is an example from this Tier-0

  • monitoring. Monitoring tools were also used to check

synchronization among the different sub-systems. Cosmic-ray tracks recorded in the barrel TRT during the M3 tests.

slide-13
SLIDE 13

13

Milestones and integration schedule

Dates Systems I ntegration Detector configuration Operations Cosmic run

M1

11-19/12 2006 DAQ R/O Barrel Lar & Tile Barrel calorimeters Barrel calorimeters Barrel Muon Barrel and End Cap calorimeters Barrel muon (5&6) EC muon MDT Barrel SCT, TRT EC muon TGC Barrel & EC calos+ muon Barrel TRT SCT R/O Level-1 Mu, Calo Converge to ATLAS detector As for M5 Achieve combined run 2 days Tile cosmic trigger

M2

28/2 to 13/3 2007 DAQ/EB DAQ V. 1.7 Muon barrel (S. 13) Monitoring/DQ Combined runs Mixed runs 2 x weekd ends Tile cosmic trigger + RPC cosmic trigger Periodic cosmic runs after M2

M3

4/6 to 18/6 2007 Barrel SCT Barrel TRT Muon EC (MDT,TGC) Offline 1st week focus on

  • perations, checklist

management, coordination between desks 1 week Tile + Muon cosmic trigger (side A)

M4

23/8 to 3/9 2007 2 day setup 2 week ends Level-1 Calo HLT DAQ 1.8 Offline 13 ATLAS-like operations Use of DQ assessment 1 week Try also calorimeter trigger

M5

16/10 to 23/10 2007 ID EC (TRT) Pixel ATLAS-like operations 1 week

M6

November/December End Cap magnets ATLAS-like Operations Run during magnet test Global cosmic run Magnets on

new

slide-14
SLIDE 14

14

ATLAS M4 effort featured in ISG newsletter

weekly e-newsletter

slide-15
SLIDE 15

15

Final Dress Rehearsal

  • Simulate 1 complete LHC fill (~10 hours of data taking) → ~7·106 events
  • Mix and filter events at MC generator level to get correct physics mixture

as expected at HLT output

  • Pass events through G4 simulation (realistic “as installed” detector geometry)
  • Produce byte streams → emulate raw data format
  • Send “raw data” to Point 1, inject at Sub-Farm Output (SFO), write out events

to separate streams, closing files at boundary of luminosity blocks

  • Send events from Point 1 to Tier-0; imitate final file structure and movement
  • Perform calibration and alignment at Tier-0/Tier-1s/Tier-2s
  • Run reconstruction at Tier-0/Tier-1s → produce ESD, AOD, TAGs
  • Distribute ESD, AOD, TAGs to Tier-1s and Tier-2s
  • Perform distributed analysis

A complete test of the final chain of data handling, distribution and analysis from last stage of TDAQ to the user’s laptop.

First test with 3.6 million events.

Timescale: (4 phases) … Phase 3: 17-21 September 2007 Phase 4: 22-26 October 2007 (1-SFO) (29/10 - 6/11: N-SFOs)

slide-16
SLIDE 16

16

Computing Coordination

Main recurrent discussion points at the visits are:

a general Tier-1 description hardware and human resources storage set-up local and remote networking databases and the 3D project installed middleware and Grid services the regional Tier-2/3 organisation re-processing and how to organise the recall of data

from tape

At the beginning of 2007 it became clear that an enhanced level of communication is needed between the ATLAS computing organisation and the Tier-1 centres.

→ Visits to Tier-1 Computing Centres around the world

slide-17
SLIDE 17

17

Analyses strategies

Issues addressed:

How are particle signatures (e, gamma, muons,

taus, jets, ET,miss) going to be validated?

What strategies will be used to validate and

understand the object ID (efficiencies, fake rates from early data)?

What physics processes will be used to do

the validation?

What are the first measurements that will

be performed with data corresponding to 10, 100 and 1000 pb-1?

What strategies will be used to estimate the

backgrounds from data?

How will the trigger efficiencies be evaluated?

Question: How are the different physics groups planning to organize the analysis of the first data and what strategies (simple cut based analysis, multivariate techniques, blind analysis) are used to perform the analysis? Combined Performance Groups Physics Groups

slide-18
SLIDE 18

18

Atlas statistics forum (since March 2007)

Define ATLAS Statistics Standards in contact with Physics Coordination and ATLAS groups. Develop, validate and approve standard statistical tools in close contact with the software group. Represent ATLAS in discussions on statistical issues with CMS. Discuss and make recommendations on statistical issues such as blind analyses, model independent searches, etc.

Mandate: Ongoing activities: ATLAS Statistics Book Higgs combination (different channels & ATLAS + CMS) Software development (RooStat) Sequential analysis

slide-19
SLIDE 19

19

ATLAS Detector Paper

Abstract: This paper describes the ATLAS experiment as installed in its experimental cavern at point 1 at CERN.

Draft 1:

27 July 2007 (300 pages)

missing: Chapter on Performance, chapt. 10

Draft 2:

19 Oct. 2007

paper out:

1 Dec. 2007

  • utline:

1.

Overview, main characteristics

2.

Solenoid and toroid magnet system, magnetic field

3.

Shielding and radiation levels

4.

Inner tracking system

5.

Calorimetry

6.

Muon spectrometer

7.

Hardware aspects of the trigger and data acquisition system

8.

Main features of the infrastructure in the ATLAS cavern

9.

Overview of global performance expected

10.

current status of installation and commissioning, expectations for the ultimate completion of the detector and its operation in 2007 and 2008.

slide-20
SLIDE 20

20

CSC notes

The main goal of CSC is to get ready for data

not to write (80) CSC notes.

The CSC notes are to be produced by the Physics groups using data made for the

Computing System Commissioning

slide-21
SLIDE 21

21

Physics analyis on the GRID

CSC requirement:

analysis on the grid with official Monte Carlo samples

GANGA

utility for job submission on the grid

  • bservations

system has become usable during 2007 for a Tier-3 site, like Innsbruck, configuration details

are different

setting up a Tier-site needs a lot of specialists

can now routinely process 50 k Monte Carlo events

through the software chain (simulation, reconstruction)

takes ~ 1 week on 32 CPUs (Woodcrest Xeon 3 GHz)

producing ~ 300 GB of output

at the moment we prefer to do things locally

slide-22
SLIDE 22

22

Software releases (release 12)

stabilizing in ~1 year → far too long! releases incompatible!

slide-23
SLIDE 23

23

QCD – first measurements (1)

  • 1. Minimum bias events:
  • efficiency ε, fake rate a
  • measure dN/dη, dN/dpT

~80%

σ ~ 50 mb

tracks charged MC primary generated

  • f

# cuts and reco after found tracks MC primary generated

  • f

# = ε

eff. eff. η

plots by A. Moraes

slide-24
SLIDE 24

24

QCD – first measurements (2)

  • 2. Prel. studies on inclusive jet cross section

by P. Francavilla

slide-25
SLIDE 25

25

ATLAS overview week – statement

Impressive progress

  • n the installation and the commissioning
  • f the detector, as well as
  • n the preparation for the data collection,

the distributed analysis and the physics

There are clearly also still

a few major challenges to

  • vercome before we will

be fully ready for LHC

slide-26
SLIDE 26

26

Selected Recent Problems

slide-27
SLIDE 27

27

slide-28
SLIDE 28

28

http://aenews.cern.ch/

Informal news on ATLAS is available in the ATLAS eNews letter at http://aenews.cern.ch/

slide-29
SLIDE 29

29

Bet on the Higgs!