MONARC PEP - Istituto Nazionale di Fisica Nucleare

Download Report

Transcript MONARC PEP - Istituto Nazionale di Fisica Nucleare

1
ATLAS short term grid use-cases
The “production” activities foreseen till
mid-2001 and the tools to be used
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
2
Status of ATLAS computing
• Phase of very active s/w development (ATHENA OO
framework, LHCb similarities)
– Physics TDR completed > 1 year ago (old s/w)
– HLT TDR postponed around end 2001 (new s/w chain
availability)
• No global ATLAS production going on now, but specific
detector, trigger and physics studies are active, and willing
to exploit and test GRID
– TDR and Tile Testbeam Objy federations
– B-Physics in Lund
– Muon barrel trigger (INFN responsibility)
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
3
uuuuuuuu
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
4
Remote use of DB (Objy)
• Master federation soon available at CERN with TDR runs
– First Scenario:
• A user request for a particular run triggers a check for its presence in the
local federation. If it is not present, a grid- enabled transfer of the
corresponding database from the CERN master federation is initiated. As
in GDMP, the database must not only be transferred, but also
appropriately attached to the local Objectivity federation.
– Extension scenario:
• The particular run may not yet be available in the master federation at
CERN, either. A request for such a run migh trigger a job at CERN that
first imports the Zebra version of the data into the master Objectivity
federation, before proceeding as in the scenario above.
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
5
Remote use of DB (Objy)
• In May Testbeam data for Tile Calorimeter will be
available
– Remote subscriptions to data generated at CERN
• A remote institution wishes to "subscribe" to the testbeam data, so that
any new runs added to the master database at CERN are automatically
replicated in a federation at a remote site that has subscribed to these data
• Related and similar DB use-cases may be very naturally
generated on these time-scales, however no resources are
yet commited in ATLAS for this kind of effort
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
6
Use-case: B-physics study
• People involved: Lund University ATLAS group (Chafik Driouichi,
Paula Eerola, Christina Zacharatou Jarlskog, Oxana Smirnova)
• Process: BsJ/  , followed by J/ +– and 
• Main background: inclusive bb J/ X
• Needs sufficient CPU and storage space; so far CERN and Lund
computers are used
• Physics goal: estimate the  angle of the unitarity triangle
• Main goal of this use-case:
– identify weak points in Grid-like environment by direct experience and by
comparing performance with modelling using the MONARC tools
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
7
Generation
• Event generator: Atgen-B
• Signal:
– one job generates 200 000 parton level events, which yields
~400 BsJ/  after Level 1 trigger cuts
– job parameters: ~104 s CPU at a machine equivalent to 10
SpecINT95, 8 MB memory (max.), 10 MB swap (max.), 13 MB
ZEBRA output
– to simulate one day of LHC running at low luminosity:
~1.4·103 BsJ/  events ( Br(BsJ/ )=9 ·10-4 )
• Background:
– estimated nr. of events per day: ~4·105 bb J/ X
• Both signal and background to be stored (on tapes)
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
8
Simulation
• Detector simulation: Geant3 – Dice
• Full detector simulation takes ~15 min CPU at a machine
equivalent to 10 SpecINT95 per event
• Only Inner Detector and Electromagnetic Calorimeter to
be simulated
• Majority of CPU time is spent on simulating the
calorimeters
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
9
Reconstruction and analysis
• Reconstruction:
– from the Athena framework (at the moment, AtRecon is used)
– information from two sub-systems to be stored in the Combined
Ntuple (CBNT)
• Analysis:
– estimation of the efficiencies of J/,  and Bs reconstruction
– acceptance, resolution, tagging, errors on the sin(2) etc.
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
10
Workplan
• Define and understand the available Grid-configuration
and environment (requires interaction with all
counterparts)
– Lund, CERN, NBI, Milan ...
• Generation and simulation (signal and background)
– test and compare different job submission configurations
– compare with the modeling
• Reconstruction, analysis
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
11
Use-case: Barrel Muon Trigger Study
• Main goals:
– finalize the level-1 trigger logic in the barrel;
– optimize the level-2 algorithms in the barrel region and study
their possible extension to  > 1.;
– evaluate the efficiencies of the different trigger levels (also
combining muon and calorimetric triggers) for single muons
and for relevant physics channels;
– estimate the different background contributions to the trigger
rate at various nominal thresholds;
– study the effects of different layouts on system performances;
– prototype (components of) distributed farms;
– evaluate distributed computing models.
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
12
Tasks
• Simulation of single muons for system optimization:
– ~108 events, ~3*109 SpecINT95*sec, ~500 GB disk space.
• Generation and simulation of relevant physics channels
with muons in the final state for different studies, wait for
G4? (DataGrid release 1)
– B0d  J/ K0s, B0d  p+p- and B0d  J/ f for CP violation;
– B0s  D-s p+ for B0s mixing;
– H  4l and pp  Z   for alignment, calibration, overall
performances;
– B+  J/ ()K+ and B0d  J/ ()K*0 for tagging control;
– ~106 events, ~1010 SpecINT95*sec, ~1.5 TB disk space.
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
13
Tasks (2)
• Simulation of background:
– simulate particle fluxes in the cavern;
– ~105 events, 1010 SpecINT95*sec, ~1. TB disk space.
• Production of ”complete” events, wait for GEANT4?
– physics and background events are merged at hit and digit level;
– ~106 events, 5*109 SpecINT95*sec.
• Reconstruction, continuing till TDR
– simulate level-1 trigger data processing;
– apply level-2 reconstruction and selection algorithms;
– ~108 events.
• Analysis, continuing till TDR
– study performances: efficiency, resolution, rates...
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
14
Tools
• Detector simulation:
– GEANT3 based ATLAS simulation program (DICE) exists and
works, GEANT4 coming by end 2001.
• Background simulation:
– FLUKA + GEANT3: particle fluxes integrated over detectors
characteristic times.
• Reconstruction:
– trigger simulation programs running in conventional (ATrig) and
OO (Athena) framework.
• Analysis:
– PAW, ROOT(?) .
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
15
Workplan
• Implement a Globus based distributed GRID architecture
and perform increasingly complex operations:
–
–
–
–
–
–
–
–
–
submit event generation and simulation locally and remotely;
store events locally and remotely;
access remote data (e.g., background events, stored centrally);
(partially) duplicate event databases;
schedule job submission;
allocate resources:
monitor job execution;
optimize performances;
…
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000
16
Sites and Resources
• Sites involved in the different activities:
– simulation of physics events: Rome1, Rome2, Naples, Milan,
CNAF, CERN;
– background simulation: Rome1;
– reconstruction and analysis: Rome1, Rome2, Naples, Milan;
– HLT prototyping: Rome1, Pavia, Rome3 (Release 1)
• Available resources:
– Linux farms (a total of ~50 Pentium III 800 MHz processors);
– >1. TB disc store (>500 GB. on disk servers).
• Manpower:
– ~5 FTE (physicists and computing experts).
L. Perini
DATAGRID WP8 Use-cases
19 Dec 2000