LCG Architecture Blueprint Seminar
Download
Report
Transcript LCG Architecture Blueprint Seminar
Report on the LCG Applications Area
Torre Wenaus, BNL/CERN
LCG Applications Area Manager
http://cern.ch/lcg/peb/applications
LHCC Meeting
May 22, 2003
Outline
Organization and overview
Implementing the Architecture Blueprint
Personnel
Schedule and planning
Brief project status
POOL, SEAL, PI, Simulation, SPI
External participation, collaboration
Concluding remarks
LHCC meeting, May 22, 2003
Slide 2
Torre Wenaus, BNL/CERN
Applications Area Organisation
decisions
applications
architects strategy
manager
forum
SPI
project
POOL
project
SEAL
project
LHCC meeting, May 22, 2003
PI
project
Slide 3
Simulation
project consultation
Torre Wenaus, BNL/CERN
applications
area
meeting
Management and Communication
Architects Forum
Attendees: architects, project leaders, EP/SFT leader
Good atmosphere, effective, agreement generally comes easily.
No problems so far.
Minutes public after internal circulation
Meetings ~1-2/month
Applications area meeting
25-50 attendees local and remote
Project status, release news, activities of interest (internal and
external), software usage and feedback
Meetings 2-3/month
Many meetings at project and work package level
We promote having them in the afternoon with a phone
connection
LHCC meeting, May 22, 2003
Slide 4
Torre Wenaus, BNL/CERN
Focus on Experiment Need
Project structured and managed to ensure a focus on real experiment
needs
SC2/RTAG process to identify, define (need-driven
requirements), initiate and monitor common project activities in a
way guided by the experiments themselves
Architects Forum to involve experiment architects in day to day
project management and execution
Open information flow and decision making
Direct participation of experiment developers in the projects
Tight iterative feedback loop to gather user feedback from
frequent releases
Early deployment and evaluation of LCG software in
experiment contexts
Success defined by experiment adoption and production
deployment
Substantive evaluation and feedback still to come, as is (of
course) adoption and production deployment
LHCC meeting, May 22, 2003
Slide 5
Torre Wenaus, BNL/CERN
Applications Area Projects
Software Process and Infrastructure (SPI)
(operating – A.Aimar)
Librarian, QA, testing, developer tools, documentation, training, …
Persistency Framework (POOL)
(operating – D.Duellmann)
POOL hybrid ROOT/relational data store
Core Tools and Services (SEAL)
(operating – P.Mato)
Foundation and utility libraries, basic framework services, object
dictionary and whiteboard, math libraries, (grid enabled services)
Physicist Interface (PI)
(operating – V.Innocente)
Interfaces and tools by which physicists directly use the software.
Interactive analysis, visualization, (distributed analysis & grid portals)
Simulation
(launched – T.Wenaus et al)
Generic framework, Geant4, FLUKA integration, generator services …
The set of projects is complete unless/until a
distributed analysis project is opened
LHCC meeting, May 22, 2003
Slide 6
Torre Wenaus, BNL/CERN
Project Relationships
LCG Applications Area
Persistency
(POOL)
Physicist
Interface
(PI)
…
Simulation
Core Libraries & Services (SEAL)
LHCC meeting, May 22, 2003
Slide 7
Torre Wenaus, BNL/CERN
LHC Experiments
Software Process & Infrastructure (SPI)
Other LCG Projects in other Areas
Implementing the Architecture Blueprint
Use what exists: almost all work leverages existing stuff
ROOT, Gaudi/Athena components, Iguana components, CLHEP,
Aida, HepUtilities, SCRAM, Oval, NICOS, Savannah, Boost,
MySQL, GSL, Minuit, gcc-xml, RLS, …
Component-ware: followed, and working; e.g. rapidity of integration
of SEAL components into POOL
Object dictionary: In place and meeting POOL needs; application
now expanding to interactivity
Component bus: both Python environment and its integration with
ROOT/CINT progressing well
Object whiteboard: Still to come
Distributed operation: essentially no activity, still not in scope
ROOT: The ‘user/provider’ relation is working: good ROOT/POOL
cooperation – POOL gets needed mods, ROOT gets
debugging/development input
LHCC meeting, May 22, 2003
Slide 8
Torre Wenaus, BNL/CERN
Domain Decomposition
EvtGen
Algorithms
Engine
Detector
Simulation
Event
Generation
Fitter
Scripting
NTuple
Reconstruction
GUI
Analysis
Interactive
Services
Modeler
Geometry
Event Model
Calibration
FileCatalog
StoreMgr
Dictionary
Whiteboard
Persistency
Scheduler
PluginMgr
Core Services
Monitor
Grid
Services
Foundation and Utility Libraries
ROOT
GEANT4
FLUKA
MySQL
DataGrid
Python
Products mentioned are examples; not a comprehensive list
Project activity in all expected areas
exceptSlidegrid
services
Torre Wenaus, BNL/CERN
LHCC meeting, May 22, 2003
9
Qt
...
Applications Area Personnel Status
Personnel spreadsheets of apps area, EP/SFT and LCG merged into a
single LCG-managed spreadsheet
Consistent and current information on contributions, activities
LCG apps area hires essentially complete
22 working; target in Sep 2001 LCG proposal was 23
Contributions from UK, Spain, Switzerland, Germany, Sweden,
Israel, Portugal, US, India, and Russia
Similar contribution level from CERN (IT and EP/SFT people without
experiment affiliation)
Gathering of people in EP/SFT under John Harvey is working
very well
Similar again from experiments (including EP/SFT with expt
affiliation)
http://lcgapp.cern.ch/project/mgmt/AppManpower.xls
[superseded this week by merged LCG spreadsheet]
LHCC meeting, May 22, 2003
Slide 10
Torre Wenaus, BNL/CERN
Personnel
People
FTEs
22
21.4
Working directly for apps area projects
17
16.6
ROOT
2
2.0
Grid integration work with experiments
3
2.8
IT
4
3.3
EP/SFT not experiment specific
18
16.6
EP/SFT experiment specific
8
5.0
Experiments outside EP/SFT
30
13.2
Total directly working on apps area projects
77
54.6
Overall applications area total
82
59.4
Total LCG hires
Apps area project contributions from
LHCC meeting, May 22, 2003
Slide 11
Torre Wenaus, BNL/CERN
Personnel Sources (FTEs)
18.2
21
LCG
CERN
Experiment
19.9
LHCC meeting, May 22, 2003
Slide 12
Torre Wenaus, BNL/CERN
Current Personnel Distribution
FTE distribution
POOL
SPI
SEAL
PI
Simu
Mgmt
ROOT
Grid
LHCC meeting, May 22, 2003
Slide 13
Torre Wenaus, BNL/CERN
FTEs by Source and Activity
POOL
SPI
SEAL
LCG funded personnel
4.70
5.70
1.10
0.50
4.80
IT personnel
2.30
1.00
0.10
2.50
0.50
8.00
4.50
1.10
3.50
1.20
11.50
7.90
7.10
2.20
EP/SFT non-expt specific
Experiments
Total
PI
Simu
Mgmt
ROOT
Grid
2.00
2.80
0.80
2.00
0.50
1.75
1.00
3.60
14.55
1.80
7.60
LCG apps area project activities
LHCC meeting, May 22, 2003
Slide 14
Torre Wenaus, BNL/CERN
3.30
Comparing reality with the Blueprint resource plan
POOL
SPI
LCG funded personnel
3.90
5.70
IT personnel
3.10
1.00
Total
PI
Simu
Mgmt
ROOT
1.30
0.50
4.80
0.10
2.00
0.90
8.00
0.80
2.00
4.50
1.10
3.30
0.80
1.75
1.20
2.20
11.50
7.90
7.10
2.20
14.55
1.80
6.20
EP/SFT non-expt specific
Experiments
SEAL
2.00
Grid
2.80
0.50
3.30
Comparing the plan from the Blueprint RTAG:
Blueprint plan - present scope
10
7
7.5
4
15
Blueprint plan – full expected
scope
11
7
9.5
12
17
condDB
--
grid
grid, vis
DD
i.e. adding
LHCC meeting, May 22, 2003
Slide 15
Torre Wenaus, BNL/CERN
ie. 13 more
Personnel Resources – Required and Available
Depend on expt ramp to complete
Mar-05
Dec-04
Sep-04
Jun-04
Mar-04
Dec-03
Sep-03
Jun-03
Mar-03
Now
Dec-02
60
50
40
30
20
10
0
Sep-02
FTEs
Estimate of Required Effort
SPI
Math libraries
Physicist interface
Generator services
Simulation
SEAL
POOL
Quarter ending
Blue = Available and pledged effort
Future estimate based on 20 LCG, 16 CERN, 23 experiments
i.e. Present LCG+CERN, and reaching 10 ATLAS, 10 CMS, 3 LHCb
In addition, ALICE contributes 4.5 FTEs via ROOT
LHCC meeting, May 22, 2003
Slide 16
Torre Wenaus, BNL/CERN
Messages in this data
The apps area is getting the mandated job done with the manpower
levels estimated as required by the blueprint RTAG
Manpower is being used efficiently, in the mandated places
In every project, the manpower external to the experiments (LCG, IT,
some EP/SFT) exceeds or equals that contributed by the experiments
Common resources used in common are making these projects
possible
Experiment participation in the projects exceeds 13 FTEs
The experiments are directly invested and participating: our
projects are their projects, and often use their software
Another ~13 FTEs needed to complete the anticipated scope
With much of it (~8 FTEs) grid-related
LHCC meeting, May 22, 2003
Slide 17
Torre Wenaus, BNL/CERN
Contributions to experiment effort
In the 9/2001 LCG proposal, 6 FTEs to the experiments for
integration of experiment applications with the grid
Present status: ~5 FTEs identified (~3 working, ~2 to be hired)
Two full time LCG people in ATLAS and ALICE (2 FTE)
One GDB/ALICE (mostly ALICE) LCG person (~.8 FTE)
EP/SFT person half-time in CMS (0.5 FTE)
Made possible by LCG people filling the hole
Two LCG hires in priority list (INFN,Germany) for LHCb, CMS
(2FTE)
Will assign people from the projects to specific experiments to help
with take-up of LCG software
A fraction, not all of their time
The person may be an experiment person in some cases!
This is in the works now in POOL
LHCC meeting, May 22, 2003
Slide 18
Torre Wenaus, BNL/CERN
+/- 90 Day Milestones
LHCC meeting, May 22, 2003
Slide 19
Torre Wenaus, BNL/CERN
L1 Milestones (1)
On target, except with
serious evaluation of
releases by the
experiments only
recently started, it will
take work to make it
‘production-capable’
Voided by continuing
absence from scope
LHCC meeting, May 22, 2003
Slide 20
Torre Wenaus, BNL/CERN
L1 Milestones (2)
Voided by continuing
absence from scope
LHCC meeting, May 22, 2003
Slide 21
Torre Wenaus, BNL/CERN
Distributed Analysis Status
14 months after launch, there is (almost) no ‘G’ in the LCG applications area
This is disruptive to the project and contrary to expectations at launch
(estimate then: 2002Q4/2003Q1)
50% of our L1 LHCC milestones are out the window
Personnel idling, thanks to the SC2
Have to deflect people interested in collaborating
POOL needs an understanding of its metadata responsibilities; will come
when this gets attention
The LCG has no involvement in grid-enabled/grid-enabling applications.
The present “starting to develop contingency plans” situation with
middleware highlights the fact the LCG should be involved in
applications – the level at which real-world problems are exposed and
workaround needs are identified and can then be acted on
The LCG cannot otherwise meet its responsibility to ensure LHC
computing capability is in place
LHCC meeting, May 22, 2003
Slide 22
Torre Wenaus, BNL/CERN
POOL Schedule Tracking
LHCC meeting, May 22, 2003
Slide 23
Torre Wenaus, BNL/CERN
MS Project Integration – POOL Milestones
LHCC meeting, May 22, 2003
Slide 24
Torre Wenaus, BNL/CERN
Apps area planning materials
Planning page linked from applications area page
Project plans for the various projects
WBS, schedule (milestones & deliverables)
Personnel spreadsheet
Applications area plan document: overall project plan
Incomplete draft
Applications area plan spreadsheet: overall project plan
Merged into overall LCG project spreadsheet
High level schedule, personnel resource requirements
Risk analysis (new)
http://lcgapp.cern.ch/project/mgmt/
LHCC meeting, May 22, 2003
Slide 25
Torre Wenaus, BNL/CERN
Applications Area Risk Analysis
LHCC meeting, May 22, 2003
Slide 26
Torre Wenaus, BNL/CERN
The only current Level 4 risk: licensing
We must have a GPL-type license for applications area
software
A position I have stated, decisively agreed with in
strong statements from all four experiments
We rely on GPL software such as GSL and MySQL
Commercial options on some, such as MySQL, is not a
defensible option in our community
Trends of recent years clearly indicate we should not build
barriers to using GPL software
It is an issue because an agreement with PPARC exists
that says the LCG will use essentially the EDG license,
which is non-GPL
Should be close to being settled (positively)
LHCC meeting, May 22, 2003
Slide 27
Torre Wenaus, BNL/CERN
Persistency Framework (POOL) Project
Dirk Duellmann
To deliver the physics data store for ATLAS, CMS, LHCb
POOL V1.0 released last week
Preparatory to the L1 LHCC milestone next month
Almost ‘feature complete’ with respect to June needs; focus now
on debugging, performance, documentation
On target to meet the June milestone
All functionality asked for by the experiments for the June
production release should be there
Should provide stably supported (1 year) format for data files
Serious experiment trials and feedback for releases has only just started
We will surely uncover bugs and surprises; we hope not major
Expect to be ready to deploy on LCG-1 in July
Initial users: CMS, ATLAS, later LHCb
LHCC meeting, May 22, 2003
Slide 28
Torre Wenaus, BNL/CERN
POOL (2)
The next several months of serious trials will tell: how
close or far are we from a truly usable product
Manpower situation not bad
Temporary shortfalls in manpower from experiments
made up with increased contributions from IT/DB
Effort available as data migration of COMPASS et al is
completed
Dirk not asking for new manpower
Sent SC2 a proposal to bring common conditions database
work into project scope (would be a work package distinct
from POOL)
LHCC meeting, May 22, 2003
Slide 29
Torre Wenaus, BNL/CERN
Core Libraries and Services (SEAL) Project
Pere Mato
Provide foundation and utility libraries and tools, basic framework
services, object dictionary, component infrastructure
Facilitate coherence of LCG software and integration with nonLCG software
Development uses/builds on existing software from experiments (e.g.
Gaudi, Iguana elements) and C++, HEP communities (e.g. Boost)
Basically on schedule, and manpower is OK
Successfully delivering POOL’s needs, the top priority
CLHEP accepted our proposal to ‘host’ the project
Also reflects appeal of SPI-supported services
Math library project incorporated into SEAL as work package
Attention to grid services on hold
LHCC meeting, May 22, 2003
Slide 30
Torre Wenaus, BNL/CERN
SEAL Schedule
Release
Date
Status
Description (goals)
V 0.1.0
14/02/03
internal
Establish
V 0.2.0
31/03/03
public
Essential
V 0.3.0
16/05/03
internal
Improve
V 1.0.0
30/06/03
public
Essential
LHCC meeting, May 22, 2003
dependency between POOL and SEAL
Dictionary support & generation from header files
functionality sufficient for the other existing LCG
projects (POOL)
Foundation library, system abstraction, etc.
Plugin management
functionality required by POOL
Basic framework base classes
functionality sufficient to be adopted by
experiments
Collection of basic framework services
Scripting support
Slide 31
Torre Wenaus, BNL/CERN
Physicist Interface (PI) Project
Vincenzo Innocente
Analysis Services – active
AIDA and its interfaces to ROOT, POOL, SEAL
Improvements to AIDA interface to histograms, tuples
implemented and offered to users for evaluation
Changes make AIDA more amenable to a ROOT implementation,
which is proceeding
Will be the supported LCG implementation of AIDA
Analysis Environment – partly on hold by SC2
Interactivity, visualization, bridge to/from ROOT
Evaluation in progress
Interactivity and ROOT bridge joint work with SEAL
Pool & Grid PI – on hold by SC2
Event & Detector Visualization – on hold by SC2
LHCC meeting, May 22, 2003
Slide 32
Torre Wenaus, BNL/CERN
Simulation Project
Torre Wenaus et al
Activity is ramping up following a work plan approved by the SC2 in
March
Principal development activity will be a generic simulation
framework
Expect to build on existing ALICE work
Incorporates CERN/LHC Geant4 work
FLUKA team participating for framework integration, physics
validation
Simulation physics validation subproject very active already
Generator services subproject starting up under MC4LHC oversight
Shower parameterisation subproject not yet fleshed out
LHCC meeting, May 22, 2003
Slide 33
Torre Wenaus, BNL/CERN
Project Organization
Geant4
Project
FLUKA
Project
Experiment
MC4LHC
Validation
Simulation Project Leader
Subprojects
Framework
Work packages
WP
WP
WP
Geant4
FLUKA
integration
WP
WP
WP
WP
WP
LHCC meeting, May 22, 2003
Slide 34
Physics
Validation
Shower
Param
WP
WP
WP
Torre Wenaus, BNL/CERN
Generator
Services
WP
WP
Simulation Subprojects
Generic simulation framework
Subproject leader: Andrea Dell’Acqua
Geant4
Subproject leader: John Apostolakis
FLUKA integration
Subproject leader: Alfredo Ferrari
Physics validation
Subproject leader: Fabiola Gianotti
Generator services
Subproject leader: Paolo Bartalini
LHCC meeting, May 22, 2003
Slide 35
Torre Wenaus, BNL/CERN
Simulation Project High Level Milestones
2003/6: Decide generic framework high level design,
implementation approach, software to be reused
2003/6: Generator librarian and alpha version of support
infrastructure in place
2003/7: Simulation physics requirements revisited
2003/8: Detector description proposal to SC2
2003/9: 1st cycle of EM physics validation complete
2003/12: Generic simulation framework prototype available with G4
and FLUKA engines
2004/1: 1st cycle of hadronic physics validation complete
2004/3: Simulation test and benchmark suite available
2004/9: First generic simulation framework production release
2004/12: Final physics validation document complete
LHCC meeting, May 22, 2003
Slide 36
Torre Wenaus, BNL/CERN
Software Process and Infrastructure (SPI) Project
Alberto Aimar
All tools and services in place, most in use, some (nightly build system) still
being deployed
Important contributions from experiments (SCRAM, Oval, NICOS, …)
Policies on code standards and organization, testing, documentation almost
complete
Good QA activity, applied mainly to POOL so far
Currently navigating through personnel transitions
Planned transitions of personnel to other projects
Generally with a continuing ‘maintenance’ level of SPI participation
Two new LCG full timers added in Dec/Jan, another expected in June,
also two new participants from EP/SFT
Manpower level OK, we just have to sustain an adequate level
Plan and personnel time profile at the end of this month
Savannah portal a great success with 54 projects, 275 users at present
Used by LCG-App, -GD, -Fabric, 3 experiments, CLHEP
LHCC meeting, May 22, 2003
Slide 37
Torre Wenaus, BNL/CERN
Savannah
LHCC meeting, May 22, 2003
Slide 38
Torre Wenaus, BNL/CERN
Documentation and training
All projects instrumented with code referencing tools (LXR,
dOxygen, viewCVS), but work needed on written documentation
June releases targeted for complete ‘user’ documentation
Documentation requirements being formalized with SPI providing
guidelines and templates
Documentation requirements as prerequisites to a release
Doing the same with testing
Active training program
Very successful ROOT course; second has now been scheduled
SCRAM course ready, public course soon
POOL, SEAL courses will follow, after the June major releases
Starting to record courses for web presentation (Syncomat)
Apps area document registry established
LHCC meeting, May 22, 2003
Slide 39
Torre Wenaus, BNL/CERN
External Participation
Examples:
POOL collections (US)
POOL RDBMS data storage back end (India)
POOL tests (UK)
POOL-driven ROOT I/O development & debugging (US)
SEAL basic framework services (France)
SPI tools – Oval, NICOS (France, US)
Math libraries (India)
Opportunities:
Throughout the simulation project
Several PI work packages
Unit and integration tests
Tried to engage an external group in one; didn’t work
E.g. POOL storage manager, persistency manager, data service
End-user examples
LHCC meeting, May 22, 2003
Slide 40
Torre Wenaus, BNL/CERN
External Participation
Engaging external participation well is hard, but we are
working at it
Problems on both sides
Being remote is difficult
More than it needs to be… e.g. VRVS physical facilities
issue – improving this month, but more improvements
needed and much too difficult to arrange
Remote resources can be harder to control and
fully leverage, and may be less reliably available
We work around it and live with it, because we must
support and encourage remote participation
LHCC meeting, May 22, 2003
Slide 41
Torre Wenaus, BNL/CERN
Collaborations
Examples…
Apart from the obvious (the experiments, ROOT)…
GD: Requirements from apps for LCG-1; Savannah
Fabrics: POOL and SPI hardware, Savannah-Castor
GTA: Grid file access, …
Grid projects: EDG-RLS, EDG testbed contribution,
software packaging/distribution
Geant4
FLUKA
CLHEP hosting
LHCC meeting, May 22, 2003
Slide 42
Torre Wenaus, BNL/CERN
Take-up in the experiments
The real measure of success or failure…
Experiments now actively engaged in evaluation, integration, and
planning deployment
ATLAS integrating POOL and SEAL into experiment framework
and infrastructure
No significant problems so far, but not yet exercising POOL/SEAL
functionality strenuously
CMS evaluated POOL and decided this month to target it for
early (July) deployment in simulation production (‘PCP’ leading
up to DC04)
LHCb will begin this summer
Beginning to define project milestones measuring take-up, tied to
these experiment programs
As mentioned, will assign project people to assist
LHCC meeting, May 22, 2003
Slide 43
Torre Wenaus, BNL/CERN
Coherence and Focus
We are developing the coherent, component based
architecture we are charged to build
Coherence evident in cross-project (POOL, SEAL, PI)
collaboration on defining, developing, deploying
components
Apart from “ROOT already does it all” arguments, apps
area development efforts are not redundant and duplicative
Approach/design is chosen, in some cases after short
explorations of the options, and set as the basis for
work
Efforts that are divergent, redundant, or under
consideration are isolated (in contrib) and not
accounted as project effort (and are few in number)
LHCC meeting, May 22, 2003
Slide 44
Torre Wenaus, BNL/CERN
Concluding Remarks
POOL, SEAL and some PI software is now out there
Take-up is starting
On target for major June POOL/SEAL releases
L1 milestone set a year ago should be met
Manpower is appropriate
is at the level the experiments themselves estimated is
required
is being used effectively and efficiently for the common
good
is delivering what we are mandated to deliver
Apps area is impatient to have grid-related work in scope
LHCC meeting, May 22, 2003
Slide 45
Torre Wenaus, BNL/CERN