LHCb Computing Status Report DAQ , ECS , Software, Facilities John Harvey

Download Report

Transcript LHCb Computing Status Report DAQ , ECS , Software, Facilities John Harvey

LHCb Computing Status Report
DAQ , ECS , Software, Facilities
John Harvey
CERN / EP
Meeting with LHCC Referees
27 - Nov – 2000
LHCb Trigger/DAQ/ECS Architecture
LHC-B Detector
VDET TRACK ECAL
HCAL MUON
Data
rates
RICH
40 MHz
Fixed latency
4.0 ms
Level 1
Trigger
40 TB/s
1 MHz
Level-0
Timing L0
&
Fast
40 kHz
L1
Control
Front-End Electronics
1 MHz
Front-End Multiplexers (FEM)
Front End Links
Variable latency
<1 ms
RU
Throttle
1 TB/s
Level-1
RU
RU
LAN
Level 0
Trigger
Read-out units (RU)
Read-out Network (RN)
SFC
Variable latency
L2 ~10 ms
L3 ~200 ms
Storage
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
6 GB/s
6 GB/s
SFC Sub-Farm Controllers (SFC)
CPU
CPU
CPU
CPU
Trigger Level 2 & 3
Event Filter
Control
&
Monitoring
50 MB/s
Slide 2
LHCb TFC system
Clock fanout
BC and BCR
LHC clock
 Readout and Throttle Switches
 Readout Supervisor




Functional Specification done
Design in progress
Review scheduled for April ’01
Prototype scheduled for Oct ‘01
TTCrx
L0 trigger
L1
TTCrx
L1 trigger
17
17
L1
L0
Local trigger
(optional)
L0
 Programmable for ‘Partitioning’
 Design reviewed in October ‘00
 Prototypes in February ’01
Readout
Supervisor
Readout
Supervisor
L0 Throttle
switch
Readout
Supervisor
L1 Throttle
switch
TFC switch
L1 trigger system
SD1 TTCtx
SD2 TTCtx
Optical couplers
SDn TTCtx
Optical couplers
Optical couplers
L0 TTCtx
Optical couplers
 TFC system test
TTC system
TTCrx
TTCrx
Control
Control
ADC
TTCrx ADC
ADC
ADC
TTCrx ADC
ADC
L1E
L1E
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
FEchip
FEchip
FEchip
L1
buffer
FEchip
L1
buffer
ADC
ADC
ADC
DSP
ADC
DSP
DAQ
L0E
L0E
FEchip
FEchip
FEchip
FEchip
FEchip
FEchip
ADC
TTCrx ADC
ADC
ADC
TTCrx ADC
ADC
L1E
L1E
FEchip
FEchip
FEchip
L1
buffer
FEchip
L1
buffer
Throttle OR
FEchip
FEchip
FEchip
FEchip
FEchip
FEchip
Control
Control
L0E
L0E
TTCrx
TTCrx
Throttle OR
 Acquire components
 Test L1 broadcast via channel B
 Start June ‘01
L1 TTCtx
ADC
ADC
ADC
DSP
ADC
DSP
DAQ
Slide 3
LHCb Readout Unit
 New version of RU
design (v2)
 less chips, fewer layers,
lower cost
 First prototype in first
week of December ‘00
 Programming of FPGA
code for implementing
readout protocols
underway
 Working and tested
modules expected by
end Jan ’01
 Integration tests will
start Mar ‘01
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 4
LHCb Event Builder
 Studied Myrinet (buffers needed)
 Now focusing on Gbit Ethernet
 Test setup between 2 PCs
CPU
 Use >95% nominal bandwidth for
frames >512 bytes (512 ->230 kHz)
 Can send out frames at frequencies up
to 1.4 MHz for 64 byte frames
NIC
NIC
PCI
CPU
Mem
PCI
PC/Linux
 Frequency of ~100 kHz demonstrated
 EB at Gbit speeds for > ~1kB frames
demonstrated
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
GbE
NIC
Mem
 Implement event building in NIC
PC/Linux
CERN Network
Nic 2 Nic throughput vs. frame size
140
120
Throughput [Bytes/us]
 Tested 1 on 1 event building over a
switch in CMS test bed
 Fixed protocol with RU
 Presented results at DAQ2000
 Now studying flow control in switch
and making a full scale test on CMS
test stand (16 on 16)
 More detailed simulation of a full scale
GbE readout network to be done
GbE
NIC
100
80
60
Data
40
Fit
20
Extrapolation w/o
min frame size
0
1
10
100
Frame size [bytes]
1000
10000
Slide 5
ECS interface to Electronic Modules
 Select a reduced number of solutions
 Support (HW and SW) for the integration of the
selected solutions
Ethernet
Credit
card
PC
JTAG
I2C
Par
 Ethernet to credit card PC– for use in counting
room
 Test board being developed
 Study i/f of CC-PC to Parallel bus, I2C, JTAG
 Test functionality (RESET) and measure noise
 Prototype expected in January ‘01
 Results end of March ‘01
 N.B. two other solutions considered for use in high
level radiation areas
 SPAC + Long Distance I2C/JTAG
 CMS tracker CCU
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Master
Serial
slave
JTAG
I2C
Par
PC
Master
PC
Slide 6
ECS Control System Prototype
 Aim is to distribute with the
SCADA license a framework
where users can easily
implement sub-systems.
 First prototype comprises:
 Configuration rules and
conventions (naming, colors,
etc)
 Tools for Device Integration
 Hierarchical Control &
Partitioning
 Based on Finite State
Machines and SCADA
 Automatic UI Generation
 Based on SCADA
 Plans are to use prototype in
the LHCb test beam
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 7
Online team members
 DAQ







Leaving
In 2000
Beat Jost – project leader
Jean-Pierre Dufey – Readout Network
Marianna Zuin – Readout Network
Richard Jacobsson – TFC
Niko Neufeld – Readout Network
EP/ED group – Readout Unit
Zbigniew Guzik – engineer
CERN staff
CERN staff
technical student
CERN staff
CERN Fellow
CERN staff
Warsaw
 ECS




Clara Gaspar – project leader
Wolfgang Tejessy – JCOP
Richard Beneyton – SCADA in test beam
Sascha Schmeling – SCADA test beam
CERN Staff
CERN Staff
Cooperant
CERN Fellow
Arriving in
2000
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 8
Software Framework - GAUDI
 GAUDI v6 released on Nov 10th
 Enhanced features e.g. event tag collections, XML browser,
detector geometry and event display
 110,000 lines of code, 460 classes
 Good collaboration with ATLAS
 New services by ATLAS – auditors, histograms in ROOT, HEPMC
 Moving to experiment-independent repository
 The most urgent tasks for next release are :




Event model – with subdetector groups
Detector description – with subdetector groups
Conditions database – with CERN/IT
Consolidation and enhancements (code documentation Doxygen)
 Further contributions expected from ATLAS
 Scripting language for interactive work
 HARP, GLAST and OPERA also users of GAUDI
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 9
Software Applications
 GAUDI-based event reconstruction (BRUNEL)





BRUNEL v1 r5 released this month
Physics functionality entirely based on wrapped FORTRAN code
First public release of C++ track fit integrated & tested
Pile-up implemented and spill-over being implemented
Available for production tests
 Migration of detector software to C++
 Progress in all areas - digitisation, geometry description, …
 Tracking - digitisation and pattern recognition almost ready for
public release
 e.g. VELO, CAL ~ complete event model and geometry description
 Current activities reflect TDR schedule
 VELO, MUON (and Tracking) now on hold until after TDR’s produced
 RICH and CAL – new software getting higher priority
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 10
Software Applications – GEANT4
 Developing interface to G4 for GAUDI applications (GiGa)
 Isolates G4 code from GAUDI
 Way to input detector geometry and kinematics to G4
 Handles passing of commands to G4 and retrieval of events from
G4
 More generally Geant 4 physics being tested
 Now - by BaBar, ATLAS, ALICE, Space applications
 Some disagreements with data and G3 seen and being studied
 Plans in LHCb
 Calorimeter – simulation of shower production in prototype and
comparison with existing simulations (G3) and data from
testbeam
 RICH – studying production and detection of Cherenkov photons
in RICH1 using TDR geometry – compare results
 Integration of these developments in GAUDI using GiGa
 Measure performance
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 11
LHCb CORE Software Team
Left
 Pere Mato – project leader
In 2000
 Florence Ranjard – code librarian
 Marco Cattaneo – BRUNEL
 Agnieszka Jacholowska - SICb
 Markus Frank – GAUDI
 Pavel Binko – GAUDI
 Rado Chytracek – GAUDI
 Gonzalo Gracia – GEANT4
 Stefan Probst - GAUDI
 Gloria Corti – GAUDI
 Sebastien Ponce - GAUDI
 Ivan Belyaev (0.5) – GAUDI/GEANT4
 Guy Barrand (0.5) – event display
CERN staff
CERN staff
CERN staff
Orsay
CERN staff
CERN staff
doctoral student
CERN fellow
technical student
CERN fellow
doctoral student
ITEP
ORSAY
Arriving in
2000
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 12
Computing Facilities
 Estimates of computing requirements updated and submitted to
Hoffmann LHC Computing Review
 NT farms being decommissioned at CERN and at RAL
 Migrating production tools to Linux now
 Starting production of 2M B-inclusive events at Liverpool
 Farm of 15 PCs for LHCb use at Bologna early 2001
 Long term planning in INFN going on(location,sharing etc…)
 Farm of 10 PCs to be set up early 2001 in NIKHEF for LHCb use
 Developing overall Nikhef strategy for LHC computing /GRIDs
 Grid Computing in LHCb
 Participation in EU Datagrid project – starts Jan 2001 (3 years)
 Deploy grid middleware (Globus) and develop production application
 Started mini-project between Liverpool/RAL/CERN for testing
remote production of simulation data and transfer between sites
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Slide 13
LHCb Computing Infrastructure Team
 Frank Harris – coordination
 Eric van Herwijnen – MC production/Grid
 Joel Closier – system support/bookkeeping
J. Harvey : LHCb Computing Status Report to LHCC referees 27-Nov-2000
Oxford
CERN staff
CERN staff
Slide 14