Veliki hadronski sudarivač vrhunska tehnologija za vrhunsku znanost Guy Paić Instituto de Ciencias Nucleares UNAM.

Download Report

Transcript Veliki hadronski sudarivač vrhunska tehnologija za vrhunsku znanost Guy Paić Instituto de Ciencias Nucleares UNAM.

Veliki hadronski sudarivač vrhunska tehnologija za vrhunsku
znanost
Guy Paić
Instituto de Ciencias Nucleares
UNAM
Mali podsjetnik
• Kockice tvari koje tvore 5% našeg Svemira
• Ostalih 95% su nepoznati pretplatnici…
g
g
atom
materija
g
e
γ
π
π
jezgra
π
g
pion
Nukleon sa
kvarkovima I
gluonima
O imenu LHC
• Sudarivač: dva paketa čestica koji se kreću
suprotnim putanjama se ukrštaju na mjestu
eksperimenta
• Hadron:čestica koja interagira sa drugim
česticama posredstvom jake sile
Od zamisli do ostvarenja
More…
1993
1990
2008
2007
10.Rujan 2008
~100 m
Rubbia 1991
ACCELERATION OF PARTICLES
• Charged particles influenced by applied electric and magnetic fields
according to the Lorentz force: F = q (E + v  B) = dp/dt
E field → energy gain, B field → curvature
• Simple particle gun: energy gained
for voltage V = 1V is 1 eV = 1.6  10-19 J
Energy per beam of LHC = 7 TeV
(i.e. 7000,000,000,000 eV)
• Use B field to deflect particles in
~ circular orbit, so that they pass
the accelerating gap many times
eg the Cyclotron (1929):
• Vary fields with time to keep
particles inside small beam pipe
→ Synchrotron (since 1950s)
• Dipole magnets used to deflect the particles
Radius r [m] = 3.33 p [GeV] / B [T]
• For the LHC, the machine has to fit in the
existing 27 km tunnel, about 2/3 of which is
used for active dipole field → r ~ 2800 m
So to reach p = 7 TeV requires B = 8.3 T
• Beams focused using quadrupole magnets
By alternating Focusing and Defocusing
quadrupoles, can focus in both x and y views
The LHC has 1232
dipoles
392
quadrupoles
y
N-pole
S-pole
Beam
S-pole
N-pole
x
LHC:
Bunches: 2808
Protons/bunch: 1.15x1011
Proton Energy: 7000 GeV
Stored energy: 360MJ ~100 kg TNT!
RMS bunch length: 7.6 cm
RMS beam size: 17 um
The technological challenges of the LHC demands
breaking new ground in superconductivity, high-speed
electronics, cryogenics, super-computing, vacuum
technology, material science and many other disciplines
O hladjenju
O supravodljivim kablovima
The filaments
•
•
•
Around each filament there is a 0.0005 mm layer of high-purity copper.
1.510 cm broad, the mid-thickness being 1.480 mm, tolerances are only a few
micrometers.
Copper is an insulation material between the filaments in the superconductive
state, when the temperature is below -263C. When leaving the superconductive
state, copper acts as a conductor transferring the electric current and the heat.
Niobium-titanium is a superconducting alloy.
6,426 twisted niobium-titanium
filaments
cable
strand
filaments
WHAT ARE WE LOOKING FOR
The Higgs mechanism
However, the Standard Model is not complete:
Originally formulated for massless particles, but mg = 0, mW,Z  100 GeV
Mechanism of Electroweak symmetry breaking:
Higgs field added to give particles mass
→ existence of neutral scalar particle H
Higgs mass mH is a free parameter
but mH < 1 TeV
• Total cross section at LHC
s(pp → anything) ~ 0.1 barn
• So a 1 pb Higgs cross section
corresponds to one being produced
every 1011 interactions!
(further reduced by BR  efficiency)
• Experiments have to be designed
so that they can separate such a
rare signal process from the
background
• Rate = L  s
where luminosity L (units cm-2s-1) is
a measure of how intense the
beams are
LHC design luminosity = 1034 cm-2s-1
Aproksimativna i naivna slika
Analog stvaranju mase
Zamislimo istu sobu opet
28
glasina
Thanks to
D. Miller
and CERN
©
Photo
CERN
Soon we have a cluster of
people discussing it
Analog higgsovog bozona
Other SM extensions
There are plenty of other candidates for New Physics, including:
• Extra dimensions
– Motivated by attempts to unify SM with gravitation
Gravity only tested on a scale of > 0.1 mm
– Postulate > 4 space-time dimensions
Extra dimensions are curled up with a small radius
– Interactions at the LHC could give gravitons in the final state
→ missing energy, dilepton resonances…
• Miniature Black Holes
– A possible consequence of extra dimensions:
production of microscopic Black Holes
at the LHC
– Expected to decay thermally by Hawking radiation
→ spherical, high multiplicity events
with democratic particle types, lifetime ~ 10-27 s
Beyond the Standard Model
•
Apart from the missing Higgs boson, there are other reasons for thinking
that the Standard Model is not the complete story, including
1. Dark matter
–
Astrophysical measurements of the rotations of galaxies indicate that normal
“baryonic” matter makes up only ~ 4% of the total energy density of the
Universe — what is the rest?
2. Gravity
–
Gravity is not part of the Standard Model
Why is natural scale of gravity mP = √ħc/G ~ 1019 GeV (Planck mass) so much
larger than the Electroweak scale ~ 102 GeV?
Known as the “hierarchy problem”
3. Baryogenesis
–
Why is the world we observe made up almost entirely of matter,
while it is expected that equal quantities of matter and antimatter were
produced in the Big Bang?
Od prof Petkovica – hvala!
H  bb event
The complexity
•
•
•
p-p collision @ √s = 14 TeV
bunch spacing of 25 ns
Luminosity
– low-luminosity: 2*1033cm-2s-1 (first years)
– high-luminosity: 1034cm-2s-1
• ~23 minimum bias events per bunch crossing
• ~1000 charged tracks per event
H  bb event
@ high luminosity
Plus 22
minimum
bias events
How to get the data from the detector?
The detectors will sense the collisions of proton bunches every 25 ns, i.e.
with the frequency of 40 MHz. With 23 pp collisions in every bunch crossing
it means pp collision rate almost 1 GHz. Few GHz is the frequency of current
computer processors, so how it could be possible to collect and elaborate
data from such a huge detector???
One should have in mind, that new beam particles come
to the interaction region with a speed of light, but signals
from the detector move in the cables always slower.
One could therefore expect, that information from the
detector will cumulate inside and sooner or later explode.
.
The solution is quite “human” - to concentrate on the most
interesting events and to forget about all others. This task is
performed by the trigger system. The trigger planned for
ATLAS has three levels and in these three steps reduces the
event rate to about 100 – 200 events per second which are
written to storage media. The size of data from one event is
about 1 MB.
ATLAS superimposed to
the 5 floors of building 40
• Experiments at LHC are
– Big
– Heavy
–
Eiffel tower
~ 7300 tons
36
ALICE magnet
~ 8000 tons
Atlas
ATLAS is about 45 meters long, more than 25 meters high,
and weighs about 7,000 tons. It is about half as big as the
Notre Dame Cathedral in Paris and weighs the same as
the Eiffel Tower or 100 747 jets (empty).
CMS
• Compact Muon Spectrometer
Compact compared to ATLAS, but ~2 heavier: 21
m long, 12,500 tons
Solenoidal
B field
(
Transverse slice through CMS detector
Click on a particle type to visualise that particle in CMS
Press “escape” to exit
LHCb
• A dedicated experiment for the study of B physics at
the LHC
Dipole
B field
Baryogenesis
•
•
•
•
•
Big Bang (~ 14 billion years ago) → matter and antimatter equally produced
Followed by annihilation → nbaryon/ng ~ 10-10
Why didn’t all the matter annihilate (luckily for us)?
No evidence found for an “antimatter world” elsewhere in the Universe
One of the requirements to produce an asymmetric final state (our world)
from a symmetric matter/antimatter initial state (the Big Bang)
is that CP symmetry must violated [Sakharov, 1967]
CP is violated in the Standard Model, through the weak mixing of quarks
For CP violation to occur there must be at least 3 generations of quarks
So problem of baryogenesis may be connected to why three generations exist,
even though all normal matter is made up from the first (u, d, e, ne)
The way to probe CP violation is through the study of quark mixing
In particular, hadrons containing the b-quark show large CP asymmetries
However, the CP violation in the SM is not sufficient for baryogenesis
Other sources of CP violation expected → good field to search for new physics
ALICE
• A Large Ion Collider Experiment
Optimized for the study of Heavy Ion collisions, such as Pb-Pb
lots of fun it is all observational!
The Making of ALICE
•
Pre-History
– early 80’s: Large Hadron-Collider pp machine in LEP tunnel (Lausanne WS)
– 1986: start of heavy ion physics at SPS & AGS (light ions, 16O and 32S)
•
Conceptual Studies
– 1990: RHIC approved for construction at BNL; call for experiments LoI
– 1990: First ideas developed for HI@LHC (LHC WS, Aachen)
•
Conclusion Theory (Convener H. Satz)
– ‘Heavy Ion Collider best possible tool for statistical QCD.
– LHC is unique in many respects’
•
Conclusion Experiment (Convener H.J. Specht)
– ‘A general purpose detector for all observables seemed impossible at LHC. Actually, such a
detector concept could be developed’
– 1992: Expression of Interest (Evian)
•
•
•
1) re-use of modified LEP experiment (Delphi): impossible
2) use of pp experiment (CMS): seemed possible for selected hard signals (mm)
3) dedicated general purpose HI detector => ALICE
Čak i radi!!!
Izazov računarstva
Balloon
(30 Km)
LHC data (simplified)
CD stack with
1 year LHC data!
(~ 20 Km)
LHC data correspond to about
20 million CDs each year
Concorde
(15 Km)
Where will the
experiments store all of
these data?
Mt. Blanc
(4.8 Km)
49
LHC data processing
LHC data analysis requires a computing power equivalent to ~ 70,000 of
today's fastest PC processors
Where will the experiments find such a
computing power?
detector
selection &
reconstruction
reconstruction
Data Handling and
Computation for
Physics Analysis
processed
data
event
summary
data
raw
data
batch
physics
analysis
event
reprocessing
analysis
event
simulation simulation
interactive
physics
analysis
[email protected]
analysis objects
(extracted by physics topic)
Grid @ CERN
•
LHC Computing Grid (LCG) – the flagship project
•
Enabling Grids for E-Science in Europe (EGEE)
•
•
Has started in April 2004 with 70 partners and 32M€ EU funding
•
Will provide the next generation middleware
•
Will run a 24/7 Grid service together with LCG
CERN openlab for DataGrid applications
•
Funded by CERN and Industry
•
Main project: opencluster
•
New project: openlab security (under preparation)
HP HPC Forum 2005
52
High level trigger
HLT overview
ALICE data rates (example TPC)
●
Central Pb+Pb collisions
● event rates:
~200 Hz
(past/future protected)
event sizes:
(after zero-suppression)
● data rates:
●
~75 Mbyte
~ 15 Gbyte/sec
TPC data rate alone exceeds
by far the total DAQ bandwidth
of 1.25 Gbyte/sec
Detector
s
●
●
Event selection
based on
software trigger
Efficient data
compression
DAQ
Mass
storage
HLT
TPC is the largest
data source with
570132 channels,
512 timebins and
10 bit ADC value.
High Level Trigger system
Purpose:
Online event reconstruction and analysis
1 kHz event rate (minimum bias PbPb, pp)
Providing of trigger decisions
Selection of regions of interest within an event
performance monitoring of the ALICE detectors
Lossless compression of event data
Online production of calibration data
DAQ
Mass
Storage
analyzed events /
trigger decisions
raw event
data
HLT
ECS
DCS
Trigger
High Level Trigger system
Hardware Architecture:
PC cluster with up to 500 computing
node
AMD Opterons 2GHz
(dual board, dual core)
8 GByte Ram
2 x 1GBit ethernet
Infiniband backbone
infrastucture nodes for maintaining the
cluster
8 TByte AFS file server
dedicated portal nodes for interaction
with the other ALICE systems
cluster organization matches structure
of the ALICE detectors
Što drugi misle on nama
Which do you think is the greatest wonder of the
modern world? (CNN poll)
The development of
7%
Dubai
The bionic arm
11%
China's Three Gorges
5%
Dam
568
887
362
The Channel Tunnel 4%
286
The CERN particle
accelerator
16%
1230
The World Wide Web47%
3750
France's Millau
viaduct
None of the above
Total Votes: 7900
2%
191
8%
626
Umjesto zaključka
Ogromna tehnološka i znanstvena prilika koju
ne koristimo ni izdaleka koliko trebamo i
možemo