What you need to know about large scale structure QuickTime™ and a TIFF (Uncompressed) decompressor are needed to see this picture. Licia Verde University of Pennsylvania www.physics.upenn.edu/~lverde.
Download
Report
Transcript What you need to know about large scale structure QuickTime™ and a TIFF (Uncompressed) decompressor are needed to see this picture. Licia Verde University of Pennsylvania www.physics.upenn.edu/~lverde.
What you need to know
about large scale structure
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
Licia Verde
University of Pennsylvania
www.physics.upenn.edu/~lverde
Outline
1) Motivation and basics
Large Scale Structure probes
2) Real world effects
3) Measuring P(k) & (Statistics)
(spherical cows)
(less spherical cows)
The standard cosmological model
96% of the Universe is
missing!!!
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
The standard cosmological model
96% of the Universe is
missing!!!
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
Major questions :
Questions that can be addressed exclusively
by looking up at the sky
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
1)What created the primordial perturbations?
2) What makes the Universe
accelerate?
These questions may not be unrelated
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
CMB is great and told us a lot, but large scale
structures are still useful:
Check consistency of the model
If this test is passed
Combine to reduce the degeneracies
We will concentrate on dark energy and inflation
On blackboard:
Power spectrum (for DM) definitions
Gaussian random fields
Linear perturbations growth
Transfer function
Primordial power spectrum=A kn
Amplitude of the power law
ln P(k)
slope
A (convention
dependent)
ln k
!
Primordial power spectrum=A kn(k)
Amplitude of the power law
slope
a=dn/dlnk
ln P(k)
A,n (convention
dependent)
ln k
!
CONSTRAINTS ON
NEUTRINO MASS
CDM density
WMAP II
WMAP+high l experiments
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
SDSS main
2dFGRS
LRG SDSS
Neutrino mass
WMAP II
Spergel et al ‘07
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
2dFGRS
CMB+SDSS LRG
Tegmark et al ‘07
0.9eV (95% CL)
SDSS main
Flatness
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
SN1A
Riess et al 04
Q uic kTim e™ and a
TI FF ( LZW) decom pr essor
ar e needed t o see t h is pict ur e .
From Sperget et al 07
WMAPII + H
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
2dfGRS ‘02
WMAPII
How about dark energy?
Planck scale
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
(At EW scale it’s only 56
orders of magnitude)
If it dominated earlier, structures would not have formed
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
And it’s moving fast
What’s going on?
Non exhaustive list of possibilities:
We just got lucky
“landscape” there are many other vacuum energies out
there with more reasonable values
It is a slowly varying dynamical component (quintessence)
Einstein was wrong (we still do not understand gravity)
Quintessence
Equation of state parameter w= p/r
w=-1 is cosmological constant what other options to consider?
clustering?
Couplings?
If dark energy properties are time dependent, so
are other basic physical parameters
Varying fine structure constant alpha
Quic kTime™ and a
TIFF (LZW) decompr es sor
ar e needed to see this picture.
Quick Time™ a nd a
TIFF ( LZW) dec ompr ess or
ar e n eed ed to s ee thi s pi ctu re.
Oklo Natural reactor:
1.8 billion yr ago there was a natural water-moderated fission reactor in Gabon.
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
Isotopic abundances contrain 149 Sm neutron capture cross section ad thus alpha
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
Dark energy
2dfGRS
QuickTime™ and
and aa
QuickTime™
TIFF(LZW)
(LZW) decompressor
decompressor
TIFF
areneeded
neededtotosee
see
this
picture.
are
QuickTime™
andthis
a picture.
TIFF (LZW) decompressor
are needed to see this picture.
H prior
WMAPII
SN
With DE clustering
Why so weak dark energy constraints from CMB?
The limitation of the CMB in constraining dark energy
is that the CMB is located at z=1090.
We need to look at the expansion history
(I.e. at least two snapshots of the Universe)
What if one could see the peaks pattern
also at lower redshifts?
Baryonic Acoustic Oscillations
Evolution of a single
perturbation,
Imagine a superposition
For those of you who think in Real space
Courtesy of D. Eisenstein
If baryons are ~1/6 of the dark matter these baryonic oscillations
should leave some imprint in the dark matter distribution
Fore those of you who think in Fourier space
Matter-radn
QuickTime™ and a
TIFF
(LZW) decompressor
equality
are needed to see this picture.
Acoustic horizon at last scattering
Data from Tegmark et al 2006
DR5
from Percival et al 2006
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
Robust and insensitive
to many systematics
THE SYMPTOMS
Or OBSERVATIONAL EFFECTS of DARK ENERGY
Recession velocity vs brightness of standard candles: dL(z)
CMB acoustic peaks: Da to last scattering
Da to zsurvey
LSS: perturbations amplitude today, to be compared with CMB
Perturbation amplitude at zsurvey
Galaxy clusters number counts
Galaxy clusters are rare events:
P(M,z) oc exp(-d2/s(M,z)2)
In here there is the
growth of structure
d
x
Beware of systematics!
“What’s the mass of that cluster?”
Galaxy clusters number counts
Galaxy clusters are rare events:
P(M,z) oc exp(-d2/s(M,z)2)
In here there is the
growth of structure
d
x
Beware of systematics!
“What’s the mass of that cluster?”
Inflation
V()
H ~ const
Solves cosmological problems (Horizon, flatness).
Cosmological perturbations arise from quantum fluctuations, evolve
classically.
Guth (1981), Linde (1982), Albrecht & Steinhardt (1982), Sato (1981), Mukhanov &
Chibisov (1981), Hawking (1982), Guth & Pi (1982), Starobinsky (1982), J. Bardeen,
P.J. Steinhardt, M. Turner (1983), Mukhanov et al. 1992), Parker (1969), Birrell and
Davies (1982)
Horizon problem
Flatness problem
Structure Problem
Seeing (indirectly) z>>1100
Information about the shape of the inflaton potential is
enclosed in the shape and amplitude of the primordial power
spectrum of the perturbations.
Information about the energy scale of inflation (the height of
the potential) can be obtained by the addition of B modes
polarization amplitude.
In general the observational constraints of Nefold>50
requires the potential to be flat (not every scalar field can be
the inflaton). But detailed measurements of the shape of
the power spectrum can rule in or out different
potentials.
But the spacing of the fluctuations
(their power as a function of scale)
depend on how fast they exited the horizon (H)
Which in turns depend on the inflaton potential
The shape of the primordial power spectrum
encloses information
on the shape of the inflaton potential!
Specific models critically tested
dns/dlnk=0
dns/dlnk=0
r
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
r
n
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
n
QuickTi me™ and a
T IFF (Uncompressed) decompressor
are needed to see thi s pi cture.
Models like V()~p
p=4
HZ
p=2
For 50 and 60 e-foldings
p fix, Ne varies
p varies, Ne fix
Possible probes of large scale structure
Galaxy surveys
Clusters surveys (SZ, thermal and Kinetic)
Lyman alpha surveys
Weak lensing surveys (***)
H21surveys (far future)
Weak lensing (cosmic shear)
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
Very near future:
Atacama Cosmology telescope
(& South Pole telescope, & Planck)
High resolution map of the CMB
Use the CMB as a background light to “illuminate” the
growth of foreground cosmological structures
Coma Cluster Telectron = 108 K
Thermal Sunyaev-Zeldovich
Kinetic SZ
CMB gravitational
Lensing
ee-
ee-
ee-
e-
ee-
Summary
Large-scale structure (LSS) (in combination with CMB)
Can be used to test the consistency of the model
(LCDM) and if that holds, to better constrain cosmology
2 problems:
dark energy, inflation
can be addressed exclusively by looking up at the sky
In the future expect an avalanche of LSS data (and acronyms)
So far we have seen the basic theory behind LSS
Next time: real world effects
Redshift space distortions
Fingers-of -God
Great walls
In linear theory: enhancement of P(k) along the line of sight
Kaiser (1987)
P(k) => P(k)(1+2/3f+1/5f^2)
Redshift-space distortions (Kaiser 1987)
zobs = ztrue + dv / c
dv prop. to m0.6 dr/r = m0.6 b-1dn/n
(bias)
shells
linear
Fourier space
sp
Non-linear
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
Fingers-of -God
Great walls
What’s bias?
What’s bias?
?
Measured for 2dFGRS (Verde et al. 2002)
“If tortured sufficiently,
data will confess to almost
anything”
Fred Menger
Treat your data with respect
(Licia Verde)
Interpretation:
CMBFAST or CAMB to get P(K)
Likelihood analysis
Bayes Theorem:
P(ai | Data) = P(ai ) P( D | ai ) / P( D)
What you really want Prior
(Posterior)
Likelihood
You should not forget
Likelihood: Gaussian vs non-gaussian
What is the probability distribution of your data?
Examples: Cl, alm,
Gaussian likelihood:
L
d
, etc..
1 T 1
=
exp[ x x ]
2
(2 ) n det
1
x = (data theory)
If data uncorrelated… much simpler
Central Limit Theorem distribution will converge to Gaussian
Best fit parameters Maximize the likelihood
Likelihood analysis
Error bars and Confidence Levels
Why errors?
a
true
i
1s
aˆi
2 s
68 .3%
95 .4%
Cosmic variance
3s
99.73%
(ignore approximations, mistakes etc..)
noise
Joint or marginalized?
Errors
From: “Numerical recipes” Ch. 15
Errors
2 ln L
=
2
From: “Numerical recipes” Ch. 15
If likelihood is Gaussian and Covariance is constant
Example: for multi-variate Gaussian
Errors
There is a BIG difference between
2
&
reduced
2
Only for multi varaite Gaussian with constant covariance
Statistical and systematic errors
Examples of statistical (random) errors: cosmic variance, instrumental noise,
roundoff (!)…..
Examples of systematic errors: approximations, incomplete modeling,
numerics, ….
(introduce biases)
As you add more data points (or improve the S/N)
the statistical errors become smaller
but the systematic errors do not.
Errors
Operationally:
Grid-based approach
m
e.g., 2 params: 10 x 10
What if you have (say) 7 parameters?
You’ve got a problem !
s8
Markov Chain Monte Carlo (MCMC)
Random walk in parameter space
At each step, sample one point in parameter space
The density of sampled points
7
FAST: before
10
posterior distribution
likelihood evaluations, now
marginalization is easy:
just project points and recompute their density
Adding external data sets is often very easy
5
10
Operationally:
1. Start at a random location in parameter space:
2. Try to take a random step in parameter space:
3a. If
L L
old
3b. If
L L
old
new
new
a L
new
a L
old
i
new
i
old
Accept (take and save) the step,
“new” “old” and go to 2.
Draw a random number x uniform in 0,1
L
If x
old
L
new
L
If x
old
L
new do not take the step (i.e. save “old”)
KEEP GOING….
and go to 2.
do as in 3a.
“Take a random step”
The probability distribution of the step is the
“proposal distribution”, which you should not change once
the chain has started.
The proposal distribution (the step-size) is crucial
to the MCMC efficiency.
Steps too small poor mixing
Steps too big poor acceptance rate
MCMC
When the MCMC has forgotten about the starting location
and has well explored the parameter space
you’re ready to do parameter estimation.
USE a MIXING and CONVERGENCE criterion!!!
Burn-in
(From Verde et al 2003)
Beware of DEGENERACIES
h
c
c h 2
Reparameterization. e.g., Kososwsky et al. 2002
Once you have the MCMC output:
The density of points in parameter space gives you the posterior distribution
To obtain the marginalized distribution, just project the points
To obtain confidence intervals, - integrate the “likelihood” surface
-compute where e.g. 68.3% of points lie
To each point in parameter space sampled by the MCMC give a weight
proportional to the number of times it was saved in the chain
To add to the analysis another dataset (that does not require extra
parameters) renormalize the weight by the “likelihood” of the new data set.
No need to re-run cmbfast!
warning: if new data set is not consistent with the old one nonsense
Thermal Sunyaev Zeldovich effect
Our Tools
Expansion rate of the universe a(t)
ds2 = dt2+a2(t)[dr2/(1-kr2)+r2d2]
Einstein equation
(å/a)2 = H2 = (8/3) rm + dH2(z)
= (8/3) rm + C exp{dlna [1+w(z)]}
Growth rate of density fluctuations g(z) = (drm/rm)/a
Second oder diff eqn, here.
Poisson equation 2(a)=4Ga2 drm= 4Grm(0) g(a)