lecture-6 Atmospheric effects and corrections1

Download Report

Transcript lecture-6 Atmospheric effects and corrections1

Lecture 6
Atmospheric Effects and
Corrections
Terminology
•Radiant flux
•Irradiance
•Radiance
•Reflection
•Transmittance
Radiance received at a remote sensor
Radiance (LT) from paths 1, 3,
and 5 contains intrinsic valuable
spectral information about the
target of interest.
Conversely, the path radiance
(Lp) from paths 2 and 4 includes
diffuse sky irradiance or radiance
from neighboring areas on the
ground. This path radiance
generally introduces unwanted
radiometric noise in the
remotely sensed data and
complicates the image
interpretation process.
Radiance received at a remote sensor
•Path 1 contains spectral solar
irradiance ( Eo) that was attenuated
very little before illuminating the
terrain within the IFOV.
•We are interested in the solar
irradiance from a specific solar
zenith angle ( θo)
•The amount of irradiance reaching
the terrain is a function of the
atmospheric transmittance at this
angle (Tθo).
•If all of the irradiance makes it to
the ground, then the atmospheric
transmittance equals one. If none
of the irradiance makes it to the
ground, then the atmospheric
transmittance is zero.
Radiance received at a remote sensor
•Path 2 contains spectral diffuse sky
irradiance ( Ed ) that never reaches the
the target study area because of
scattering in the atmosphere.
• This energy is often scattered into the
IFOV of the sensor system.
• Rayleigh scattering of blue light
contributes much to this diffuse sky
irradiance. Hence blue band image
produced by a remote sensor system is
often much brighter than any of the
other bands and contains much
unwanted diffuse sky irradiance that
was scattered into the IFOV of the
sensor system.
• Therefore, if possible, we want to
minimize its effects. This quantity is
referred to as the upward reflectance
of the atmosphere (Edu).
Radiance received at a remote sensor
•Path 3 contains modified
energy from the Sun that has
undergone some Rayleigh, Mie,
and/or nonselective scattering
and perhaps some absorption
and reemission before
illuminating the study area.
• Its spectral composition and
polarization may be somewhat
different from the energy that
reaches the ground from path 1.
• This quantity is also referred to
as the downward reflectance of
the atmosphere (Edd).
Radiance received at a remote sensor
• Path 4 contains radiation that
was reflected or scattered by
nearby terrain covered by snow,
concrete, soil, water, and/or
vegetation into the IFOV of the
sensor system.
• The energy does not actually
illuminate the study area of
interest. Therefore, if possible,
we would like to minimize its
effects.
•Path 2 and Path 4 combine
to produce what is
commonly referred to as
Path Radiance, Lp.
Radiance received at a remote sensor
Path 5 is energy that was also
reflected from nearby terrain into
the atmosphere, but then
scattered or reflected onto the
study area.
Generally insignificant.
Radiance received at a remote sensor
Images are arrays of pixels, where each pixel is
represented by a brightness value or grey level, generally
between 0 and 255. These values are called DNs.
We can determine the radiance at the sensor for any pixel
from its DN value, between 0 and 255:
L pxl  ( DN
pix
 k )  L min
where
k 
L max  L min
DN
max
Lmax and Lmin are maximum and minimum measurable
radiances of the sensor.
k and Lmin are also called gain and offset of the detector.
This information is provided by the sensor manufacturer.
Radiance received at a remote sensor
BAND
1
2
3
4
5
7
Lmin
(W/m2/sr/μm)
-1.5
-2.8
-1.2
-1.5
-0.37
-0.15
Lmax
(W/m2/sr/μm)
152.1
296.8
204.3
206.2
27.19
14.38
Preflight TM-4 and TM-5 spectral range values (from NASA, 1986, Table C-8)
DN value of a pixel in bands 1 and 7 is 100
For band 1, k = (152.1+1.5)/255 = 0.602353
Maximum DN value in both bands is 255
2
L pix = (100 x 0.602353 ) – 1.5 = 58.73 W/m /sr/μm
Radiance at the pixel in band
In 7,band
7??
For 1?
band
k = (14.38+0.15)/255
= 0.05698
Lpix = (100 x 0.05698) – 0.15 = 5.54 W/m2/sr/μm
The irradiance (Esunλ) of the sun in a specific length (λ) at a solar
zenith angle of θ is
 E sun  cos  0
Remote sensing systems sense wavebands, rather than specific
wavelengths. The available irradiance (Eo) in a specific wave band
between λ1 and λ2 in the area of interest is
2

 E sun  cos  0 d 
 E sun   cos  0
or
or

E sun   cos  0
d
1
2
where Δλ = λ2- λ1 is very small and Esun Δ λ is the average irradiance in
the band Δλ. d2 (in AU)accounts for varying distance of earth from the
Sun. If the reflectance of the pixel of interest is R, then the radiant
exitance of the pixel is:
E Pxl   
E sun   cos  0
d
2
R
We know that
2  / 2
E Pxl   

 L Pxl   cos  .Sin  .d  .d    L Pxl    L Pxl   
 0  0
L Pxl   
E sun   cos  0  R
d
2
E Pxl  

Radiance received
at a remote
sensor
However the atmosphere scatters and absorbs a proportion of the solar
irradiance. If the downward scattered or diffused sky irradiance is Edd and Tθo is
the atmospheric transmission, i.e., the proportion of radiance transmitted by
the atmosphere, in the direction θo, then the total irradiance at the pixel =
Radiance received at a
remote sensor
 E

cos  0 
E   ( incident )    sun   2

T
  o   E dd
d



The radiance from the pixel due to this irradiance =
L Pxl   
E   ( incident )  R


 E
sun  
cos  0   T o  E dd d
2

R
d
L Sensor  
2
E sun  
L Path
R is the reflectance. If the atmospheric transmission in the
direction θv is Tθv , then the radiance Lsensor arriving at the
sensor after traversing the atmosphere is:


R

2
L Sensor      E sun   cos  0   T o  E dd d 
2
d

 L path
Where Lpath is path radiance

  T v

L Pxl  


2
L Sensor      E sun   cos  0   T o  E dd d




T
   L path
2
d

R
OR
Radiance received at a
remote sensor
L Sensor    L path
R 

2


E
cos


T

E
d

sun  
0
o
dd
2

 d 
T
            
v

L
pxl

L
L

pxl

L Sensor    L path
T
v


2
   E sun   cos  0   T o  E dd d
pxl


L Sensor    L path
R 

2
 d 
T
v
L Sensor  
E sun  
L Path
Assume .... E dd  negligible
R 
L Sensor    L path
T
Assume .... T
R 

v
d
2
E sun   cos  0 T o
 1 ( is it justified?
)
v
( L Sensor    L path ) d
2
E sun   cos  0 T o
L sensor  Known , E sun    Known ,  0  Known , d  Known
2
T o , L path ??
Atmospheri
c correction
involves
estimation
of T o , L path
L Pxl  
Objectives of atmospheric corrections
The high goal of remote sensing:
To identify the composition of objects on ground from remote sensing data
 Spectral reflectance curves are used for this purpose
 However, radiance-at-the-sensor is contaminated by path radiance due to
the atmosphere, hence spectral reflectance estimated from remote sensing
data are incorrect
 We have to correct the radiance-at-the-sensor to remove atmospheric
effects
When is the atmospheric correction really required??
• Mono-temporal data : NO
• Classification: NO
• Change monitoring and detection: YES
• Composition mapping, spectral analysis: YES
Radiometric calibration
DN
Sensor calibration
• gain and offset
Radiance at sensor
Atmospheric correction
Radiance at ground
Solar and topographic correction
Surface reflectance
• image measurement
• ground measurements
• atmospheric models
• sensor view path atmospheric radiance
• sensor view path atmospheric
transmittance
• solar exo-atmospheric spectral
irradiance
• solar path atmospheric
transmittance
• down-scattered radiance
• solar angle, DEM
Atmospheric corrections: Techniques
• Histogram minimum method aka dark object subtraction – the
bootstrap approach
•
Empirical line method
•
Radiative transfer models – Physical-based approach
Estimation of LP : Dark object subtraction
•Dark-object subtraction techniques derive the corrected DN (digital number) values
solely from the digital data with no outside information.
•This type of correction involves subtracting a constant DN value from the entire digital
image.
•The assumption is that there is a high probability that at least a few pixels within an
image which should be black (0% reflectance). If there are no pixels with zero values,
that is the effect of atmospheric scattering
•For example, there are about 45 million pixels in a single TM band – so there very high
probability that at least one of them should be black.
Estimation of LP : Dark object subtraction
Estimation of LP : Dark object subtraction
Water bodies have 0% reflectance
in the IR region, hence zero DN
Water
absorption
Water
absorption
Non-zero values over water bodies
in the IR consequence of path
radiance.
2090-2350 nm Band 7
1550-1750 nm Band 5
775-900 nm Band 4
LANDSAT
ETM+
BANDS
450-515 nm Band 1
525-605 nm Band 2
630-690 nm Band 3
Subtract the non-zero value over
water bodies from all pixels. That
would make water body perfectly
non-reflecting.
In Visible bands, shadows should
be black in absence of path
radiance.
Hence non-zero values over
shadowed areas can be used for
dark pixel correction.
Estimation of LP : Dark object subtraction
• Histograms of pixel values in all bands
• pixel values of low reflectance areas near zero
• exposures of dark colored rocks
• deep shadows
• clear water
• Lowest pixel values in visible and near-infrared are
approximation to atmospheric path radiance
• Minimum values subtracted from image
Estimation of LP : Dark object subtraction
How will you calculate path radiance for all bands ??
For example, calculate reflectance for a pixel whose DN value is 53 in band 1.
Estimation of LP : Dark object subtraction
How will you calculate path radiance for all bands ??
For example, calculate reflectance for a pixel whose DN value is 53 in band 1.
BAND
1
2
3
4
5
7
Lmin
(W/m2/sr/μm)
-1.5
-2.8
-1.2
-1.5
-0.37
-0.15
Lmax
(W/m2/sr/μm)
152.1
296.8
204.3
206.2
27.19
14.38
Estimation of LP : Dark object subtraction
How will you calculate path radiance for all bands ??
For example, calculate path radiance for a pixel whose DN value is 53 in band 1.
Day of Year Distance Day of Year Distance Day of Year Distance Day of Year Distance Day of Year Distance
1
.98331
74
.99446
152
1.01403
227
1.01281
305
.99253
15
32
46
60
.98365
.98536
.98774
.99084
91
106
121
135
.99926
1.00353
1.00756
1.01087
166
182
196
213
1.01577
1.01667
1.01646
1.01497
242
258
274
288
1.00969
1.00566
1.00119
.99718
319
335
349
365
.98916
.98608
.98426
.98333
R
( L Sensor    L path ) d
E sun   cos  0 T o
2
ETM+ Solar Spectral Irradiances
Band
watts/(m2 * μm)
1
1997
2
1812
3
1533
4
1039
5
230.8
7
84.90
8
1362.
Estimation of LP : Dark object subtraction
Regression technique
• DN values of correlated bands are
plotted
• Least square line fit using standard
regression methods
• Resulting offset is approximation for
the atmospheric path radiance offset
subtracted from image
Empirical line method
One dark (X1) and one bright (X2) object selected
on the image which can be clearly identified on
the ground also
Ground reflectance of X1 and X2 measured using
field radiometer (R X1 and R X2).
Radiance-at-the-sensor of X1 and X2 calculated
from the image (L X1 and L X2).
The two points plotted on a graph, joined by a
line, and the slope (s) and intercept (a) of the
line measured.
Equation of the line derived, used for converting
all radiance values into reflectance values
Ri = Lis - a
R - Reflectance
a - Offset
s - Slope = (Rx1-Rx2)/(Lx1-Lx2)
a
Estimation of LP : Dark object subtraction
Regression technique
Does it always work?
The key criterion of atmospheric correction algorithm ….. Quantify atmospheric influences on satellite image radiometry but at
the same time insensitive to surface reflection effects
Estimation of LP : Dark object subtraction
Regression technique
So how to correct this image?
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
Y. Zhang et al., 2002 (RSE)
 Manually select several clear and hazy area pixels in the image
Two spectral bands are selected based on the following criteria:
• The spectral responses of different land cover types, under clear atmospheric conditions, should
be highly correlated in the two bands. This will result in a well-defined surface response vector in
spectral space called “clear line” (CL)
• The effect of haze should be markedly different in the two bands so that increased atmospheric
contamination manifests in increased shift away from the CL
• Typically we would select blue and red bands
 Apply a transformation whose coefficients define a direction orthogonal to the CL and whose
response magnitude is proportional to the deviation from this line
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
Y. Zhang et al., 2002 (RSE)
Schematic diagram of the TM1 – TM3 spectral space illustrating the conceptual components of the HOT. Under
clear sky conditions, radiances of common surface cover types, coded as A – K, exhibit high correlation and
define a ‘clear line’ (CL). The effect of haze of increasing optical depth, illustrated by the numerical sequences 1
– 18, is to pixels to ‘migrate’ away from the CL. The HOT quantifies the atmospheric contamination level at a
pixel location by its perpendicular distance, in spectral space, from the CL.
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
Y. Zhang et al., 2002 (Rem Sens Env)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
1. Select two correlated bands (bands
showing similar reflectance
characteristics for all objects) but
affected by scattering due to
atmospheric components to different
degrees.
Example: Bands 1 (Blue) and 3 (Red) of
ETM/TM
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
2. Mask out areas with obvious haze
3. Select some very clear areas that are
unaffected by clouds/haze)
Band 3 (Red)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
4. Plot DN (Blue band – X axis) vs DN (Red
band – Y axis) of pixels from clear area
5. Fit the pixel DNs to the clear line generated by
linear regression (slope = α and offset β on x axis.
6. Haze vector is orthogonal to clearline
Band 1 (Blue)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
7. Plot clear line
10. Calculate HOT for all pixels as
the offset of a pixel from the clear
line in the haze vector direction
 ( DN
blue
 offset ) Sin   DN
Or
 DN
blue
Sin   DN
red
red
Cos 
Band 3 (Red)
8. Plot all DN (Blue) v/s DN(Red) for
all pixels on the image
9. Haze vector is orthogonal to clear line,
hence you can identify haze pixels
Cos 
(disregard ing the offset)
β
. ..
.
........... ..
........................ .
........................
.................................
.......................... .
............................
.. ..............................
...........
.
.. ...................
.......
. ... .....
α
Band 1 (Blue)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
11. Generate HOT Image and determine the HOT values for clear
areas and hazy areas
(Not the same image as in the previous slide)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
12. Plot histogram for
different HOT levels for
clear and hazy areas
Clear areas
Increasing HOT
= > Increasing Haze
Haze areas
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
14. Estimate radiometric adjustment
using a method similar to “dark object
subtraction” to normalize the image
to the radiometric level of the clearest
areas.
DN
13. Plot histogram lower bound versus
HOT for bands TM1–TM3
Clear pixel
From Step 13 plot, note that, for Band TM 1 (Blue), the histogram lower bound
for clear pixels (i.e., HOT= 30) is approximately 20 DNs. Consider a hazy pixel
with an observed HOT level of 40. It is a member of a histogram with a lower
bound 27. This implies that this hazy pixel should have its band 1 DN level
reduced by 7 during the radiometric adjustment phase. This procedure can be
used to adjust all bands for which the histogram analysis has been done.
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
Estimation of LP : Haze Removal Algorithm
Haze Optimization Transform (HOT)
Results
Advantages and disadvantages of image-based techniques??
Model-based atmospheric corrections –
Absorption
Transmittance (T):
T 
Transmitte
d radiation
Incident
radiation

I
Io
L
Absorbance (A):
1
A     log
T 
1
  log
10 

T 
or for gases A  ln     ln T
T 
 
Beer’s Law: For monochromatic plane-parallel light entering a medium perpendicular to the surface of
the medium:
c - molar concentration; L- light path length, and
A   cL
α - molar absorption coefficient for the medium
10
T
1
If L  1 cm , then A   c or c  A / 
A   cL and A   ln T  T  e
Hence Beer’s law can be used to
estimate concentrations
  cL
Molar absorption coefficient is sometimes called molar extinction coefficient – however, this is only in the
idealized case when scattering is zero.
Model-based atmospheric corrections
Molar absorption coefficient, extinction coefficient and attenuation coefficnet
T e
  cL
=> in the absence of scattering.
However, transmittance is function of absorptance + scattering, hence we need to define a new
term, called “optical depth (or optical thickness, τ) ”, as a measure of transmittance.
Optical depth is defined as the negative natural logarithm of the fraction of radiation that is not
scattered or absorbed on a path.
Hence optical depth is dimensionless, and in particular is not a length, though it is a
monotonically increasing function of path length, and approaches zero as the path length
approaches zero.
Hence, optical depth is conceptually
 I 

analogous to absorbance, but not the same.
   ln     ln T  T  e
Includes both absorbance and scattering
 IO 
   cL
ε – extinction (or attenuation) coefficient
Optical thickness or Optical depth
Optical thickness (δ) has three components:
   Molecular
 scattering
  aerosol   Molecular
 absorption
Optical thickness due to molecular scattering by atmospheric gases
• Mainly affects shorter wavelengths
Optical thickness due to molecular absorption by atmospheric gases
• Mainly due to 7 gases:
water vapour (H2O), carbon dioxide (CO2), ozone (O3), nitrous oxide (N2O), carbon monoxide (CO),
methane (CH4) and oxygen (O2)
• Water vapour absorption is significant and varies with time and space.
Optical thickness due to atmospheric aerosol
Aerosol scattering is significant and varies with time and space.
Optical depth due to
molecular absorption by
atmospheric gases
Optical depth due to molecular scattering and absorption by
atmospheric gases
Rayleigh
Scattering
Strong water vapor bands are located near 1.38 and 1.88
micron. No signals are detected under clear sky
conditions.
Strong water vapor bands are
located near 1.38 and 1.88
micron. No signals are
detected under clear sky
conditions.
Radiance spectrum
over a pixel (mineral
Kaolinite)
Reflectance
spectrum of
Kaolinite after
atmospheric
corrections
Ground pixel reflectance is given by:
R
( L Sensor    L path ) d
2
E sun   cos  0 T o
What is known?
• Sun-earth distance
• Radiance at the sensor
• Zenith angle
• Incoming solar spectral irradiance
What is unknown?
• Path radiance
• Transmittance
Radiative transfer codes are used to estimate the unknowns
Radiative Transfer
• The physical phenomenon of energy transfer in a medium
• In our case, it refers to electromagnetic radiation in the atmosphere
• The propagation of the radiation through the atmosphere is affected by
the processes of absorption and scattering, as well as atmospheric
emissions
• The equations of radiative transfer describe the interactions
mathematically
However, we need not worry about radiative transfer equations – leave that to
atmospheric physicists - because computer codes are available that can model
the atmospheric transmission of solar radiation using radiative transfer
equations. However we need to worry about providing input variables to the
equations.
Radiative Transfer codes
• MODTRAN (Moderate resolution atmospheric transmission)
• 6SV (Second Simulation of a Satellite Signal in the Solar Spectrum)
• DISORT (Discrete Ordinates Radiative Transfer Program for a Multi-Layered
Plane-Parallel Medium)
Radiative transfer codes simulate the Path Radiance (Lpath) and atmospheric
transmittance (Tθ), based on user provided values for different atmospheric
parameters.
Atmospheric correction algorithms
Atmospheric correction algorithms are used for estimating the values of
atmospheric parameters, based on a user provided inputs.
Main algorithms are (all commercial)
•ATmospheric CORrection (ATCOR) : PCI Geomatica, ERDAS
• ATmosphere REMoval (ATREM)
• Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes (FLAASH)
(ENVI)
Flow chart of radiative-transfer based models
AC algorithm
Key atmospheric
parameters
AC - Atmospheric correction
RT - Radiative Transfer
RT Code
LUTs
AC algorithm
Atmospheric parameters to be input to Radiative Transfer
codes for simulation of path radiance and transmittance
INPUTS TO RADIATIVE TRANSFER MODELS:
• Solar azimuth 
• Location 
• Wavelength (bands) 
• Ground elevation 
• Sensor view angle 
• Atmospheric optical depth
WITH ABOVE PARAMETERS, RADIATIVE TRANSFER MODELS SIMULATE PATH RADIANCE AND TRANSMITTANCE FOR
ESTIMATING SURFACE REFLECTANCE
Estimation of atmospheric optical depth
Atmospheric optical depth A. Due to molecular scattering by gases
B. Due to molecular absorption by gases
C. Due to absorption and scattering by aerosols
Since optical depth is a function of concentration:
   cL
We need to estimate the concentration of the seven important gases (O2, O3, N2O, CO2, CO,
CH4, H2O) + aerosols in the direction of solar incidence.
PROBLEM: Atmosphere is not homogenous, vertically or horizontally: Not realistically possible
to measure concentrations of all of the above gases and aerosols over the entire atmospheric column
(at least up to 100 km height) for every pixel.
PRACTICAL SOLUTION USED IN ALL ALGORITHMS:
• Define standard atmospheres
• mid-latitude summer atmosphere
• US standard atmosphere 1976
• standard tropical atmosphere
• desert tropical (arid) atmosphere
• fall (autumn) atmosphere
• mid-latitude winter
• subarctic winter
• Vertically profile different standard atmospheres at a number of locations for:
• air pressure,
• air temperature,
• Gaseous concentrations (except H2O)
• τ due to aerosols?
• τ due to water vapour?
The unkown parameters …..
• Aerosol type and concentration
• Water vapour concentration
Water vapour concentration
Must have bands in one of the following bands:
•1050-1210 nm (for the 1135 nm water feature)
•870-1020 nm (for the 940 nm water feature)
•770-870 nm (for the 820 nm water feature)
The band depth can be used to estimate the water vapor content (pixel wise)
Aerosol : nature and concentration
δAerosol≈εAerosolCAerosol
• For εAerosol
• Estimate from visibility:
• Get the user input for visibility
• Use Koschmieder equation (VIS = 3.912/ε) to estimate extinction coefficient from visibility
(Aerosol optical depth not critical if the visibility is high, that is >40 km)
• For CAerosol
• Concentration difficult to estimate for every atmospheric condition, therefore standard types are used:
- rural, urban, desert, maritime
• The concentration of aerosols measured for different visibility ranges and different aerosol types, and are stored in
lookup tables
• Typical user inputs values are:
•sensor type (LANDSAT/ASTER/etc etc)
•solar azimuth
• sensor viewing angle
• Latitude-Longitude
• standard atmosphere in the image
• visibility
• aerosol type
• average ground elevation
• Water vapour absorption channels (if the sensor type is provided, this is taken from the header file)
ATCOR-2 LUT derived using MODTRAN
The catalogue (LUT) consists atmospheric correction functions for:
1.
Different standard atmospheres (altitude profile of pressure, air temperature, gases concentration)
• mid-latitude summer atmosphere
2.
3.
4.
5.
6.
7.
• US standard atmosphere 1976
• standard tropical atmosphere
• desert tropical (arid) atmosphere
• fall (autumn) atmosphere
• mid-latitude winter
• subarctic winter.
Different aerosol types: rural, urban, desert, maritime
Different aerosol concentrations (aerosol optical depth) defined by the visibility. The range provided is 5-40 km, calculated
values are: 5, 7, 10, 15, 23, 40 km. Values for 4 and 80 km are obtained by linear extrapolation. The conditions range from
hazy to very clear.
Water vapour concentrations (calculated from absorption bands depths – optionally user defined)
Different ground elevations ranging from 0 to 1 km (calculated values are for 0, 0.5, and 1km ASL; other values interpolated.)
Solar zenith angles ranging from 0o - 70o in the steps of 10o
Different functions for each sensor and each band - the atmospheric correction functions depend on the spectral response of
the sensor, thus there are different functions for each sensor and each band
The above parameters can be specified by the user, or are read from the image header.
For illustration - in ATCOR-2, the number of entries in the look-up tables for the six reflective bands of Landsat TM is about 9000, i.e.
12 x 7 x 6 x 3 x 6 = 9072, including 12 atmospheres, 7 solar zenith angles, 6 visibilities, 3 ground elevations, and 6 bands.
Measured atmospheric data can also be used to calculate new files of look-up tables for the catalogue.
After estimation of path radiance, global flux and atmospheric transmission, apply the following equation to
derive surface reflectance
 { d ( c 0  c1 * DN )  L path }
2
R
T o E g
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
The bidirectional reflectance distribution function (BRDF) is a theoretical concept that describes
the relationship between
1) the geometric characteristics of the solar irradiance, and
2) the remote sensing system viewing geometry;
hence the bidirectional terminology (Sandmeier, 1996; Jensen, 2000)
f ( i ,  i ;  r ,  r ;  ) 
dL ( i ,  i ;  r ,  r ;  )
dE ( i ,  i )
Bidirectional Reflectance Distribution Function
Very difficult to acquire BRDF information about a surface because
a) the Sun is constantly moving across the sky, and/or
b) it is difficult to acquire multiple images of the terrain from various angles of view in a short
period of time.
This problem resulted in the invention of the goniometer; a specialized instrument that measures
spectral reflectance in a specified number of directions distributed throughout the hemisphere above a
particular surface in a very short time (5-10 minutes), allowing scientists to generate a useful BRDF
for that surface.
Bidirectional Reflectance Distribution Function
Bidirectional reflectance factor (R)
R ( i ,  i ;  r ,  r ;  ) 
dL ( i ,  i ;  r ,  r ;  )
dL ref ( i ,  i ;  r ,  r ;  )
dLr is the energy reflected from a surface in a specific direction divided by the radiance dLref ,
reflected from a loss-less Lambertian reference panel measured under identical illumination
geometry. The Rref is a calibration coefficient determined for the spectral reflectance panel used.
The bidirectional reflectance factor (R) is then normalized to an anisotrophy factor (ANIF) to analyze
the spectral variability in BRDF data.
The ANIF is calculated by normalizing bidirectional reflectance data R to nadir reflectance, Ro
using the equation (Sandmeier et al., 1998a; Sandmeier and Itten, 1999)
ANIF ( i ,  i ;  r ,  r ;  ) 
R ( i ,  i ;  r ,  r ;  )
Ro ( )
Bidirectional Reflectance Distribution Function
(Jensen and Schill, 2000)
Atmospheric
absorption
Zenith angle (Sun
zenith angle 75
degrees)
-75o
Reflectance
-45o
-15o
0o
15o
45o
75o
Wavelength
Bidirectional Reflectance Distribution Function
(Jensen and Schill, 2000)
Atmospheric
absorption
Azimuth
angle
Reflectance
0o
30o
60o
90o
150o
270o
Wavelength
Bidirectional Reflectance Distribution Function
Bidirectional reflectance factor (R)
An understanding of BRDF is needed in remote sensing to correct for Sun
illumination angle and sensor viewing angle effects for • Mosaicking images,
• Deriving albedo,
• Improving land cover classification accuracy,
• Enhancing cloud detection, and
• Correcting for atmospheric conditions
• To identify bands that are least impacted by BRDF, recognize optimal sun/sensor
angle-of-views
(Myneni et al., 1995; Woodcock et al., 1997).
Bidirectional Reflectance Distribution Function
The accurate computation of BRDF required for:
• Making corrections to reflectance measurements of features measured
from nadir or off-nadir pointing remote sensing systems.
• To identify bands that are least impacted by BRDF, recognize optimal
sun/sensor angle-of-views, and provide insight into radiometrically
adjusting remotely sensed data to minimize BRDF effects.