Blue Design Template for Slide Presentations

Download Report

Transcript Blue Design Template for Slide Presentations

U.S. Geological Survey
ASPRS LiDAR Calibration and
QA telecon results
ASPRS, 4 May 2011
Greg Stensaas
Remote Sensing Technologies Project Manager
Data Management Branch
USGS/EROS Center
Sioux Falls, SD
U.S. Department of the Interior
U.S. Geological Survey
Background
·
·
·
·
Currently, the LiDAR system calibration is defined by a handful
of parameters. There is an ongoing effort to consistently derive
these parameters for every project.
However, there exists a lacuna in the understanding of their
relationship to the accuracy of the final data and their products
on the ground.
This has severely restricted the ability of local and state
governments to fully leverage the potential of LiDAR data.
The solution to the problem requires a thorough analysis and
definition of the calibration parameters and their effects on
ground accuracy, and the definition of a common process via
ASPRS.
2
Background
·
·
·
Discussion at Fall ASPRS (and many previous
ASPRS presentations and committee meetings)
· on the strong need for common cal/val and QA
processes
· USGS Specification v13 and associated QA/
QC needs
4 monthly telecons and ILMF discussions
LiDAR QA/QC Face-to-Face meeting , Friday May
2, 2011, ASPRS annual conference, Milwaukee
3
Objective
·
The objective of the Face-to-Face meeting was to elaborate on
LiDAR QA/calibration activities of the last few months and assign
tasks and actions. The meeting seek to establish the need and
actions for solving and documenting the LiDAR cal/val and QA
issues, and define how to get it done.
· Agenda:
· Welcome, Introduction, and Purpose of the Meeting
· LiDAR QA/QC issues/problems
· Summarize the previous 4 Telecons
· Discuss LiDAR specifications: Write, review and edit LiDAR QA/calibration terms,
·
·
·
·
processes
The LiDAR Calibration spreadsheet
Call for volunteers for writing, reviewing and editing.
Discuss the format of specifications for the QA/calibration documents. Models include
LAS, etc.
Define the Way Forward
4
Purpose
·
·
·
·
During the telecons and F2F meeting, we have agreed on the
need for a coordinated QA/calibration process for LiDAR.
This effort has resulted in a lot of important discussion that will
hopefully lead to an ASPRS documentation of QA/calibration
processes of LiDAR.
Currently, the members attending the telecon are in the
process of defining the salient terminology, definitions and
concepts required to unambiguously describe the
QA/calibration process.
In this regard, a matrix sheet synthesizing these activities has
been generated, and members are requested to volunteer to
document, review and edit these LiDAR QA/calibration
processes, terms and definitions based on the matrix.
5
Error Process Spreadsheet
Identification
Schematic Illustration of
Manifested Error
What is it that I don't want to see in
What does the error look lik e? How
the data? What do we name this type do I k now it when I see it? What is
of defect? What are the
the math model that describes it?
characteristics of the identified type of
error?
Error Class
Quantification
What level of this effect is acceptable
in the data set? Is it a percentage of
swath? Percentage of another spec
(e.g., elevation spec)? Fixed value?
Measurement
How do I determine what amount of
this identified anomaly is acceptable
in the data? Algorithms? Sample
size (number)? Sample size (area
and shape)? Location of sample?
Number of samples? What is the
nature of the control data required?
Control Features vesus Control Point,
and accuracy of controla data
required.
Reporting
How should the result be reported? A
readable document? Point Cloud
Processor log file or regristry files?
Flight line by flight line basis? Lift by
lift basis? What units should be used
to describe the extent of any given
error type?
latitude error to ground control
longitude error to ground control
elevation error to ground control
roll error
pitch error
heading error
"pop-ups" on reflective striping
scan-to-scan errors due to internal
alignments of laser to scanning device
intra-scan errors due to internal
alignments of laser to scanning device
Spatial
Accuracy
positive-to-negative scan differences
(typically elevation differences between
left-bound and right-bound scan,
typically worst near the edge of swath)
harmonic effects that may go in and
out of phase along the flight line;
normally observed as anomaly from
scan line to scan line
harmonic effects that may go in and
out of phase within a scan; normally
observed as anomaly from scan line to
scan line
etc.
6
Identification
Schematic Illustration of
Manifested Error
What is it that I don't want to see in
What does the error look lik e? How
the data? What do we name this type do I k now it when I see it? What is
of defect? What are the
the math model that describes it?
characteristics of the identified type of
error?
Error Class
Quantification
What level of this effect is acceptable
in the data set? Is it a percentage of
swath? Percentage of another spec
(e.g., elevation spec)? Fixed value?
Measurement
How do I determine what amount of
this identified anomaly is acceptable
in the data? Algorithms? Sample
size (number)? Sample size (area
and shape)? Location of sample?
Number of samples? What is the
nature of the control data required?
Control Features vesus Control Point,
and accuracy of controla data
required.
Reporting
How should the result be reported? A
readable document? Point Cloud
Processor log file or regristry files?
Flight line by flight line basis? Lift by
lift basis? What units should be used
to describe the extent of any given
error type?
consistent radiometry flight line to
flight line
"blooming" or saturation - similar to
over-exposure in photos
"drop-outs" on low-reflectivity surfaces
intensity output calibration - if there is
a desire to output a reflectivity value as
opposed to an intensity value
scan-line-to-scan-line radiometry
variation (stripes of over- or underRadiometric
exposed data)
Accuracy
intra-scan-line radiometric consistency
("feathers" at high-contrast
boundaries)
positive-to-negative scan differences
(typically intensity differences between
left-bound and right-bound scan,
typically worst near the edge of swath)
etc.
average point density
worst-case point density
worst-case cross-track spacing
worst-case along-track spacing
Point Pattern worst-case cross-track:along-track
spacing
ratioor equivalent
max cut-off,
etc.
NOTE: Fugro Horizons paper on point
pattern nominal post spacing,
quantifies deviation from idealized
raster pattern
7
Next Steps
·
·
·
·
·
•
•
Continue monthly telecons, establish working group face
to face meetings
Include additional interested Airborne and Mobile
Mapping LiDAR Sub-committees and PDAD members
Define outline and matrix the work
Provide enhanced work matrix and obtain documents
Data Link ftp://edcftp.cr.usgs.gov/edcuser/stensaas/outgoing/LiDAR
%20Calibration/
Compile input and peer review
Continue to support ASPRS LiDAR QA and calibration
guidelines and best practices.
•
•
Many support groups including work by NGA GWG
Draft by Fall ASPRS
8
Questions?
9
Data Provider Evaluation & Cal/Val
Range Creation

During the research effort,
ranges were prepared as part of
the preparation to support
Sensor Assessment and Data
Provider Evaluation
 Operational Data Provider
evaluation process is now
stopped
 Research Evaluation of
Sensors Only
 Developing Cal/Val Range
Stds. & 5 National Ranges
 Dual use for hi-res ortho &
satellite, & LiDAR cal/val
Large area
Geometric Test
Range
12 Inch
Minnehaha
County
Sioux
Falls
6 Inch
Lincoln
County
3 Inch
USGS Cal/Val Basemap range: hi
res image and LiDAR data
Geometric Targets and Control
10
USGS National Range Locations
Sioux
Falls, SD; Rolla, MO and Pueblo, CO Ranges Completed
 Airy, North Carolina and Rochester, NY Ranges In-Process
11
Terrestrial LiDAR collected by USGS,
Vivian Queija
· USGS EROS by Vivian Queija on June 21-22, 2010
· Sioux Falls, SD on June 23, 2010
· Test data only; interested in point cloud and test
·
range model; what do we need for targets and good
performance testing
Note: Images quick look only and are not fully
processed
12
Front view of the water tower colored with RGB.
Same scan, rotated for back view.
Same scan with points colored by intensity.