Quality, Uncertainty and Bias Representations of Atmospheric Remote Sensing Information Products Peter Fox, and … others Xinformatics 4400/6400 Week 12, April 22, 2014
Download ReportTranscript Quality, Uncertainty and Bias Representations of Atmospheric Remote Sensing Information Products Peter Fox, and … others Xinformatics 4400/6400 Week 12, April 22, 2014
Quality, Uncertainty and Bias Representations of Atmospheric Remote Sensing Information Products
Peter Fox, and … others Xinformatics 4400/6400 Week 12, April 22, 2014
reading
• Audit/ Workflow • Information Discovery – Information discovery graph(IDG) – Projects using information discovery – Information discovery and Library Sciences – Information Discovery and retrieval tools – Social Search • Metadata – – – http://en.wikipedia.org/wiki/Metadata http://www.niso.org/publications/press/UnderstandingMetada ta.pdf
http://dublincore.org/ 2
Acronyms
AOD Aerosol Optical Depth MDSA Multi-sensor Data Synergy Advisor MISR Multi-angle Imaging Spectro-Radiometer MODIS Moderate Resolution Imaging Spectro-radiometer OWL Web Ontology Language REST Representational State Transfer UTC Coordinated Universal Time XML XSL eXtensible Markup Language eXtensible Stylesheet Language XSLT XSL Transformation
Where are we in respect to
the
data challenge?
“The user cannot
find
the data; If he can find it, cannot
access
it; If he can access it, ; he doesn't know
how good
they are; if he finds them good, he can not
merge
them with other data”
The Users View of IT, NAS 1989 4
PROBLEM STATEMENT Data quality is an ill-posed problems because
It is not uniquely defined It is user dependent It is difficult to be quantified It is handled differently by different teams It is perceived differently by data providers and data users
User question: Which data or product is better for me?
QUALITY CONCERNS ARE POORLY ADDRESSED
Data quality issues have lower priority than building an instrument, launching rockets, collecting/processing data, and publishing papers using the data. Little attention on how validation measurements are passed from Level 1 to Level 2 and higher as it propagates in time and space.
USERS PERSPECTIVE There might be a better product somewhere but if I cannot easily find it and understand it, I am going to use whatever I have and know already.
(Some) Facets of Quality
• Accuracy: closeness to Truth – Bias: systematic deviation – Uncertainty: non-systematic deviation • Completeness: how well data cover a domain – Spatial – Temporal • Consistency – Spatial: absence of spurious spatial artifacts – Temporal: absence of trend, spike and offset artifacts • Resolution – Temporal: time between successive measurements of the same volume – Spatial: distance between adjacent measurements • Ease of Use • Latency: Time between data collection and receipt
Pretend you’re a museum curator...
...and you’re putting together an exhibit on wildfires with some cool satellite data
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Museum Curator
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Museum Curator Poll
You’re an operational user and...
...you want to use satellite wildfire data to direct HotShot team deployments
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Operational User / HotShot
You’re an operational user and...
...you want to use satellite wildfire data to estimate burn scar areas for landslide prediction
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Which data quality facet is most important to you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Operational User / Landslide
You’re an ecology researcher and...
...you want to use satellite wildfire data to predict extinction risk of threatened species
Which data quality facet is least important to
you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Which data quality facet is least important to
you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Ecology Researcher
You’re a remote sensing researcher...
...you want to perfect an algorithm to detect and estimate active burning areas at night with visible and infrared radiances
Which data quality facet is least important to
you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Which data quality facet is least important to
you?
A – Accuracy B – Resolution
(spatial and/or temporal)
C – Completeness
(spatial and/or temporal)
D – Latency E – Ease of Use
Remote Sensing Researcher...
Giovanni Earth Science Data Visualization & Analysis Tool • Developed and hosted by NASA/ Goddard Space Flight Center (GSFC) • Multi-sensor and model data analysis and visualization online tool • Supports dozens of visualization types • Generate dataset comparisons • ~1500 Parameters • Used by modelers, researchers, policy makers, students, teachers, etc.
19
The Old Way:
Pre Find data Science Retrieve high volume data Learn formats and develop readers Extract parameters Perform spatial and other subsetting Identify quality and other flags and constraints Perform filtering/masking Develop analysis and visualization Accept/discard/get more data (sat, model, ground-based) DO Exploration SCIENCE Initial Analysis Use the best data for the final analysis Derive conclusions Write the paper Submit the paper Jan
Giovanni Allows Scientists to Concentrate on the
Science
Web-based Services:
The Giovanni Way:
Feb Mar Apr May Jun Jul Aug Sep Minutes Read Data Extract Parameter Filter Quality Subset Spatially Reformat Reproject Days for exploration Use the best data for the final analysis Derive conclusions DO Write the paper Submit the paper SCIENCE Visualize Explore Analyze
Web-based tools like Giovanni allow scientists to
compress
the time needed for pre science preliminary tasks:
data discovery, access, manipulation, visualization, and basic statistical analysis
.
Scientists have
more time to do science!
Oct
EXPECTATIONS FOR DATA QUALITY What do most users want?
Gridded data (without gaps) with error bars in each grid cell
What do they get instead?
Level 2 swath in satellite projections with poorly defined quality flags Level 3 monthly data with a lot of suspicious aggregations and standard deviation as an uncertainty measure (fallacy) – Standard deviation mostly reflects the variability within the grid box.
Little or no information on sampling (Level 3).
The effect of bad quality data is often
not
negligible
Hurricane Ike, 9/10/2008
Total Column Precipitable Water Quality kg/m 2 Best Good Do Not Use
Data Usage Workflow
Data Discovery Assessment Access Manipulation Visualization Analyze 23
Data Usage Workflow
Data Discovery Assessment Access Manipulation Visualization Analyze Subset / Constrain Reformat Re-project Filtering Integration 24
Data Usage Workflow
Precision Requirements Integration Planning Intended Use Quality Assessment Requirements Data Discovery Assessment Access Manipulation Visualization Analyze Subset / Constrain Reformat Re-project Filtering Integration 25
Challenge
• Giovanni streamlines data processing, performing required actions on behalf of the user –
but
automation amplifies the potential for users to generate and use results they do not fully understand • The assessment stage is integral for the user to understand fitness-for-use of the result –
but
Giovanni did not assist in assessment • We were challenged to instrument the system to help users understand results 26
Producers Consumers Quality Control Fitness for Purpose Trustee Quality Assessment Fitness for Use Trustor 27
•
Definitions – for an atmospheric scientist
• Quality – Is in the eyes of the beholder – worst case scenario… or a good challenge
Uncertainty
– has aspects of accuracy (how accurately the real world situation is assessed, it also includes bias) and precision (down to how many digits) 28
Quality Control vs. Quality Assessment
• Quality Control (QC) flags in the data (assigned by the algorithm) reflect “happiness” of the retrieval algorithm, e.g., all the necessary channels indeed had data, not too many clouds, the algorithm has converged to a solution, etc.
• Quality assessment is done by analyzing the data “after the fact” through validation, intercomparison with other measurements, self-consistency, etc. It is presented as bias and uncertainty. It is rather inconsistent and can be found in papers, validation reports all over the place.
Definitions – for an atmospheric scientist
• Bias has two aspects: – Systematic error resulting in the distortion of measurement data caused by prejudice or faulty measurement technique – A vested interest, or strongly held paradigm or condition that may skew the results of sampling, measuring, or reporting the findings of a quality assessment: • Psychological: for example, when data providers audit their own data, they usually have a bias to overstate its quality.
• Sampling: Sampling procedures that result in a sample that is not truly representative of the population sampled. (Larry English) 30
Data quality needs: fitness for use
• • • • – –
Measuring Climate Change:
Model validation
:
gridded contiguous data with uncertainties
Long-term time series
:
bias assessment
is the must , especially sensor degradation, orbit and spatial sampling change –
Studying phenomena using multi-sensor data: Cross-sensor bias
is needed – –
Realizing Societal Benefits through Applications:
Near-Real Time for transport/event monitoring
- in some cases,
coverage and timeliness
might be more important that accuracy
Pollution monitoring
(e.g., air quality exceedance levels) –
accuracy Educational
(users generally not well-versed in the intricacies of quality; just taking all the data as usable can impair educational lessons) –
only the best products
Level 2 data
32
• Swath for MISR, orbit 192 (2001)
Level 2 data
33
Level 3 data
34
MODIS Same parameter
MODIS vs. MERIS
Same space & time MERIS Different results – why?
A threshold used in MERIS processing effectively excludes high aerosol values.
Note: MERIS was designed primarily as an ocean-color instrument, so aerosols are “obstacles” not signal.
Spatial and temporal sampling – how to quantify to make it useful for modelers? MODIS Aqua AOD July 2009 MISR Terra AOD July 2009 • • • Completeness: MODIS dark target algorithm does not work for deserts Representativeness: monthly aggregation is not enough for MISR and even MODIS Spatial sampling patterns are different for MODIS Aqua and MISR Terra: “pulsating” areas over ocean are oriented differently due to different orbital direction during day-time measurement
Cognitive bias
Three projects with data quality flavor • Multi-sensor Data Synergy Advisor
–
Product-level
Quality: how closely the data represent the actual geophysical state
• Data Quality Screening Service
–
Pixel-level
Quality: algorithmic guess at usability of data point – Granule-level Quality: statistical roll-up of Pixel-level Quality
• Aerosol Statistics
–
Record-level
Quality: how consistent and reliable the data record is across generations of measurements 37
Multi-Sensor Data Synergy Advisor (MDSA)
•
Goal
: Provide science users with clear, cogent information on salient differences between data candidates for fusion, merging and intercomparison –Enable scientifically and statistically valid conclusions • Develop MDSA on current missions: – NASA - Terra, Aqua, (maybe Aura) • Define implications for future missions 38
How MDSA works?
MDSA is a service designed to characterize the differences between two datasets and advise a user (human or machine) on the advisability of combining them.
• Provides the Giovanni online analysis tool • • Describes parameter and products Documents steps leading to the final data product • Enables better interpretation and utilization of parameter difference and correlation visualizations. • Provides clear and cogent information on salient differences between data candidates for intercomparison and fusion. • Provides information on data quality • Provides advice on available options for further data processing and analysis. 39
Correlation – same instrument, different satellites
Anomaly
MODIS Level 3 dataday definition leads to artifact in correlation 40
…is caused by an Overpass Time Difference
41
Effect of the Data Day definition on Ocean Color data correlation with Aerosol data
Only half of the Data Day artifact is present because the Ocean Group uses the better Data Day definition!
Correlation between MODIS Aqua AOD (Ocean group product) and MODIS-Aqua AOD (Atmosphere group product)
Pixel Count distribution
Research approach
• Systematizing quality aspects – Working through literature – Identifying aspects of quality and their dependence of measurement and environmental conditions – Developing Data Quality ontologies – Understanding and collecting internal and external provenance • Developing rulesets allows to infer pieces of knowledge to extract and assemble • Presenting the data quality knowledge with good visual, statement and references
Semantic Web Basics
• The triple: { subject predicate object } Interferometer is-a optical instrument Optical instrument has focal length • W3C is the primary (but not sole) governing org. languages – RDF programming environment for 14+ languages, including C, C++, Python, Java, Javascript, Ruby, PHP,...(no Cobol or Ada yet ; ( ) – OWL 1.0 and 2.0 - Ontology Web Language - programming for Java • Query, rules, inference… • Closed World - where complete knowledge is known (encoded), AI relied on this SW promotes this
Ontology Spectrum
Catalog/ ID Thesauri “ narrower term ” relation Terms/ glossary Informal is-a Formal is-a Frames (properties) Selected Logical Constraints (disjointness, inverse, …) Formal instance Value Restrs.
General Logical constraints Originally from AAAI 1999- Ontologies Panel by Gruninger, Lehmann, McGuinness, Uschold, Welty; – updated by McGuinness.
Description in: www.ksl.stanford.edu/people/dlm/papers/ontologies-come-of-age-abstract.html
45
Model for Quality Evidence
46
Data Quality Ontology Development (Quality flag)
Working together with Chris Lynnes’s DQSS project, started from the pixel-level quality view.
Data Quality Ontology Development (Bias) http://cmapspublic3.ihmc.us:80/servlet/SBReadResourceServlet ?rid=1286316097170_183793435_22228&partName=htmltext
Modeling quality (Uncertainty)
Link to other cmap presentations of quality ontology: http://cmapspublic3.ihmc.us:80/servlet/SBRead ResourceServlet?rid=1299017667444_189782 5847_19570&partName=htmltext
MDSA Aerosol Data Ontology Example
Ontology of Aerosol Data made with
cmap
ontology editor http://tw.rpi.edu/web/project/MDSA/DQ-ISO_mapping
Multi-Domain Knowledgebase
Provenance Domain Data Processing Domain Earth Science Domain 51
RuleSet Development
[DiffNEQCT: (?s rdf:type gio:RequestedService), (?s gio:input ?a), (?a rdf:type gio:DataSelection), (?s gio:input ?b), (?b rdf:type gio:DataSelection), (?a gio:sourceDataset ?a.ds), (?b gio:sourceDataset ?b.ds), (?a.ds gio:fromDeployment ?a.dply), (?b.ds gio:fromDeployment ?b.dply), (?a.dply rdf:type gio:SunSynchronousOrbitalDeployment), (?b.dply rdf:type gio:SunSynchronousOrbitalDeployment), (?a.dply gio:hasNominalEquatorialCrossingTime ?a.neqct), (?b.dply gio:hasNominalEquatorialCrossingTime ?b.neqct), notEqual(?a.neqct, ?b.neqct) -> (?s gio:issueAdvisory giodata:DifferentNEQCTAdvisory) ]
Advisor Knowledge Base
Advisor Rules test for potential anomalies, create association between service metadata and anomaly metadata in Advisor KB 53
Assisting in Assessment
Precision Requirements Integration Planning Quality Assessment Requirements Intended Use Provenance & Lineage Visualization Data Discovery Assessment Access Manipulation Visualization Analyze Re Assessment MDSA Advisory Report Reformat Subset / Constrain Filtering Re-project Integration 54
Thus - Multi-Sensor Data Synergy Advisor
•
Assemble
semantic knowledge base – Giovanni Service Selections – Data Source Provenance (external provenance - low detail) – Giovanni Planned Operations (what service intends to do) •
Analyze
service plan – Are we integrating/comparing/synthesizing?
• Are similar dimensions in data sources semantically comparable? (semantic diff) • How comparable? (semantic distance) – What data usage caveats exist for data sources?
•
Advise
caveats regarding general fitness-for-use and data-usage 55
RPI
Semantic Advisor Architecture
…. complexity
57
Presenting data quality to users
• Global or product level quality information, e.g. consistency, completeness, etc., that can be presented in a tabular form.
• Regional/seasonal. This is where we've tried various approaches: – maps with outlines regions, one map per sensor/parameter/season – scatter plots with error estimates, one per a combination of Aeronet station, parameter, and season; with different colors representing different wavelengths, etc.
Advisor Presentation Requirements
• Present metadata that can affect fitness for use of result • In comparison or integration data sources – Make obvious which properties are comparable – Highlight differences (that affect comparability) where present • Present descriptive text (and if possible visuals) for any data usage caveats highlighted by expert ruleset • Presentation must be understandable by Earth Scientists!! Oh you laugh… 59
Advisory Report
• Tabular representation of the semantic equivalence of comparable data source and processing properties comparable input parameters and their semantic equivalence • Advise of and describe potential data anomalies/bias Expert Advisories 60
Advisory Report (Dimension Comparison Detail)
comparable input parameters and their semantic equivalence Expert Advisories 61
comparable input parameters and their semantic equivalence Advisory Report (Expert Advisories Detail)
Quality Comparison Table for Level 3 AOD (Global example)
Quality Aspect MODIS MISR Completeness Total Time Range Local Revisit Time Platform
Terra Aqua
Platform Time Range
2/2/2000-present 7/2/2002-present
Time Range
2/2/200-present
Platform Time Range Revisit Time Swath Width Spectral AOD AOD Uncertainty or Expected Error (EE) Successful Retrievals
Terra Aqua 10:30 AM 1:30 PM Terra 10:30 AM global coverage of entire earth in 1 day; coverage overlap near pole global coverage of entire earth in 9 days & coverage in 2 days in polar region 2330 km AOD over ocean for 7 wavelengths (466, 553, 660, 860, 1240, 1640, 2120 nm ); AOD over land for 4 wavelengths (466, 553, 660, 2120 nm (land) +-0.03+- 5% (over ocean; QAC > = 1) +-0.05+-20% (over land, QAC=3); 15% of Time 380 km AOD over land and ocean for 4 wavelengths (446, 558, 672, and 866 nm) 63% fall within 0.05 or 20% of Aeronet AOD; 40% are within 0.03 or 10% 15% of Time (slightly more because of retrieval over Glint region also)
What they really like!
64
Summary
• Quality is very hard to characterize, different groups will focus on different and inconsistent measures of quality – Modern ontology representations and reasoning to the rescue!
• Products with known Quality (whether good or bad quality) are more valuable than products with unknown Quality.
– Known quality helps you correctly assess fitness-for-use • Harmonization of data quality is even more difficult that characterizing quality of a single data product 65
Summary
• Advisory Report is not a replacement for proper analysis planning – But provides benefit for all user types summarizing general fitness-for-usage, integrability, and data usage caveat information – Science user feedback has been very positive • Provenance trace dumps are difficult to read, especially to non-software engineers – Science user feedback; “Too much information in provenance lineage, I need a simplified abstraction/view ” • Transparency Translucency – make the important stuff stand out 66
Current Work
• Advisor suggestions to correct for potential anomalies • Views/abstractions of provenance based on specific user group requirements • Continued iteration on visualization tools based on user requirements • Present a comparability index / research techniques to quantify comparability 67