SeaDataNet Technical Task Group meeting

Download Report

Transcript SeaDataNet Technical Task Group meeting

SeaDataNet Technical Task Group meeting
JRA1 Standards Development
Task 1.2 Common Data Management Protocol
(for dissemination to all NODCs and JRA3)
• Data quality checking methodology
• Quality flag scale protocol
• Definition of a common tagging
(identification) system
Only the first two are dealt with here
SeaDataNet Technical Task Group meeting
Data quality checking methodology
Work plan
• Review current schemes in place at NODCs
• Review other known schemes (e.g. WG MDM
guidelines, World Ocean Database, GTSPP, Argo,
WOCE, etc.)
• Data types: profile and time series data
• Present first version to kick-off meeting in June
• Revise on the basis of comments/feedback
• Finalise draft document by end of July (β Version)
• Distribute to partners for further feedback
• Finalise document for end of November 2006
SeaDataNet Technical Task Group meeting
Data quality checking methodology
Progress to date
Following QC procedures/documents examined:
• WG MDM Guidelines (covering 12 data types)
• BODC
• SISMER (and MEDATLAS)
• IOC/IODE Programmes (GTSPP and GOSUD)
• World Ocean Database and GODAR
• Argo
To be taken into consideration:
• For metocean data (e.g. currents, waves, met. buoy) suggest use
SIMORC Quality Control document)
• For sea level data suggest use ESEAS and GLOSS documentation
• Feed in information from QARTOD (Quality Assurance of Real Time
Oceanographic Data) at qartod.org
•
Etc, etc,.
SeaDataNet Technical Task Group meeting
Summary list of information required to accompany data:
• Where the data were collected: location (preferably as latitude and
longitude) and depth/height
• When the data were collected (date and time in UTC or clearly
specified local time zone)
• How the data were collected (e.g. sampling methods, instrument
types, analytical techniques)
• How you refer to the data (e.g. station numbers, cast numbers)
• Who collected the data, including name and institution of the data
originator(s) and the principal investigator
• What has been done to the data (e.g. details of processing and
calibrations applied, algorithms used to compute derived
parameters)
• Watch points for other users of the data (e.g. problems encountered
and comments on data quality)
SeaDataNet Technical Task Group meeting
Examples of Automatic QC tests
(based on Argo, derived from GTSPP (IOC Manuals and Guides No. 22):
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Impossible date — Tests for sensible observation date and time values
Impossible location — Tests for sensible observation latitude and longitude values
Position on land — Tests whether the observation position is on land
Impossible speed — Tests for a sensible distance travelled since the previous profile.
Global range — Tests that the observed temperature and salinity values are within the
expected extremes encountered in the oceans.
Regional range — Tests that the observed temperature and salinity values are within the
expected extremes encountered in particular regions of the oceans.
Deepest pressure — Tests that the profile does not contain pressures higher than the
highest value expected for a float.
Pressure increasing — Tests that pressures from the profile are monotonically increasing.
Spike — Tests salinity and temperature data for large differences between adjacent values.
Gradient — Tests to see if the gradient between vertically adjacent salinity and temperature
measurements are too steep.
Digit rollover — Tests whether the temperature and salinity values exceed a floats storage
capacity.
Stuck value — Tests for all salinity or all temperature values in a profile being the same.
Density inversion — Tests for the case where calculated density at a higher pressure in a
profile is less than the calculated density at an adjacent lower pressure.
Sensor drift — Tests temperature and salinity profile values for a sudden and important
sensor drift.
Frozen profile — Tests for the case where a float repeatedly produces the same
temperature or salinity profile (with very small deviations).
SeaDataNet Technical Task Group meeting
Quality flag scale protocol
Work plan
• Review current schemes in place at NODCs
• Review other known schemes (e.g. Argo, GTSPP,
WOCE, MEDATLAS, etc.)
• Data types: profile and time series data
• Present to Kick-off meeting in June
• Revise on the basis of comments/feedback
• Finalise draft document by end of July (β Version)
• Distribute to partners for further feedback
• Finalise document for end of November 2006
SeaDataNet Technical Task Group meeting
Quality flag scale protocol
Progress to date
Preliminary examination of the following schemes:
• BODC
• SISMER and MEDATLAS
• GTSPP
• WOCE (CTD and water bottle)
• WOCE (Surface meteorology)
• WOCE (Floats)
• Argo
• ESEAS
Next slide shows examples of flagging schemes
SeaDataNet Technical Task Group meeting
ESEAS FLAGS
Mandatory
VALUE
0 - no quality control
No
0
NOT CONTROLLED VALUE
1 - correct value
Yes
1
CORRECT VALUE
2 - interpolated value
Yes
2
VALUE INCONSISTENT WITH STATISTICS
3- doubtful value
No
3
DUBIOUS VALUE
4- isolated spike or wrong value
Yes
4
FALSE VALUE
5 - correct but extreme value
No
5
6 - reference change detected
No
VALUE MODIFIED DURING QC (only for profile
headers)
7- constant values for more than a defined
time interval
No
6-8
Not used
9
NO VALUE
8 - out of range
No
MEDATLAS (above) WOCE CTD (below)
9- missing value
Yes
1
Not calibrated.
2
Acceptable measurement.
3
Questionable measurement.
4
Bad measurement.
5
Not reported.
6
Interpolated over >2 dbar interval.
7
Despiked.
8
Not assigned for CTD data.
9
Not sampled.
Flags used to describe data - SISMER
0: No quality control
1: Good
2: Probably good
3: Probably bad
4: Bad
5: Changed
Meaning
SeaDataNet Technical Task Group meeting
Quality flag scale protocol
Conclusions
•
Preliminary review of data quality flagging schemes shows
• small variations on a theme for oceanographic data
• but more complicated and detailed schemes do also exist.
•
The most straightforward solution would be a simple scheme
perhaps comprising the following quality flags:
•
•
•
•
•
Not checked
Good/correct value
Doubtful/suspect value
Bad value
For some organisations this will involve mapping their
schemes to this simple scheme