QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATION TECO-2005 C.

Download Report

Transcript QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATION TECO-2005 C.

QUALITY MANAGEMENT, CALIBRATION,
TESTING AND COMPARISON OF
INSTRUMENTS AND OBSERVING SYSTEMS
WMO TECHNICAL CONFERENCE ON METEOROLOGICAL
AND ENVIRONMENTAL INSTRUMENTS AND METHODS
OF OBSERVATION TECO-2005
C. Bruce Baker, NOAA
USA
1
The Backbone
Note the stability
QUALITY MANAGEMENT,
CALIBRATION, TESTING
AND COMPARISON OF
INSTRUMENTS AND
OBSERVING SYSTEMS
2
Functions of an International/National
Backbone
• Infrastructure in Place for Quality measurements
• Collects open access data and provides consistent
quality assurance and control
• Distributes data and information (via multiple paths) in real
time (varies with parameter) and ensures archival
• Abides by national / international standards and
fosters the implementation of standards by local and
regional observing systems
3
Key Components
•
•
•
•
•
•
•
•
Management of Network
Change
Parallel Testing
Meta Data
Data Quality and Continuity
Integrated Environmental
Assessment
Complementary Data
Continuity of Purpose
Data and Meta Data Access
4
VOCABULARY
MANAGEMENT
Documentation, Performance Measures, and Requirements
PROGRAM
POLICY
Determined by International
or National Policy
and Science
Driven Directives
QUALITY
MANAGEMENT
SYSTEM
QUALITY
MANUAL
Personnel, Hardware, Ingest, and
Dissemination
Requirements Documents
QUALITY
CONTROL
Automated, Manual, Maintenance
RESEARCH
Testing, Intercomparisons, Transfer functions
Overlapping Measurements
QUALITY
ASSURANCE
Documented Metadata, Performance
Measures
IMPLEMENTATION
Program Infrastructure
5
• Functional Requirements
– Systems - parameters, ranges, accuracies,
resolutions, expandability, design life,
maintainability
– Program - number of systems, cost and schedule
targets, communications
• Commissioning
– Defines decision point – when data are official
– Sustained operation, data from each site 95% of
the time within one hour and/or successful entry
into the archives within 30 days
6
Configuration Management
• Change management of hardware and
software items, metadata management
responsibilities and procedures for CCB
7
Test and Evaluation Phase
• Conducted by Evaluation Team
• Reviewed by Ad Hoc Science Working
Group
• Six areas Evaluated
–
–
–
–
–
–
Site Selection
Site Installation
Field Equipment and Sensors
Communications
Data Processing and Quality Control
Maintenance
8
5 Components of
Data Quality Assurance (QA)
•
•
•
•
•
•
Laboratory Calibration
Routine Maintenance and In-Field
Comparisons
Automated Quality Assurance
Manual Quality Assurance
Metadata, Metadata, Metadata
Ability to Integrate New technology
9
Laboratory Calibration
• Every sensor is calibrated before
deployment to verify and improve upon the
manufacturer’s specifications
• Sensors are routinely rotated back into the
lab from the field to be re-calibrated
10
Routine Maintenance and
In-Field Comparisons
Site Maintenance Passes
Three visits scheduled
annually
Trouble Ticket or
Emergency Repairs
Malfunctioning Sensor
Lightning Strike
Communication
Problems
Theft and Vandalism
11
Site Maintenance Passes
Sensor Inspection
Air Temperature and Humidity sensors are inspected
for dust accumulation, spider webbing and wasp nests.
The radiation shields of these sensors are also cleaned.
12
Trouble Ticket or Emergency Repairs
Trouble Tickets
•Issued by the Data QA Manager
•Priorities range from 2 to 30
business days (based on sensor)
•QA Manager provides a
description of the problem
•Technicians complete the form
with time of fix, serial numbers of
sensors and a description of the
repairs made
•Technicians may also generate
tickets in the field and submit
them to the QA Manager
13
Quality Assurance of Instruments
Documented in Anomaly Tracking System Users Manual
Reports of Incidences collected, evaluated, maintenance as needed
Metadata records updated
Quality Control Data
Documented in Data Management – Ingest to Access
Data ingest
Tests for proper message form, communication errors, etc.
Automated
Limits - Gross limits check
Variance - Limits for individual parameters
Redundancy - Data inter-comparison relies on multiple sensors
Manual
-- Handbook of Manual Monitoring
14
Metadata Management
Survey to Operations
15
Field Sites
Instrument Suite
Ingest
Processing Unit
Communications
Device
Processing
User
Community
Raw-Data
Archive
Quality Control
Maintenance
Notification
FlaggedData Archive
Internet
offline
Communications
Network
Maintenance
Provider
online
Access
16
Performance Measures
114 CONUS Geographic Locations Required
•
Captures 98% of variance in monthly temperature,
95% in annual precipitation for CONUS.
• Average annual error <0.1ºC for temperature,
<1.5% for precipitation
• Trend “errors” <0.05ºC per decade
• IPCC: projects warming of 0.1-0.3ºC/decade and
precipitation changes of 0–2%/decade for CONUS.
17
Determine the Actual Long-term Changes in Temperature
and Precipitation of the Contiguous U.S. (CONUS)
FY2005 Target: Capture more than 96.9% and 91.1% of the temperature and precipitation trends.
18
RESEARCH
19
11
/1
/2
11 00
/8 3
11 / 20
/1 03
5
11 /20
/2 03
2
11 /20
/2 03
9/
2
12 003
/6
12 / 20
/1 03
3
12 /20
/2 03
0
12 /20
/2 03
7/
20
1/ 03
3/
20
1/
10 04
/2
1/ 00
17 4
/2
1/ 00
24 4
/2
1/ 00
31 4
/2
0
2/ 04
7/
20
2/
14 04
/2
2/ 00
21 4
/2
2/ 00
28 4
/2
0
3/ 04
6/
20
3/
13 04
/2
3/ 004
20
/2
3/ 00
27 4
/2
0
4/ 04
3/
20
4/
10 04
/2
4/ 00
17 4
/2
4/ 00
24 4
/2
00
4
Cumulative Precipitation
Gauge Comparison Sterling, VA
24.00
20.00
Geonor #2
16.00
12.00
8.00
4.00
0.00
Ott-704
Ott-754
Geonor #1
TB#1
TB#2
Frise-C1
Frise-D3
8"S
8"N
Ott-706
Ott-705
8" Std
8" DFIR
20
Tretyakov Shield with Ott
21
Double Alter with Geonor
22
23
DewTrack
MET2010
Standard RMY
USCRN Shield
PMT
New ASOS
Air Temperature & RH Monitoring
At
High Plains Regional Climate Center
(Lincoln)
Standard HMP243
24
ASOS
MMTS
CRS
Gill
25
Network Integration
26
Cross-Network Transfer
Functions
Cooperative Observer Network (~10,000 Stations)
27
Planned USCRN Stations
at end of 2008 (114* stations)
Installed Paired Locations
Installed Single Locations
As of April 26, 2005
28
* Does not Include Alaska, Canada,
Hawaii, & GCOS stations
Experimental Product
29
Siting Standards Documents
Representativeness
•
•
•
•
•
•
•
Network Plan
Site Acquisition Plan
Site Information Handbook
Site Survey Plan
Site Survey Handbook
Site Survey Checklist
Site Acquisition Checklist
30
Major Principles of
Station Siting
•
•
•
•
•
Site is representative of climate of region.
Minimal microclimatic influences.
Long-term (50-100 year) land tenure
Minimal prospects for human development
Avoids agriculture, major water bodies, major forested
areas, basin terrain.
• Accessible for calibration & maintenance.
• Stable Host Agency or Organization.
• Follows WMO Climate Station Siting Guidelines
31
Objective Site Scoring
• An objective scoring sheet was developed
based on the Leroy method. The score for a
station becomes part of the metadata for the
station
• Re-scoring of stations is part of the annual
maintenance visit; allows tracking time
change in representativeness of station
meteorology
32
33
34
International Cooperation ,Collaboration
and Partnerships
• U.S Representative on the Canadian National
Monitoring Change Management Board
• Canadian Reference Climate Network program participates on the
USCRN Science Review Panel
• USCRN hardware architecture incorporated into
Canadian Climate Monitoring Network
• Two nations will exchange and co-locate reference climate stations
FY04
First step in international cooperation to have commonality
established for surface observing systems to monitor
climate change
35
QUESTIONS
• How do we continue to expand International and
National Partnerships??
• What is the best way for the exchange of information??
• How do we glue the system of systems together??
E-Mail: [email protected]
URL: http://www.ncdc.noaa.gov/oa/climate/uscrn/index.html
36
37
Network Characteristics
• Benchmark Network for temperature and precipitation
• Anchor points for USHCN and full COOP network
• Long-Term Stability of Observing Site (50+ years) likely to
be free from human encroachment
• Sensors Calibrated to Traceable Standards
• Planned redundancy of sensors and selected stations
• Network Performance Monitoring - Hourly and Daily
• Strong Science & Research Component
38