Top Ten For Warfighters

Download Report

Transcript Top Ten For Warfighters

Know the Earth

NATIONAL IMAGERY AND MAPPING AGENCY

Show the Way

NIMA Commercial and Civil Applications Project (CCAP) Geopositional Accuracy Evaluations

Terry Lehman December 4, 2003

ISPRS International Workshop on Radiometric and Geometric Calibration

CCAP Mission

• CCAP is the NIMA process to assess the utility of emerging civil and commercial remote sensing systems • DoD directive 5105.60 states that NIMA shall: – Assess the applicability of evolving commercial capabilities to meet imagery and geospatial information needs of the department of defense and the intelligence community • CCAP partners include USGS and NASA-Stennis: – JACIE (joint agency commercial imagery evaluation) team – Space act agreement • Elements of the evaluations: – Image interpretation for intelligence, military, and civil applications – Feature extraction for mapping – Geopositional accuracy – Radiometric fidelity 2

Approach

• The evaluation of geopositional accuracy of basic products is based on a comparison to known ground control points of higher accuracy • Evaluation support provided by NIMA’s: – Precision engagement staff (PTNT) – Front end processing environment (FPE) – Image quality & utility (AEAI) – Innovison (IDR) • Primarily Socet Set ® was used to measure the coordinates in the imagery products 3

Approach (cont.)

• 13 scenes – Various locations having features with known ground truth geo-coordinates – Scenes vary by terrain elevation characteristics • 25 drop points geo-located per scene – Ground truth geo-coordinate data derived from controlled base – Only latitude and longitude determined – no stereo 4

Imagery Matrix

SCENE Abu Musa, TC Antananarivo, MA

Auckland, NZ Bandar Abbas, IR Belem, BR Buenos Aires, AR Callao PE Camp Lejeune, NC

Christchurch, NZ

Colon, PM Durban, SF

Fallon, NV

Ft Hood, Texas FT Irwin, CA Ft Lewis, WA

Hickam AFB, HI

Jakarta, ID

Keflavik, IC

Kirkuk, IZ

Latitude

25.8756N 18.9103S 36.9635S 27.1697N 1.3900S

34.5791S 12.09952S

34.6695N 43.4822S 9.3600N 29.9689S 39.4170N 31.1420N 35.26N 47.1100N 21.3478N 6.1500S 64.02037N

35.4699N

Longitude

55.0323E

47.5411E

174.7969E

56.2082E

-48.4600W

-58.4032W

77.137W

-77.3222W

172.5150E

-79.8660W

30.9508E

-118.7000W

-97.7639W

-116.69W

-122.5383W

-157.9522W

106.8333E

22.59077W

44.3460E

SCENE

Langley, VA

Miami, Fl

Montevideo, UY

Nellis AFB, NV

Norfolk, VA San Diego, CA Santiago de Cuba

Sioux City,IA (US)

Soufriere Hills MH

St. Simons Isl, GA Sunnyvale, CA

Sydney, AS Tuzla, BK

Utapao, TH Villa Dolores

Winslow, AZ Yongbyong, KN Yuma, AZ

Latitude

37.0845N 25.7900N

34.8500S

36-14-00N 36.9370N

32.7200N 20.0000N 42.4026N 16.7000N 31.1400N 37.4166N 33-55-00S 44.5400N 12.6814N 39.9143S 35.0333N 39.7971N 32.6519N

BOLD: Geopositional Products

5

Longitude

-76.3567W

-80.2000W

-56.1750W

115-02-00W -76.2895W

-117.1500W

-75.8500W

-96.3812W

-62.1666W

-81.3952W

-122.0500W

151-17-00E 18.6750E

100.9978E

-65.1000W

-110.1500W

125.7501E

-114.6120W

Scene Locations

6

Geopositional Accuracy

• Basic products – Imported into SOCET Set  – For each GCP • Measurement cursor elevation set to GCP elevation • Operator selects image pixel representing GCP • Horizontal coordinates computed using image rapid positioning capability (RPC) data, GCP elevation, and image line/sample – Statistics compiled for all GCPs measured • Ortho products – Imported into SOCET Set  – For each GCP • Operator selects image pixel representing GCP • Horizontal coordinates computed using image RPC data and image line/sample – Statistics compiled for all GCPs measured 7

Camp Lejeune

8

Christchurch

9

Feature Selection

• Features in this evaluation fall into broad coverage categories as listed in the Feature and Attribute Coding Catalog (FACC) • Some overlap of features exists between coverage categories • All confidence ratings and yes/no attribute responses for features were grouped by coverage category and averaged for each product type – This allowed for comparisons of products by general mapping applications • Multiple choice attribute responses were grouped by attribute category, compared with predetermined ground truth, and averaged for each product type 10

Feature Selection, cont.

• 6 Geospatial Analysts/Cartographers • 265 Image chips; 29 unique geographic locations – 112 Pan – 112 MSI – 41 Pan-Sharpened MSI – Data were analyzed to determine the suitability and information content of standard imagery products in support of extraction and attribution tasks – Categories derived from the (FACC) 11

Image Chips

Images Copyright DigitalGlobe 2003 12

Coverage Categories

• Nine FACC coverage categories used – Ground Obstacle – Hydrography – Industry – Physiography – Population – Surface Drainage (SDR) – Transportation – Utility – Vegetation 13

Attribute Categories

• Fourteen attribute categories used –Accuracy –Existence –Hydrology –Infrastructure –Location –Material Composition –Other –Product –Structure/Shape –Surface Condition –Surface Type –Usage –Vegetation Characteristics –Weather Type 14

Scene Locations

15

Image Interpretability

• 10 imagery analysts (IAs) • 53 images; 28 unique geographic locations – 28 pan – 10 MSI – 15 pan sharpened MSI – All images at near-nadir collection geometry – Resampling kernel: nearest neighbor 16

Image Interpretability, cont.

• One to five sub-scenes chipped from each image – “Chip” size ranged from 2048 2 to 6144 2 depending on the product pixels, – Total of 213 sub-scenes • National Image Interpretability Rating Scale (NIIRS) evaluation – Images displayed on certified and calibrated, non destructive softcopy displays 17

Image Interpretability, cont.

• 1 meter imagery will satisfy NIIRS 5 requirements about half the time. The other half will satisfy high NIIRS 4 requirements.

– Assumes near nadir collection geometry –

NIIRS 5

criteria • Distinguish between a MIDAS and a CANDID by the presence of refueling equipment (e.g., pedestal and wing pod). • Identify TOP STEER or TOP SAIL air surveillance radar on KIROV-, SOVREMENNY-, KIEV-, SLAVA-, MOSKVA-, KARA-, or KRESTA-II-class vessels.

• Identify radar as vehicle-mounted or trailer-mounted.

• Distinguish between SS-25 mobile missile TEL and Missile Support Vans (MSVs) in a known support base, when not covered by camouflage .

• Identify individual rail cars by type (e.g., gondola, flat, box) and/or locomotives by type (e.g., steam, diesel).

• Identify, by type, deployed tactical SSM systems (e.g., FROG, SS-21, SCUD).

18

QB Super Resolution

Three Gorges Copyright 2003 Digital Globe Standard Product Copyright 2003 Digital Globe

Conclusions

• CCAP evaluates all commercial and civil satellite imagery that NIMA’s customers are interested in using • CCAP evaluations only insure the delivered products meet the supplier’s specifications, particularly with respect to geopositional accuracy • CCAP evaluations quantify the utility of commercial imagery for image analysis and mapping applications 20

Present and Future Projects

• OrbView-3 • SPOT-5 • EROS A1 • ISTAR/ADS-40 • GeoSAR • Star 3I • RadarSat 2

21

Know the Earth

NATIONAL IMAGERY AND MAPPING AGENCY

Show the Way

Questions?