Transcript Lecture 36

ASEN 5070: Statistical Orbit Determination I
Fall 2015
Professor Brandon A. Jones
Lecture 36: SNC Example and Solution
Characterization
University of Colorado
Boulder

Homework 11 due on Friday 12/4
◦ Sample solutions will be posted online

Exam 3 Posted On Friday 12/4
◦ In-class Students: Due December 11 by 5pm
◦ CAETE Students: Due 11:59pm (Mountain) on 12/13

Final Project Due December 14 by 12noon
University of Colorado
Boulder
2
Homework 11
University of Colorado
Boulder
3

Leverage code from HW10
◦ New data set generated with a different force model
◦ Otherwise, same format, data noise, etc.

Process observations in existing filter
◦ Do not add J3 to your filter model!
◦ Observe the effects of such errors on OD
◦ Add process noise to improve state estimation
accuracy
University of Colorado
Boulder
4
University of Colorado
Boulder
5
University of Colorado
Boulder
6
University of Colorado
Boulder
7
Application of SNC to Ballistic Trajectory
University of Colorado
Boulder
8

Obs. Stations

Start of filter
University of Colorado
Boulder
Ballistic
trajectory with
unknown
start/stop
Red band
indicates time
with available
observations
9


Object in ballistic
trajectory under the
influence of drag and
gravity
Nonlinear observation
model
◦ Two observations
stations
University of Colorado
Boulder
10
University of Colorado
Boulder
11

Outliers
◦ Mitigate through prediction residual 3σ editing

An observation bias in Station 1 range
◦ Still estimating the bias in the filter
University of Colorado
Boulder
12


Now use an EKF
We will vary the truth model to study the
benefits of SNC
◦ Look at two cases:
 Run each with and without a process noise model
 Error in gravity (g = 9.8 m/s vs. 9.9 m/s)
 Error in drag (b = 1e-4 vs. 1.1e-4)
University of Colorado
Boulder
13
Station 1


Station 2
Blue – Range
Green – Range-Rate
University of Colorado
Boulder
14
University of Colorado
Boulder
15

Added SNC to the filter:

Why is the term for the x-acceleration smaller?
University of Colorado
Boulder
16
Station 1


Station 2
Blue – Range
Green – Range-Rate
University of Colorado
Boulder
17

178.26 vs. 0.85 meters RMS
University of Colorado
Boulder
18
Station 1


Station 2
Blue – Range
Green – Range-Rate
University of Colorado
Boulder
19
University of Colorado
Boulder
20

Added SNC to the filter:
University of Colorado
Boulder
21
Station 1


Station 2
Blue – Range
Green – Range-Rate
University of Colorado
Boulder
22

27.63 vs. 1.26 meters RMS
University of Colorado
Boulder
23

Mitigation of the gravity acceleration error
yielded better results than the drag error
case. Why could that be?
University of Colorado
Boulder
24
Solution Characterization
University of Colorado
Boulder
25

Truncation error (linearization)

Round-off error (fixed precision arithmetic)

Mathematical model simplifications (dynamics
and measurement model)

Errors in input parameters (e.g., J2)

Amount, type, and accuracy of tracking data
University of Colorado
Boulder
26

For the Jason-2 / OSTM mission, the OD fits are
quoted to have errors less than centimeter (in
radial)
◦ How do they get an approximation accuracy?
◦ Residuals?
 Depends on how much we trust the data
 Provides information on fit to data, but solution accuracy?
◦ Covariance Matrix?
 How realistic is the output covariance matrix?
 (Actually, I can make the output matrix whatever I want
through process noise or other means.)
University of Colorado
Boulder
27

Characterization requires a comparison to an
independent solution
◦ Different solution methods, models, etc.
◦ Different observations data sets:
 Global Navigation Satellite Systems (GNSS) (e.g., GPS)
 Doppler Orbitography and Radio-positioning Integrated by
Satellite (DORIS)
 Satellite Laser Ranging (SLR)
 Deep Space Network (DSN)
 Delta-DOR
 Others…

Provides a measure based on solution precision
University of Colorado
Boulder
28

Jason-2 / OSTM positions solutions
generated by/at:
◦ JPL – GPS only
◦ GSFC – SLR, DORIS, and GPS
◦ CNES – SLR, DORIS, and GPS

Algorithms/tools differ by team:
◦ Different filters
◦ Different dynamic/stochastic models
◦ Different measurement models
University of Colorado
Boulder
29
Image: Bertiger, et al., 2010


1 Cycle = approximately 10 days
Differences on the order of millimeters
University of Colorado
Boulder
30

Compare different fit intervals:
University of Colorado
Boulder
31

Consider the “abutment test”:
University of Colorado
Boulder
32



Each data fit at JPL uses 30 hrs of data, centered at
noon
This means that each data fit overlaps with the
previous/next fit by six hours
Compare the solutions over the middle four hours
◦ Why?
University of Colorado
Boulder
33
Image: Bertiger, et al., 2010

Histogram of daily overlaps for almost one year

Imply solution consistency of ~1.7 mm

This an example of why it is called “precise orbit determination” instead
of “accurate orbit determination”
University of Colorado
Boulder
34


In some case, we can leverage observations
(ideally not included in the data fit) to estimate
accuracy
How might we use SLR to characterize radial
accuracy of a GNSS-based solution?
University of Colorado
Boulder
35
Image: Bertiger, et al., 2010


Results imply that the GPS-based radial error is
on the order of millimeters
Why is the DORIS/SLR/GPS solution better here?
University of Colorado
Boulder
36


Must consider independent state estimates
and/or observations
Not an easy problem, and the method of
characterization is often problem dependent
◦ How do you think they do it for interplanetary
missions?
University of Colorado
Boulder
37