March 3, 2005

Download Report

Transcript March 3, 2005

LAT HSK Data Handling
from
B33 Cleanroom
ISOC Software Architecture
RT HK, Diag & Alert telem
L0 File Ingest
L0 Package
MOC
Telemetry
Monitoring
L0 Raw
Archive
Integ. Observ. Timeline,
Spacecraft Timeline,
As-Flown Timeline,
Flight Dynamics Products
Mission
Planning
GSSC
Limits, Alerts,
Processing info
EU Values
L0 Data
Processing
Preliminary
Science Timeline
Data arrival info,
Packet Processing info
LAT Timeline
Package
LAT Timeline
Package
L0 Sci Data
(LDF files)
EU values
Limits, Alerts
Logging
EU values
L1&2 Product
Generation
Trending
L1&2 Sci Data
(FITS files)
Cmd execution tlm,
Upload confirmation tlm
L1&2 Data Products Package
L1&2 Product
Transmission
Planning
Production
LAT
Configuration
Monitoring
Top-Level ISOC Data Flows
9 Feb 2005
Import Flow Diagram
Flight
Hardware
Flight
Hardware
HTTP
Flight
Hardware
Elog
Elog
WebApp
WebApp
Register
Values
Register
Values
Web
Browser
Register
Values
SQL
LATTE
LATTE
HSK Server
HSK Server
LATTE
LATTE
HSK Server
HSK Server
Src, Time,
Value, Validity
LATTE
LATTE
HSK Server
HSK Server
URL
Link
HTTP
Elog
Database
(Oracle)
Trending
Trending
WebApp
WebApp
Src, Time,
Src, Time,
Value, Validity Value, Validity
SQL
LSF Batch Farm
Changes,
Statistics
MySQL
Database
Trending
Database
(Oracle)
Trending
Trending
Import
Import
Src, Time,
Value, Validity
Database
Database
Dump
Dump
(cron)
(cron)
b
Su
Tlm.out
Val.out
m
it
Tlm.out
Val.out
Cleanroom (B33)
U12 NFS
Volume
DMZ Drive
Tlm.out
Val.out
Pipeline
Pipeline
Scheduler
Scheduler
Task,
Run ID
Tlm.out
Val.out
Task,
Run ID
HSK Export
HSK Export
(cron)
(cron)
glast02
Pipeline
Database
(Oracle)
glast04
Backup Slides
HSK Data Collection
• Flight hardware in the cleanroom is polled by the
LATTE HSK Server.
• Sampled HSK values are stored in a MySQL
database
• Every two hours, new data is dumped to binary
files on the DMZ drive.
– This shared volume is mounted by LATTE
workstations in the cleanroom, and by a small number
of designated hosts on the SLAC network.
HSK Data Export
• A cron job on the ‘glast02’ host moves the
database-dump files from the DMZ volume
to the U12 volume
– This volume is accessible to (most) GLASTproject hosts and to the LSF batch farm
• The cron job also injects a new instance of
the ‘b33-hsk-import’ task into the pipeline.
HSK Data Import
• The pipeline schedules execution of the
Oracle-import program (written in Python)
on the LSF batch farm.
– Processing status can be tracked through the
pipeline web pages.
• The Python program reads the binary data
file, creates change records and statistics
records, and posts them to Oracle.
Web Data Access
• The test-data processing pipeline
populates the ‘ELogReport’ database table
with information about each test run.
– This information can be queried to provide intest time-spans of interest for querying
trending data.
– It can also be used to determine the serial
numbers of hardware under test (with some
limitations).