J-6 Internal Template April 2005

Download Report

Transcript J-6 Internal Template April 2005

Data Quality
The Logistics Imperative
Elaine S. Chapman
Defense Logistics Information Service
Chief, Data Integrity Branch
October 26, 2006
DNA of the DOD Supply Chain
“Data is the DNA of supply chain management”
• Acquisition
• Financial management
Weapon System Lifecycle Management
• Hazardous material
• Freight & packaging
Define New
Sustain
Design
Test
Build
Deploy
• Maintenance
Requirements
Supplier
Supplier
Supplier
Supplier
• Sustainability
Supplier
Supplier
Supplier
Supplier
Supplier
• Disposal
• Demilitarization
Retire
DLIS
Who is the customer?
What is needed?
How many are needed?
Where is it needed?



What meets the
requirement?
How many do we have and
where? or, Where/how can
we obtain?
How must it be handled?
Material Supply and Services Management
Supplier
Supplier
Supplier
Ongoing
Requirements
& Demand
Management
Acquisition
Management
-Contract
-Provision
-Purchase
Maintenance & Configuration
Materials
Management
& Warehousing
Distribution &
Transportation
Management
Disposal
Quality
Finance
Reporting
Retail
2
DATA INTEGRITY . . . It’s About Parts
From a logistics perspective . . . supporting
an F-15 is about 171,000 parts flying. . . and a Bradley is
3
about 14,000 parts rolling in close formation
How Expensive Can
Bad Quality Data Be?
In this case $125,000,000
The price of a Mars Climate Orbiter
Mars Orbit
Insertion Burn
Begin
End
Mars Orbit
Insertion Burn
Start
Finish
M/D/Y HH:MM:SS PDT
(Earth Receive Time, 10
min. 49 sec. Delay)
9/23/99 02:01:00
9/23/99 02:17:23
Distance (miles)
Speed
(miles/hr)
Force
(Pounds)
121,900,000
12,300
9,840
143.878
YYYYMMDD EDT
(Earth Receive Time, 10
min. 49 sec. Delay)
19990923 05:01:00
19990923 05:17:23
Distance (km)
Speed
(km/sec)
Force
(Newtons)
196,200,000
5.5
4.4
640
Root Causes of Poor Data Quality
Shared Data
Problems
Interface
Disconnects
5
Interface
Novaces, LLC
Benefits
• Saving money right from the start
– $1 to correct an error at data entry
– $10 to correct a number of errors after the fact with batch
processing
– $100 cost of not correcting an error
• Benefits
– Eliminates time to reconcile data
– Alleviates customer dissatisfaction
– Prevents loss of system credibility
– Eliminates system downtime
– Prevents some revenue loss
– Assists with compliance issues
6
Master Data Management
BSM
• Authoritative sources
• Data Standards
• Meta Data
ARMY
…No New Stovepipes
NAVY
Air Force
7
Vendor Master Example
Data Input
Data Output
> 500 BUSINESS RULES
> 500 EDITS
On-line
D&B Parent
Linkage
CAGE INTERNATIONAL
Public Web
Search
On-line
DUNS
Validation
3rd Party Data
Validation/Certification
SBA
CAGE
USPS
Federal
Reserve
IRS
Daily
Extracts
XML Transactions
ERP
CCR Tools
NOC
DLA Business Systems
Modernization, Army
Logistics
Modernization Program
NO ANNUAL UPDATE = INACTIVE
8
DLIS Data Quality Process
• Knowledge exchanges with the experts – Universities, Gartner,
others
• Plan addresses: People-Process-Technology
−
Management priority / visibility
−
Program managers: overall responsibility
−
Data stewards: analyze, measure, report and support PMs
−
Elaborate, fact-based methodology / measures
−
Edits, profiling tools and system checks
9
Data Quality Methodology
The Process
Define – Identify Data Issues
Measure – Apply appropriate metrics.
Improvements – Address needed
enhancements.
Implement – Initiate approved changes/corrections.
Monitor – Re-measure for effectiveness.
Report – Document status improvements and cost
savings.
25 Jan 05
Overall J6B quality assessment of FLIS on DLA Mgd
NIINs/DRNs where FLIS or BSM is the authoritative source
Process Step – Measure/Baseline
A – Accuracy CN – Consistency CR – Currency CM- Completeness NM – Not Measured
DQ ISSUES
A
CN
1. Shelf Life
Code
85%
2. Jump to Code
NM
3. Order of Use Code
NM
4. Demil Code
5. Precious Metal
Indicator Code
CR
100%
100%
CM
86%
Over
all
92.7%
DQ ISSUES
A
6. Quantity Per
Assembly
CN
NM
100%
CM
Over
all
100%
100%
CR
NM
JTAV First Look Status
System/Product
Benchmark
100%
NM
100%
100%
100%
NM
100%
100%
100%
100%
100%
100%
100%
NM
100%
NM
100%
7. Federal Stock Class
NM
100%
NM
100%
100%
100%
NM
100%
63.1%
87.7%
9. Reference Number
Category Code
NM
100%
NM
100%
100%
100% 10. Reference Number
Variation Code
NM
100%
NM
100%
100%
8. Reference Numbers BSM
Data Cleansing (BR2) PID
project
Issues/Concerns:
Example of
Grading Scale
90-100% A Green
80-89% B Yellow
70-79% C Orange
60-69% D Pink
59%-0% E Red
Not Established - White
Percent or Grade
People
Process
Technology
AFTER: FLIS Status
of
System/Product
DQ as
Baseline
Action Plan
Baselines, DCB
Benchmarks,
Trends,
Recommendations:
Begin checking additional DRNs
Gaps and Quarterly
Changes
100
PM/DS: J6B/Wendy Ball/Roy Marko/Lori Rowley
Participants: Wendy Ball/Lori Rowley
25 Jan 05
ACCURACY
80
CONSISTENCY
60
CURRENCY
40
20
0
COMPLETENESS
BENCHMARK
Root
Cause Analysis
I
L
N OT
E
N
C
IC
U/ No
S/ NS DNo
G
P/
M HC
CAProblem H
Training
Policy
Problem?
Identified DataProblem?
Quality Issues
PM/DS: Mary Faber, Brad Williams / Lori Rowley
Participants: DLIS-SIQ/SXS
Yes
Yes
No
Procedure
Problem?
Revision:
Date: Yes
No
Internal
System
Problem?
No
Interface
Problem?
Yes
Yes
System/Program Approval/Assistance
Does
Training
Exist?
Does
Policy
Exist?
Does
Procedure
Exist?
Target Population: Example: FLIS
Process: Describe the process
Adequate/
Current?
Does Edit
Exist?
Does
Interface
Exist?
Issues/Needs/Concerns: Address any
known conflicts regarding suggested
improvement; any methods or tools required;
and overall concerns.
Unassigned
Error
Characteristic
Establish/
Problems/Errors: Describe
Administerthe
adequate
training,
Policy, Procedure Edits,
problems or deficiencies
found
Accuracy
Grade Percentage
G
68%
or Interface required
Measurable observations: Annotate
the findings
The Results
Methodology
Complete extract
all NSNs
Complete extract
all NSNs
Consistency
G
82%
Currency
G
95%
Complete extract
all NSNs
Completeness
G
75%
Complete extract
all NSNs
DCB Recommendations:
Desired Improvement/need: State
the desired improvement or need
Accuracy
Consistency
Currency
Completeness
PM/DS: Mary Faber Brad Williams, DLIS-SXS Participants: SXS/SIQ
Overall Grade
Date Briefed:
DQ Applied
•
•
•
•
•
•
Identified top five queries for program
Worked with Program Manager to prioritize data elements
Broke them into small pieces that can be measured
Worked with contractor
Began building metrics
Get downstream systems involved in reviewing/implementing
solutions
11
Quality Assessment Oct 06
Process Step – Measure/Baseline
A – Accuracy CN – Consistency CR – Currency CM- Completeness NM-Not Measured
A
CN
CR
CM
Over
all
NM
45%
NM
NM
NA
30%
NM
NM
NM
NA
No Van
Owner Data
NM
NM
NM
4%
Leading Zeros
NM
48%
NM
Cross Dock
Operations
NM
NM
NM
DQ ISSUES
11 Character
In RF ITV
Wrong Van
Owner Data
Grading Scale
90-100% A Green
80-89% B Yellow
70-79% C Orange
60-69% D Pink
59%-0% E Red
Not Established - White
A
CN
CR
CM
Over
all
49%
49%
49%
49%
49%
A0 VLIPS flow to
AV
NM
NM
NM
98%
NM
NA
AS VLIPS flow to
AV
NM
NM
NM
0%
NM
48%
NA
AE VLIPS flow to
AV
NM
NM
NM
52%
NM
24%
NA
DR VLIPS flow to
AV
NM
NM
NM
0%
NM
DQ ISSUES
AV DODAAC
Issues/Concerns:
DCB Recommendations:
We have identified 5 of 117 public
queries. We have researched 3 of 105
data elements of the 5 queries
Data Date: 1 Jun 06
Baseline Grade
12
%
PM/DS: Teresa Lindauer / Rich Hansen
Participants:
Date Briefed: 5 Jun 06
Root Cause Analysis
Doc ID
Error Types
4 - Interface
5 - Procedure
Yes
Training
problem?
No
Interface
problem?
The Doc ID are
not flowing from
DMARS/SOMA
to AV
Analysis step
(DQ indicators
1., 2.)
Proposed
solution
Yes
No
Policy
problem?
Document
resolution and
close problem
No
Yes
Procedure
problem?
No
Procedure
problem?
The DOC ID
A4x and ACx
are not in the
AV ODS
Analysis step
(DQ indicator
3. and 4.
Monitor
improvement
via metric
Proposed
solution
No
Other Error
System/Product:
Asset Visibility/ In-Process-Doc ID
Revision:
1
Date:
15 May 06
13
Template for Analysis of System Potential Issues
Analyze and Improve
DQ Indicator
1.
Doc IDs not
flowing
from
DMARS/
SOMA to
AV
Root Cause
AV’s “in Process” functionality is
primarily designed to answer
questions that support the
“requisitioning organization”.
AV processes AE1, AE2 and
AE3 Doc IDs which provides the
requisitioning organization
status by document number.
The Doc IDs AE6, AE8 and AE9
provide depot level status and
do not flow to AV.
Current Status
Because we were unable to evaluate
SOMA raw data at entry to AV we
performed the next best available
comparison: AV to WEBVLIPS.
WEBVLIPS and DMARS/SOMA are
both DAASC products. Comparisons
were made based on equivalent
queries to both AV and WEBVLIPS.
AV DS briefed DQ indicator results to
(1.) to AV PMO.
Proposed Solution
Short term - Keep the
status quo since AV’s “in
Process” functionality is
primarily designed to
answer questions that
support the
“requisitioning
organization” and it
meets that requirement.
Long term – Combine the
WEBVLIPS an AV
capability into AV and
Eliminate WEBVLIPS
14
Using Standards to Ensure Quality
• The NATO Codification System is the foundation of an international
standard for product and service descriptions
• The eOTD is an open standard for encoding product data through
the life cycle of a product – from design through disposal
15
ISO 8000*
*in development
• Labeling: 
Each data element must be tagged using a globally unique identifier
that can be resolved to its terminology through a free (anonymous)
internet interface
• Originating and cataloging organizations *
The originating and cataloging organizations for each data element
must be identified using globally unique identifiers that can be
resolved to contact information through a free (anonymous) internet
interface
• Origination and cataloging date*
The origination and cataloging date of each data element must be
specified
16
Requirements of ISO 8000
Information quality at the level of the organization
• Assessment of the level (grade) of information management
capabilities
17
What can you do?
• Insist on access to quality information
• Participate in the development of Standard Identification
Guides to define your data requirements
• Promote alignment and interoperability among standards
efforts by insisting on standard data labeling (internally and
externally)
• Encourage your data providers to prepare for ISO 8000
18
Data Quality
19
Questions?
20