Transcript [TEAM NAME]

PLCS Implementers Forum
Kick-Off Meeting
March 16-18, 2011
Agenda Wednesday, March 16, 8:30 - 5:00
Mount Pleasant & Berkley Rooms
•
•
•
•
•
Welcome and Introductions 8:30 – 8:45 – Mary Mitchell
Goals and Scope of the PLCS-IF 8:45 – 9:30 – Phil Rosche
Identification of potential PLCS-IF participants, stakeholders, and
sponsors – Mary Mitchell
Break – 9:30 – 9:45
Requirements Definition for PLCS-IF Breakouts 9:45 – 11:30:
– Implementors – Phil Rosche
– Users – Mary Mitchell
• Lunch – 11:30 – 1:00
•
•
•
•
Requirements Prioritization – 1:00 – 2:30
– Implementors – Phil Rosche
– Users – Mary Mitchell
Recap of Requirements Definition and Prioritization – 2:30 – 3:00
Break – 3:00 – 3:15
Use Case and DEX Identification and Prioritization - Scot Motquin –
3:15 – 5:00
Workshop Participants
•
•
•
•
•
•
•
•
•
•
•
U.S. Army LOGSA
NIST
UK MOD
Norwegian Defense LO
Swedish Defence
Materiel Admin
ATI
ICF International
BAE Systems
IBM
Engisis
Top Quadrant
• Naval Surface Weapons
Systems
• Boeing
• Eurostep
• NG Shipbuilding & Tech
Services
• Lockheed Martin
• DNV
• SAAB
• Transcendata
• Cassidian
• PILOG USA
• Costvision
PLCS-IF Kickoff Meeting
Ground Rules
• Some items and topics that come up may
be put into the parking lot for future
discussion
• Everyone try and make note of action
items
• Provide requirements and other important
items in a short, succinct manner
Meeting Outcomes
•
•
•
•
•
•
•
•
Implementor requirements
User requirements
Use Case and DEX Identification
Public Web Site Requirements
Input to System Design Document
Input to Concept of Operations Document
Initial Schedule
Initial list of potential PLCS-IF participants,
stakeholders, and sponsors
PLCS-IF Scope
• To Verify and Validate STEP AP 239
PLCS
– Verify and Validate the PLCS Standard
• PLCS Templates
• PLCS Reference Data
• PLCS DEXs
– Verify and Validate PLCS Software
Implementations
• Target group: STEP AP 239 PLCS DEX
Implementors/Developers
Goals
• Reduce barriers to and risk of implementations
• Provide feedback to PLCS standards developers based
on experience of implementation, includes pre-standard
testing
• Verify software implementation supports DEX specific
capabilities
• Verify interoperability of PLCS software
• Publish recommended practices
• Provide support for testing multiple formats including file
exchange and web services
• Out of scope:
– Duplication or overlap with DEX LIB functions (i.e., validation of
DEX)
– Any data that falls under Export Control
Objectives
• Provide a neutral and open environment to accelerate
development and implementation of PLCS adapters
• Establish a vendor independent forum
• Ensure Quality Assurance of, and build confidence in the
PLCS Standard and PLCS Software Implementations
• Ensure that user's requirements are satisfied through
real world scenario based testing
PLCS-IF Deliverables
• Flexible testing framework and an approach to how to
use it
• Test plan to include prioritized schedule
• Report Test Results (Executive Summary/ Statistics) to
PLCS TC
• Confidential Detailed QA Report to Vendor
• Recommended Changes to PLCS (Templates, DEXs,
STD) to OASIS PLCS TC
• Testing documentation to include:
–
–
–
–
Recommend Practices to aid implementors
Library of reference data (known good files)
Use cases
Test cases
Project Landscape
User
community
Requirements
Vendor
Development
& Testing
STEP
Standards
The world
Requirements
Approaches &
Experience
Results
Approaches
Pilot projects
Key users & service providers
Recommended
Practices
PSI/PDES
user community
e.g. Downstream Scenarios
PLCS-IF in its context
AIA
ISO
LOTAR
AP239
edition n
OASIS
PLCS TC
...
PDES Inc
SEDS
Request QA
(DEX, template)
Report issues
PLCS
Implementor
Forum
Statistics
The public
ASD
User
SW for
testing
Vendor
QA input
START
Define User
Requirements
Publish Testing
Results and
Recommendations
Perform
Interoperability
Testing and
Semantics
Check Syntax
and Structure
Develop
Test Case
Documentation
Develop/Refine
Translators
Output PLCS
Files
Requirements Definition for
PLCS-IF
The following notation for Involvement/Interested Parties is
used:
• None - All participants
• A – Acquiring officials
• D – Vendors COTS Software, 3rd party & Adapter
Implementers, Internal developers
• EU – End Users
• IA – Industry Info. Architects
• Q – Information/Data Quality
• S – Standards Developers (OASIS TC, SC4, etc.)
• T – Test operators
• X – DEX Template Builders (spec writers)
User Requirements
•
•
•
•
Ability to evaluate syntax and semantics - D, IA, T
Provide a broad open forum for all users and product categories - EU
Compatibility/ability to consume entire data set (process all types) – D
Well documented stable test data sets - A, D, IA, Q, X, T
– Clear definition of the information assets
– Repository of test data and other reference data
• Report DEX implementable / mapped to right constructs –IA, Q, S, X,
• Is semantic of specific business needs out of scope? - D, EU, X, T
i.e., Business rules not in DEX (exchange agreement in addition to DEX)
• Contract-ability – data to be exchanged (ability to exchange) – A, IA
• Provide a testing framework to allow ‘sandbox’ test environment - D, S,
T, X
• Forum to present issues uncovered during implementation
• Provide common tools / neutral testbed for system to system test - D, IA,
Q, S, T
• Definition of what is needed to participate in the test environment
Notes:
• Be Pragmatic (domain specific exchange has failed to this point)
• Suggested Approach: testing at the core template level before full
exchange is more efficient and establishes modular test capability
User Requirements (cont)
•
•
•
•
•
•
•
•
DEX specific entities captured, recommended implementation practices, transform
multiple format (map p21, p28, DEXLIB2/SysML, possibly OWL) - D, IA, S, T
– Production quality tools for mapping between different representation formats
– State of the art of tools / consistent implementation of underlying components
Analysis tools for test environment (to assess that exchange good) - D, Q, T, X
Communicate desired features & priorities missing in COTS Implementations
• Develop a Roadmap to implementation
• Summarize Business Need Priorities and Enhance visibility to Industry
Production (API, adapters) implementation testing - D, T, X
Communication / promote results, harmonization needs, clarify interpretation
Recommended practices / measurement approach to “goodness” of data
– Validate properties of DEX content e.g., product structure, item ID, etc (NG
will provide audit LCA/CATIA)
Develop standard error and result reporting formats
Contribute to harmonization of DEX to eliminate overlap (business drivers,
implement easier to see) – IA, S, X
User Requirements (cont)
• Forum active engagement to resolve issues (operational model)
• Barriers complexity, competitiveness, trust, proprietary
– Address by transparency and openness
•
•
•
•
Ship test cases included (test scripts, use cases) - D, T, X
Provide basic open source tools to support testing & implementers - D, T
Provide Tools to support test development and execution - D, T
Define Exchange Processes (given data drives underlying process)
– Candidate structure for this may be RosettaNet Partner Interface Process (PIP)
http://www.rosettanet.org/dnn_rose/Support/ImplementingRosettaNetStandards/Tech
nicalFAQs/tabid/577/Default.aspxor or
– AIA
• Assign Requirements to Specific uses /class of user
• Market data on capabilities - EU, A
• Knowledge base of past implementations
Implementor Requirements
• Support a variety of implementations/be open
to a variety of implementation platforms
• Initial scope will support Part 21 and 28
• Testing will always be based on a valid data
model and reference data
• Reference data will not only be from OASIS,
but may be from a variety of other sources
• Reference data should be tested as well
• Exchange files that use the reference data
should be tested as well.
Implementor Requirements (cont)
• Business usage requirements need to be
supported as well (business rules)
• File comparison for initial implementations
• PLCS-IF WILL NOT support ITAR and similar
requirements/system portability
• Support the testing of multiple files
• [support for associated files pointed to]
• Downloadable test suite
• Support AP 239 edition 2
• Support AP 233 implementations
• Part 28 edition two format for initial test cases
Other things to consider
• Flexible testing framework that is
configurable (e.g., support Business DEXs)
• Sandbox for other implementation forms,
and standard forms (data model
specifications)
• Ontologies and other logic based
approaches
• Impact of DEXlib 2 and Platform Specific
Model
Results of Syntax and Structure
Checking
•
•
•
•
•
•
•
•
•
Instance count by type
Data type checking
Dangling pointers
Orphans
Mandatory attributes
Boundary checks for attributes
Valid part 21/XML file
Comparison between reports (bug fixing)
Current scope file exchange, web services soon
after
Future Requirements: Web Services
• Special interest group initially
• Define relationship with organizations like
OASIS, OMG, Open-Services, OAGIS
• Inputs: OMG PLM web services, OASIS
PLCS/PLM web services, semantic web
services, Engisis, Eurostep Boeing report,
Open-Services PLM reference model
(ALM), Jotne PLM web services, OAGIS
Types of Semantic Analysis
• Part 21 – express rules
• XML – schematron, XSLT,
• Semanntic web - Sparql
•
•
•
•
•
•
•
•
•
Use Case, DEX Identification and
Prioritization
Business model, governing principles, etc
Review templates and prioritize which ones are used most frequently
Tools to help testers implement standard
Allow Edition 2
Have reference data but need simplified data 1st
Resolve differences in Part 28 formats
Verify the core reference data on DEXLIB
Expand DEX development to include limited test data
Potential candidates for testing implementations
– LOGSA DEX product breakdown for support, DEX for provisioning and
cataloging (6 months) PLM export, LOGSA Catalog, Import Legacy System
– LOGSA next GIA 0007 DEX
– UKMOD Quality of Data files; Legacy and Rolls Royce
– Business DEX ASD 1 and 3
– Ship data exchange to NAVSEA LEAPS 12 months from award
– LOTAR is not yet ready; expectation is 18 months
– ASD Tech Data Package
– MILSTD 31000
– FMV – DEX Item of Supply
– Multiple Users - Item Identification
Concepts/Functions
DEX Development
Business
Exchange
Requirements
DEX
Templates Specification
DEX
Schema
Summary of Use Case Discussion
• Found a lack of implementations that need to be tested
today
– Production efforts do exist but are not cleanly mapped into
AP239 and standard DEXs
• LOGSA DEXs (provisioning, product breakdown for
support)
– will have Reference Data 3rd quarter, 2011
– Expect an adapter to be available in 6 months
• Joint Military programs need common exchanges at a
minimum of to support deployed systems
• Shared logistics chain – NATO Logistic Functional Area
Services (LOGFAS), AIA members
Discussion (Cont.)
• Need to make decisions on what is common and
what flexibility is needed
• Partner Interface ProcessWhat issues are
driving the need for interoperability
– DEX harmonization should be an early priority (i.e.,
ensure entities are used the same way)
– Pick one DEX and Focus on this as the 1st Target
– Develop the Building Blocks and Reference Data
Agenda Thursday, March 17, 8:30 to ???
Mount Pleasant & Dorchester Rooms
• Transition to future editions of PLCS – Tor-Arne
• NAVSEA Test Cases – Ben Kassel
• Jump start building blocks
–
–
–
–
–
–
Internal DEXs w/ 3rd party
Generic small prototype, steps to develop template and business process
Mark
Archiving of data also important; schema at point of archiving EN 9300
LOTAR Preservation planning & Governance
Configuration management
• Deliverables - Defer
• Public Web Site Requirements Definition - Mary Mitchell
• Testing Reference Data, PLCS Implementation Approaches David Price
• Update from PLCS TC meeting – Howard Mason
• Open forum – Simon Frechette
A Test Case Framework for
PLCS-IF
Ben Kassel -NAVSEA
A testcase framework
for the PLCS-IF
Ben Kassel
Naval Surface Warfare Center
Carderock Division
30
Distribution Statement A : Approved For Public Release; Distribution is unlimited
Digital Product Model Data
A brief review
Product Model data is the combination of 3D geometry and non-graphic attributes to define ship
objects such as a piece of equipment, deck, bulkhead, etc. Product Model data can be organized to
define interim products and ultimately the entire ship.
Part & System Definition (Caterpillar
3512, Starboard Main Engine,
Propulsion System)
Design Definition (12 cylinder 4 stroke
diesel engine )
Physical (Geometry, material
connections, etc.)
Engineering Definition (1175 HP,
6464kg, 170mm bore, 190mm
stroke)
Process Definition (Starting
instructions, shaft alignment)
Logistics Support (FGC, SCLSIS, etc.)
Advocates anticipate substantial economies from Product-Model-based design, construction, and service-life support
31
activities due to better integration and reduction of engineering effort to locate, verify, and transform information.
Digital Product Model Data
… its more than just design and construction
The DPM should be the primary source of data for all pre milestone B activities.
The DPM should be used by NAVSEA to validate the design during the Detail
Design and Ship Production phases.
The DPM should be the authoritative source of data in support of the Situation
Incident Room upon delivery of the ship.
The DPM should be the authoritative source of data for technical manuals,
training.
Program Initiation at MS A or
MS B
Acquisition Strategy Defined
CD
Exploratory
Design
Functional
Analyses
Force
Architecture
Studies
Concept or
Feasibility
Studies
ROM or
Concept
Studies
A
Analysis of
Alternatives
(Analysis of
Material
Approaches)
Full Program
Funding through
Out Years
Feasibility
Studies
Full Rate
Production
Decision Review
C
B
Lead Ship Ready
for Deployment
IOC
Detail Design
Pre-Preliminary /
Indicative;
Ship Production
Preliminary; and
Lead Ship Construction
Contract Design DRR
FOC
Life Cycle
Operations
&
Support
Ship Production
System Integration
Major Sub -Systems Development
Capabilities
Development
Identify
Critical
Systems &
Technologies
Design
Readiness
Review
Follow (LRIP) Ship
Construction
Sub-Systems
Development
Critical Technology Development
FRP DR
Follow (FRP) Ship
Construction
Trials
DT&E/LFT&E/IOT&E
ICD
RDT&E
FOT&E (as required)
CDD
CPD
BA 4 & BA 5
(for Concepts, Requirements Dev, Design, Systems Eng, Acq Doc, Major Systems Dev.)
RDT&E
SCN
BA 3 & BA 4
O&MN
(for Critical Technology Development)
New DoD
Concept
Decision 5000
Joint
Capabilities
Integration &
Development
Concept
Refinement
Technology Opportunities & User Needs
A
IOC
C
B
Technology
Development
System
Development
& Demonstration
Design
Decision Review
Production &
Deployment
Support
LRP/IOT&E
FRP
Decision Review
Systems Acquisition
Pre-Systems Acquisition
Operations
FOC &
Support
Operations &
Sustainment
Disposal
Sustainment
(Engineering Development, Demonstration, LRIP & Production)
ICD
CDD
CPD
The
32Digital Product Model can support the entire ship’s lifecycle.
Uses of the Digital Product Model
Rear Admiral Eccles affirmation
33
NAVSEA Instruction 9040.3A
Acquisition and Management of Product Model and other Technical Data
Ship and ship system design, acquisition, and fleet support
activities shall procure and accept product model data in accordance
with ISO 10303, Standard for the Exchange of Product model data
(STEP) format, native Computer Aided Design (CAD) files, and/or
Leading Edge Architecture for Prototyping Systems (LEAPS) format.
This should be based on solutions that provide the best technical
and cost performance as determined by a NAVSEA business case
analysis.
Provide guidance for the acquisition of product model and related technical data.
This instruction applies to product models and technical data derived directly from the
product model such as engineering analysis, bills of material, and drawings.
This instruction implements the DON POLICY ON DIGITAL PRODUCT/TECHNICAL
DATA issued in 2004 and the NAVSEA SHIP DESIGN AND TOOLS GOALS issued in
2008.
This instruction does not specify a format explicitly, but instead requires Navy
stakeholders to reach consensus on the definition and delivery of product model
data.
34
Balances
cost, data utility, and data exchange technology.
Product Model Data Definition and Exchange
A NAVSEA perspective
A two level approach for the exchange of product model data
First level : Support configuration management, logistics support, provisioning,
spares, and repairs through the use of STEP for geometry, product
structure, non graphical attributes, and to manage configuration items
of the as-built / as-maintained ship.
Second level : Deliver the as-designed class model of
1) molded forms suitable for defining a general arrangement
2) scantling level of detail of structure to support structural (and
other types of) analysis
3) functional distributed systems model (i.e. path, components, and
connections)
4) compartmentation, including accesses, opening, and tightness
5) plates, stiffeners, brackets, collars, and other structural
components as parts
6) distributed system components, fittings, and equipment as parts.
35
Implementing the Policy
Ships Specification 098 – 3D Product Model
The Digital Product Model shall be delivered in both a
native and neutral format. The neutral format shall
comply with the Department of the Navy Policy on
Digital Product/Technical Data dated 23 October 2004.
ISO 10303 Part 214 shall be used to define the Digital
Product Model geometry. ISO 10303 Part 239 shall be
used to define product structure, the relationship
between objects, and configuration management data.
The Builder shall provide a list in the PPM1 of each data
exchange specification (DEX) that will be used to
support the ISO Part 239 exchange. In the event the
contractor can demonstrate the need for an additional
DEX, then the contractor shall develop a NAVSEA
approved DEX.
36
Archiving Ship Product Model Data
Mining ShipConstructor Product Model Definition
NATIVE ShipConstructor® DATA
EXTRACTED NON-GRAPHICAL ATTRIBUTES
37
EXTRACTED GEOMETRY
A testcase framework for PLCS-IF
Test case ultimate depth and breadth
38
A testcase framework for PLCS-IF
A small distributed system
Electrical Components ● Mechanical Components ● Piping C
Systems ● Subsystems ● Assemblies
A testcase framework for PLCS-IF
A sample USE case
(Product Operational Information)
The useful life of gate valves used in fuel oil service is 20 years. Gate valves between 10 and
20 years old should be inspected periodically, and their condition reported. This is not a critical
issue so the repair does not need to be made immediately. However, planning is required to
purchase new seals or replacement valves depending on their condition. From the product
model repository
Attributes to support USE case
Monitor Gate Valve condition
1.
2.
3.
4.
Locate all of the valves in fuel oil service
Determine the date the valve was placed in service
If the valve is more than 20 years old flag it to be replaced
If the valve is between 10 and 20 years old, issue a service
bulletin to inspect for signs of seepage and report back
5. For all reports that indicate seepage, schedule valve
replacement during the next scheduled maintenance period
6. Determine the next time the ship has maintenance scheduled
7. For all reports that indicate no problem make a notation in the
ships maintenance logs
Ship
Hull Number
Date put in service
Location
Availability
System Attributes
Type (SWBS?)
Fluid Type
Part Attributes
Seal type
Model number
Serial number
Seal material
Install date
Overhaul date
Size
A testcase framework for PLCS-IF
Scope like stuff
Provide sufficient data to develop a product model and collect all the “other technical
data” necessary to perform the aforementioned task. The test shall be of sufficient
breadth and depth that the output products required to support the aforementioned
tasks can be generated automatically from the product data. Additionally, there will be
sufficient information to associate the other technical data to the product data and to
allow simultaneous configuration management of the product data and other technical
data.
The test case was developed to exercise
The test case includes information required to
Parts
Assemblies
Systems
Configuration Management
Multiple product structures
Workflow
Revision control
Parts placed in a system
Parts placed in an assembly
Assemblies placed in a system
Systems divided into subsystems
41
Design the system
Develop the work package to assemble a system
Purchase components
Maintain the system
Repair the system
Operate the system
A testcase framework for PLCS-IF
Bill of Material (well almost)
Instance
Component Item Name
T0
T1
T2
R00
R01
R02
R10
R11
R12
R20
R21
R22
E01
Tank, 700 gallon
48
0
0
0
-1
0
1
0
0
0
0
1
E02
10 GPM Pump and Motor Unit
120.898
47.38
-4.07
0
1
0
-1
0
0
0
0
1
E03
E04
Tank, 200 gallon
Breaker Box, 300 amp
209.12
183
54
72
12
36
0
1
1
0
0
0
-1
0
0
1
0
0
0
0
0
0
1
1
E05
Strainer Y, 150lb, 1in, FF
104.502
30
3
1
0
0
0
1
0
0
0
1
F01
Elbow, SCH40, BW, 1in
193.06
30
3
1
0
0
0
0
-1
0
1
0
F02
Elbow, SCH40, BW, 1in
196.06
30
51
-1
0
0
0
0
-1
0
-1
0
J01
Gasket 1/16in, 150lb, 1in
50.56
30
3
1
0
0
0
1
0
0
0
1
J10
Flange, 300lb, FF, 1in
130.945
30
3
1
0
0
0
1
0
0
0
1
J11
Flange, 150lb, FF, 1in
142.82
30
3
-1
0
0
0
-1
0
0
0
1
J12
Gasket 1/16in, 150lb, 1in
142.883
30
3
-1
0
0
0
1
0
0
0
1
J13
Gasket 1/16in, 150lb, 1in
146.823
30
3
1
0
0
0
1
0
0
0
1
J14
Flange, 150lb, FF, 1in
146.885
30
3
1
0
0
0
1
0
0
0
1
J15
Flange, 150lb, FF, 1in
206.498
30
51
-1
0
0
0
1
0
0
0
1
J16
Gasket 1/16in, 150lb, 1in
206.56
30
51
-1
0
0
0
1
0
0
0
1
J02
Flange, 150lb, FF, 1in
50.623
30
3
1
0
0
0
1
0
0
0
1
J03
Flange, 150lb, FF, 1in
106.88
30
3
-1
0
0
0
1
0
0
0
1
J04
Gasket 1/16in, 150lb, 1in
106.943
30
3
-1
0
0
0
1
0
0
0
1
J05
Gasket 1/16in, 150lb, 1in
110.883
30
3
1
0
0
0
1
0
0
0
1
J06
Flange, 150lb, FF, 1in
110.945
30
3
1
0
0
0
1
0
0
0
1
J07
Flange, 300lb, FF, 1in
122.82
30
3
-1
0
0
0
-1
0
0
0
1
J08
Gasket 1/16 in, 300lb, 1in
122.82
30
3
1
0
0
0
1
0
0
0
1
J09
Gasket 1/16 in, 300lb, 1in
130.883
30
3
1
0
0
0
1
0
0
0
1
P01
Pipe, SCH40, BW, 1in
50.883
30
3
1
0
0
0
1
0
0
0
P02
Pipe, SCH40, BW, 1in
111.205
30
3
1
0
0
0
1
0
0
0
P03
Pipe, SCH40, BW, 1in
131.205
30
3
1
0
0
0
1
0
0
0
P04
Pipe, SCH40, BW, 1in
147.145
30
3
1
0
0
0
1
0
0
0
P05
Pipe, SCH40, BW, 1in
194.56
30
4.5
0
0
-1
0
1
0
1
P06
Pipe, SCH40, BW, 1in
196.06
30
51
1
0
0
0
1
0
P07
Pipe, SCH40, BW, 1in
50.883
30
3
1
0
0
0
1
V01
Valve Gate, 150lb, 1in, FF
106.943
30
3
1
0
0
0
1
V02
Valve Gate, 150lb, 1in, FF
142.883
30
3
-1
0
0
0
W01
Wire
121.383
42.69
0.38
1
0
0
0
42
pipe end x
pipe end y
pipe end z
length
1
106.62
30
3
1.41572
1
122.56
30
3
0.288417
1
142.56
30
3
0.288417
1
193.126
30
3
1.167917
0
0
194.56
30
49.5
1.143
0
0
1
206.238
30
51
0.258521
0
0
0
1
100.177
30
3
1.252068
0
0
0
1
1
0
0
0
1
1
0
0
0
1
192
75
36
4.140683
A testcase framework for PLCS-IF
Systems of Parts
43
A testcase framework for PLCS-IF
Assemblies of Parts
44
A testcase framework for PLCS-IF
Parts Libaries
45
A testcase framework for PLCS-IF
Catalogs
46
A testcase framework for PLCS-IF
Drawings
47
A testcase framework for PLCS-IF
Ordering Information
48
A testcase framework for PLCS-IF
Configuration Management
49
49
A testcase framework for PLCS-IF
Configuration Management
50
50
A testcase framework for PLCS-IF
Connections and Connectors
From
To
Node 1 UID Node 2 UID
Unique ID
Connection Description
J03 Port 1 J04 Port 2 distPort_20 distPort_23
distJoint_5 J03 Port 1 - J04 Port 2
J04 Port 1 V01 Port 1 distPort_22 distPort_62
distJoint_6 J04 Port 1 - V01 Port 1
V01 Port 2 J05 Port 1 distPort_63 distPort_24
distJoint_7 V01 Port 2 - J05 Port 1
J05 Port 2 J06 Port 1 distPort_25 distPort_26
distJoint_8 J05 Port 2 - J06 Port 1
distJoint_5
distJoint_6
distJoint_7
distJoint_8
J03
distPort_20
J04
distPort_23
V01
distPort_22
distPort_62
distPort_63
J05
distPort_24
Component Name
Instance
Node
Node Unique ID
xPart
yPart
zPart xModel
Flange, 150lb, FF, 1in
J03
J03 Port 1
distPort_20
0
0
0
Flange, 150lb, FF, 1in
J03
J03 Port 2
distPort_21
0.26
0
Gasket 1/16in, 150lb, 1in
J04
J04 Port 1
distPort_22
0
0
Gasket 1/16in, 150lb, 1in
J04
J04 Port 2
distPort_23
0.0625
Gasket 1/16in, 150lb, 1in
J05
J05 Port 1
distPort_24
Gasket 1/16in, 150lb, 1in
J05
J05 Port 2
Flange, 150lb, FF, 1in
J06
Flange, 150lb, FF, 1in
yModel
zModel
30
3
0
106.62
30
3
0
106.943
30
3
0
0
106.8805
30
3
0
0
0
110.883
30
3
distPort_25
0.0625
0
0
110.9455
30
3
J06 Port 1
distPort_26
0
0
0
110.945
30
3
J06
J06 Port 2
distPort_27
0.26
0
0
111.205
30
3
Valve Gate, 150lb, 1in, FF
V01
V01 Port 1
distPort_62
0
0
0
106.943
30
3
Valve Gate, 150lb, 1in, FF
V01
V01 Port 2
distPort_63
3.94
0
0
110.883
30
3
51
J06
distPort_25
106.88
distPort_26
Day 2 – Jump start
User
Group
Underlying technology to facilitate proper implementation
•
• Joint programs need a minimum of common exchanges
to support deployed systems with local support
• Need to make decisions on what is common and what
flexibility is needed – where do we need interoperability
– DEX harmonization should be an early priority (use entities the
same way)
– Pick 1 DEX and focus on this as 1st target
– Building blocks (Ben’s example of ICF)
• Shared logistics chain – LogFast, AIA
• Sample simple DEX and data (Xenia) to remove
implementor barriers
Day 2 – Jump start
User Group
•
Context, NG developed
1.
2.
3.
•
Navy will use PLCS for delivery of data to the Government
1.
2.
3.
•
•
•
•
Internal DEXs w/ 3rd party - Mark
Provide a Generic small prototype (Dummy Business Level), steps to develop template
and business process
Archiving of data; must save schema at point of archiving. See EN 9300 (Howard rec’d
looking at LOTAR Preservation planning & Governance)
Configuration management Data in DEX, need to know version and applicability
Ask Industry to select std DEX for delivery of data (ask them to use those that exist, but
authorize to build what they need)
Preference is to release to world and work with Standards body to harmonize.
Issued six months ago. Award not expected for 18 months
Clearing house for DEXs and actual uses. PLCS-TC DEXLIB is intended
to be used by those building DEXs; PLCS-resources.org
Capture Business models and map to available DEXs.
LOGSA 2 DEXs (provisioning, product breakdown for support) with
reference data 3rd quarter; 6 months adapter available.
LOGSA PLCS-IF Forum Inception Phases 1st 6-8 weeks ConOps, 2nd
tools & architecture, 3rd phase stand up.
Day 2 – Jump start
User Group
•
•
•
•
•
NDLO
– Exploit NAVSEA test case framework and data
– DEX should be delivered w/ real sample data
Jerry – pick 1 DEX to start with for fleshing out testing
– Focus on interoperability rather than pure compliance
– Credible path for commercial translators
– Pursue plumbing example
– Work out the ConOps rules based on this example (capture scenario)
UID example
– Role based implementation guidance
– Howard Mason will streamline the AIA Scenario Process as basis.
LOTAR Validation & Verification Work may contribute to Quality of Data
Can forum be used to generate Business Cases for why AP239/DEX (Value
Measures)
Day 2 – Jump start
Implementors Group
• Address implementation issues
• Look for technology to facilitate
development and testing
• Develop sample DEXs and data sets for
training and education (sample aps)
• PLCS for Dummy’s
• Spread the knowledge of PLCS – Need
more experts
Day 2 – Jump start - Implementors Group
What artifacts can we give developers to assist their internal testing, prior to
formal PLCS-IF testing?
Phase II
• Test cases
–
–
–
–
Good DEX
Part 28 (P21 and OWL derived from P28) example files
Reference data
Description of the test case with pictures
• A few simple business rules (from DEX and implementation agreements) in English
• Rules also written in EXPRESS, Schematron, Sparql and Java
–
–
–
–
•
•
•
•
Sample code (macros in a spreadsheet)
Source data in spreadsheet
Target “model” and resulting data
Style sheets for presenting P28 files in an attractive, human readable manner
“How to read a DEX” – for implementors
List of validation tool for P21 and 28 files with guidance
Open information sharing forum focusing on implementation (e-mail and Wiki)
Links to tools (open source and commercial) to aid implementors
Day 2
Phase III
• Application built on top of target or source data
• “Hello World” DEX (how to build a simple DEX)
• Access to PLCS implementation “experts”
• Take over and enhance “Instructions for
Software Implementors” (On DEXlib currently)
• Web based tool independent PLCS training
• Get educational establishments involved
Day 2
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Instance count by type
Data type checking
Dangling pointers
Orphans
Mandatory attributes
Boundary checks for attributes
Valid part 21/XML file
Where rules
Uniqueness
Comparison between reports (bug fixing)
Current scope file exchange, web services soon after
Public Web Site Requirements Definition
•
•
•
•
Guidance and best practices
Implementation guidelines
Collaboration Space for Implementors
Source Forge
Agenda Friday, March 18, 8:30 to 12:00
Mount Pleasant & Dorchester Rooms
•
•
•
•
•
•
PLCS TC meeting - 8:30- 9am
DEXLIB 2 – Tor’Arne Irgens
PLCS-IF Roles and Responsibilities – Phil Rosche
Schedule and Test Plan discussion and development
Break – 10:00 – 10:15
Testing System Walk-through from an Implementor
Perspective – Phil Rosche
• Wrap-up, Next Steps, and Action Items –
11:00 – 12:00 – Phil Rosche
Testing Reference Data,
PLCS Implementation
Approaches – David Price
See Attachment A
DEXLIB 2
Tor’Arne Irgens
See Attachment B
PLCS-IF Roles and
Responsibilities
Phil Rosche
PLCS-IF Actors
Implementor – one who writes code and develops software
User – A system user from an organization that has or will
implement PLCS
Stakeholder – An organization that has “skin in the game”
(OEM’s, Tiers of the supply chain, etc.)
Sponsor – Any organization that provides funding and
resources for the development and sustainment of the
PLCS standard and/or the PLCS-IF
PLCS-IF Organization
PLCS-IF Governing
Board
PLCS-IF
Forum Director
PLCS-IF Advisory Board
(e.g., rep. from PDES, Inc.,
OASIS TC, ASD, AIA)
Management Infrastructure &
Support
Technical Infrastructure
&
Support
PLCS-IF Key Roles

Governing Board – Made up of key stakeholders to provide strategic
guidance and operational direction

Forum Director – Responsible for organizing and facilitating the
testing activities of the forum, and internal and external
communication and marketing

Advisory Board – Provides tactical advice and recommendations to
the forum; source of requirements and scenarios;

Management Infrastructure and Support



Supports Forum Director by providing cost and schedule reports
Provides day to day support to the Forum Director
Technical Infrastructure and Support


Development and maintenance of testing tools and infrastructure
Data model expert to provide guidance to implementors
PLCS-IF Membership
 PLCS-IF is open to any organization developing
software based on the PLCS standard and any
organization providing support to the PLCS-IF
 Participants may include, but are not limited to:
 Department or Ministry of Defense Organizations
 OEM software vendors
 First and lower tier commercial and defense contractors
 Middleware software vendors
 Ongoing Forum operations and maintenance will
be supported through a Membership and
Participant fee structure similar to the CAX-IF
Schedule and Test Plan
discussion and
development
Revised timeline
3 Months
4 Months
5 Months
Inception Phase I
•Project Plan
•Requirements &
Systems Design
•Concept of
Operations
Meetings
Build Phase II and III
•Establish Public Website
•Develop syntax & structure
checker
•DEX Id and Prioritize
•Develop business case
•Create sample data sets
•Develop /prioritize use cases
•Develop initial testing
documentation
•ID and recruit stakeholders
• Establish Secure Private Website
and Infrastructure
•Develop semantic analyzer
•Integrate RDL
•Develop statistics and results module
•Gather and validate test data Create
data sets
•Develop testing documentation
•Plan one round testing and verify test
system against concept of operations
PLCS-IF Formal Testing
System
User A
Logs
In
Selects
Test
Submits
Files
Syntax &
Structure
Check
P21 or 28,
Reference
Data
File
Vault
Results
Report
System
Users
B
C
D
Dex 1
Dex 2
P21
P28
Semantic
Analysis
Results
Report
PLCS-IF Informal Testing
System
User A
Scope of
Informal
Testing
Logs
In
Selects
Test
Submits
Files
Syntax &
Structure
Check
P21 or 28,
Reference
Data
File
Vault
Results
Report
System
Users
B
C
D
Dex 1
Dex 2
P21
P28
Semantic
Analysis
Results
Report
Potential to Draw From Other Standards
Test Environments
PLCS Implementers Forum
Path Forward
• Phase 1 – Develop detailed plan, test
system design, and concept of operations
• Phase 2 – Develop initial infrastructure for
Informal Testing
• Phase 3 – Expand infrastructure to
support Formal Testing
• Phase 4 – Commence Formal Testing
Inception
Phase 1
Collect Requirements
Develop Detailed Plan
Build
Phase 2
Develop
Infrastructure, Tools,
and Resources for
Informal Testing
Phase 3
Develop
Infrastructure, Tools,
and Resources for
Formal Testing
Develop business case
ID and get stakeholders on board
Pursue funding
Sustain
Phase 4
Commence Formal
Testing
Identification of potential
PLCS-IF participants,
stakeholders, and sponsors
ADD from NOTES
STEP AP 239 PLCS Implementers Forum
Scope:
Objectives:
 To Verify/Validate STEP AP 239 PLCS

 V/V the PLCS Standard



PLCS Templates
PLCS Reference Data
PLCS DEXs


 V/V PLCS Software Implementations
 Target group: STEP AP 239 PLCS DEX
Implementers/Developers
Potential Interested Participants:









US ARMY (LOGSA)
US NAVY (NAVSEA)
Jotne
Eurostep
LSC
Boeing
IBM
EADS
LOTAR

Provide a neutral and open environment to
accelerate development and implementation of
PLCS adapters
Establish a vendor independent forum
Ensure Quality Assurance of, and build
confidence in the PLCS Standard and PLCS
Software Implementations
Ensure that user's requirements are satisfied
through real world scenario based testing
Deliverables:




Report Test Results (Executive Summary/
Statistics) to PLCS TC
Confidential Detailed QA Report to Vendor
Recommended Changes to PLCS (Templates,
DEXs, STD) to OASIS PLCS TC
Recommend Practices to avoid potential
conflicts provided to OASIS PLCS TC for
Action.
Wrap-up, Next Steps, and
Action Items
Phil Rosche