The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations

Download Report

Transcript The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations

Page 1 of 35
George Hrbek
[email protected]
LA-UR–05–6750
The Pinocchio Project and Verifying
Complex Multi-Material Flow Simulations
Requirements and Issues for Automation
of Complex Multi-Material Flows
Presented at the Workshop on Numerical Methods
for Multi-Material Fluid Flows
St Catherine’s College Oxford, UK 8 September 2005
George M. Hrbek
X-8 Computational Science Methods
Los Alamos National Laboratory
George Hrbek
[email protected]
Page 2 of 35
LA-UR–05–6750
The Pinocchio Project

Verification module of Quantitative
Simulation, Analysis, and Testing (QSAT)
George Hrbek
[email protected]
Page 3 of 35
LA-UR–05–6750
Mission of QSAT

Identify, create, and maintain products,
and provide services that aid in
analyzing and certifying coupled physics
simulation codes.
George Hrbek
[email protected]
Page 4 of 35
LA-UR–05–6750
Products are Analytic Test Functions (ATFs)

ATFs are analytical tests performed on physics-simulation codes

Essential to our simulation efforts


Aid in interpretation and application of relevant theory
ATFs include




Code and Calculation-verification analyses (e.g., convergence
studies)
Error-ansatz Characterization (formulating discretization-error
models)
Sensitivity Analysis
Uncertainty Quantification
George Hrbek
[email protected]
Page 5 of 35
LA-UR–05–6750
Properties of ATFs

Generally ATFs are






Mathematically complex
Require multiple procedural steps
Each procedure requires specialized software
ATFs may require significant computing resources to generate
the underlying or foundational simulations
ATFs are limited by their complexity and intensive computational
nature.
-> Automation can help
Frequent and independent testing is necessary throughout the
development, assessment, and deployment of physicssimulation codes.
-> Automation is essential
George Hrbek
[email protected]
Page 6 of 35
LA-UR–05–6750
QSAT Focus Areas

Apply cutting edged analysis methods incorporated in
the ATF modules to interpret experiments through
simulation



Demonstrates importance to experimental and simulation efforts
Aids in interpretation and application of relevant theory
Automate as appropriate


Scripting
Streamline and merge similar elements


Common – spawn jobs, manage results, spawn ATFs, write
reports
Unique – determine # of jobs, input templates, create ATF
analysis tools
George Hrbek
[email protected]
Page 7 of 35
LA-UR–05–6750
QSAT Focus Areas

Extract common and unique processes




Codes
Platforms
Organizations
Abstract processes and create a
common code framework
George Hrbek
[email protected]
Page 8 of 35
LA-UR–05–6750
Requirements for QSAT ATF Modules

Invoked through CTS (Collaborative Testing System)

Cross platform compatibility

Works with ALL ASC and Legacy Projects

Meets or exceeds best software engineering practices



Documentation
Maintainability
Code reuse
George Hrbek
[email protected]
Page 9 of 35
LA-UR–05–6750
Requirements for QSAT ATF Modules

Requires uniform test problems and methods



Uniform application of ATFs, coded exact analytics,
and report generation software
Standard template for adding new problems
Increases functionality for ATF analyses and
report generation
George Hrbek
[email protected]
Page 10 of 35
Operational Requirements

Run in automatic and interactive modes

Improves frequency and ease of





Use
Reporting
Archiving
Traceability
Uses good software practices



Increases reliability
Maintainability
Addition of upgrades and new features
LA-UR–05–6750
George Hrbek
[email protected]
Page 11 of 35
LA-UR–05–6750
Fig. 1. Flowchart of Automatic Verification
Results of a
Physical
Simulation
from CTS
Grid Points
are
extracted
Exact
Analytic
Program
Perform Verification Analysis
Exact Analytic
Solutions for
Grid Points
Page 12 of 35
George Hrbek
[email protected]
LA-UR–05–6750
Fig. 2. The Pinocchio Project (Verification)
Specialized
Decision Module
(How many jobs?)
Spawn jobs
Templates
Library
Run jobs, results stored,
and control deck written
Verification
Analysis
Module
(Pinocchio Project)
Verification Control Deck
Simulation Results
Report written
George Hrbek
[email protected]
Page 13 of 35
LA-UR–05–6750
Fig. 3. The Pinocchio Project - Major Modules and their Function
Jiminey - Scripts
Geppetto - Analytic Solutions
Figaro –
Data Acquisition and Parsing Tools
Collodi – Automation Tools
and Scripts Repository
Cleo - Verification Tools
George Hrbek
[email protected]
Page 14 of 35
LA-UR–05–6750
Problems Automated to Date

Crestone Project: six of the seven tri-lab test problems.

Frank Timmes T-DO, Jim Kamm X-7, and Ron Kirkpatrick X-3







Noh
Sedov
Reinicke Meyer-ter-Vehn
Su-Olson
Coggeshall 8
Mader
Shavano Project: one of the seven tri-lab test problems.

Jim Kamm X-7 and Jerry Brock X-7

Noh
LA-UR–05–6750
Page 15 of 35
Uncertainty
Validation
…
Regression
Verification
(Pinocchio
Project)
ATF
Modules
Fig. 4. The General ATF Flowchart
Specialized
Decision
Modules
(How many runs?)
GUI (spawns jobs)
Templates
Library
Manage Simulation Jobs and Results
through the Collaborative Testing System
(CTS)
Simulation Results
ATF Control Deck
Report Written
George Hrbek
[email protected]
George Hrbek
[email protected]
Page 16 of 35
LA-UR–05–6750
How do we automate an ATF?

Recognize that all code projects seem to implement
common ATFs in unique ways.




Separate serendipity from real code dependent requirements
(e.g., data structures, file formats)
Identify real code dependent requirements that effect
implementation of ATFs
Breakdown ATFs into steps or processes that are
clearly defined and understood by an independent
agent
Drill down into each process and identify as either a
common or a unique element.
George Hrbek
[email protected]
Page 17 of 35
LA-UR–05–6750
What elements should we
automate in an ATF?

ONLY the UNIQUE elements particular to the
specific ATF analysis


Which jobs to run?
Details of the ATF analysis

Common processes that include a ‘translator’
to handle cell, vertex, and face centered data

Code unique dump files need to be read

Move towards a ‘universal’ format.
George Hrbek
[email protected]
Page 18 of 35
LA-UR–05–6750
How do we automate an ATF?

Identify individual ATFs


Break each ATFs down into individual processes that
can be clearly defined


YOU tell people like ME what you need to do.
People like ME aid YOU (i.e., the Experts) in explaining each
step in excruciating detail!
Identify each process as either a common or a
unique element.

That’s why I’M here
George Hrbek
[email protected]
Page 19 of 35
What is Verification?

Demonstrates that the code



Solves the governing equations correctly
Shows the accuracy of the implementation
Two types

Code verification



Forward Analytical problems
Backward Analytical Problems
Calculation verification


No Analytical Solution
Often Self Convergence
LA-UR–05–6750
George Hrbek
[email protected]
Page 20 of 35
LA-UR–05–6750
Why do we need to verify code?

Only way to realistically measure and
demonstrate how well a physics code
approximates the variables for a
particular physical regime.
George Hrbek
[email protected]
Page 21 of 35
LA-UR–05–6750
Why do we need to do it so often?

Demonstrate that the code has not changed



New features are added
Problems fixed
Demonstrate that the instantiation of
algorithms is properly achieved

Second order algorithms achieve second order
accuracy
George Hrbek
[email protected]
Page 22 of 35
LA-UR–05–6750
What are convergent variables?

In addition to space and time





Temperature
Pressure
Velocity
…
It really depends on the test problem!
George Hrbek
[email protected]
Page 23 of 35
LA-UR–05–6750
As an example….

Consider the ‘instability triad’







Richtmeyer-Meshkoff
Rayleigh-Taylor
Kelvin-Helmholtz
What would constitute the parameter space and
range of validity of these phenomena?
What are the ranges of validity of the instantiated
algorithms?
Is there proper overlap? (i.e., does every algorithm
stay within the range of validity of the phenomena?)
What are the ‘Universal’ Test Problems?
George Hrbek
[email protected]
Page 24 of 35
LA-UR–05–6750
A test problem is said to be
universally usable when

Can be understood by all serious researchers in a particular field of research

Can be implemented on all physical simulation codes that are ready for
meaningful scientific investigation


Can generate information about the physical or mathematical phenomena
that is unambiguous
Can fulfill three requirements:




Is unambiguously defined
Is documented
Is certified as correct
How do we do this?
George Hrbek
[email protected]
Page 25 of 35
LA-UR–05–6750
Forward vs. Backward Problems

The Forward Problem


Classical method of solving PDE’s (e.g.,
solving the heat conduction equation using
separation of variables for given IC’s, BC’s,
and coefficients)
The Backward Problem

Solved through the Method of Manufactured
Solutions (MMS)
George Hrbek
[email protected]
Page 26 of 35
LA-UR–05–6750
The “Forward Problem”


Direct comparison of code with exact solutions
to real problems
Limitations





Simplification of general problem space
Primitive physical domains
Existence of singularities
Many special cases needed to test BC’s and/or IC’s
Difficult if not impossible to design a full coverage
test suite
George Hrbek
[email protected]
Page 27 of 35
LA-UR–05–6750
The “Backward Problem”



Method of manufactured solutions (MMS)
Allow one to test the most general code capability
that one intends to use (i.e., the Cardinal Rule of
verification)
Limitations



Must think about the ‘types of terms’ that will be exercised in the most
general use of the code
Requires code developers to insert a source term into the appropriate
differencing equations
Must prevent users from accessing this source term for a ‘knob’
George Hrbek
[email protected]
Page 28 of 35
LA-UR–05–6750
The 10 Steps of the MMES
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Determine the governing equations and the
theoretical order of accuracy
Design a suite of coverage tests
Construct an exact solution
Perform the test and calculate the error
Refine the grid
Compute the observed order of accuracy
Troubleshoot the test implementation
Fix test implementation
Find and correct coding mistakes
Report results and conclusions of verification tests
Page 29 of 35
George Hrbek
[email protected]
LA-UR–05–6750
Design a suite of coverage tests

Define what code capabilities will and will not be tested

Determine the level that each capability will be tested

List and describe what tests will be performed

Describe the subset of the governing equations that
each test will exercise
George Hrbek
[email protected]
Page 30 of 35
LA-UR–05–6750
Example of MMS:
1-D Steady State Thermal Slab

Manufactured Solution (made up)
T [x] = C (A cos l x + B sin l x)

Steady State Condition and BC’s
 2T [ x]
 0,
2
x

T [0]  T0
T [ L]  T1
Steady State Solution
T [x] = T0 cos lx + csc lL sin lx (T1 - T0 cos lL )
George Hrbek
[email protected]
Page 31 of 35
LA-UR–05–6750
Determining the Source Function Q[x]

The Source Function Q is defined from
 2T [ x]
 Q[ x]
2
x


We obtain the corresponding source function
Q [x] =- l2 ( T0 cos lx +
csc lL sin lx (T1 - T0 cos lL ) )
For this case ONLY (in general this is NOT true!)
Q [x] =- l2 T [x]
Page 32 of 35
George Hrbek
[email protected]
LA-UR–05–6750
Using the Source Function Q[x]

For difference equations
 2T [ xn ]
T [ xn1 ] 2T [ xn ]T [ xn1 ]

Q
[
x
]

n
2 x
x 2

n  1...10
We compute Q and insert it into every zone
Q [xn] =- l2 ( T0 cos l xn +
csc lL sin lx (T1 - T0 cos lL ) )
George Hrbek
[email protected]
Page 33 of 35
LA-UR–05–6750
Fig. 5. Temperature and Source Function Profiles
T0 = 100oC T1 = 375oC L = 10 m
1D Slab Source Function Profiles vs l
1D Slab Temperature Profiles vs l
500
0
2
4
6
8
o C/M 2
400
Source Function Q (
Temperature (
o
C)
)
0
300
0.01
0.05
200
0.1
0.15
-5
-10
0.01
0.05
0.1
-15
0.15
0.2
0.2
100
0
2
4
6
Position (m)
8
10
-20
Position (m)
10
George Hrbek
[email protected]
Page 34 of 35
The ‘Bottom Line’

Test problems are



Expensive to implement
Tedious => essential to automate!
Tend to be redundant between ATFs
LA-UR–05–6750
George Hrbek
[email protected]
Page 35 of 35
LA-UR–05–6750
Conclusions



We need to perform ATFs often so we must
automate the processes
Must choose problems carefully to properly
cover physical regimes and parameter spaces
The development of a single automated ATF
framework should allow for easy
incorporation of additional ATFs