© IBM Corporation, 2010 IBM Advanced Technical Skills WP101783 at ibm.com/support/techdocs An illustration IBM WebSphere Compute Grid for z/OS Integrated with an Enterprise Scheduler such.

Download Report

Transcript © IBM Corporation, 2010 IBM Advanced Technical Skills WP101783 at ibm.com/support/techdocs An illustration IBM WebSphere Compute Grid for z/OS Integrated with an Enterprise Scheduler such.

© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
An illustration
IBM WebSphere Compute Grid for z/OS
Integrated with an Enterprise Scheduler such as
IBM Tivoli Workload Scheduler
See a narrated video of this
on YouTube … search on
ATSDemos
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Preview of Technical Message
IBM WebSphere Compute Grid z/OS has an MDB interface
intended to interface with enterprise schedulers
The WSGRID utility program is what connects the enterprise
scheduler to Compute Grid
WSGRID forms up a job submission message and places it on
a queue. The MDB picks it up and the job is submitted inside
Compute Grid.
WSGRID stays active while the job executes in Compute Grid
and feeds output to JES spool and alerts the enterprise
scheduler of the Java batch job’s status
This design allows Compute Grid Java batch to be integrated
with traditional batch in a broader batch process
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Please Note …
IBM WebSphere Compute Grid is supported on all platforms
supported by WebSphere Application Server
The focus of this presentation will be Compute Grid for z/OS
Our focus will be on integration with Tivoli Workload
Scheduler, but this integration design works with any
scheduler capable of submitting JCL to JES
Our focus will also be on using WebSphere MQ as the JMS
provider, but there is also a solution involving the internal
messaging provider of WebSphere Application Server
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Tivoli
Workload Compute
Scheduler Grid Job
Console
The WebSphere
WebSphere Compute
ApplicationGrid
Server z/OS
scheduler function has a
browser interfaceCompute
AppServer
Grid
End Point
AppServer
Compute
Grid
Scheduler
Batch
Application
JES
In addition to a browser
interface, Compute Grid
also provides:
Spool
• Command line interface
Job Submission
and Dispatching
AppServer
Compute Grid
The question
is this:
End Point
what ties TWS + JES
Batch
to Compute
Grid?
Application
Data
Systems
DB2
CICS
IMS
MQ
VSAM
etc.
• Web Services interface
• RMI Client interface
• MDB interface
System z
and z/OS
This is the interface of
particular interest for
z/OS Facilities
integration
with and Functions
(WLM, RRS,schedulers
SAF, RMF, Parallel Sysplex, etc.)
enterprise
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Two versions of WSGRID are
a
Tivoli The answer: WSGRID,
WebSphere Application Server z/OS
Workload utility program supplied
provided: a C/C++ native
Scheduler
AppServer
implementation
on z/OS;
with Compute Grid
Compute
Grid
Endimplemented
Point
and on
in Java
AppServer
Compute
Grid
Scheduler
JES
MDB
Spool
Batch
Application
Data
Systems
AppServer
Compute
Grid
End Point
DB2
CICS
IMS
MQ
VSAM
etc.
Batch
Application
JOB
MQ
PGM=
The native WSGRID utility
interacts with Compute Grid
using MQ and BINDINGS mode
WSGRID
Input
System z
and z/OS
Output
Native
code
utility and
MQ with
z/OS
Facilities
and
Functions
BINDINGS means this is very fast
(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Let’s take a high-level look at how this works,
then we’ll dig into some of the details
Normal archive
WebSphere
Application Server z/OS
Tivoli
Workload
Scheduler
process
AppServer
Compute
Grid
End Point
JCL
Job RC = 0
JES
AppServer
Compute
Grid
Scheduler
MDB
Spool
Batch
Application
The job Data
executes
and Systems
completes
AppServer
Compute
Grid
End Point
DB2
CICS
IMS
MQ
VSAM
etc.
Batch
TWS submits WSGRID JCL Application
to JES (details on JCL coming)
JOB
MQ
WSGRID forms up message (details coming) and places on queue
PGM=
WSGRID
Msg
JCL names PGM=WSGRID, which results in program being launched
MDB in scheduler fires and pulls message off the input queue
Input
System z
and z/OS
Output
Job is dispatched, executes and completes
z/OS Facilities and Functions
Scheduler feeds output back to MQ in a series of messages
(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
WSGRID pulls messages off output queue and writes to JES
WSGRID ends and JES alerts TWS of job return code
If desired, normal JES spool archiving may take place
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Let’s see what the JCL for WSGRID looks like,
and start to demystify how this works.
Standard JOB card
Tivoli
Workload
Scheduler
EXEC PGM=WSGRID
STEPLIB to WSGRID
module and MQ
QLOAD and QAUTH
JCL
JES
SYSPRINT DD to JES
Name the QMGR and
input / output queues
Spool
JOB
MQ
Specify the input xJCL
path and file name
PGM=
WSGRID
Input
Output
Provide any substitution
properties you wish to
pass into the xJCL
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
The output ends up in the JES spool, and is
viewable like any other JES spool
Tivoli
Workload
Scheduler
WebSphere Application Server z/OS
AppServer
Compute
Grid
End Point
JES
Spool
JOB
MQ
Batch
AppServer
Application
Compute Grid
Scheduler
The first part of the job
output showing the xJCL
and substitution variables
AppServer
Compute
Grid
The second part of the job
End Point
output showing the output
from the return codes for
Batch
each
step
as
well
as
the
Application
xJCL
overall job return code
Data
Systems
DB2
CICS
IMS
MQ
VSAM
etc.
PGM=
WSGRID
Input
System z
and z/OS
Output
z/OS Facilities and Functions
(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Are jobs submitted through WSGRID controllable
from the Job Management Console?
Tivoli
Workload
Scheduler
WebSphere Application Server z/OS
AppServer
Compute
Grid
End Point
AppServer
Compute
Grid
Scheduler
JES
Application
AppServer
Compute
Grid
End Point
Spool
xJCL
JOB
Batch
Cancel
Job
MQ
PGM=
Data
Systems
DB2
CICS
IMS
MQ
VSAM
etc.
Batch
Application
Yes! Jobs submitted through WSGRID
are controllable through the Job
Management Console (JMC).
WSGRID
Input
System z
and z/OS
Output
Facilities
and are
Functions
And z/OS
actions
in the JMC
fed back to
(WLM,
RMF,
Parallel Sysplex,
etc.)
JESRRS,
andSAF,
TWS
through
WSGRID
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
What if you don’t have MQ as a part of your
enterprise messaging infrastructure?
Then use the Java client with the built-in WAS messaging
Tivoli
Workload
Scheduler
WebSphere Application Server z/OS
Compute
Grid
AppServer
Scheduler
JCL
wsgridConfig.py
JES
MDB
AppServer
Compute
Grid
End Point
Batch
Application
JMS
SIBus
Spool
JOB
JMS
Destination
AppServer
Compute
Grid
End Point
Batch
JobApplication
Output
Data
Systems
DB2
CICS
IMS
MQ
VSAM
etc.
Dispatched and
Executed
BPXBATCH or JZOS
WSGrid.sh
Msg
Run supplied WSADMIN script to create messaging components
inside the Compute Grid Scheduler
z/OS Facilities and Functions
System z
and z/OS
TWS integration using JCL same as before. Difference is the job
(WLM,
RRS,
SAF,
Sysplex,
etc.)
now
launches
a Java
clientRMF,
ratherParallel
than the native
MQ client.
The WSGrid Java client forms message and places on JMS
destination. MDB fires and pulls the message and submits job.
If using JZOS then output can be directed back to JES spool.
If BPXBATCH then output goes to file system.
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Integration with Traditional Batch
We know that Tivoli Workload Scheduler
(TWS) is a powerful enterprise scheduler
We’ve seen how it integrates with WebSphere
Compute Grid
Now let’s see how we can use the power of
TWS to integrate Compute Grid and
traditional batch into a larger batch process
Finally, we’ll simplify the pictures a bit to
reduce clutter and focus on the key points
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Imagine you have a mixed-batch environment,
with Compute Grid and traditional batch
WCG Endpoint
Tivoli
Workload
Scheduler
A
Batch
Appl B
Batch
Appl
WCG Scheduler
JES
MDB
WCG Endpoint
Spool
Batch
C
C/C++
3
Appl
MQ
JOB
WSGRID
COBOL
Input
Output
Batch
1
Assem
Batch
2
Batch
You have Tivoli Workload Scheduler and other z/OS functions (JES)
You have a series of traditional batch jobs
You have WebSphere Compute Grid in place with several batch
applications deployed to the batch endpoints
System z
and z/OS
You plan to integrate TWS with WebSphere Compute Grid so you have
the WSGRID program ready with MQ input/output queues defined
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Imagine further that you have a TWS batch
workflow defined with mixed Java and native batch
JCL Job Library
Tivoli
Workload
Scheduler
WCG Endpoint
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
JES
WSGRID JCL for Java B
Spool
Batch
A
B
Batch
C
C/C++
3
You assemble the JCL for your traditional
Appl native
batch jobs so TWS has access to submit to JES
Batch
WCG Scheduler
Appl
MDB
And you assemble the JCL to invoke an instance of
WSGRID for each Java batch
job Endpoint
in WCG
WCG
WSGRID JCL for Java C
Appl
MQ
JOB
WSGRID
COBOL
Input
Batch
Output
1
Assem
Batch
2
Batch
Assembler
2
Your TWS batch workflow:
System z
and z/OS
A
1
Java
COBOL
B
3
C
C/C++
Java
Java
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Let’s now walk through an illustration of how TWS
would integrate traditional and Java batch …
TWS process initiated
JCL Job Library
Tivoli
Workload
Scheduler
COBOL Batch 1 JCL
Job dispatched to end point
where application deployed
A
Batch
B
Appl
Job
Batch
Appl
Assembler Batch 2 JCL
WCG Scheduler
C/C++ Batch 3 JCL
WSGRID JCL for Java A
JES
completes
MDB
WSGRID JCL for Java B
WCG Endpoint
WSGRID JCL for Java C
Batch
C
C/C++
3
Spool
Appl
Job output goes to spool
MQ
JOB
WSGRID
Msg
WCG Endpoint
COBOL
Input
Batch
Output
Message formed based on
WSGRID job initiated
properties
inline
with JCL
WSGRID
job spun
down
or in named properties file
1
Assem
Batch
2
Batch
Tivoli Workload Scheduler readies itself to proceed
Assembler
in the
workflow … COBOL Job 1 … that’s next
2
Your TWS batch workflow:
System z
and z/OS
A
1
Java
COBOL
B
3
C
C/C++
Java
Java
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
TWS moves on to the next job in its process – a
traditional COBOL batch job
JCL Job Library
Tivoli
Workload
Scheduler
WCG Endpoint
COBOL Batch 1 JCL
A
Batch
Appl B
Batch
Appl
Assembler Batch 2 JCL
WCG Scheduler
C/C++ Batch 3 JCL
WSGRID JCL for Java A
JES
MDB
WSGRID JCL for Java B
WCG Endpoint
WSGRID JCL for Java C
Batch
Spool
MQ
COBOL
Input
Appl
Job
completes
Job output goes to spool
1
2
C
3
Assem
Job C/C++
Batch
Batch executesBatch
Tivoli Workload Scheduler readies itself to proceed
in the workflow … simultaneous submission
Output
JES initiates batch job
Assembler
2
Your TWS batch workflow:
System z
and z/OS
A
1
Java
COBOL
B
3
C
C/C++
Java
Java
Click
A TWS process may consist of multiple jobs run
simultaneously. It makes no difference to TWS if
the jobs are mixed Java and native.
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
JCL Job Library
Tivoli
Workload
Scheduler
WCG Endpoint
COBOL Batch 1 JCL
Appl
Assembler Batch 2 JCL
WCG Scheduler
C/C++ Batch 3 JCL
WSGRID JCL for Java A
JES
MDB
Note: We’re going
to speed this
up
1
2
quite a bit
WSGRID JCL for Java B
WCG Endpoint
WSGRID JCL for Java C
Batch
C
C/C++
3
Spool
JOB
MQ
WSGRID
Msg
A
Batch
Appl B
Batch
Job output goes to spool
COBOL
Input
Appl
Assem
Batch
Output
Batch
Batch
Assembler
2
Your TWS batch workflow:
System z
and z/OS
A
1
Java
COBOL
B
3
C
C/C++
Java
Java
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
The processing of the final two jobs in this batch
flow unfolds just like the first two did …
JCL Job Library
Tivoli
Workload
Scheduler
WCG Endpoint
COBOL Batch 1 JCL
Tivoli process complete with
all jobs
ending RC=0
Assembler
Batch 2 JCL
Appl
WCG Scheduler
C/C++ Batch 3 JCL
WSGRID JCL for Java A
JES
A
Batch
Appl B
Batch
MDB
WSGRID JCL for Java B
WCG Endpoint
WSGRID JCL for Java C
Batch
C
C/C++
3
Spool
Appl
MQ
JOB
WSGRID
Msg
COBOL
Input
Batch
Output
1
Assem
2
Job output goes
to spool
Batch
Batch
Assembler
2
Your TWS batch workflow:
System z
and z/OS
A
1
Java
COBOL
B
3
C
C/C++
Java
Java
Click
© IBM Corporation, 2010
IBM Advanced Technical Skills
WP101783 at ibm.com/support/techdocs
Summary of this Show …
•
Integration with Enterprise Schedulers is provided by
the WSGRID function
•
WSGRID is a module that’s easily submitted with
batch JCL
•
One option is a thin MQ client that puts message on an
MQ queue. Compute Grid MDB picks it up and submits job
•
There is a Java based client that does not require MQ.
It’s not as fast as the native MQ client however.
•
WSGRID feeds Compute Grid job output back to the
JES spool, and informs enterprise scheduler of RC
•
Because of this model, Compute Grid may be
integrated with traditional batch using Enterprise
Scheduler process flows
End