roundtable-SCOH present v8

Download Report

Transcript roundtable-SCOH present v8

The AASHTO Guide to
Systems Operations & Management
NCHRP 3-94
Steve Lockwood – PB
Phil Tarnoff – University of Maryland
Rich Margiotta, Erin Flanigan – CSI
John Conrad – CH2MHill
Scott Rawlins – SSOM and Panel Chair
Basic Mitigation
Strategy
Effective Strategies are evolving to maximize
performance of existing system
Impact of Operations Strategy
(Best Practice) on Total Delay
Causes
Solutions
Flow control/ramp
metering
5-6%
Traffic responsive
signals
1%
Incident management
5-6%
Work zone traffic
management
1-2%
Weather information
1%
Traveler information
1%
These impacts are comparable to many capacity investments
and can be applied on a large proportion of the total
network.
2
2
State of the Practice is Advancing
1. Performance Management critical for maximizing
mobility – especially as capacity lags
2. SO&M will significantly reduce delays and
disruptions
3. Large gap between average applications and best
practice
4. Review of DOT experience: absence of appropriate
processes & institutional arrangements is a major
barrier to effective SO&M
5. A formalized core SO&M program is the key to
success
3
3
Supporting Effectiveness Through Maturing
Processes and Institutional Arrangements
Program
SO&MA Program
Performance
needs-responsive,
A needs-responsive,
performance-driven, comprehensive
performance-driven,
and cost-effective
statewide
SO&M program
comprehensive
C/E statewide
SO&M program
Processes
Necessary Technical
and Business
The business processes
and systemsProcesses
required
to facilitate
program required to facilitate
Business & technical
processes
and systems
qualities
above
program qualities above
Institutions
The values, capabilities
and arrangements and
Supportive Institutional
& Organizational
Arrangements
resources required to support and sustain of the
The values, capabilities and required
arrangements
and resources required to support and sustain of the
business process
required business process
4
4
The Need for SO&M “Green Book”
• Objective: maximizing the effective use
(performance) of existing system
• Research shows that top management actions
regarding processes and institutional arrangements
are key to effectiveness
• Guidance to Indentify steps managers can/should
take designed to improve SO&M programs
• Develop existing material into an accessible and userfriendly product
• Web-based approach with self-evaluation point of
departure
5
5
The Operations Capability Framework:
Key Elements
Institutional/Organizational
Arrangements:
Business/Technical Process
Capabilities:
• Scope of Activities
• Culture/Leadership
• Business Processes
• Organization/Staffing
• Technology/Systems
• Resources
• Performance
• Partnerships
Measurement
6
6
Concept of Capability Maturity Levels
Goal for the future
Possible target
Most of today’s
agencies
Today’s
Best practice
Level 4
Mainstreamed
Level 3
Managed
Level 2
•Strategies integrated
•Outcomes known
•Continuous improvement
•Accountability
Transitioning
Level 1
Ad Hoc
•Some strategies
•No performance
•Fragmented organization
• Agency commitment
•Wide understanding
•Formal program
•Full range
•Processes standard
•Formal program
•Some measurement
7
7
Maturity Model Framework
(Has now been applied in practice)
• State DOT experience indicates that certain process &
institutional features are closely linked to effective
programs and their maintenance of continuing
improvement
• Key is internalizing continuous and measurable
improvement in customer-related performance across
complete range of operations needs
• Point of departure self-evaluated
• Defines next level of arrangements to improve
effectiveness (stepwise) and strategies to achieve it
8
8
Substantive Structure:




PROGRAM MATURITY
Needs-basis
 Customer needs related
 Network/context related
Support Infrastructure
 Sharing across
applications
 Functional integration
Applications
Aggressiveness
 Technology and
procedures balance
 Benchmarking
 Continuous improvement
Comprehensiveness and
Consistency
 Standardization
 Applications consistency
 Application range



PROCESS MATURITY
Business Processes
 Planning
 Programming and
budgeting
 Procurement
 Operations
Systems and Technology
 Regional architectures
 Project Systems
engineering process
 Standards/interoperabili
ty
 Testing and validation
Performance
 Measures definition
 Data acquisition
management
 Measures utilization
9




INSITUTIONAL MATURITY
Culture
 Mission Understanding
 Leadership
 Program Status
 DOT Authority
 Continuous Improvement
Organization and Staffing
 Executive Level
 Organizational & accountability
 Structure
 Core competencies
Resource Allocation
 Formal. program-level budget
estimate
 Sustainable resourcing from
state funds
 Project Investment trade offs
Partnerships
 Operational roles and
procedures with PSAs, LG,
MPOs
 Rationalize staff versus
outsourcing
9
Levels of Maturity
for Each of the 3 Dimensions
• Level 1 (L1) - characterizes agencies that have not yet
begun to develop an SO&M program
• Levels 2 (L2) and 3 (L3) - describe the level of maturity
found in the top 10 percent of State DOTs
• Level 4 (L4) - an ideal target for achievement in terms of
institutionalized, sustainable, continuously improving
SO&M activities
10
10
PROGRAM ELEMENTS
Level 1
Level 2
Level 3
Level 4
Needs-driven
Ad hoc, limited
C/E Priorities
indentified
Full range by context
Synergies applied
Support Infrastructure
Minimal support
Integrated on
Systematic basis
Applications optimized
Cost-effective linkages
Application aggressiveness
Not considered
State of the practice
Optimized to location
Performance-driven
Comprehensive/consistent
No standard
Standardized by type
Consistent application SW
Complete functions statewide
PROCESS ELEMENTS
Level 1
Level 2
Level 3
Level 4
No custom-tailored
processes
Functions and criteria for
processes identified
Processes standardized &
documented
Processes streamlined &
updated
Unique to project
SE employed for conops &
arch
SE & standards employed
statewide
Architectures and standards
updated
No performance
measurement
Some measurements via
outputs with limited use
Out put date used for
evaluation and reporting
Mission-related outcomes
utilized and archived
What types of processes (planning, systems
engineering,
performance measurement) are
Technology
and systems
Performance
needed to improve program to the next level –
and to continue to improve?
Business processes
INSTITUTIONAL ELEMENTS
Level 1
Level 2
Level 3
Level 4
Note: these are the elements: each one is broken
down
into components for guidance purposes
Organization
and staffing
Value of SO&M not
understood
SO&M appreciated by key
staff heroes
Visible agency leadership &
operations as formal
program
Customer service mission
key focus as core program
Legacy fragmentation &
limited capacities
Need for coord & core
capacities understood
Units aligned & staff
professionalized
Top level manager, CO &
districts with accountability
Resource allocation
Ad hoc for occasional
projects
Ad hoc annual allocation
to
projects
Transparent sustainable
line item resource
allocation
Operations as trade-off with
other investments
Partnerships
Based on temporary
personal relationships
MOU for specific functions
Rationalization by formal
agreements and laws
Agency level coordination
and accountability SW 11
Operations Culture
11
PROGRAM ELEMENTS
Level 1
Level 2
Level 3
Level 4
Needs-driven
Ad hoc, limited
C/E Priorities
indentified
Full range by context
Synergies applied
Support Infrastructure
Minimal support
Integrated on
Systematic basis
Applications optimized
Cost-effective linkages
Application aggressiveness
Not considered
State of the practice
Optimized to location
Performance-driven
Comprehensive/consistent
No standard
Standardized by type
Consistent application SW
Complete functions statewide
PROCESS ELEMENTS
Level 1
Level 2
Level 3
Level 4
Business processes
No custom-tailored
processes
Functions and criteria for
processes identified
Processes standardized &
documented
Processes streamlined &
updated
Unique to project
SE employed for conops &
arch
SE & standards employed
statewide
Architectures and standards
updated
No performance
measurement
Some measurements via
outputs with limited use
Out put date used for
evaluation and reporting
Mission-related outcomes
utilized and archived
Technology and systems
Performance
INSTITUTIONAL ELEMENTS Level 1
Level 2
Level 3
Level 4
What
are
the
leadership,
organization
and
staff
features,
Operations Culture
Resources and relationships needed to support and
Organization and staffing
maintain
the needed Processes?
Value of SO&M not
understood
SO&M appreciated by key
staff heroes
Visible agency leadership &
operations as formal
program
Customer service mission
key focus as core program
Legacy fragmentation &
limited capacities
Need for coord & core
capacities understood
Units aligned & staff
professionalized
Top level manager, CO &
districts with accountability
Resource allocation
Ad hoc for occasional
projects
Ad hoc annual allocation
to
projects
Transparent sustainable
line item resource
allocation
Operations as trade-off with
other investments
Partnerships
Based on temporary
personal relationships
MOU for specific functions
Rationalization by formal
agreements and laws
Agency level coordination
and accountability SW
12
12
PROGRAM ELEMENTS
Level 1
Level 2
Level 3
Level 4
Needs-driven
Ad hoc, limited
C/E Priorities
indentified
Full range by context
Synergies applied
Support Infrastructure
Minimal support
Integrated on
Systematic basis
Applications optimized
Cost-effective linkages
Application aggressiveness
Not considered
State of the practice
Optimized to location
Performance-driven
Comprehensive/consistent
No standard
Standardized by type
Consistent application SW
Complete functions statewide
PROCESS ELEMENTS
Level 1
Level 2
Level 3
Level 4
Business processes
No custom-tailored
processes
Functions and criteria for
processes identified
Processes standardized &
documented
Processes streamlined &
updated
Unique to project
SE employed for conops &
arch
SE & standards employed
statewide
Architectures and standards
updated
Performance
No performance
measurement
Some measurements via
outputs with limited use
Out put date used for
evaluation and reporting
Mission-related outcomes
utilized and archived
INSTITUTIONAL ELEMENTS
Level 1
Level 2
Level 3
Level 4
Operations Culture
Value of SO&M not
understood
SO&M appreciated by key
staff heroes
Visible agency leadership &
operations as formal
program
Customer service mission
key focus as core program
Organization and staffing
Legacy fragmentation &
limited capacities
Need for coord & core
capacities understood
Units aligned & staff
professionalized
Top level manager, CO &
districts with accountability
Resource allocation
Ad hoc for occasional
projects
Ad hoc annual allocation
to
projects
Transparent sustainable
line item resource
allocation
Operations as trade-off with
other investments
Partnerships
Based on temporary
personal relationships
MOU for specific functions
Rationalization by formal
agreements and laws
Agency level coordination
and accountability SW
Technology and systems
13
13
Guidance Based on Current Best Practice and Beyond
INSTITUTIONAL
DIMENSIONS
L-1: AD HOC
Legacy-based
L.2: RATIONALIZED
Restructuring
Operations Culture
Legacy—Hero-driven
Operations acknowledged, including value
of reliability) but without strategic
commitment or top level leadership
Adherence to legacy roles among
transportation and public safety entities
Championed/Internalized across
disciplines
Visible agency leadership citing Operations
leverage, cost-effectiveness and risks across
disciplines -Rationalization of responsibilities by formal
agreements across institutions (transportation
agency, PSAs, private)
L.3:MAINSTREAMED
Fully Supportive
Mobility Committed
Customer mobility service commitment
accessibility accepted as core program
Clear legal authority for operations roles,
actions among transportation agency, PSAs,
Local government clarified
Fragmented, Understaffed
Some fragmentation of key functions and
boundaries - horizontal and vertical
L-1
Reliance on key individual for technical
TRANSITIONING”
knowledge and champions for leadership
Aligning, trained
Professionalized
TMC focus with Vertical/horizontal
Top level management position with
authority/responsibility
alignment for
operations
orientation established in central
L-2
L-3
operations including P/B/D/C/O/M
office and districts
MANAGED
INTEGRATED
Core capacities established with KSA specs,
Professionalization and certification of
training and performance incentives
operations core capacity positions
Narrow and Opportunistic
Needs-based and Standardized
Full range Core Program
Project -level
Criteria-based program
Sustainable Budget Line Item
Ad hoc operations activities based on
Operations as needs mobility- based multiFull staged program of synergizing functionalities
Funds at project level, ad hoc,
Budget allocation for operations driven by
Operations is formal visible sustainable line
regional initiatives, with limited central office
strategy program
Operations as key trade-off investment with other
unpredictable
transparent criteria on life cycle needs basis
item in agencies’ budget -- capital, operating
support
Standardized agency programs or strategies
improvements in terms of “mobility management”
Ad hoc resource allocation with operations
Operations claim on agencies’ resources for
and maintenance
Narrow/ITS-project based, low hanging fruit
related to specific problems, desired outcomes
as secondary priority
mobility support established on timing,
Trade-offs between operations and capital
extent, cost-effectiveness
considered as part of the planning
Informal, undocumented
Planned
Integrated andexpenditure
Documented
process
Projects/issues handled on fire fight basis
Strategic planning and budgeting of staged
Integrated operations-related
planning, budgeting,
with only modest formal regional/district
improvements including maintenance and
staffing, deployment and maintenance both within
Informal, unaligned
Formal, aligned
Consolidated
planning i(but no standard template)
construction implications
operations and with SW and metro planning
Non-transportation entities unaligned with
Transportation agencies assert leadership in
High level of operations coordination among
Minimal conops, architecture; procedures
Architectures and related processes developed,
Full documentation of key conops, architecture,
transportation objectives, procedures
partnerships via formal written, agreements
owner/operators: state, local private with
ad hoc/no consistency
including major communications structure
procedures and protocols
relying on informal personal basis
with PSA, EM,
TMC consolidation
Private sector utilized for isolated
Private sector capabilities in technology,
Clear outsourcing role developed, while
Qualitative, opportunistic
Evaluated platforms
Standardized, interoperable
functions
management
tapped
maintaining agencies’
core available
capacities
Technologies selected
at project level
Basic stable technology
for existing
strategies
Systematic evaluation/application
of best
Limited understanding of operating platform
evaluated on qualitative basis
technology/p[procedure combinations
needs
Identification of standardized, statewide
Standard technology platforms
interoperable operating platforms and related
developed/maintained
procurement procedures
Organization and
PROCESS
Staffing for
DIMENSIONS
Operations
Scope
Resource
Allocation to
Operations
Business
Processes
Partnerships for
Operations
Technology and
Systems
Performance
Outputs reported
Measurement of outputs only with limited
analysis/remediation
Output measures reported
Informal, undocumented
• Projects/issues handled on fire fight
basis with only modest formal
regional/district planning (but no
standard template)
•Minimal concepts of operations.
Systems architecture; procedures ad
Outcomes used
Performance Accountability
Outcome measures
measured
developed
and
Continuous improvement perspective adopted
hoc
--no
consistency
used for improvement
(requires intra and interagency after action analysis
Outcome measures reported
14
Accountability and benchmarking at unit and agency
level via regular outcome performance reporting –
internal and public
14
User-based Self-Evaluation
Indicate User’s position (check one):


Indicate agency current state-of-play (level)
regarding selected element
Top management (CO or District)
 Measures not defined or utilized
Program manager (CO or District)
 Output measures utilized only for some
activities/unlinked to policy
Project manager (CO or district)
 Output measures for all activities/Linked to
policy
Select capability element as point of
departure (examples)
 Performance measurement
Detailed Strategy Guidance for
selected element to move to next level
 Standardization/Documentation
Sustainable budget for planning
 clear lines of responsibility
 Aligned partnerships with PSAs
15
Guidance
15
“Black Box” Determines Users current level
and next steps to improved capability
CURRENT LEVEL OF PERFORMANCE MEASUREMENT MATURITY
L-1  L-2
PM program based on
output measures only
L-2  L-3
PM program based on output
measures and a few outcome
measures only
16
L-3  L-4
PM program based on full set
of both output/outcome
measures, linked to agency
operations
16
Standard Strategy Templates for Each
Element/Level combination (100+)
PERFORMANCE ELEMENT
Performance Measures Definition: Strategy from Level 1 to 2,
Issue
Function of component
Guidance
Development of performance metrics for operations activities and the impact operations activities have on
users
Why important for CIP
Development of an effective operations program must be based on matching needs to solutions.
Performance measurement does this by cost-effectively directing operations investments. Having an
ongoing performance process means that new deployments and policies can be continually evaluated, rather
than having to undertake piecemeal evaluations.
Strategy (from Layer 3)
Identify output performance measures for the selected operations activities
Detailed Description of activity
to move up a level
A. The philosophy here is, “if you’re going to manage it, you’re going to measure it”. For Level 1 to Level 2,
the limited number of operational activities that will be monitored must first be identified. For Level 2 to 3,
all operations activities engaged in will have performance measures defined for them. Output measures for
both Level transitions should be relatively simple and easy to collect. These are often referred to as
“activity based” measures since they monitor the extent of activities undertaken, and their immediate
consequences. Therefore, the measures selected will include at a minimum:
incident duration
incident and work zone characteristics (number, type, severity)
operational activity (website hits, service patrol stops, messages)
equipment locations and status
equipment downtime
Responsibility: Operations staff
Relationships: N/A
References:
Guide to Benchmarking Operations Performance Measures,
http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218
NCHRP Web-Only Document 97, "Guide to Effective Freeway Performance Measurement: Final Report
and Guidebook", http://www.trb.org/news/blurb_detail.asp?id=7477
17
17
Process is repeated for next element of
capability
• “Model” uses user self-evaluation to
determine current level
• Model defines next steps (based on current
best practice
• User only sees material related to her agency
level
Business & Technical Process Capabilities:
•
Scope of Activities
•
Business Processes
•
Technology/Systems
•
Performance Measurement
Institutional/Organizational Arrangements:
•
Culture/Leadership
•
Organization/Staffing
•
Resources
•
Partnerships
18
18
Benefits of Web-Based Approach
• Avoids lengthy paper documents with big
charts
• Relationships among elements built in
• Custom tailored to user
• Users self-evaluate SDOT state of play
• Increasing levels of detail displayed on
automated basis
• Hyperlinks to supporting documents
19
19
Need for Senior State DOT Input
• Importance of SO&M to your program?
• Your management objectives related to SO&M?
• Principle challenges you see
• Lessons you have learned to date
• Utility of Guidance to you
We look forward to follow-up with you
20
20