Interoperability Performance Management

Download Report

Transcript Interoperability Performance Management

Performance Management for
Justice Information Sharing
David J. Roberts
Global Justice Consulting
Steve Prisoc
Chief Information Officer
New Mexico State Courts
2006 BJA/SEARCH
Regional Information Sharing Conference
June 4, 2007 Jacksonville, Florida
0
“Measurement is the first step that leads to
control and eventually to improvement. If
you can’t measure something, you can’t
understand it. If you can’t understand it,
you can’t control it. If you can’t control it,
you can’t improve it.”
—H. James Harrington
1
WHY evaluate performance?
• Information is control
• Provides feedback to improve program
performance
• Provides information for resource
allocation
• Enables effective planning
• Tests generalizations based on
experiences and assumptions
• Market and develop support among
funding bodies, constituents, and staff.
2
Landscape of Performance
Management
• Investment appraisal and benefits
realization
– What is the actual investment we’re making?
– How are benefits going to be collected and
tracked?
• Solid program management and tracking
– Is the project on track?
– How do we ensure it remains on track?
• Achievement of the strategic objectives
– Fundamentally, what is it that we’re trying to
achieve in our information sharing initiative?
3
Process vs. Impact Evaluations
• Process evaluations focus on how the initiative
was executed; the activities, efforts, and
workflow associated with the response.
Process evaluations ask whether the response
occurred as planned, and whether all
components worked as intended.
Fundamentally, a process evaluation posits the
question, “Are we doing the thing right?”
• Impact evaluations focus on the outcome (the
what) of the initiative; the output (products and
services) and outcome (results,
accomplishment, impact). Did the problem
decline or cease? And if so, was the response
the proximate cause of the decline?
Fundamentally, the impact evaluation posits
the question, “Are we doing the right
thing(s)?”
4
Balanced Scorecard
Originally developed in business by Kaplan &
Norton
1. Financial – How do we look to stakeholders?
2. Customer – How well do we satisfy our
internal and external customers’ needs?
3. Internal Business Process – How well do we
perform at key internal business processes?
4. Learning and Growth – Are we able to sustain
innovation, change, and continuous
improvement?
5
Balanced Scorecard for
Law Enforcement (Mark Moore, et al)
1. Reduce criminal victimization
2. Call offenders to account
3. Reduce fear and enhance personal
security
4. Guarantee safety in public spaces
5. Use financial resources fairly, efficiently,
and effectively
6. Use force and authority fairly, efficiently,
and effectively
7. Satisfy customer demands/achieve
legitimacy with those policed
6
Trial Court Performance Standards
1.
2.
3.
4.
5.
Access to Justice
Expedition and Timeliness
Equality, Fairness and Integrity
Independence and Accountability
Public Trust and Confidence.
7
Corrections Performance
• Security
– Drug Use
– Significant Incidents
– Community Exposure…
• Safety
– …of inmates
– …of staff
– …of environment…
• Order
– Inmate misconduct
– Use of force
– Perceived control…
• Care
– Stress & illness
– Health care
– Dental care…
• Activity
– Work & industry
– Education & training
– Religion…
• Justice
– Staff fairness
– Use of force
– Grievances (# & type)...
• Conditions
– Space
– Pop density
– Freedom of movement…
• Management
– Satisfaction
– Stress & burnout
– Turnover…
8
Universal IJIS Elements
• Definition: The ability to access and share
critical information at key decision points
throughout the whole of the justice enterprise.
• Scope: Recognition that the boundaries of the
enterprise are increasingly elastic—engaging not
only justice, but also emergency & disaster
management, intelligence, homeland security,
first responders, health & social services, private
industry, the public, etc.
• Goal: Get the right information, to the right
people, all of the time—underscores the need for
dynamic information exchange.
9
Information Sharing Objectives
• What is the problem we’re addressing?
• What information do we have regarding
current levels of performance?
• What is it that we’re trying to do?
• 3 Universal Objectives:
– Improve Public Safety and Homeland Security;
– Enhance the Quality and Equality of Justice;
– Gain Operational Efficiencies, Effectiveness, and
demonstrate Return on Investment (ROI).
10
Sample Public Safety Measures
• Increase the percentage of
court dispositions that can
be matched to an arrest—
this will improve the
quality of the
computerized criminal
history records
• Decrease the average
response time to establish
a positive identification
following an arrest
• Reduce the number of
incidents of criminal
records being associated
with the wrong person
• Reduce recidivism
• Reduce the fear of crime
in target neighborhoods
• Decrease the amount of
time it takes to serve a
warrant
• Decrease the amount of
time for law enforcement to
have details on protection
orders.
• Reduce the amount of time
it takes users of the
integrated justice system to
respond to a request from
the public
• Reduce the time it takes to
complete a criminal history
background check
• Reduce the number of
agencies that can’t
communicate with each
other.
11
JNET: Improved Public Safety
& Homeland Security
• Notifications
– Timely notification of critical events
• Arrest, disposition, warrant, violation,
death, etc
– Offender accountability and increased public safety.
• Confirmed Notifications
–
–
–
–
–
–
FY01/02
FY02/03
FY03/04
FY04/05
FY05/06
Total =
3,645
18,349
29,980
33,264
46,424
178,339 confirmed notifications
12
Sample Quality of Justice Measures
•Reduce the number of
civilian complaints against
local law enforcement
•Reduce the number of
continuances per case that
result from scheduling
conflicts between the courts,
law enforcement, and
prosecution
•Reduce the number of cases
without a next scheduled
event
•Reduce the average number
of days or hours from arrest
to arraignment
• Reduce the average time a
defendant is held while
waiting for a bond
decision
• Reduce the time it takes
for correctional facility
intake
• Reduce the number of
days it takes to process
cases from arrest to
disposition
• Reduce the number of
false arrests.
• Reduce the amount of
missing data.
13
JNET: Improvement in the
Quality of Justice
• Improved decision making
– At key decision points, providing the required information in
a timely, usable method
• Traffic Stop
– Who is this person? Positive identification (photo, SID, etc)
– Is this person wanted? Outstanding warrants/wants.
– Is this person a threat? Previous history of violent behavior, firearms,
etc.
• Enhanced Overall Data Quality
• Reduction of errors
• Accurate and timely statistical reporting
• Improve Business Process
• Minimize offender processing time
• Reduction in “holding” time.
14
Sample Efficiency/Effectiveness Measures
• Reduce the number of hours
that staff spend entering data
manually or electronically
• Reduce the costs of copying
documents for justice
organizations
• Reduce the number of hours
spent filing documents
manually
• Reduce the number of hours
spent searching other
governmental databases
• Increase the number of law
enforcement personnel
performing community
policing tasks, instead of
administrative tasks
• Reduce the amount of missing
information in criminal justice
databases
• Reduce the number of
corrections needed in
databases maintained by CJIS
agencies
• Decrease the number of
warrants that never get
entered into the state registry
• Increase the number of query
hits on each agency database
• Reduce the number of hours it
takes to enter a court
disposition into the state
criminal history repository
15
JNET: Efficient and Effective ROI
•
PennDOT (DMV) Certified Drivers History via JNET
– In 2003, the PennDOT processed 157,840 certified driving history requests for
local police, district attorneys, and the Courts
– One clear performance measure is highlighted by the dramatic reduction in
processing costs for PennDOT. The personnel cost metric is based on the time
required to process a paper copy of the driver history request, including the
manual application of an embossed certification seal. PennDOT calculates
their personnel cost at $1.50 per certified history processed, and when
incorporating a combined printing and mailing cost of $.50 per copy, the total
cost to manually generate a certified driver history equates to $2.00 per
request.
– During August 2006, the 56,126 certified driving history requests process by
JNET saved PennDOT $112,252 in monthly operating expenses. Only 4767
were processed in the traditional fashion.
– PennDOT has reallocated personnel to support and process other areas of
business such as ‘paid’ requests from individual citizens and pre-employment
screeners.
16
Critical Assumptions
• Baseline data exist regarding current or historical
performance of the system
• Access, ability and willingness to capture data
regarding on-going performance
• Timely, accurate and complete data collection
• Appropriate and sufficiently detailed analysis
techniques
• Staff to conduct the analysis and reports
• Effective communication mechanisms to:
– Monitor on-going baseline performance
– Constantly assess the impact and operations
• Political will and operational capacity to do
something as a result of what the measures
show!
17
Performance Dashboards
What we’re NOT talking about:
The threat level in the airline sector
is HIGH or Orange 3/1/07
18
What we ARE talking about…
19
Sample Performance Dashboard
Draft dashboard
assessing
performance on a
series of dimensions
that have been
agreed by key
decisionmakers.
This requires
effective data
collection and routine
reporting from
operational systems
in place throughout
the County and
agreement that we’re
going to do
something with the
data in order to
respond to critical
performance
elements.
20
21
Establishing a Performance
Management Program
The Six Steps to Establishing a Performance-Based Management Program
Source: Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based
Management Handbook, Volume 1: Establishing and Maintaining a
Performance-Based Management Program (Washington, DC: U.S.
Department of Energy, 2001)
22
Outcomes and performance measures
• Outcomes are the benefits or results gained by reaching
goals, achieving objectives and resolving strategic issues
• Performance Measures are specific, measurable, timebound expressions of future accomplishment that relate
to goals, objectives and strategic initiatives
• Goals, objectives and strategic initiatives should
ideally lead to outcomes
• Pragmatic performance measurement planners
recognize that not all things that need to be measured
can always be empirically linked to outcomes.
23
Not all outcomes easily lend themselves to
measurement
Everything that can be counted does
not necessarily count; everything that
counts cannot necessarily be counted.
—Albert Einstein
It is important that performance measures be based
on criteria that correspond to desired outcomes;
however, it is often difficult or even impossible to
obtain objective measures of certain key
outcomes.
24
Populated logic model
Program Logic Model and Chain of Events
Category
Program Feature and
Activity
Initial
Outcomes
Intermediate
Outcomes
Intermediate
Outcomes II
Measures
Rap sheet information
of appropriate scope,
timeliness, accuracy
and ease of use
available to magistrate
judge at first court
appearance/bond
hearing
1. Greater use
of Rap sheet
information
when setting
bail/bond and
conditions of
release
1.More
appropriate
conditions of
release and
establishment of
bail/bond
appropriate to
both the arrest
charges and the
criminal history
and past warrant
information
1. Fewer
crimes
committed
by those
awaiting
trial
2. Fewer
failures to
appear
3. More
timely
disposition
of criminal
cases
Final Outcomes
Reached
1.Enhanced
justice
process
2. Positive
influence on
lessening
total
number of
crimes
committed
25
Use scenario approach to reach agreement
and define performance
• Bring stakeholders together to reach consensus
on the desired state of integration
• Define the current state of integration (baseline)
• Quantify gap between current state and desired
state
• Define desired outcomes
• Develop objectives and performance measures
that can be linked to desired outcomes
26
Stakeholders must agree on
performance measures in advance
• Perceived performance is an agreed-upon
construct
• Criteria for defining performance should be
negotiated by stakeholders (and governing body)
prior to developing measures
• Stakeholders will value outcomes differently
depending on their role within (or relative to) the
justice enterprise
27
Characteristics of good measures
• Measures link back to goals, objectives and mission
statements
• Measures drive the right behavior from employees,
partners and consultants
• Collecting data on measures is feasible and cost effective
• Measures are understood by most employees
• Measures lend themselves to being tracked on an
ongoing basis so that drops in performance can be
detected when there is time to do something about it.
• Measures represent aspects of performance that we can
actually change
28
Performance measurement caveats
• Most people (including your employees and
consultants) can learn to make measures come out the
way they think you want them to, without actually
improving a process
• Always question the measures you’ve defined, keeping
in mind that the people applying them could find ways
of boosting the measures without really improving
anything
• Test each measure to determine if it operates as
expected. Does it always go one way when things get
better and the other when things get worse?
29
The Russian Nail

Manipulating a single metric allowed Soviet managers to appear
successful even though their efforts did not always lead to expected
outcomes.

Success was typically measured by singular metrics of gross output, such
as weight, quantity, square feet, or surface area. Gross output indicators
played havoc with assortments, sizes, quality, etc., and frequently
resulted in products like Khrushchev’s chandeliers – so heavy “that
they pull the ceilings down on our heads.”

A famous Soviet cartoon depicted the manager of a nail factory being
given the Order of Lenin for exceeding his tonnage. Two giant
cranes were pictured holding up one giant nail.
My Time with Soviet Economics by Paul Craig Roberts
(Published in The Independent Review, v.VII, n.2, Fall 2002,pp. 259– 264.)
30
Behavior driven the wrong way
• The Soviet Union wasted billions searching for oil
because it rewarded drilling crews on the basis of the
number of feet drilled. Because it is easier to drill many
shallow wells than a few deep wells, drillers drilled lots
of shallow wells, regardless of what was advisable
geologically.
• The 1983 Chicago Sun Times article reported a Soviet
hospital that had turned away a seriously ill patient because
"they were nearing their yearly quota for patient deaths—
and would be criticized by authorities if they exceeded it."
31
Family of related measures
• Produce x widgets per hour
• Produce x widgets per hour without exceeding y
dollars
• Produce x widgets per hour without exceeding y dollars
with only one full-time employee
• Produce x widgets per hour without exceeding y dollars
and with only one full-time employee and generating z
units of waste
• Produce x widgets per hour without exceeding y dollars
with only one full-time employee and generating z units of
waste and at a zero defect rate
• Produce x widgets per hour without exceeding y dollars
with only one full-time employee and generating z units of
waste and at a zero defect rate and without
32
contributing to global warming
The “widget” family
•
•
•
•
•
•
Number produced within specified time period
Cost of producing widgets
People required
Waste generated
Defect rate
CO2 produced
33
Specific Justice Example
34
Legislatively imposed measure
Number of calls from users
requesting assistance (lower
number indicates superior
performance)
35
Replacement multi-dimensional measure
Measure: Length of time to resolve a call for
service and the quality of service call resolution
as measured by the following two dimensions:
1. Average time from the opening of a service ticket to the
closing of a service ticket. JID will also report the median
and standard deviation with the average.
2. The quality of service as measured by a regular user
surveys designed to measure the quality of the service
provided to the caller. Survey respondents are selected
36
randomly
Strategic Goal 3: Identify and recommend cost
effective biometric identification applications
Objective 3.1:
By September 2004, research, identify, and recommend technological
applications that support biometrics for rapid identification.
Objective 3.2:
By September 2004, research, identify, and evaluate the costs and benefits of
biometric identification applications.
Outcomes:
• Increased knowledge of biometric technologies
• Improved cost-effective biometric identification solutions
Performance Measures:
• Number of research projects on biometric technological solutions completed
by September 2004
• Number of research projects on costs and benefits of biometrics completed by
September 2004
• Number of research reports presented to the Governing Body
37
Justice performance measures
• Average law enforcement response time to calls for
service for incidents involving a threat to citizen safety
• Percent of arrest records in state repository with final
dispositions
• Number of automated information exchanges within
and between criminal justice agencies
• Number of crimes cleared using AFIS system(s)
• Number of arrests made of wanted individuals
resulting from the use of electronically available
warrant and detainer information
• Number of electronic checks of justice databases
performed to identify high risk individuals
• Average time from arrest to final case disposition for
38
felony arrests
What makes a performance measure
effective?
• First and foremost, to be effective a measure must
be an incentive to a person or group of persons to
change behavior in such a way that things really
improve.
• A performance measure should provide feedback
to a person or group of persons. Without feedback
no information is available on whether the target
implied by the measure is being met.
• A performance measure (or family of measures)
should be precise and comprehensive so as to
prevent the possibility of the measure being met
without actually leading to expected outcomes.
39
Three-legged Stool
Strategic Planning
Project Management
Performance Management
40
The role of project plans
• Project plans can augment a performance plan by
ensuring that outputs are completed on time and
on budget
• Rigorous project management can ensure that
tasks are actually performed before they are
measured.
• Project planning, along with strategic planning, is
an essential adjunct to any performance
management program.
41
There’s more to management
than measurement
If you can’t measure it,
you can’t manage it.
—Peter Drucker
Drucker’s saying has convinced some managers that
measurement is management, which is a bit of an
overstatement; however, measurement is one of the
most powerful tools in management toolbox
42
Final points
1.
2.
3.
4.
5.
If you don’t monitor your performance it will probably
get worse.
You can’t devise performance measures in a vacuum,
you must involve stakeholders and measure what’s
valued.
Don’t devise measures for which you lack data.
Performance measurement can be expensive and time
consuming so why bother unless you intend to use
the results to provide ongoing process feedback.
Errors in devising measures will lead to unexpected
consequences
43
So inscrutable is the arrangement of
causes and consequences in this world
that a two-penny duty on tea, unjustly
imposed in a sequestered part of it,
changes the condition of its inhabitants.
Thomas Jefferson
44
Performance Measurement
BJA’s Perspective
Michael Dever
Policy Advisor
45
Purposes: Performance Measures
•
•
•
•
Linking people and dollars to performance
Linking programs and resources to results
Justification of continued funding
Learning and management tools for us,
for you
46
What Does BJA Do With the Data?
• GPRA: Government Performance and
Results Act
• PART: Program Assessment Rating Tool
(www.expectmore.gov)
• Budget formulation
• MD&A: Management Discussion and
Analysis
47
How You’ll Report Performance
Measures
• Via the semi-annual progress report
submitted electronically via GMS (Grant
Management System) Due Jan 30st and
July 30st
• Report only on grant-funded activities
during the specified reporting period
• Progress reports will not be accepted
without complete data
48
Resources
•
•
•
•
•
Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based
Management Handbook, Volume 1: Establishing and Maintaining a
Performance-Based Management Program (Washington, DC: U.S.
Department of Energy, 2001) at
http://www.orau.gov/pbm/pbmhandbook/pbmhandbook.html
John E. Eck, Assessing Responses to Problems: An Introductory
Guide for Police Problem-Solvers (Washington, DC: Center for
Problem-Oriented Policing, no date), at
http://www.popcenter.org/Tools/tool-assessing.htm
Michael Geerken, The Art of Performance Measurement for
Criminal Justice Information System Projects, (Washington, DC:
U.S. Department of Justice, Bureau of Justice Assistance, 2006
[forthcoming])
Robert H. Langworthy (ed.), Measuring What Matters: Proceedings
from the Policing Research Institute Meetings, (Washington, DC:
NIJ/COPS, July 1999, NCJ 170610), pp. 37-53.
David J. Roberts, Law Enforcement Tech Guide: Creating
Performance Measures that Work! A Guide for Law Enforcement
Executives and Managers, (Washington, DC: U.S. Department of
Justice, Office of Community Oriented Policing Services, 2006) at
http://www.cops.usdoj.gov/mime/open.pdf?Item=1968
49