Results and Performance Accountabilty, Decision
Download
Report
Transcript Results and Performance Accountabilty, Decision
Results Based Accountability
Basics
A One Day Presentation
Standard Training Slides Sponsored by the Ministry of
Social Development
Results Based Accountability
The Fiscal Policy Studies Institute
Santa Fe, New Mexico
Websites
raguide.org
resultsaccountability.com
Book - DVD Orders
sheapita.co.nz
amazon.com
resultsleadership.org
How could RBA add value to you?
3
SIMPLE STEPS
COMMON SENSE
PLAIN LANGUAGE
MINIMUM PAPER
TALK TO ACTION!
Results Based Accountability
is made up of two parts:
Population Accountability
about the wellbeing of
WHOLE POPULATIONS
For Communities – Cities – Districts – Countries
E.g. All Rangatahi/Youth in Te Tai Tokerau, All Migrants in Nelson
Performance Accountability
about the wellbeing of
CLIENT GROUPS/CUSTOMERS
For Teams - Providers – Programmes - Agencies – Service Systems
E.g. Clients of Services, Collectives, Ministries or the Health System
5
The Language Trap
Too many terms. Too few definitions. Too little discipline
Benchmark
Outcome
Result
Modifiers
Indicator
Measurable Core
Urgent
Qualitative
Priority
Programmatic
Targeted
Performance
Incremental Strategic
Systemic
Measure
Goal
Objective
Target
Lewis Carroll Center for Language Disorders
CoreYour
qualitative
made
upsystemic
jargonobjectives
here
Measurable
urgentstrategic
indicators
Results Based Accountability
COMMON LANGUAGE
COMMON SENSE
COMMON GROUND
Performance
Population
Definitions
• RESULT / OUTCOME
– A condition of wellbeing for children, adults, families or communities
All Tamariki in Hamilton are Born Healthy, Safe Roads, Nurturing
Whānau/Families, A Prosperous Economy
• INDICATOR / BENCHMARK
– A measure which helps quantify the achievement of a result.
Rate of low-birth weight babies, Rate of road crashes,
Rate of child abuse and neglect, Unemployment rate
• PERFORMANCE MEASURE
– A measure of whether a programme, agency or service system is
working. Three types
1. How much did we do?
2. How well did we do it?
3. Is anyone better off? = Client Results / Outcomes
Translation Guide/Rosetta Stone
Not the Language Police
Ideas
Group 1
Group 2
1. A condition
of well-being
for children,
adults, families
& communities
RESULT
OUTCOME
2.
3.
etc.
TRANSLATION
Back to the Idea
Group 3
etc.
GOAL
From Ends to Means
Performance
Population
From Talk to Action
RESULT / OUTCOME
ENDS
INDICATOR / BENCHMARK
PERFORMANCE
MEASURE
MEANS
Client /Customer result = Ends
Service delivery = Means
10
Results – Indicators – Performance Measures in
Maori, Fijian,
Tuvaluan
Result, Indicator, Strategy or Performance
Measure?
Result 1. A Safe Community
Indicator 2. Percentage of Total Recorded Offences
Perf Measure 3. Average Police response time
Result 4. An Educated Workforce
Strategy 5. Installing street lights to make people feel safe
Result 6. People have living wage jobs and income
Indicator 7. % of people with living wage jobs and income
Perf Measure 8. % of participants in job training programme who
get living wage jobs
Key RBA concepts
• 2 key types of accountability and language discipline:
– Population accountability - results / outcomes and
indicators
– Performance accountability - performance measures
• 3 types of performance measures:
– How much did we do?
– How well did we do it?
– Is anyone better off?
• 7 questions from ends to means:
– baselines and turning the curve – to make life better for our
families / whānau, children / tamariki, and communities.
13
Population Accountability
For whole populations in a geographic area
Mark Friedman (author)
www.resultsaccountability.com
www.raguide.org
14
The 7 Population Accountability
Questions
1.
2.
3.
4.
5.
6.
7.
What are the quality of life conditions we want for the children, adults
and families who live in our community? (Population & Results)
What would these conditions look like if we could see them?
(Experience)
How can we measure these conditions? (Population Indicators)
How are we doing on the most important of these measures? (Baseline
Data and Story)
Who are the partners that have a role to play in doing better? (Partners)
What works to do better including no- cost and low-cost ideas? (What
works)
What do we propose to do? (Action Plan)
15
16
Christchurch City Community Outcomes
• A safe city
• A city of inclusive and diverse communities
• A city of people who value and protect the natural
environment
• A well governed city
• A prosperous city
• A healthy city
• A city of recreation, fun and creativity
• A city of lifelong learning
• A city that is attractive and well designed
17
Kotahitanga Whānau Ora Collective
positive statements - positive focus
• All whaanau in Te Puuaha ki Manuka
(Greater South Auckland) are ...
–
–
–
–
–
Mana Ora: Healthy and safe;
Mana Motuhake: Economically secure;
Mana Tangata: Culturally confident;
Mana Rangatiratanga: Knowledgeable and skilled;
Mana Whānau: Connected, engaged and
entrepreneurs
Acknowledgement: Kotahitanga Collective
Members: Turuki Healthcare Trust, Papakura Marae,
Huakina Development Trust and Te Kaha O Te
Rangatahi Trust, South Auckland, New Zealand.
Aranui Community Trust
Acknowledgement:
Aranui Community
Trust Inc Society
(www.actis.org.nz)
Implementing RBA Aranui
Result areas and indicators
Result Area 1:
A community that is spiritually and socially strong
Indicator:
% of police callouts for family violence
Result Area 2:
A community full of knowledge and learning
Indicator:
% of students with NCEA level 1 @year 11
% primary school students performing at national
average for literacy and numeracy
Result Area 3:
A great physical environment
Indicator:
Expenditure on repairs and maintenance to city
property in the Aranui Burwood Pegasus area
% of $ R & M that is due to damage
Result Area 4:
People who know and fit in Aranui
Indicator:
% of people who offer to participate in local events
Result Area 5:
A community that is healthy
Indicator:
% of Aranui residents presenting at Accident and
Emergency with no trauma needs/ concerns
# of total acute inpatient admissions
Results for Children, Families and
Communities
A Working List
Healthy Births
Healthy Children and Adults
Children Ready for School
Children Succeeding in School
Young People Staying Out of Trouble
Stable Families
Families with Adequate Income
Safe and Supportive Communities
23
Georgia Policy Council for Children and
Families
RESULTS
Healthy children
Children ready for school
Children succeeding in school
Strong families
Self-sufficient families
24
Tip for Drafting Population Accountability
Results
Families in ______
Tauranga are __________
Economically Secure
All ______
Insert your
Population
Insert your
Geographic
area
Insert your
Condition of
Wellbeing
25
Examples of Means not Ends
1. COLLABORATION
2. SYSTEMS REFORM
3. SERVICE INTEGRATION
4. DEVOLUTION
5. FUNDING POOLS
26
Leaking Roof
(Results thinking in everyday life)
Experience:
Measure:
Not OK
Inches of Water
? Fixed
Turning the Curve
Story behind the baseline (causes):
Partners:
What Works:
Action Plan: #2
3 criteria for choosing Indicators
Communication Power
Does the indicator communicate to a broad range of audiences?
Proxy Power
Does the indicator say something of central importance about the result?
Does the indicator bring along the data HERD?
Data Power
Quality data available on a timely basis.
28
Choosing Indicators
Worksheet
Safe Community
Outcome or Result_______________________
Candidate Indicators
Communication
Power
Proxy
Power
Data
Power
Measure 1
H M L
H M L
H M L
H
H
H
H
H
L
Measure 2
Measure 3
Measure 4
Measure 5
Measure 6
Measure 7
Measure 8
Data
Development
Agenda
Three Part Indicator List for each Result
Part 1: Primary Indicators
• 3 to 5 “Headline” Indicators
• What this result “means” to the community
• Meets the Public Square Test
Part 2: Secondary Indicators
• Everything else that’s any good
(Nothing is wasted.)
• Used later in the Story behind the Curve
Part 3: Data Development Agenda
• New data
• Data in need of repair (quality, timeliness etc.)
30
What do we mean by a baseline?
H
M OK?
L
Point to Point
History
Turning the Curve
Forecast
Baselines have two parts: history and forecast
31
Indicator Reports
Neighbourhood
Country
City
Kruidenbuurt
Tilburg,
Netherlands
New Zealand
Portsmouth, UK
MADD
33
Key RBA concepts
• 2 key types of accountability:
– Population – results / outcomes and indicators
– Performance – performance measures
• 3 types of performance measures:
– How Much Did We Do?
– How Well Did We Do It?
– Is Anyone Better Off?
• 7 questions that take you from talk to action
• From Ends to Means: Change our thinking and approach
- from what we do, to what we achieve
34
Population Accountability
QUICK EXERCISE
Tip for Drafting Population
Accountability Results
All ______ in ______ are __________
How would you experience this outcome? What would be different?
How would you measure success? What Indicator would you use?
Insert your
Population
Insert your
Geographic
area
Insert your
Condition of
Wellbeing
Performance Accountability
For clients of programmes, agencies, teams and
service systems
Mark Friedman (author)
www.resultsaccountability.com
www.raguide.org
37
Results Based Accountability
is made up of two parts:
Population Accountability
about the wellbeing of
WHOLE POPULATIONS
For Communities – Cities – Districts – Countries
E.g. All Rangatahi/Youth in Te Tai Tokerau, All Migrants in Nelson
Performance Accountability
about the wellbeing of
CLIENT GROUPS/CUSTOMERS
For Teams - Providers – Programmes - Agencies – Service Systems
E.g. Clients of Services, Collectives, Ministries or the Health System
38
The 7 Performance Accountability
Questions
1.
2.
3.
4.
5.
6.
1.
Who are our clients? (Client Group/Customers)
How can we measure if our clients are better off? (Client/Customer
Result / Outcome)
How can we measure if we are delivering services well? (Quality
Measures)
How are we doing on the most important of these measures? (Baseline
Data and Story)
Who are the partners that have a role to play in doing better? (Partners)
What works to do better including no-cost and low cost ideas? (Common
sense ideas & research where available)
What do we propose to do? (Action Plan)
39
Performance Accountability
Getting from talk to action
Client Group/Customers
“All performance measures
that have ever existed
for any programme
in the history of the universe
involve answering two sets of
interlocking questions.”
Performance Measures
Quantity
Quality
How
Much
How
Well
did we do?
did we do it?
(#)
(%)
Performance Measures
Effort
How hard did we try?
Effect
Anyone better off?
Performance Measures
Effort
How
How
Much
Well
Effect
Quantity
Quality
Input
Effort
How much
service did
we deliver?
How well
did we
deliver it?
Output
Effect
Performance Measures
How much
change / effect
did we produce?
What quality of
change / effect
did we produce?
Social Services Example
Effort
Quantity
Quality
How much did we do?
How well did we do it?
# of young people
(clients) receiving job
training / mentoring
services
% clients who complete
the job training /
mentoring programme
Effect
Is anyone better off?
# of clients who move
off a working age
benefit and into
employment (at 6
months and at 12
months)
% of clients who move
off a working age
benefit and into
employment (at 6
months and at 12
months)
Effort
Education example
Quantity
Quality
How much did we do?
How well did we do it?
Number of
students
Student-teacher
ratio
Effect
Is anyone better off?
Number of
graduates
Percent of
graduates
Effort
Drug/Alcohol Treatment Programme
Quantity
Quality
How much did we do?
How well did we do it?
Number of
persons
treated
Percent of
staff with
training/
certification
Effect
Is anyone better off?
Number of clients Percent of clients
off of alcohol &
off of alcohol &
drugs
drugs
- at exit
- 12 months after exit
- at exit
- 12 months after exit
Effort
Education example # 2
Quantity
Quality
How much did we do?
How well did we do it?
Number of
students
Student-teacher
ratio
Effect
Is anyone better off?
Number of
secondary school
students who
graduate on time
and enter Uni or
employment after
graduation
Percent of
secondary school
students who
graduate on time
and enter Uni or
employment after
graduation
Effort
Primary Care Practice
Quantity
Quality
How much did we do?
How well did we do it?
Number of
children aged
0-2 enrolled
Percent of
children who did
not attend
Effect
Is anyone better off?
#
children
aged 8 months
immunised
%
children
aged 8 months
immunised
(in the practice)
(in the practice)
Effort
Not all performance measures are created equal
Quantity
Quality
How much did we do?
How well did we do it?
Least
Also
Very Important
Important
Effect
Is anyone better off?
Most
Important
Effort
The matter of control
Quantity
Quality
How much did we do?
How well did we do it?
Most
Control
Effect
Is anyone better off?
Least
Control
PARTNERSHIPS
Separating the Wheat from the Chaff
Types of performance measures found in each quadrant
How much did we do?
How well did we do it?
# Clients/customers
served
% Common measures
# Activities (by type
of activity)
% Activity-specific
measures
e.g. client staff ratio, workload ratio, staff
turnover rate, staff morale, % staff fully
trained, % clients seen in their own language,
worker safety, unit cost
e.g. % timely, % clients completing activity,
% correct and complete, % meeting standard
Is anyone better off?
% Skills / Knowledge
#
#
#
#
(e.g. parenting skills)
Point in Time
vs. Point to Point
Improvement
% Attitude / Opinion
(e.g. toward drugs)
% Behavior
(e.g. School attendance)
% Circumstance
(e.g. working, in stable housing)
Choosing Headline Measures and the Data Development Agenda
Quantity
Quality
Effort
How much did we do?
How well did we do it?
# Measure 1 ----------------------------
% Measure 8 ----------------------------
# Measure 2 ----------------------------
% Measure 9 -----------------------------
# Measure 3 ----------------------------
% Measure 10 ---------------------------
# Measure 4 ----------------------------
% Measure 11 ---------------------------
# Measure 5 ----------------------------
% Measure 12 ---------------------------
# Measure 6 ----------------------------
% Measure 13 ---------------------------
# Measure 7 ----------------------------
% Measure 14 ---------------------------
#3 DDA
#2 Headline
Effect
Is anyone better off?
# Measure 15 ----------------------------
% Measure 15 ----------------------------
# Measure 16 ----------------------------
% Measure 16 ----------------------------
# Measure 17 ----------------------------
% Measure 17 ----------------------------
# Measure 18 ----------------------------
% Measure 18 ----------------------------
# Measure 19 ----------------------------
% Measure 19 ----------------------------
# Measure 20 ----------------------------
% Measure 20 ----------------------------
# Measure 21 ----------------------------
% Measure 21 ----------------------------
#2 DDA
#3 Headline
#1 Headline
#1 DDA
The matter of use
1. The first purpose of performance
measurement is to
improve performance.
2. Avoid the performance measurement
equals punishment trap.
● Create a healthy organisational environment
● Start small
● Build bottom-up and top-down simultaneously
RBA categories account for all performance measures
(in the history of the universe)
TQM
Cost
Process
Input
Quality
Effort
Quantity
Efficiency Admin overhead, Unit cost
Efficiency,
Staffing ratios, Staff turnover
Staff morale, Access, Waiting time,
Waiting lists, Worker safety
Client Satisfaction
Product
Output
Impact
Effect
(quality service delivery
& client benefit)
Benefit value
Cost / Benefit ratio
Return on investment
Client results / outcomes
Effectiveness
Value added
Productivity
57
RBA categories account for all performance measures
(in the history of the universe)
TQM
Cost
Process
Input
Quality
Effort
Quantity
Efficiency, Admin overhead, Unit cost
Staffing ratios, Staff turnover
Staff morale, Access, Waiting time,
Waiting lists, Worker safety
Client Satisfaction
Product
Output
Impact
Effect
(quality service delivery
& client benefit)
Benefit value
Cost / Benefit ratio
Return on investment
Client results / outcomes
Effectiveness
Value added
Productivity
58
RBA categories account for all performance measures
(in the history of the universe)
TQM
Cost
Process
Input
Quality
Effort
Quantity
Efficiency, Admin overhead, Unit cost
Staffing ratios, Staff turnover
Staff morale, Access, Waiting time,
Waiting lists, Worker safety
Client Satisfaction
Product
Output
Impact
Effect
(quality service delivery
& client benefit)
Benefit value
Cost / Benefit ratio
Return on investment
*
1. Did we treat
you well?
2. Did we help
you with your
problems?
Client results / outcomes
Effectiveness
Value added
Productivity
* World’s simplest complete
client satisfaction survey
59
Not all performance measures are created equal
TQM
Cost
Process
Input
Quality
Effort
Quantity
Efficiency, Admin overhead, Unit cost
Staffing ratios, Staff turnover
Staff morale, Access, Waiting time,
Waiting lists, Worker safety
Customer Satisfaction
Product
Output
Impact
Effect
(quality service delivery
& customer benefit)
Benefit value
Cost / Benefit ratio
Return on investment
Client results / outcomes
Effectiveness
Value added
Productivity
60
Comparing performance
• Can we do
better than
our
own history?
1. To
Ourselves
2. To Others
•When it is a fair
apples/apples
comparison
• When we
know
what good
performance is
3. To
Standards
61
Comparing Performance
1. To Ourselves First
Can we do better than our
own history?
Using a Baseline
CHART ON THE
WALL
2. To Others
When it is a fair apples/apples
comparison.
3. To Standards
When we know
what good performance is.
62
Comparing Performance
1. To Ourselves First
Can we do better than our
own history?
Reward?
2. To Others
When it is a fair apples/apples
comparison.
Punish?
3. To Standards
When we know
what good performance is.
63
Comparing Performance
1. To Ourselves First
Can we do better than our
own history?
2. To Others
When it is a fair apples/apples
comparison.
3. To Standards
When we know
what good performance is.
64
The matter of standards
1. Quality of Effort Standards are
sometimes WELL ESTABLISHED
● Childcare staffing ratios
● Application processing time
● Handicap accessibility
● Child abuse response time
Effort
Quantity
Effect
BUT
2. Quality of Effect Standards are
almost always EXPERIMENTAL
AND
3. Both require a
LEVEL PLAYING FIELD
and an ESTABLISHED RECORD
of what good performance is.
● Hospital recovery rates
● Employment placement
and retention rates
● Recidivism rates
65
Advanced Baseline Display
Create targets
only when they are:
FAIR & USEFUL
Goal (line)
Target or Standard
●
Avoid publicly declaring
targets by year if possible.
Your Baseline
Comparison Baseline
Instead:
Count anything better
than baseline as progress.
66
Key RBA concepts
• 2 key types of accountability:
– Population – results / outcomes and indicators
– Performance – performance measures
• 3 types of performance measures:
– How Much Did We Do?
– How Well Did We Do It?
– Is Anyone Better Off?
• 7 questions that take you from talk to action
• From Ends to Means: Change our thinking and approach
- from what we do, to what we achieve
67
Performance Accountability
QUICK EXERCISE
Performance Measures for my …
(insert the name of your Programme or Service here)
How much did we do?
How well did we do it?
# Clients/customers
Who are your clients?
served
What would you put in here?
% client satisfaction
with xxx
Is anyone better off?
Choose one
# / % Skills / Knowledge
(e.g. parenting skills)
# / % Attitude / Opinion
(e.g. toward drugs)
# / % Behavior
(e.g. School attendance)
# / % Circumstance
(e.g. working, in stable housing)
How Population & Performance
Accountabilities Fit Together
70
THE LINKAGE Between POPULATION and PERFORMANCE
POPULATION ACCOUNTABILITY
Result: Healthy Safe Young People
Youth crime rates
POPULATION
RESULT
Contribution
relationship
PERFORMANCE ACCOUNTABILITY
Mentoring Programme for Young Offenders
# young people
on programme
% meeting
weekly with
mentor
# reoffending
% reoffending
CLIENT
RESULTS/OUTCOMES
Alignment
of measures
Appropriate
responsibility
Population Accountability
to which you contribute to most directly.
Result:
Indicators:
Every time
you make a
presentation,
use a
two-part
approach
Story:
Partners:
What would it take?:
Role: as part of a larger strategy.
Your Role
Performance Accountability
Programme:
Performance measures:
Story:
Partners:
Action plan to get better:
Population Accountability
to which you contribute to most directly.
Result:
Indicators:
Every time
you make a
presentation,
use a
two-part
approach
Story:
Partners:
What would it take?:
Role: within the larger strategy.
Your Role
Performance Accountability
Programme:
Performance measures:
Story:
Partners:
Action plan to get better:
Division #1
Programme
#1
74
75
76
Different kinds of progress
1. Data
a. Population indicators:
Reporting on curves turned: % increase or decrease of the graphed data (e.g. the
baseline).
b. Performance measures:
Client group progress and improved service delivery:
How much did we do?
How well did we do it?
Is anyone better off? E.g. Skills/Knowledge, Attitude/Opinion, Behaviour Change,
Circumstance Change
2. Accomplishments
Other positive activities accomplished, not included above.
3. Stories
Real stories that sit behind the statistics that show how individuals are better off e.g.
case studies, vignettes, social media clips.
77
What’s next?
A Basic Action Plan for Results Based Accountability
TRACK 1: POPULATION ACCOUNTABILITY
•
•
•
•
Establish results
Establish indicators, baselines and charts on the wall
Create a result card
Set tables (action groups) to turn curves
TRACK 2: PERFORMANCE ACCOUNTABILITY
• Performance measures, and charts on the wall for
programmes, agencies and service systems
• Use 7 Questions manager by manager, and programme by
programme, in management, budgeting and strategic
planning
IN CLOSING
80
Kia ora / thank you!
WEBSITES:
www.raguide.org
www.resultsaccountability.com
BOOK /DVD ORDERS:
www.sheapita.co.nz
www.trafford.com
www.amazon.com
81
Exercises
Fiscal Policy Studies Institute
Santa Fe, New Mexico
www.resultsaccountability.com
www.raguide.org
82
3 Options
1. Turn the Curve: Population Accountability
2. Turn the Curve: Performance Accountability
3. Develop a Performance Measure Framework for a programme
or service (the 4 Quadrants)
83
Turn the Curve Exercise: Population Results
5 min: Starting Points
- timekeeper and reporter
- geographic area
- two hats (yours plus partner’s)
10 min: Baseline
- pick a result and a curve to turn
- forecast – OK or not OK?
15 min: Story behind the baseline
- causes/forces at work
- information & research agenda part 1 - causes
15 min: What works? (What would it take?)
Two
pointers
to action
- what could work to do better?
- each partners contribution
- no-cost / low-cost ideas
- information & research agenda part 2 – what works
10 min: Report convert notes to one page
84
4. --------- Off the Wall
ONE PAGE Turn the Curve Report: Population
Result: _______________
Indicator
Baseline
Indicator
(Lay Definition)
Story behind the baseline
-----------------------------------------------------
(List as many as needed)
Partners
-----------------------------------------------------
(List as many as needed)
Three Best Ideas – What Works
1. --------------------------2. --------------------------3. ---------No-cost / low-cost
4. --------- Off the Wall
Sharp
Edges
85
Turn the Curve Exercise: Programme Performance
5 min: Starting Points
- timekeeper and reporter
- identify a Programme to work on
- two hats (yours plus partner’s)
10 min: Performance measure baseline
- choose 1 measure to work on – from the lower right quadrant
- forecast – OK or not OK?
15 min: Story behind the baseline
- causes/forces at work
- information & research agenda part 1 - causes
15 min: What works? (What would it take?)
- what could work to do better?
- each partners contribution
- no-cost / low-cost ideas
- information & research agenda part 2 – what works
10 min: Report Convert notes to one page
86
Two
pointers
to action
4. --------- Off the Wall
ONE PAGE Turn the Curve Report: Performance
Programme: _______________
Performance
Measure
Baseline
Performance Measure
(Lay definition)
Story behind the baseline
-----------------------------------------------------
(List as many as needed)
Partners
-----------------------------------------------------
(List as many as needed)
Three Best Ideas – What Works
1. --------------------------2. --------------------------3. ---------No-cost / low-cost
4. --------- Off the Wall
Sharp
Edges
87
Develop a Performance Measure Quadrant
1.
Decide on a service or programme that you would like to
create a set of performance measures for;
2.
Draw the four quadrants on a flip chart;
3.
Brainstorm with your group the range of measures in each
quadrant;
4.
Work with the group to identify which measures you
currently have data for;
5.
Circle these and create the list of your most ‘vital few’ (e.g.
3-5 most important measures from your perspective);
6.
Put these on to a clean piece of flip chart paper and you
now have your performance measures for your service or
programme.
88
OTHER HELPFUL SLIDES
89
The first step in performance accountability is to
DRAW A FENCE
Around something that has
ORGANISATIONAL OR FUNCTIONAL IDENTITY
The Whole
Organisation
Division A
Division B
Unit
Unit
1
Division C
90
What Kind of PERFORMANCE MEASURE?
Upper Left
● # of people served
Lower Right
● % participants who got jobs
Upper Right
● staff turnover rate
Lower Left
● # participants who got jobs
Lower Right
● % of children reading at grade level
Upper Right
● cost per unit of service
Upper Left
Lower Right
● # applications processed
● % patients who fully recover
91
Bridging across all professions and partners
RESULT
INDICATOR
PERF. MEASURE
1. A condition of well-being for children, adults, families
and communities
2. A measure that helps quantify the achievement of
a (result)
3. A measure of how well a programme agency or service
system is working
HOW MUCH
DID WE DO?
4. A measure of the quantity of effort, how much service
was delivered
HOW WELL DID
WE DO IT?
5. A measure of the quality of effort, how well the service
was delivered, how well the functions were performed
IS ANYONE
BETTER OFF?
6. A measure of the quantity and quality of effect on
(customer's) lives.
BASELINE
7. A visual display of the history and forecast(s) for
a measure
TURNING THE CURVE
92
8. Doing better than the baseline forecast.
The cost of bad results
The costs of remediating problems after they occur
Convergence
of Cost & Revenue
Revenue
$300 billion
Cost
Investment
Track
Invest in prevention to reduce or avoid out-year
costs.
93
The business case for investment in prevention
United States 1970 to 2010
2008
94
All data have two incarnations
Lay
Technical
Definition
Definition
University Graduation Rate
% enrolled 01 June who graduate 15 June
% enrolled 30 Sept who graduate 15 June
% enrolled Bachelors who graduate at x date
95
Select 3 to 5 Performance Measures
at each level of the organization
Be disciplined about what’s most
important. Don’t get distracted.
?
. .
“Get over it!”
Pick the 3 – 5 most important
of the 9 – 15 measures
or create composites.
96
97
98
Framework Crosswalk Analysis
Population Results
(For Population Well-being, Across Communities,
Across Systems)
1. Population
2. Results (Outcomes, Goals)
3. Indicators (Benchmarks)
Data Development Agenda
Report Card
4. Baseline
5. Story behind the baseline
Cost of Bad Results
Research Agenda Part 1
6. Partners
7. What works
Research Agenda Part 2
8. Action Plan (strategy)
9. Funding Plan (budget)
Example
Framework:
Logic
Model
__________
Goal
Program Performance
(For Programs, Agencies and Service Systems)
1. Customers (Clients)
2. Performance measures
Customer results
Quality of Effort
Quantity of Effort
Data Development Agenda
3. Baseline
4. Story behind the baseline
Research Agenda Part 1
5. Partners
6. What works
Agency/program actions
Partner's actions
Research Agenda Part 2
7. Action Plan (strategy)
8. Funding Plan
Outcome
Output
Activity
Input
99