Transcript Slide 1
APPLYING PERFORMANCE BASED FUNDING
TO LITERACY AND ESSENTIAL SKILLS
BORIS PALAMETA, KAREN MYERS, NATALIE CONTE
January 16, 2013
Transition to a new economy, in which skills are the new
currency
Many jurisdictions moving towards integrated service delivery
models along an employment continuum
Interest in re-aligning incentives to improve outcomes for jobseekers, employers, and tax payers
Can performance-based funding (PBF) drive system-wide
change?
What can we learn from the experiences of other
jurisdictions?
Today’s presentation draws a State of Knowledge Review that was conducted in partnership with
Workplace Education Manitoba and funded by HRSDC’s Office of Literacy and Essential Skills
2
Literature review - Review evidence on various PBF
models (Canada, US, UK, Australia)
Expert review – Interviews with PBF experts in other
jurisdictions
Consultations – Consultations with practitioners and
government officials in Manitoba and Nova Scotia
Expert panel – Canadian Economics Association
conference, June 2012
Throughout the process our approach was guided by input from the project
reference group which was comprised of officials from MB & NS.
3
4
A tool for allocating resources to service providers
based on measurable performance targets
Shifts the focus from inputs to outcomes
Assumption is this shift will drive innovation in
service delivery and achieve desired long-term
outcomes
1. Design matters - PBF systems are complex and vary
widely in design and effectiveness
2. Better design can mitigate risk – PBF risks
generating unintended consequences, but ‘secondgeneration’ designs are more successful in mitigating
these risks
3. Promising approaches – Establish meaningful links
between practice and performance by paying for client
progress along employment and learning pathways
Intermediate outcome milestones (“tipping points”) as
performance indicators
5
State of knowledge
1. PBF systems are complex and vary widely
SYSTEM GOALS
DESIGN OF INCENTIVE SYSTEM
Policy objectives
Type of incentive
Examples: “Work first” – Job placement; human capital
development; poverty reduction; productivity
Financial
Non-Financial (e.g. star ratings)
Scale of risk
Target population
% service-based payments
Examples: Employment status; income status; work readiness;
human capital, demographic
Outcomes of interest
Process
Client outcomes
Immediate
Short-term
Longer-term
Performance indicators
% outcome-based payments
Performance targets
Benchmark attainment (x$
if x% of clients achieve
outcome A)
Payment per outcome (y$ per
each client achieving
outcome A)
Payment weighting
By outcome
By client
characteristics
Adjustment for factors outside provider control
By local economic conditions
PROCUREMENT MODEL
Less competitive-non market
Open competition-quasimarket
By speed of
placement
By client characteristics
Competition for incentive
Payment based on absolute
performance
Payment based on relative
performance
U.S. Job Training
Partnership Act
(1982-1998)
Australia Job Network
/ Job Services (1998current)
Washington St.
Student Achievement
Initiative (2007 –
current)
Context
Employment services
Employment services
Adult education (college
system)
Policy objectives
ROI in human capital
development
Work first; job placement,
cost cutting
Educational attainment
leading to labour market
attachment
Performance
indicators
Post-program (employment
and earnings at 13 weeks)
Post-program (13 & 26
week job placements)
In-program continuum of
learning outcomes, from
basic skills to credential
attainment
Funding model &
scale of risk
6% of state allocations for
incentive awards
Competitive bidding – up to
50% for outcome payments
2% funding cut reallocation
Performance payment
For meeting federal or
state-set targets (%
attaining outcome)
Per outcome achieved
Per point gained within
learning continuum
(momentum points model)
Leveling playing field
Targets adjusted to local
conditions and client
characteristics
Client tiering; payment
weighted by type of client
Momentum points
determined by principle of
equivalent effort
Competition between
providers
In some states
Yes; star ratings
Each college assessed
against its own historical
performance
7
8
Even small amounts of PBF may change behaviour, but not all
changes are in the desired direction
Early models particularly fraught with unintended consequences
(cream-skimming, parking, gaming)
Second generation models are more promising with built-in features
that aim to avoid these pitfalls
Key to mitigating risk is not only careful system design, but also
commitment to continuous improvement
Choice of measures crucial in determining incentive
architecture
◦ Poorly chosen performance measures may create conflicting
incentives – obtaining performance payments vs. serving clients
Performance measures have often been:
◦ 1) Outside provider control; i.e. based entirely on outcomes that
happen after clients leave the program
No clear connection between services providers offer and outcomes
they are paid for
◦ 2) Based on attainment of levels rather than gains from a starting
point
Incentives to pick “winners”
9
Performance measures have often been:
◦ 3) Poor proxies for quality
Program outcomes of interest are often long-delayed
Performance measures typically use short-term proxies (e.g.
employment at 13 weeks) for outcomes of interest (e.g. longer-term
employment)
But chain of evidence is often lacking
No clear connection between the short-term outcomes providers are
paid for and longer-term program impacts
“Hitting the target, missing the point”
10
1) Use in-program performance measures > In-program
measures establish a more immediate and meaningful connection
between day-to-day practice and performance
Allow providers to track progress in a timely fashion, understand
where and why learners succeed and where they falter, and design
interventions to accelerate progress
2) Measure gains not levels > Most measures have focused on
levels attained by clients at the time performance is assessed.
Need measures that include starting points and magnitudes of
improvement to convey information about provider’s impact on
learner achievement.
11
3) Measure what counts > Avoid mission narrowing by ensuring
that performance measures recognize the full range of program
objectives
PBF changes cost/benefit calculus, may encourage development of
costly but innovative services; on the other hand, what you do not
pay for may be left undone
4) Identify key milestones > Identify intermediate milestones that
can be used to track the progress of clients who may enter at
different points (e.g. with different levels of skill, employment
readiness, etc.)
Select milestones based on points along the pathway where
learners stall or struggle – meaningful transitions
12
5) Monitor system performance > Build a continuous learning
process to respond to unplanned behaviour
E.g. ‘teaching to the test’
6) ‘Right-size’ incentives > Ensure performance incentives are
neither too big nor too small
Too big risk management rather than innovation
Too small if costs of meeting performance targets > performance
bonuses, incentives will be ignored
13
7) Flexible approach to performance targets > Pre-set
performance targets are often either too ambitious or not ambitious
enough; both can lead to strategic behaviour
A more open-ended approach encourages continuous improvement
E.g. awarding performance dollars according to ‘momentum points’
(i.e. total number of milestones achieved along a learning pathway)
8) Ensure all targeted clients are served > ‘Level the playing field’
Design incentives using the principle of equivalent effort whereby
each momentum point should require roughly the same intensity of
effort to attain. Recognizes that clients with more barriers may
require greater effort to transition between milestones.
14
9) Build provider capacity > Providers may lack knowledge or
resources to respond effectively to incentives
Limit competition for performance dollars; encourage collaboration
to build tools and practices
10) Link in-program measures to post-program impacts > Use
longitudinal research to establish a chain of evidence between
intermediate milestones (potential ‘tipping points’) and longer-term,
post-program impacts (e.g. employment, earnings, etc.)
Follow-up with learners to establish the connection between
measured performance and client success in the long-term
Use results to refine and improve performance measurement
framework
15
Transition to
college level
15 College
level credits
(0.5yr)
30 College
level credits
(1 yr)
Pre-college
math/english
gains
Basic skills
gains
Key transitions milestones within a student’s pathway (identified by
research as ‘tipping points’). Provides incentives to focus on full range of
skill levels
16
Rewards achievement of key milestones – Encourages client
progress by rewarding achievement of key milestones that, if
reached, are associated with further progress and ultimately longterm labour market success
Focuses on balanced set of ‘in-program’ measures – Which
helps providers understand where clients succeed and where they
falter, and thus provide the data to drive innovation
Driven by a balance of competition and collaboration - Allocates
performance dollars according to total number of milestones
achieved. Thus while providers have strong incentives to innovate,
they are not in competition with each other. Indeed they may be
motivated to collaborate to improve outcomes
17