Transcript Document

Demonstrating the Economic Value
of Career Services
Bryan Hiebert
Vice-president, IAEVG
Professor Emeritus, University of Calgary
Adjunct Professor, Educational Psychology, University of Victoria
Docent of Education (Research), University of Jyväskyla
Member, Canadian Research Working Group
on Evidence-based Practice in Career Development
[email protected]
1
What Counts:
2
Background and Rationale
A challenge by Canadian Policy Makers:
“You haven’t made the case
for the impact and value
of career development services”
A research team formed in 2004 to follow-up
• The Canadian Research Working Group for EvidenceBased Practice in Career Development
• 10 researchers from 7 universities & 1 foundation
3
State of Practice: Measuring Outcomes
 84% of agencies report collecting data
• Frequency counts, e.g., number of clients
served/month, number of clients who found
employment, number of client action plans
created, number of clients who completed
programs
• Employment status
4
2005 Study: Agencies & practitioners
What are the 3 most important outcomes you report?
1. Change in employment or educational status of
the client
and marginally
2. Skill development; financial independence,
connectedness, self-confidence
3. Number of clients served
4. Client satisfaction
5. Programs completion
6. Service delivery
7. Cost-benefit
5
What outcomes are you achieving that are going
unreported or unmeasured?
(From 2005 CRWG State of Practice study)
• Client empowerment
• Client skill development
• personal self-management skills
• Client increased self-esteem
• Client changes in attitudes
• about their future
• about the nature of the workforce
•
•
•
•
Client knowledge gains
Financial independence
Creation of support networks
More opportunities for clients
These are
legitimate
areas for
intervention
6
Outcomes of Counselling
1. Client learning outcomes
•
•
Knowledge
Skills
2. Impact on client’s life
•
•
•
Client presenting problem
Economic factors
Third party factors
+
•
•
•
•
•
•
Precursors
Attitude
Motivation
Self-esteem
Stress
Internal locus of control
Belief that change is
possible
Personal Attributes
Intervene between learning outcomes & impact outcomes
7
Evidence-Based Outcome-Focused Practice
Input  Process  Outcome
Need to link process with outcome
8
Definitions
 Outcome: Specific result or product of an
intervention including changes in client
competence, client situation and/or broader
changes for the client and/or community
 Input: Resources available for achieving outcomes
 Process: Activities engaged in to achieve
outcomes
 Intervention: Intentional activity implemented in the
hopes of fostering client change
 Output: Products produced during the intervention,
e.g., resume, sample cover letter, action plan
9
Outcome-Focused Evidence-Based Practice
Input

Process

Outcome
Indicators of client change
1. Learning outcomes
• Knowledge and skills linked to intervention
2. Personal attribute outcomes
• Changes in attitudes
• Intrapersonal variables
(self-esteem, motivation, independence)
3. Impact outcomes
• Impact of #1 & #2 on client’s life,
e.g., employment status, enrolled in training
• Societal, economic, relational impact
10
Outcome-Focused Evidence-Based
Practice
Input

Process
 Outcome
Activities that link to outputs or deliverables
Generic interventions
•
Working alliance, microskills, etc.
Specific interventions
1. Interventions used by service providers
•
•
Skills used by service providers
Home practice completed by students
2. Programs offered by school
3. Involvement by 3rd parties
4. Quality of service indicators
•
Stakeholder satisfaction, including students
11
Outcome-Focused Evidence-Based Practice
Input

Process
 Outcome
Specific interventions
1. Career decision making
2. Work-specific skills enhancement
3. Work search
4. Job maintenance
5. Career-related personal development
6. Other
12
Outcome-Focused Evidence-Based
Practice
Input 
Process
 Outcome
Resources available
1. Staff
•
Number of staff, level of training, type of training
2. Funding
•
Budget
3. Service guidelines
•
Agency mandate
4. Facilities
5. Infrastructure
6. Community resources
13
Intervention-Evaluation-Service Delivery:
Merged Framework
Context:
Client
Counsellor
Client
Client  Goals Actions  Actions 
Needs
Inputs

(Resources)
Processes

Outcomes
Client Outcomes
• Knowledge
• Skills
• Attributes
• Impact
Service Delivery
• Client flow
• Accessibility
• System factors
• Client satisfaction
14
Quality Service Delivery
1.
Accessibility
•
•
•
•
•
2.
Regular hours
Extended hours
Physical accessibility
Resources in alternate format
Ease of access, who can access
Timeliness
• % calls answered by 3rd ring
• Wait time for appointment
• Wait time in waiting room
3.
System requirements
• Adherence to mandate
• Completion of paper work
4. Service standards
• Staff credentials,
competencies, resources
5. Service delivery
• Client volumes
• Client presenting problems
• Number of sessions
6. Responsiveness
• Respect from staff
• Courteous service
• Clear communication
7. Overall satisfaction
• % rating service good or
excellent
• % referrals from other clients
15
Quality Service Standards
Are all components equally important?
Performance Management System (Ontario)
Three broad dimensions of service delivery success
1. Effectiveness (50%)
a. Participant Suitability (15%)
b. Service Impact (35%).
2. Customer Satisfaction (40%)
a. Customer Satisfaction (15%)
b. Service Coordination (25%)
3. Efficiency (10%)
a. Assisted Services Intake (5%)
b. Information Session/Workshop Activity (5%)
16
Outcome Focused Evidence-Based Practice
Quality Improvement
Input
Resources

Process
Counsellor

Outcome
Client change
• Skills
• Knowledge
• Interventions
• Programs
• Skill
• Attribute
• impact
17
Outcome Focused Evidence-Based Practice
Dynamic and Interactive
Context:
Client Needs
Client
Goals
Inputs
(Resources)
Intervention,
Linking process to outcome
• Knowledge
• Skills
• Attributes
• Impact
Outcome
Process
Client
Actions
Counsellor
Actions 18
Outcome-Focused Evidence-Based Practice
Input  Process  Outcome
Intervention
=
Process + Outcome
What will I do? + How is it working?
Professional Practitioner
19
Professional Practitioner
(Local Clinical Scientist)
 Intervening in a systematic manner
• Documenting what you did
 Paying attention to what happened
• Tracking the effects
 Looking for associations between
what you did & the effects that happened
 Across time and across clients
• Acquire ability to make predictions
linking interventions & outcomes
 Each client is a n = 1 experiment (investigation, exploration)
• Multiple replications provides predictability
20
Professional Practitioner
(Local Clinical Scientist)
 Approach your practice in a scientific manner
•
•
•
•
Be clear about the nature of the change clients desire
Be clear about what you will do to meet client goals
Document what you do
Document how well it works
 Your own practice becomes your data source for
predicting client outcomes
This is a viable,
perhaps even preferable, alternative
to RCTs
21
Outcome-Focused Evidence-Based Practice
Input  Process  Outcome
Need to link process with outcome
1. What will I do?
2. What are the expected client changes?
 What do I expect clients to learn?
 What sorts of personal attributes do I want my
clients to acquire?
 What will be the impact on their lives?
3. How will I tell?
22
Evidence Policy makers can relate to
Concepts under development
 Return on investment
 Employment Equivalence
(Career Self-Sufficiency Index)
Food for thought and discussion
23
Return on investment: Community Agency
A community agency
 Career development services for welfare recipients to
help them integrate into the labour market
 Government investment was about $1,300 per client
 Return on investment came from two sources
• clients who found employment and were no longer on
welfare, earned higher income, paid income tax
• service providers employed to deliver the program
 Clients provided a copy of their pay stubs before and
after the program
 Return on investment was between $1.14 and $1.46
for each $1.00 spent (times number of years employed)
24
Return on Investment for
High School Career Education Programs
School funding is based on student enrolment
(person-courses)
 2 years after implementing career education
program
• Completion rates increased by 15%
• number of students in their Registered Apprenticeship
Program increased
 Increased funding provided
• 1.5 additional staff (1 counsellor + .5 support staff)
• more preparation time for teachers
• perceived more positive work climate
25
Career Self-Sufficiency Index
(Employment Equivalence)
 Consider a client who receives careers guidance
and
• Decides to return to school so he can
• Find a better job that pays more money and
less likelihood of unemployment
 Employment status does not change
• Considered a failure
26
Career Self-Sufficiency Index
(Employment Equivalence)
Consider instead
 In Canada, men 30 years old are 34% more likely to
be employed if they have high school education
(compared to men with no high school diploma)
• Employment Equivalence (CSSI) for taking training is .34
 Consider also, men 30 years old who have high
school education, earn on average $6,000 more
money per year
• Return on investment = $6,000 times years worked,
perhaps 30 years = $180,000
This is evidence of success
27
Return on Investment for
Post Secondary Student Services
Post secondary leavers vs. completers
 50% more likely to have difficulty keeping up with
the work load
• CSSI = 0.50, for completing a study skills program
 leavers reported being unsure of what they wanted
to do,
 #1 reason for leaving school was “lack of fit.”
 Completers were 45% more likely to report having
a career plan that was a good match for their
program
• CSSI = 0.45 for completing a program that helps
increase fit between career plans and program of
28
study
Return on Investment for
Post Secondary Student Services
Post secondary leavers vs. completers (continued…)
 PSE graduates earn on average $5,512 more than
those who do not graduate
 Return on investment for completing study skills
programs would be .50 x 5,512 = $2,756 per
person per year
 Return on investment for completing programs that
promote congruence between students’ career
plans and their course of study would be .45 x
5,512 = $2,480 per person per year
29
Applied Career Transitions Program
(on-line program for unemployed university grads)
For Module 1
•
•
•
•
•
All together there were 10 (items) x 29 (participants) = 290 ratings
Pre: 144 Unacceptable Ratings – Post: 3 Unacceptable Ratings
Unacceptable Ratings decreased from 50% to 14%
Pre: 6 Exceptional Ratings – Post: 130 Exceptional Ratings
Exceptional Ratings increased from 2 to 44% of the participants
30
Results: Impact Outcomes
 Employment status
• 27 out of 29 were employed
• 90% employment rate
 Quality of job
• 13 of the jobs lined up well with career vision
• 48% of jobs were a good fit with career vision
31
Attribution for Change
To what extent would you say that any changes in the
ratings on the previous pages are a result of your
participation in this research project, and to what extent
were they a function of other factors in your life?
mostly
other
factors
somewhat
other
factors
uncertain
somewhat
this
program
mostly
this
program





ACT
0
0
0
10
19
LMI-Assisted
0
1
4
19
42
LMI -Independent
3
2
11
28
38
Program
32
Building cause and effect cases
 We have data on the process used
• Counsellor adherence to program
• Client engagement in program
 We have data on the outcomes
• KSAs: Knowledge, Skills, Personal Attributes
 We have data on the impact
• Employment status
 We have economic data
• Career Self-Sufficiency Index (Employment Equivalence)
We have a clear link between process and outcome
33
Possible Career Self-Sufficiency Index
(Employment Equivalence)
Employment
Element
equivalent
Take further training in institution with
0.34
student counsellors on staff
for each year
Take further training in institution with
0.25
no counsellors on staff
Complete career guidance program
0.45
Complete Job Finding Club
0.80
Complete ACT
0.90
34
Future Possible Directions
35
Possible Career Self-Sufficiency Index
(Employment Equivalence)
Element
Completes career program with
modest self-confidence
Completes career program with good
self-confidence
Completes ACT
Employment
equivalent
0.60
0.70
0.90
36
Question to ponder
Is it logical that a Career Self-Sufficiency Index
Employment Equivalence could be greater than 1?
 If the goal is employment, job = 1
 A good job with prospects for permanency and
advancement should contain a bonus
 Consult tables of labour turnover for various
occupations.
• Turnover for labourer might happen every 6 months
• For other categories it might be, say, 12 months
• People getting low level jobs would get an equivalent
value of 1 and the latter an equivalent level of 2.
What do you think of this idea?
37
Possible Career Self-Sufficiency Index
(Employment Equivalence)
Employment
Element
equivalent
Obtains job in firm with fewer than 20
1.0
workers
Obtains job in firm with more than
1.25
500 workers
Job obtained in unionized firm
Add 0.25
38
The Problem
 Agency managers and counsellors agree that
evaluation of services is important
BUT
 Counsellors do not evaluate their work with
clients in a way that permits making a
connection between
• what counsellors do
and
• the client changes that take place.
 Perhaps these ideas will help integrate
evaluation into service delivery
39
Professional Identity:
What we do defines who we are
 Most practitioners define their job as delivering services
• So … they do not evaluate
the impact of their services on clients
What is career development all about?
 The answer needs to include BOTH process and outcome
• What will I do to facilitate client change?
+
• How well is it working?
Answers need to be a negotiated consensus
between practitioners and clients
40
What have we learned?
From Practitioners
 Structure and checklists are foreign at first
• But later help them to be more focused
 Service providers are willing research partners
• Most said they would do it again if given the opportunity
 Service providers are happy to follow procedures
that result in meaningful evidence of client change
From Clients
 Structure and timelines motivate action and a
sense of progress
 Giving clients hands-on tools is motivating
41
Demonstrating Value
It is really, really unfortunate when … There is
an excellent program
 That everyone knows is working
 Which is filling an important need
but
 The program is cancelled because
there is no evidence
to support the positive claims
42
To demonstrate value, we need to develop
Culture of evaluation:
We need to reach the state where
•
Identification of outcomes is an integrated part of
providing services
 Without efficacy data, career services are vulnerable
 It is in our best interest to gather evidence attesting to
the value of the services we provide
• Measuring and reporting processes and outcomes is
integrated into practice
• Outcome assessment is a prominent part of counsellor
education
• Reporting processes and outcomes is a
policy (and funding) priority
This needs to be a priority in all sectors
43
Don’t worry about getting it right, just
start and improve it as you use it
1. Small steps are OK
2. Several small steps = one BIG STEP
3. Share your success stories
• with the people who need to hear them,
• in language they can understand
4. Be persistent
5. Build support for yourself
44
Don’t Ever
Give Up
45
Discussion
1. What do you think of this idea?
2. Would general evaluation model work for you?
3. How could you use employment equivalence in
your work?
4. Other …
questions, comments, suggestions?
[email protected]
46
Demonstrating the Economic Value of
Career Services
What Counts:
Accountability, Evaluation, and Service Delivery
Intertwined
Bryan Hiebert
Vice-president, IAEVG
Professor Emeritus, University of Calgary
Adjunct Professor of Educational Psychology, University of Victoria
Docent of Education (Research), University of Jyvaskyla
[email protected]
47