NRS Data Monitoring for Program Improvement

Download Report

Transcript NRS Data Monitoring for Program Improvement

NRS Data Monitoring for
Program Improvement
Unlocking Your Data
7/21/2015
M. Corley
1
Objectives—Day 1
1. Describe the importance of getting involved with and
using data;
2. Identify four models for setting performance standards as
well as the policy strategies, advantages,and
disadvantages of each model;
3. Determine when and how to adjust standards for local
conditions;
4. Set policy for rewards and sanctions for local programs;
5. Identify programmatic and instructional elements
underlying the measures of educational gain, NRS followup, enrollment, and retention.
7/21/2015
M. Corley
2
Agenda—Day 1
 Welcome, Introduction, Objectives, Agenda Review
 The Power of Data
–
–
–
–
–
Why Get Engaged with Data? Exercise
The Data-driven Program Improvement Model
Setting Performance Standards
Adjusting Standards for Local Conditions
Establishing a Policy for Rewards and Sanctions
 Getting Under the Data
– Data Pyramids
– Data Carousel
 Evaluation and Wrap-up for Day 1
7/21/2015
M. Corley
3
Objectives—Day 2
1. Distinguish between the uses of desk reviews
and on-site monitoring of local programs;
2. Identify steps for monitoring local programs;
3. Identify and apply key elements of a change
model; and
4. Work with local programs to plan for and
implement changes that will enhance program
performance and quality.
7/21/2015
M. Corley
4
Agenda—Day 2
 Agenda Review
 Planning for and Implementing Program Monitoring
– Desk Reviews Versus On-site Reviews
– Data Sources (small group work)
– Steps and Guidelines for Monitoring Local Programs
 Planning for and Implementing Program Improvement
– A Model of the Program Improvement Process
– State Action Planning
 Closing and Evaluation
7/21/2015
M. Corley
5
STOP!
Why Get Engaged with Data?
7/21/2015
M. Corley
6
Question for Consideration
Why is it important to be able to
produce evidence of what your state
(or local) adult education program
achieves for its students?
7/21/2015
M. Corley
7
The Motivation Continuum
Intrinsic
Extrinsic
Which is the more powerful force for change?
7/21/2015
M. Corley
8
NRS Data-driven Program
Improvement (Cyclical Model)
STEPS
– Set performance standards
– Examine program elements underlying the
data
– Monitor program data, policy, and
procedures
– Plan and implement program improvement
– Evaluate progress and revise, as necessary,
and recycle
7/21/2015
M. Corley
9
What’s Under Your Data?
The Powerful Ps
__Performance_(Data)_
Program Policies
Procedures
Processes
Products
7/21/2015
M. Corley
10
NRS Data-driven Program
Improvement Model
Set Performance
Standards
Plan and Implement
Program
Improvement;
Evaluate
Improvement
NRS
DATA
Examine Program
Elements Underlying
the Data
Monitor Program
Data, Policy,
Procedures
7/21/2015
M. Corley
11
Educational Gains for ESL Levels
and Performance Standards
100%
Exhibit 1-2
91%
80%
80%
60%
40%
50%
33% 36%
31%
34%
22% 27%
20%
Program
26% 26%
Performance Standards
0%
Beg.
Lit
7/21/2015
Beg.
Low
Int.
High
Int.
Low
Adv.
M. Corley
High
Adv.
12
Questions Raised by Exhibit 1-2
 How were performance standards set? Based on past
performance?
 Are standards too low at the higher levels?
 Is performance pattern similar to that of previous
years? If not, why not?
 What are program’s assessment and placement
procedures? Same assessments for high and low ESL?
 How do curriculum and instruction differ by level?
 What are student retention patterns by level?
7/21/2015
M. Corley
13
The Power of Data: Setting
Performance Standards
7/21/2015
M. Corley
14
Essential Elements of
Accountability Systems
• Goals
• Measures
• Performance Standards
• Sanctions and Rewards
7/21/2015
M. Corley
15
National Adult Education
Goals
Reflected in NRS Outcome Measures of
 educational gain,
 GED credential attainment,
 entry into postsecondary
education, and
 employment.
7/21/2015
M. Corley
16
Performance Standards
 Similar to a “sales quota”: how well are you
going to perform this year?
– Should be realistic and attainable, but
– Should stretch you toward improvement
 Set by each state in collaboration with ED
 Each state’s performance is a reflection of
the aggregate performance of all the
programs it funds
7/21/2015
M. Corley
17
Standards-setting Models
Continuous Improvement
Relative Ranking
External Criteria
Return on Investment (ROI)
7/21/2015
M. Corley
18
Continuous Improvement
 Standard based on past performance
 Designed to make all programs improve
compared to themselves
 Works well when there is stability and a
history of performance on which to
base standard
 Ceiling reached over time, resulting in
little additional improvement
7/21/2015
M. Corley
19
Relative Ranking
 Standard is mean or median performance of
all programs
 Programs ranked relative to each other
 Works for stable systems where median
performance is acceptable
 Improvement focus mainly on low-
performing programs
 Little incentive for high-performing
programs to improve
7/21/2015
M. Corley
20
External Criteria
 Set by formula or external policy
 Promotes a policy goal to achieve a
higher standard
 Used when large-scale improvements are
called for, over the long term
 No consideration of past performance:
unrealistic, unattainable
7/21/2015
M. Corley
21
Return on Investment
 Value of program :: Cost of program
 A business model; answers question, Are
services or program worth the investment?
 Can be a powerful tool for garnering
funding (high ROI) or for losing funding
(low ROI)
 May ignore other benefits of program
7/21/2015
M. Corley
22
Decision Time for State Teams
1. Which model(s) do you favor for setting
standards for/with locals?
2. Is it appropriate to use one statewide
model or different models for different
programs?
3. How will you involve the locals in
setting the standards they will be held
to?
7/21/2015
M. Corley
23
Question for Consideration
How do the standard-setting
model(s) that states select represent
a policy statement on the
relationship between performance
and quality that states want to instill
in local programs?
7/21/2015
M. Corley
24
Adjusting Standards for
Local Conditions
Research suggests that standards
often need to be adjusted for local
conditions before locals can work to
improve program quality.
WHY IS THIS SO?
7/21/2015
M. Corley
25
Factors that May Require
Adjustment of Standards
 Student Characteristics
– An especially challenging group
– Students at lower end of level
– Influx of different types of students
 Local Program Elements
 External Conditions
7/21/2015
M. Corley
26
Shared Accountability
State and locals share responsibility
to meet accountability requirements
– State provides tools and environment
for improved performance
– Locals agree to work toward
improving performance
7/21/2015
M. Corley
27
Locals should know…
 The purpose of the performance standards;
 The policy and programmatic goals the
standards are meant to accomplish;
 The standard-setting model that the state
adopts; and
 That State guidance and support is available
to locals in effecting change.
7/21/2015
M. Corley
28
Shared Accountability
 Which state-initiated efforts have been
easy to implement at the local level?
 Which have not?
 What factors contributed to locals’
successfully and willingly embracing the
effort?
 What factors contributed to a failed
effort?
7/21/2015
M. Corley
29
Shared Accountability
Local Program
Involvement
High
Low
Locals Out of
Control??
Hot Dog!! We’re
really moving!
Anything
Happening Out
There??
Get
OFF
our backs!!
Low
High
State Administrative Control
7/21/2015
M. Corley
30
What About Setting
Rewards and Sanctions?
 Which is the more powerful motivator: rewards or
sanctions?
 List all the different possible reward structures you
can think of for local programs.
 How might sanctioning be counter-productive?
 List sanctioning methods that will not destroy locals’
motivation to improve or adversely affect
relationships with the state office.
7/21/2015
M. Corley
31
Variations on a Theme Exercise
 (Refer to H-10). Brainstorm as many possible rewards
or incentives as you can for recognizing local programs
that meet their performance standards.
 Then brainstorm sanctions that the state might impose
on local programs that do not meet their performance
standards.
 Select a recorder for your group to write one reward
per Post-It Note and one sanction per Post-It Note.
 When you have finished, wait for further instructions
from the facilitator.
7/21/2015
M. Corley
32
Summary of Local Performance
Standard-setting Process
Procedure
Select standardsetting model
Set rewards and
sanctions policy
Make local
adjustments
Provide T/A
Monitor often
7/21/2015
Goal
Reflect state policies;
Promote program improvement
Create incentives;
Avoid unintended effects
Ensure standards are fair &
realistic for all programs
Create atmosphere of shared
accountability
Identify and avoid potential
problems
M. Corley
33
Getting Under the Data
NRS data, as measured and
reported by states, represent the
product of underlying
programmatic and instructional
decisions and procedures.
7/21/2015
M. Corley
34
Four Sets of Measures
1. Educational gain
2. NRS Follow-up Measures
– Obtained a secondary credential
– Entered and retained employment
– Entered postsecondary education
3. Retention
4. Enrollment
7/21/2015
M. Corley
35
Educational Gain
Educational
Gain
Assessment Policies and
Approach
Assessment Procedures
Goal Setting and Placement Procedures
Retention
Class Organization
Professional Development
7/21/2015
M. Corley
36
Follow-up Measures
GED
Employment
Postsecondary
Instruction
Support Services
Tracking Procedures
Retention
Professional Development
7/21/2015
M. Corley
37
Retention
Retention
Students
Class Schedules and
Locations
Placement Procedures
Support Services
Retention Support and Policies
Professional Development
7/21/2015
M. Corley
38
Enrollment
Enrollment
Community
Characteristics
Class Schedules and
Locations
Instruction
Professional Development
7/21/2015
M. Corley
39
Data Carousel
7/21/2015
M. Corley
40
Question for Consideration
How might it benefit local programs
if the State office were to initiate
and maintain a regular monitoring
schedule to compare local program
performance against performance
standards?
7/21/2015
M. Corley
41
Regular Monitoring of Performance
Compared with Standards
 Keeps locals focused on outcomes and
processes;
 Highlights issues of importance;
 Increases staff involvement in the process;
 Helps refine data collection processes and
products;
 Identifies areas for program improvement;
 Identifies promising practices;
 Yields information for decision-making;
 Enhances program accountability.
7/21/2015
M. Corley
42
BUT…
 How can states possibly monitor
performance of all local programs?
 Don’t we have enough to do already??
 Where will we find staff to conduct the
reviews?
 You’re kidding, right??
7/21/2015
M. Corley
43
Not!
7/21/2015
M. Corley
44
So….Let’s Find Some Answers
 How can you monitor performance of
locals without overburdening state staff?
 What successful models are already out
there??
 How does your state office currently
ensure local compliance with state
requirements?
 Can you build on existing structures?
7/21/2015
M. Corley
45
Approaches to Monitoring
On-site Reviews
Desk Reviews
– Ongoing process
– Useful for
quantitative data
• Proposals
• Performance
measures
• Program improvement
plans
• Staffing patterns
• Budgets
7/21/2015
M. Corley
– Single event,
lasting 1-3 days
– Useful for
qualitative data
– Review of
processes &
program quality
– Input from diverse
stakeholders
46
Advantages and Disadvantages
of Desk Reviews
Advantages
Disadvantages
Data, reports, proposals,
etc., already in state office
Assumes accurate data
that reflect reality
Review can be built into
staff’s regular workload
Local staff and
stakeholders not heard
Data is quantitative; can
be compared to previous
years
No travel time or costs
required
Static view of data; no
interaction in context
7/21/2015
No team perspective
M. Corley
47
Advantages and Disadvantages
of On-site Reviews
Advantages
Disadvantages
Data is qualitative; review of
processes & program quality
Stressful for local program and
team
Input from perspectives of
diverse stakeholders
Arranging site visits and team is
time-intensive for both locals
and state
State works with locals to
explore options for
improvement; provides T/A
Requires time out-of-office
Opportunity to recognize
strengths; offer praise; identify
best practices
Incurs travel costs
7/21/2015
M. Corley
48
Data Collection Strategies
for Monitoring
1. Program Self-Reviews (PSRs)
2. Document Reviews
3. Observations
4. Interviews
7/21/2015
M. Corley
49
Program Self-Reviews
 Conducted by local program staff
 Review indicators of program quality
 Completed in advance of monitoring
visit and can help focus the on-site
review
 Results can guide the program
improvement process
7/21/2015
M. Corley
50
Document Reviews
 Can review from a distance:
– Proposals
– Qualitative and quantitative reports
– Improvement plans
 Can review on-site:
– Student files
– Attendance records
– Entry and update records
– Course evaluations
7/21/2015
M. Corley
51
Qualitative and Quantitative Data
7/21/2015
M. Corley
52
Observations
 Interactions
– during meetings
– At intake and orientation
– In hallways and on grounds
– In the classroom
 Link what is observed to
– Indicators of quality
– Activities in the program plan
– Professional development workshops
7/21/2015
M. Corley
53
Interviews
 Help clarify or explore ambiguous
findings
 Provide information re: stakeholders’
opinions, knowledge, and needs
– Administrative, instructional, and support staff
– Community partners
– Community agencies (e.g., employment, social
services)
– Learners
7/21/2015
M. Corley
54
Fill in the Boxes: Monitoring
with Indicators of Program Quality
In teams of 4-5 and using H-12, fill
in the data sources you would
expect to use, the questions you
would ask locals, and the strategies
you would use in conducting a desk
review versus an on-site review.
7/21/2015
M. Corley
55
Steps for Monitoring
Local Programs
1.
2.
3.
4.
5.
6.
7.
7/21/2015
Identify state policy for monitoring; gather support
from stakeholders.
Consider past practices when specifying scope of
work for monitoring.
Identify persons to lead and participate in
monitoring.
Identify resources available for monitoring locals.
Determine process for collecting data with clearly
defined criteria for rating; conduct monitoring.
Report findings and recommendations.
Follow-up on results.
M. Corley
56
Data Help…
 Measure student progress
 Measure program effectiveness
 Assess instructional effectiveness
 Guide curriculum development
 Allocate resources wisely
 Promote accountability
 Report to funders and to the community
 Meet state and federal reporting requirements
 Show trends
7/21/2015
M. Corley
57
BUT…
Data do not help:
 If the data are not valid and reliable;
 If the appropriate questions are not
asked after reviewing the data; or
 If data analysis is not used for making
wise decisions.
7/21/2015
M. Corley
58
A Word about the Change Process
Factors that allow us to accept change:
1.
2.
3.
4.
5.
7/21/2015
There is a compelling reason to do so;
We have a sense of ownership of the change;
Our leaders model they are serious about
supporting the change;
We have a clear picture of what the change
will look like; and
We have organizational support for lasting
systemic change.
M. Corley
59
Stages of Change
1. Maintenance of the old system
2. Awareness of new possibilities
3. Exploration of those new possibilities
4. Transition to some of those possibilities
or changes
5. Emergence of a new infrastructure
6. Predominance of the new system
7/21/2015
M. Corley
60
A Word of Caution
 Start small; don’t overwhelm locals with a “data





dump.”
Begin with the core issues, such as educational gain.
Listen to what the data tell about the big picture; don’t
get lost in too many details.
Work to create trust and build support by laying data
on the table without fear of recrimination.
Provide training opportunities for staff on how to use
data.
Be patient, working with what is possible in the local
program.
Source: Spokane, WA School Superintendent Brian Benzel
7/21/2015
M. Corley
61
Planning and Implementing
Program Improvement
Stages of the Program Improvement Process
1. Planning;
2. Implementing;
3. Evaluating; and
4. Documenting Lessons Learned
and Making Adjustments, as
needed
7/21/2015
M. Corley
62
Planning Questions
 Who should be included on your
program improvement team?
 How will you prioritize areas needing
improvement?
 How will you identify and select
strategies for effecting improvement?
7/21/2015
M. Corley
63
Guiding Questions for Strategies
Is the strategy:
 Clear and understandable to all users?
 One specific action or activity, or dependent on other






activities? (If so, describe the sequence of actions.)
An activity that will lead to accomplishing the goal?
Observable and measurable?
Assignable to specific persons?
Based on best practices?
One that all team members endorse?
Doable—one that can be implemented?
7/21/2015
M. Corley
64
Implementation Questions
 Who will be responsible for taking the
lead on ensuring that the change is
implemented?
 Who will be members of the “change”
team and what will be their roles?
 How will expectations for the change be
promoted and nurtured?
 How will the change be monitored?
7/21/2015
M. Corley
65
Evaluation Questions
 How will the changes that are
implemented be evaluated?
 How will the team ensure that both
short- and long-term effects are
measured?
 Who will interpret the results?
 Who will be on the look-out for
unintended consequences?
7/21/2015
M. Corley
66
Possible Evaluation Results
 Significant improvement with no
significant unintended
consequences: Stay the course.
 Little or no improvement: Stay the
course OR scrap the changes?
 A deterioration in outcomes: Scrap
the changes.
7/21/2015
M. Corley
67
Documenting the Process
Document
 what worked and what didn’t;
 lessons learned; and
 logical next steps or changes to the
plan.
Use as guide for future action.
7/21/2015
M. Corley
68
State Planning Time
In your state teams, consider the questions on
H-14 and begin planning.
 Consider the stakeholders you want to include
in your planning for data monitoring and
program improvement.
 Consider the problems you anticipate facing
and propose solutions to those problems.
 Complete H-14 to the best of your ability and
be prepared to report on your plan in one hour.
7/21/2015
M. Corley
69
Thank you
 Great Audience!
 Great Participation!
 Great Ideas!
 Live Long and Prosper!
 Good Luck!!
7/21/2015
M. Corley
70