Diving Deep into NRS Data

Download Report

Transcript Diving Deep into NRS Data

Welcome to San Francisco!
NRS 2010 Regional Training
Diving Deep into the NRS
Local Data Pool: Attendance,
Educational Gain, and
Transition Outcomes
2010 Regional Training
Objectives
At the end of this training, participants will be able to:
• Dive Deeper: Explore local data, analyze results
• Produce an Analysis: Develop a “My Program” summary on
attendance, level gain, or transition
• Customize Tools/Exercise: Adapt tools for use by programs
or classes within the state
• Plan a Local Level Change: Plan to roll out the NRS Local
Data Tool to make a change and create new levels of data
use at the local level
• Collaborate: Share ideas with other states and programs;
work together across and within state teams
3
7/28/2015
Agenda
• Tuesday
– History of Data Training: Why Look at Data
– Taking a Dip in the Water: Data Carousel
– Wading Further In: Asking Questions of Your Data &
Creating Data Exercises for Local Program Staff
• Wednesday
– Diving Deep: Using the Local Program Data Tool
– Creating a Local Program Data Report
• Thursday
– Identifying Local Program Changes
– Building an Action Plan
7/28/2015
4
Test your NRSAbility!
Instructions
1. Select a judge/timer/scorekeeper from
your group.
2. First person draws a card and must
describe the first word without using
any variation (no “ABE”) of any words
on the card.
3. Everyone at the table tries to guess
the word within 1 minute. If guessed
right, the table earns a point. If not, the
card goes to the bottom of the pile.
4. Next person selects a card and repeat.
5. Table with the most points wins!
Word/Concept
Can’t say this word
Can’t say this word
Can’t say this word
Can’t say this word
Can’t say this word
5
7/28/2015
Before You Arrived
NRSWeb
• History of the NRS
• What is the NRS? (and what it is not)
• Data Quality Guide Review, Chapter 2
– Were there any surprises?
– Things you learned that you did not know?
– Where were you (in the adult education system)
when the NRS took shape? And has your opinion
about the NRS changed since then?
6
7/28/2015
What is the NRS?
What is the National Reporting
System (NRS) for Adult Education?
• The accountability system for the federally supported
adult education program
– Provides a framework for providing information on
students and their outcomes to a federal audience.
• Includes core (required) and secondary (optional)
measures
• Outlines methodologies for collecting the measures
8
7/28/2015
The NRS Does Not
• Preclude states from collecting other measures and
using them for state accountability.
• Preclude use of other educational functioning levels.
• Specify which test or assessment that must be used.
• Require which software or record-keeping system t
use.
• Specify skill areas or topics to teach students.
9
7/28/2015
Brief History of the NRS
Brief History: Overview of the NRS
• Wave 1: Development
– Pre-NRS: Emergence of accountability
– Development process
• Wave 2: Implementation
– Refinement of measures and procedures (1999–2001)
– State implementation(2000–2003)
– Training and T/A on requirements, data quality & use
• Wave 3: Focus on Improvement
• Wave 4: Maturity and Growth
11
7/28/2015
History of Data Diving
• Data Quality – 2002
– “What!? You can’t make me!”
• Data Use – 2003
– “Huh? Are we still talking about data?”
• Data Monitoring for Program Improvement – 2004
– “Oh, data can do that?”
• Developing an MIS – 2004
– “Web-based? Nah, my floppy disk works fine.”
• Developing State and Local Report Cards – 2005/2007
– “Now this is fun! But don’t tell Mike we said so.”
12
7/28/2015
History Data Diving (Cont’d.)
• Learning to be an NRS Data Detective – 2006
– “I’m really good at this data work.”
• Desk Monitoring – 2007
– “How useful! I’m saving time and money.”
• Building and Sustaining Quality in the NRS – 2008
– “It’s review, but we’ve got new staff who need
it…great!”
• The Local Connection: Building a Data Use Learning
Community – 2009
– “We need to build a culture of local data use.”
• Diving Deep into the NRS Local Pool – 2010
– Local programs dive deep into data to make changes
13
7/28/2015
NRS Data Diving
Why Look at Data?
Data help us to…
• Replace hunches and anecdotes with facts
concerning the changes that are needed;
• Identify root causes of problems;
• Identify whether student or program goals
are being met; and
• Tell our stakeholders, including students,
about the value of our programs and the
return on their investments.
15
7/28/2015
Data: A Carrot or a Stick?
Data may be used…
• To highlight, clarify, and
explain what’s
happening in your
program
“However beautiful the strategy, you should
occasionally look at the results.”
–W. Churchill
OR
• To show what’s not
happening in your
program.
16
7/28/2015
Data Tell You
•
•
•
•
Where you’ve been
Where you are
Where you’re going
How to get there
Data can help you design a quality program to help
meet learners’ goals.
17
7/28/2015
The Importance of Data
The importance of data for administrators,
policymakers, and teachers in the
classroom—to be able to break data down
and know where the strengths and
weaknesses reside—is crucial if you want
to make any kind of improvement.
-Monte Moses, Superintendent,
Cherry Creek (CO) School District
18
7/28/2015
Applied to Adult Education…
Data can:
• Guide you to improve instruction
• Measure program success & effectiveness
• Tell you if what you are doing is making a
difference
• Tell you which classes are getting the results
you want—and which are not
• Get to the root of problems, such as poor
retention, low educational gains, or low
transition rates
19
7/28/2015
Starting the Dive
Starting the Dive
• Attendance
• Educational Gain
• Transition Outcomes
21
7/28/2015
Attendance
• Contact hours of instruction the learner
receives (NRS)
• Includes intensity and duration
• Can help to tell us whether:
– Instruction is successful
– Content and materials are relevant
– Students are motivated
– Students are reaching their goals
See Guide pages 10-20
7/28/2015
22
Examples: What Increases Attendance
• Quality instruction and relevant content
• Well-trained teachers
• Clear goals set at intake, revisited regularly,
and matched to teachers and content
• Reduction of obstacles – flexibility in
programming, support services, and access to
site off-hours
(NCREL; Lieb, 1991; Comings, 2007; Beder, 1988; Beder, 1991; Comings,
Parella, & Soricone, 1999; Kerka, 2005; Thoms, 2001; Porter, Cuban &
Comings, 2005)
23
7/28/2015
Educational Gain
• Advancement through 12 educational functioning
levels
• Core NRS measure
• Can tell us:
– Whether the program/students are meeting goals
– Which sites/classes/teachers are most effective
– Extent of student progress
– Impact of changes
See Guide pages 20-31
7/28/2015
24
Examples: What Increases Ed Gain
•
Make classes learner-centered
•
Focus on relevant knowledge
•
Opportunity for practice and application
•
Coherence
•
Sufficient Intensity and Duration
(NRC, 1999; Garet, Porter, Desimone, Birman, & Yoon, 2001)
25
7/28/2015
Transition Outcomes
• Four “follow-up” measures: entered employment,
retained employment, entry into postsecondary
education, attainment of a secondary credential
• Only track students with the related goal
• Transition outcomes can tell you
– Whether the program/students are meeting goals
– Which sites/classes/teachers are most effective
– The extent of student progress
– The impact of changes made in the program
See Guide pages 32-44
7/28/2015
26
Examples: What Increases Outcomes
• Match student needs and goals
• Use SMART goal setting
• Tie content to employment or academic skills
and necessary background knowledge
• Utilize college-based instructional practices (for
college transition)
• Build strong relationships with external entities
(Zafft, Kallenbach, and Spohn, 2006)
27
7/28/2015
Discussion: Challenges & Strategies
• Refer to “Challenges & Strategies” Handout
• At your table (20 minutes)
– Choose one topic: Attendance, Educational Gain,
or Transition Outcomes
– Describes challenges you have had in this area
– Describe strategies you used to address this issue
• Report out—one person per table
See Handout 1: “Challenges & Strategies: Attendance,
Educational Gain, and Transition Outcomes”
28
7/28/2015
Levels of Diving
Diving Deep: Analyzing and Improving
Performance
• Data disaggregation – diving deep requires access to
disaggregated data to analyze the site,
classroom/teacher, and student information to
identify issues and evaluate solutions
• Data comparisons – meaningful analysis requires
comparison data from other programs or classes, the
state, or a defined standard
• Questions – a good analytic plan for program
improvement begins by asking specific questions
about data and program performance
30
7/28/2015
Levels of Diving: Data Quality
31
7/28/2015
Data Quality
Data Quality = Good Data Collection Process
Good Data Collection requires:
1. Staff training in data collection, administration,
and review; staff roles are clear
2. Standardized processes for entering, checking,
and correcting information
3. Clear, common understanding of data definitions,
forms, and coding through written documentation
4. Continuous monitoring by staff at multiple levels
(program director, data administrators, teachers,
students)
32
7/28/2015
33
7/28/2015
Do You Trust Your Data?
Data analysis is only as good as the
original data allow.
Keys to good data collection systems include:
• Clear policies and procedures for data entry
• Data is entered & reviewed daily, weekly, or monthly
• Teachers, staff, administrator all have access to data
and review regularly
• Teachers share data with students
What does your program do to ensure data is
accurate, reporting is timely, and staff have
access to the data?
34
7/28/2015
Tips for Monitoring Data Quality
1. Perform error and validity checks: Check ranges
2. Look for trends over time: Flag changes, e.g., > 10%
3. Compare data within and across programs
Look for the red flags.
35
7/28/2015
Dive into the National Pool
For each of the next few slides
showing national data,
write down five observations
for discussion
• What do you see?
• What is interesting or unusual?
• Do any questions or hypotheses come to mind
as a result?
See Handout 2: “Dive
into the National Pool”
36
7/28/2015
Write 5 Observations/Thoughts
Educational Gain Trends for States with the Highest Percent of
Students Completing a Level in PY 2007-08: PY 2003-04 to PY 2007-08*
37
7/28/2015
Write 5 Observations/Thoughts
Educational Gain Trends for States with the Lowest Percent of Students
Completing a Level in PY 2007-08: PY 2003-04 to PY 2007-08*
38
7/28/2015
Write 5 Observations/Thoughts
National Trends for Transition Outcomes: PY 2003-04 to PY 2007-08
39
7/28/2015
Write 5 Observations/Thoughts
States with Highest Percentage of Students Entering Employment
in 2007-2008: PY 2003-04 to PY 2007-08
40
7/28/2015
Write 5 Observations/Thoughts
Attendance Trends for States Reporting the Highest Attendance
Hours per Students Enrolled in PY 2007 8: PY 2003-04 - PY 2007-08
41
7/28/2015
Take a Dip in the Data Pool
Take a Dip in the Data Pool
Data Carousel Activity
• Participants count off 1 to 5, then split into 5
groups
• Each group begins at one wall chart
• Write observations, possible causes, and next
steps (spend only 3 minutes per graph)
• Cycle to next graph until all 5 graphs are analyzed
• Return to first graph when done; report out on
observations, possible causes, next steps
See Handout 3: “Data Carousel Exercise”
43
Take a Dip in the Data Pool (cont’d.)
Tell Your Program’s Story
• Each table is assigned one program (Capitol
City, Rugbyville, Quaintly, Brasiltown, Pronto)
• Consider the observations, possible causes,
and next steps for your assigned program
• Write the “story” of your program based on
data
• Report out your program’s story
Handout 3 (cont’d.)
44
7/28/2015
Diving Deep into Classroom Data
Data Carousel Activity: Class Level Data
• Exercise for Local Programs to use with
program staff and teachers
• Instructions for activity same as first Data
Carousel
• Use fake data first, non-threatening, then
substitute real data and alternate graphs
45
7/28/2015
Wading Further In…
NRS as Inquiry
Inquiry-based learning is learning
directed by students’ questions. It
requires students to work together to
solve problems (based on their
questions) rather than receiving direct
instructions on what to do from the
teacher.
NRS as inquiry helps local programs
to ask questions about their data and
discover what’s happening in the site
and classroom—by looking at NRS
data.
(Source: http://en.wikipedia.org/wiki/Inquiry-based_learning )
47
Begins with a Question
» What is a central question you
have about your program or sites
in the area of attendance,
education gain, or transition
measures?
» Your question may address
whether the data are of high
quality, but it may not. If the data
quality is poor, you will end up at
this question quickly.
48
7/28/2015
Levels of Diving: Students, Teachers,
Instruction, and Program & Policies
49
7/28/2015
Questions: Attendance
Attendance &
Retention
Sample questions
Further questions
Data collection Who enters attendance data
& quality
at each site? How often is
attendance data entered?
Who checks the data? How
often?
Students
How does attendance differ
by student type (ESL vs.
ABE)?
When in the term do students
tend to drop/stop-out most? Is
this the same across sites?
Teachers
Which classes have very high Do teachers with high attendance
(or low) attendance?
have greater educational gains?
Instruction
Does attendance vary by
instructional content (e.g.
GED, workplace) or level?
How many hours does it take to
achieve a goal, on average?
Program
What is the average
attendance for my program?
Are my program’s attendance
hours similar to other programs?
Program
policy
Are my managed enrollment
Does managed enrollment result
classes more successful than in higher ed gains or greater50goal
open classes?
achievement?
Questions: Educational Gain
Educational Gain Sample question
Further questions
Data collection & What is the range of
quality
pre/posttest scores in my
program/site?
Are all the test scores within the
correct range for the test and
class level?
Students
Which students are most
likely to complete a level
(student characteristics)?
Do students with higher contact
hours have greater completion
rates?
Teachers
What teacher characteristics
are most related to level
completion?
How high is teacher turnover at
each site? Which sites retain
teachers longest/best?
Instruction
Which instructional
approaches have the
greatest impact on gain?
Do assessments match course
content?
Program
How many hours of PD do
our teachers participate in?
Which PD have the greatest
impact on student learning?
Program policy
Do placement policies differ
among sites?
Which placement policies have
51
an impact on educational gains?
Questions: Transition Outcomes
Transition
Sample question
Further questions
Data
How are data collected, e.g.,
collection & who administers the survey or
quality
oversees the match?
How do response rates or
matches differ by site? Who
verifies the data?
Students
Which types of students have
which goals?
Which student variables are
related to the goals set, e.g.,
postsecondary placement?
Teachers
How do goals set compare
Does goal achievement differ by
across similar classes (i.e., ESL class?
to ESL)?
Instruction
Do college transition classes
show greater goal achievement
for post-secondary?
How do college transition classes’
course content differ from other
ASE classes?
Program
How do sites compare with
each other and the program
average on goal achievement?
For sites with high goal
achievement, are there
programmatic differences?
Program
policy
What linkages exist between
the program and
colleges/businesses?
Do sites/classes with linkages
show different outcomes than
52
those without linkages?
What Do I Want to Know?
With what question would you like to
begin your local program line of
inquiry?
See Handout 4: “What do I want to know?”
53
7/28/2015
One question leads to more questions…
…and it becomes a Data Dive.
54
7/28/2015
Make A Dive Plan
How well are we serving students who
are seeking a GED?
• Create (individually) a dive plan for this question
• Share your dive plans and reflect
– Did the dive plans begin the same way?
– Did they cover the same topics? Same order?
– What additional data did others consider?
See Handout 5: “Make A Dive Plan”
55
7/28/2015
Examples to begin…
How well are we serving
students seeking a GED?
What are the %s of students with GED as
a goal by class type (e.g., Low ASE)
averaged over the last three years?
How do our pass rates for the GED
compare across classes and to the general
public? What percentage of scores are
above 3000?
What are the average # of contact hours
for GED attainment by class? Do day or
evening programs have a higher
attainment rate?
56
7/28/2015
Wading Further In…(cont’d.)
Ways to Cut the Data
1. What does it look like now?
– What is the mean, median, mode, and range (i.e., shape
and range of the distribution)
2. How does it compare?
– Over time
– To peers, now and over time
– To the state/nation or a benchmark, now and over time
3. How do the different subgroups contribute?
– Redo 1 and 2 with subgroups
4. Are any part of subgroups doing better in any
circumstances? (i.e., look for interactions)
58
7/28/2015
Ways to Cut the Data (Cont’d.)
5. What are the key relationships (ratios) to outcomes
– Speed of improvement (outcomes/time)
– Efficiency (outcomes/money)
– Impact (outcomes/participants or dosage)
6. What other factors might be related?
– E.g., attendance on completion; professional
development on student completion
59
7/28/2015
Checklist: Ways to Look at your Data
 Quality: Are the data credible?
 Averages: What is the average?
 Variation: What are the ranges?
 Rates: What are the outcomes by time, cost, or
resource?
 Benchmarks: On each of the above, how do we
compare to the standard or our best peers?
 Trend: On each of the above, how do the numbers
change over time?
60
Strengths and Areas to Target
• Consistently identifying the highest and lowest
performers will provide information you might
use later in developing a plan for making
changes, for example:
– Building on your strengths, including people
(e.g., Gallup)
– Being targeted (tiered?) in providing
technical assistance to lowest performers
(e.g., response to intervention)
61
7/28/2015
Visualize Your Dive
Diving Details: Analysis, Graphing
 Qualitative and Quantitative
 Choosing the right graph
 Data display recommendations
63
7/28/2015
Quantitative vs. Qualitative
Select quantitative or qualitative data,
depending on the goals
Quantitative
Qualitative
Number of contact hours
Classroom environment
Retention rates
Intake and orientation
Educational gain
Professional development
Pre-posttest rates
Teacher interview reports
Follow-up outcomes
Follow-up processes
Goal setting/goal attainment Other services (e.g., child
care)
Student Demographics
Community partnerships
Teacher Characteristics
State or program policies
7/28/2015
64
Knowing Which Graph to Choose
For example:
• Line graph: Use with trend data or correlations ONLY
where the x-axis is a continuous variable (called
“time series” when y is percent/quantity and x is
time)
• Bar graph: Use the same as a line graph except bar
graphs can also handle discontinuous (categorical)
data.
• Pie chart: When comparing the relative contributions
to a whole at one point in time. If over time, either do
a row of pie charts or bar charts that show
composition (stacked bar charts).
65
7/28/2015
66
•
http://extremepresentation.typepad.com/blog/2006/09/choosing_a_good.html
Visualization Methods
(Click on each cell to see examples)
http://www.visual-literacy.org/periodic_table/periodic_table.html# 67
Plan Your Dive
Reminder: Levels of Diving
69
7/28/2015
Create Your Dive Plan
With what question would you like to
begin your data dive?
•
•
•
•
•
Reference questions selected on Handout 4
Thoughts on what to include in your dive plan
Draft your dive plan
Peer-to-peer sharing of dive plan
Revise your dive plan
See Handout 6: “Create Your Dive Plan”
70
7/28/2015
Diving Deep into Data
Diving Deep: NRS Local Data Tool
Open up the NRS Local Data Tool from
http://tinyurl.com/divingdeep
72
7/28/2015
Introduction to “My Program” Report
Building “My Program” Report – a report to track the
change you have identified; serves as a data review
tool for you and your staff in your local program
My Program Report includes:
 Standard, regularly reviewed data and charts
 Ad hoc charts that change regularly, depending on
staff needs/interests
73
7/28/2015
Using the Tool: Identify and Describe
Strengths and Challenges for Local Programs
• Pick one local program and use other programs and
state data for comparison charts.
• Identify the area of focus: Attendance, Ed Gain, or
Follow Up Measures (Goals)
• Make observations, identify additional data to review,
consider your plans.
• At the end of the section explain what you learned—
what are the strengths and challenges of your program.
• Identify a change (at least one);
• Explain how you will measure that change, what data
will be used to verify that the change was implemented,
what impact the change had, how to evaluate
74
7/28/2015
Telling Your Local Program Story
Each breakout room assigned one data dive (Attendance,
Educational Gain, or Transition Outcomes)
• Address the question identified from “What do I want to
know about my program?”
• Using local program data, create 3-5 graphs, tables, or
charts to investigate your questions; identify strengths
and areas for improvement; notice patterns/themes in
your data
• Report out on your local program story
Continue until you have completed your “My Program”
Report with questions on each: Attendance, Educational
Gain, and Transition Outcomes
75
7/28/2015
By the end of the day…
You should have the “My Program” report completed
with at least three graphs.
Report-outs in breakout rooms using the My Program
booklet or PowerPoint slides.
76
7/28/2015
Deep Diving to Make a
Change
The greatest danger for most of us
is not that our aim is too high
and we miss it,
but that it is too low and we reach it.
Michelangelo
Making Data Diving Changes:
Attendance
Making changes to improve attendance or
persistence? Consider the research to ensure
that
• Instruction is successful;
• Content and materials are relevant;
• Students are motivated to show up, persist over
time; and
• Students are reaching their goals.
See Guide pages 10-20
79
7/28/2015
Making Data Diving Changes:
Educational Gain
Making changes to improve educational gain?
Consider the research to ensure that
• Students have clear goals for achievement,
• Classes are student-centered and learnerdriven,
• Tests measure student learning and progress,
and
• Students are engaged and learning is relevant.
See Guide pages 20-31
80
7/28/2015
Making Data Diving Changes:
Transition Measures
Making changes to improve transition outcomes and
goal-setting? Consider the research to ensure that
• Students are setting goals,
• Transition models or pipelines help students to meet
their goals,
• Instruction is effective and bridges the gaps in
academic skills and background knowledge,
• Students are effectively assessed,
• Changes in student progress are tracked, and
• Program build effective relationships with external
entities (colleges, workforce)
See Guide pages 32-44
7/28/2015
81
Making a Change
Re-examine your data and clarify your focus:
• What changes do you want to make in your local
program?
• What are the strengths of your program and how
will you replicate them?
• State Teams Report Out–3-5 minutes:
– What we learned
– What we will change, and
– How we will evaluate/measure that change
See Handout 7: “Making a Change in
Your Local Program”
82
7/28/2015
What Do You Need to Make a Change?
• Local Program Director: Identify the data,
resources, and other tools you will need to
support your plans for local program change
and prioritize.
• State Team: Discuss the list of local program
needs and priorities as a state team. Identify
which resources the state can provide in the
near-term, the long-term, or not at all currently.
Share highlights with the group.
See Handout 8: “What do you Need to
Make a Change? The Wish List”
83
7/28/2015
Making a Splash!
Action Plan
• Local Plan for Making a Change – as a local
program director, how will you introduce this
change to your staff? How will you involve them
in reviewing data, recognizing the change,
implementing the change, tracking data to
review and evaluate the change?
• State Plan (for state teams without local staff in
attendance) – Professional Development on
Creating a Local Data Report
85
7/28/2015
Local Program Making a Change Plan
Change My
Identified Program
Report
Data
New
goal
setting
process
Resources Staff to
Needed
Review/
Update
•Goals
Access to Program
•Follow up
MIS,
director,
•Attendance reports on MIS
similar
person,
programs’ teachers
data
(names
here)
Timeframe Evaluation
to
Method/
Implement Measure
Fall &
spring
terms;
review
monthly
See Handout 9A: “Action Plan:
Local Program Making a Change”
7/28/2015
20%
increased
ed gain
with ABE
students;
student &
teacher
survey
feedback
on goalsetting
86
Feedback on Action Plans
Provide feedback to your colleagues
• What is the change identified? Is it achievable?
• What data was provided to support this change and
what additional data will need to be reviewed?
• Is access to the needed data and resources available?
• Is the timeframe for this change realistic?
• What is the follow-up plan to measure and evaluate this
change?
• What additional questions do you have for the state
team/local program?
• What suggestions do you have for the state team/local
program?
See Handout 9B: “Feedback Form:
Local Program Change Plan”
7/28/2015
87
State Roll-out of PD Plan for Local
Programs
Activities Staff
Data
Planned Responsible Needed for
Locals in
PD
Participants Timeframe
to
Implement
Evaluation
Method/
Measure
See Handout 10: “Action Plan: State Roll-out of
Professional Development Plan for Local Programs”
88
7/28/2015
Next Steps, Wrap-up, and
Evaluation
• Follow-up Webinar: State & Local Program staff
report on:
– Breaking through data culture barriers in
Local Programs
– Successful changes
– Lessons learned
• Evaluation Form
89
7/28/2015