I Have the Data and Know What?

Download Report

Transcript I Have the Data and Know What?

Facilitating Ongoing Data
Collection and Collaborative
Conversations to Foster/Focus on
School Improvement
Presenters:
Sylvia De La Pena
Betty Damiani
Data: We All Have It!
Now What ?
On Your Own:
“Collect Your Thoughts”
My PSP Role(s)
Types of data the
campus I serve
collects/Uses
What the campus does
with the Data
The campus collects
data (when)…..
Table Activity # 1
Talk It Up:
“Share Your Thoughts”
PSP Roles
CAM:
TAP:
CIT:
Here’s What data I
collect/use:
So What does/might the
data tell me?
Now What do I do to
facilitate the use of the
collected data?
Session Descriptor
Session Descriptor: This session will provide the
participants with data tools, strategies, processes,
and guiding questions to facilitate data collection
and data-driven discussions with the campus and
LEA, ,shift to STAAR thinking, and foster campus
data literacy . The presenters will show how they
have used data in their various PSP roles to focus
the campus on the road to school improvement .
 How Do We Keep The Data Conversation and
Thinking Going ?
Tip # 1 – On the Road Checklist
Presenters will share how they :
O
O
O
O
O
O
O
Facilitate data-driven dialogue with campus staff .
Recognize the different levels of data literacy and provide support for
continuous data literacy growth.
Create templates/activities to collect a variety of data.
Examine and modify current tools to address the needs of STAAR
Use guiding questions to facilitate data analysis, dialogue, reflection,
and school process examination.
Display / use data walls to spark dialogue and focus school
improvement efforts.
Value working with a PSP Learning Partner for continuous growth
Tip # 2 - Set your GPS
Facilitation Skills + Guiding Questions + Tools=
Ongoing Data Conversation and Collaboration
Facilitation
Skills
Guiding
Questions
Tools
What Do Facilitators Do ?
Learning
Help participants develop a vision
for the work
Foster self reflection
Facilitate protocols
Help group track their progress
Check in with participants in and out
of meetings to gather feedback.
Provide resources and research
Longevity
Help participants identify how
the work is related to school
goals and initiatives
Logistics
Distribute relevant materials
before, during and after
meeting
Ensure meetings are
scheduled into school
calendar
Month
Aug/Sept.
July-Sept.
July/Aug
Completion
July/Aug
Sept.
July/Aug
Aug-May
On-going
Sept./Oct
Sept./Oct
Aug./Sept
Every nine
weeks
Aug.-May
On-going
Sept. May
On-going
July/Aug.
July/Aug.
Aug.-May
On-going
Sept.
Sept./Oct.
Sept./Oct
Sept.-May
April
Feb./March
On-going
Strategy/Plan
Identify all 10th grade students who did not pass 9th grade Math TAKS
Schedule non-masters into TAKS intervention class
Determine if intervention will occur daily, time of day, and who will be
instructor
Schedule protected time for core teachers to meet and collaborate on
curriculum, student needs, and student progress as revealed by data
Identify student by special populations, TAKS test administered, audit
credits and AARs of LEP and special education students
Review class schedules to ensure targeted students are receiving support
within the school day
Meet with core teachers and ensure they have and understand curriculum
to be taught, resources available, expectations for teaching and student
learning
Create folder for each student and become familiar with historical data,
items missed, scale scores, attendance
Create a profile folder/chart for each student so that he/she can monitor
progress towards goal
Develop a calendar of skill check tests or campus mini-assessments dates
Develop an instructional calendar indicating alignment to the district
curriculum, testable TEKS, skill checks, spiraling
Identify weekly collaborative meeting dates for core teachers. Require a
facilitator, agenda, minutes/summary of meeting discussion
After each FMA or Campus skill check utilize item analysis to develop
growth targets, identify student needs to plan instruction and interventions
Identify the system used to enroll crisis/at risk students and determine type
of intervention needed
Identify students by category : Passed TAKS, Made Growth ; Passed TAKS
Missed Growth; Failed TAKS , Made Growth ; Failed TAKS , Missed Growth:
Discuss data of these categories
Focus on embedded professional development as a means of developing
professional growth through data discussion and analysis, TEKS rigor,
curriculum alignment, lesson plan design, student engagement, utilization of
best practices
Identify all students by cohort
Disaggregate non-graduates by sub-populations, factors which may have
contributed to non-completion : attendance, disciplinary removals,
pregnancy, loss of credits, failure to pass TAKS
Develop a plan to provide on-going support and monitor potential non completers
Collaborate with administrative staff on walkthrough data collection.
Determine professional development needs based on data accumulation
No Senior Left Behind form administered to senior teachers after spring
break
Develop and implement a 6 weeks Countdown to TAKS Plan: identify
students for appropriate grouping, offer flexible grouping, Saturday camps,
tutoring, targeted instruction during class ; notify parents, adjust classes and
 Fostering Data Literacy
Recognize the different levels of data
literacy and provide support for
continuous data literacy growth
Data Location
O
Teachers in case study schools generally were adept at finding information shown explicitly in a table or graph.
Data Comprehension
O
Teachers in case study schools sometimes had difficulty responding to questions that required manipulating and
comparing numbers in a complex data display (e.g., computing two percentages and comparing them).
O
Some case study teachers’ verbal descriptions of data suggested that they failed to distinguish a histogram from a bar
graph or to consider the difference between cross-sectional and longitudinal data sets.
Data Interpretation
O
Many case study teachers acknowledged that sample size affects the strength of the generalization that can be made
from a data set and suggested that any individual student assessment administration may be affected by ephemeral
factors (such as a student’s illness).
O
Case study teachers were more likely to examine score distributions and to think about the potential effect of extremely
high or low scores on a group average when shown individual students’ scores on a class roster than when looking at
tables or graphs showing averages for a grade, school, or district. An implication of this finding is that teachers will
need more support when they are expected to make sense of summaries of larger data sets as part of a grade-level,
school, or district improvement team.
O
Case study teachers’ comments showed a limited understanding of such concepts as test validity, score reliability, and
measurement error. Without understanding these concepts, teachers are susceptible to invalid inferences, such as
assuming that any student who has scored above the proficiency cutoff on a benchmark test (even if just above the
cutoff) will attain proficiency on the state accountability test.
Data Use for Instructional Decision Making
O
Many case study teachers expressed a desire to see assessment results at the level of subscales (groups of test items)
related to specific standards and at the level of individual items in order to tailor their instruction. After years of
increased emphasis on accountability, these teachers appeared quite sensitive to the fact that students will do better
on a test if they have received instruction on the covered content and have had their learning assessed in the same
way (e.g., same item format) in the past.
O
Many case study teachers talked about differentiating instruction on the basis of student assessment results.
Teachers described grouping strategies, increased instructional time for individual students on topics they are weak
on, and alternative instructional approaches.
Question Posing
O
. Many case study teachers struggled when trying to pose questions relevant to improving achievement that could be
investigated using the data in a typical electronic system. They were more likely to frame questions around student
demographic variables (e.g., “Did girls have higher reading achievement scores than boys?”) than around school
variables (e.g., “Do student achievement scores vary for different teachers?”).
excerpt from1U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Teachers’ Ability to Use Data to Inform
Instruction: Challenges and Supports, Washington, D.C., 2011.
How Does the PSP “Move” the Campus to
the Next Level?
Knowledge
Information
Data
Process of Transforming Data Into
Knowledge
Decision making
Synthesizing
Analyzing
Summarizing
Organizing
Collecting
Adapted from Keeping Teachers in the Center : A Framework of Data Driven Decision Making
Daniel Ligh.t Education Development Center ,Inc. Center for Children and Technology USA,2004
Table Activity # 2
Take 3 minutes to reflect on where in
the continuum your campus is most
comfortable and tends “to rest”
Collecting
Organizing
Summarizing
Analyzing
Synthesizing
Decision
making
 Data / Data Tools
Create templates and
activities to collect a
variety of data.
Proposed Road Trip Norms for looking
at Data- Why is this important?
Table Activity # 3
O Generate a list of norms for looking at data
O Discuss why this is important
Proposed Norms for Looking at Data – Why
is This Important ?
O i.e. Describe only
O Ask questions when
what you see.
O Just describe the
data in front of you
O Resist the urge to
immediately work on
solutions
O Seek to understand
differences
you don’t understand
O Surface the lenses
and experiences you
bring to the data
O Surface assumptions
and use data to
challenge and support
them
What Data Might Be Considered
at Beginning of 2011 School Year
for STAAR Preparation ?
TAKS or Pre- STAAR Beginning of Year
Data Review
Types of Data
• 2011 TAKS Performance (Field Guide –
Knezek )
Item Analysis (Student Expectations)
All Student
Commended
Economically Disadvantaged
LEP
Teacher Level Data
• Pearson Reports
• PEIMS Report
• End of Year Grade Report
• Preliminary AYP Data
• Preliminary AEIS Data
• 2 year TAKS Performance
•
•
•
•
Data Skills
Location
Comprehension
Interpretation
Tip #5 - Be Prepared for the Unexpected
Ongoing Data Review
Watch : Student Groups and Student Expectations !
Will this change with STAAR ?
O
O
O
O
O
O
O
Campus assessments
District benchmarks
Grades
Failure reports
Discipline data
Attendance data
Dropout data
All Students
Commended # and %
College Ready # and %
Met Expectations # and %
Missed by 1-5 questions #
and %
Missed by 6 or more questions
# and %
Special Populations
Commended
Commended
College Ready
College Ready
Met Expectations
Met Expectations
Below Expectations
Below Expectations
Missed by 6 questions
or more
Missed by 6 or more
questions
What Does the Data Tell You?
Taking it to the Next Level
Table Activity # 4 : Case Study
Use the Question Template to
Facilitate Data Analysis
What might
be the data
implications
for
instructional
decisions ?
What might
be the data
implications
for curricular
decisions ?
What might
be the data
implications
for student
learning ?
What might
be the data
implications
for school
processes ?
What might
be the data
implications
for school
culture?
What might be
the data
implications
for
professional
development?
What might
be the data
implications
for school
leadership?
What is the Story Being
Told by the Case Study
Data?
Why do you think the story played itself out this way ?
Will campus be ready for STAAR ? Why or Why Not ?
 Facilitate Data - Driven Dialogue
Use guiding questions to
facilitate data analysis,
dialogue, reflection, and
school process examination.
You don’t need an advanced
degree in statistics and a room
full of computers to start asking
data-based questions about your
school, and using what you learn
to guide reform.
- Victoria Bernhardt
Using TAKS/Benchmark Guiding
Questions to Shift from TAKS to
STAAR
Benchmark Guiding Questions



















How much of tested curriculum had been taught ? to what level of depth and complexity ?
Analyze TEKS Benchmark Performance
Compare analysis to level of instruction
Compare analysis to district/state performance
How do you prioritize SEs / TEKS to be taught/reviewed/further tested ?
Analyze Pre-AP/AP performance. Are passers at college ready or commended level ?
Identify types of questions most missed . by SE / TEK
Are identified groups of students missing certain types of questions ?
Yellow,red,SPED,LEP?
Blue,Green?
How are interventions tailored to specific needs ?
Are all yellow, red 2010 TAKS students receiving support,intervention within the school day ?
Based on data and causal factors what are the instructional implications ?
How do you ensure instruction is paced to maximize teaching and learning ?
How do you ensure that all SE’s?TAKS objectives will be taught, reviewed and assessed ?
What does differentiation look like in your classroom ?
How timely is your assessment feedback to your students ?
What does assessment feedback and student self-monitoring look like in your classroom ?
Did you notice any difference in performance the last half/fourth of the test ?
Discussion Questions for 2011 End of Year Meeting
1. Review TAKS results in core areas : provide analysis of performance by #testing, # passing, %
passing, with and without TPM, All students, Hispanic, White, Afr. Amer., Eco. Dis.
2. Did any cells meet standard through required improvement, ?
3. Compare results to ’10 and identify strengths and weaknesses.
4. Compare results to district performance and feeder school performance
5. Identify TEKS not mastered. How will this information be used to address student needs ?
6. What was average scale score in each core area ? compared to 10 ?
7. Commended performance in each core area ? compared to 10 ?
8. What implications does TAKS data have for instructional focus next year ?
9. Review data by teacher. Are there any significant findings that will impact teacher assignment
for 2011-‘12?
10. Did you find attendance and/or discipline to be a factor with TAKS non-masters and/or TAKS
masters who did not show growth ?
11. What instructional practices contributed to the performance results ?strengths and weaknesses
12. How is walkthrough data used to improve instructional practices ? Is there evidence ?
13. What professional development contributed to success in low performing areas? Is
professional development external training or job-embedded. Explain
14. Describe the ’10 – ’11 intervention program and any changes based on data for next school year.
15. How do you ensure fidelity to the curriculum, quality planning and assessment ? ( written,
taught, tested )
16. What multiple data sources are you using to determine school culture and climate ?
17. Is there a common vision and purpose for the campus among staff ?
18. Is there a common understanding and action about what will happen when a student
experiences difficulty in learning ?
19. Are there specific commitments the staff must honor to achieve purpose and vision ? If so, how
is this approached ?
20. What are the most essential conditions and factors which must be monitored on an ongoing
basis ? Do you have systems in place to do this ?
21. If you are using common assessments answer the following :
 What evidence do you have that the results of common assessments are used to
identify students who require additional time, support for learning ?
 What evidence do you have that the results of common assessments are used to
identify instructional strengths and weaknesses ?
22. What is the correlation between TAKS performance and student grades ?
23. What support did you receive from the district this year ?
24. Did anything change for teachers this year ?
25. What core values did administration/staff hold to ?
26. What support will administration need to move campus to the next level ?
Some questions taken from Learning by Doing, Solution Tree and Texas Education Agency Turnaround Center; others developed by Sylvia De
La Pena, External CIT
Table Activity # 5
1. Using the case study data, which guiding
question(s) could be used to facilitate data driven
discussion?
2. How can some of these questions be changed
to reflective questions ?
 Data Triangulation
Help campuses
examine
data
through multiple lenses
It is irresponsible for a school to
mobilize ,initiate , and act without any
conscious way of determining whether
such expenditure of time and energy is
having a desirable effect.
-Carl Glickman
Observations about the Data
External
Data
10th Grade
2011 TAKS
Math
Performance
Internal
Data
10th Grade
Math March
Benchmark
Classroom
Data
10th Grade
Math 6 weeks
Common
Assessments
10th Grade 2011 TAKS Math
Students
All
•When was test administered?
•Which skills were tested
•How many items were tested ?
•Did the campus meet AYP standard
?
•What was standard ?
•Analyze special population
performance
•What % met standard ?
•What % met commended
performance ?
District Benchmark
Students
All
•When was test administered?
•Which skills were tested ?
•How many items were tested
•Did the campus meet AYP standard
?
•What was standard ?
•Analyze special population
performance
•What % met standard ?
•What % met commended
performance ?
For each data source list observations that can be shared
Next, analyze each source of data in more detail
6 weeks Common Assessment
All Students
•When was test administered?
•Which skills were tested ?
•How many items were tested
•Did the campus meet AYP standard
?
•What was standard ?
•Analyze special population
performance
•What % met standard ?
•What % met commended
performance ?
•Did one or more classes perform
better than another class ?
•How many students are 2-3 items
from meeting standard?
Tip # 6 - Know Your Road Trip
Emergency Contacts and Resources
O Campus Support
O LEA Support
O ESC Support
O PSP Support
O SES
O Funding
SUBJECT :
Here’s what the 2011
TAKS data says:
Consider all student
/group performance,
TEKS and objective
performance,
teacher
performance, etc
Here’s what the
spring benchmark
data says:
GRADE LEVEL :
Here’s what the
grade level
semester/yearly
performance
says:
So what support, processes,
structures, initiatives, etc.
did campus provide? How
was support monitored for
implementation and
effectiveness?
SCHOOL YEAR:
CAMPUS:
So what support,
Now what campus
processes, structures,
support should
initiatives, etc. did district
continue, change,
provide? How was support be modified,
monitored for
included ,etc.?
implementation and
effectiveness?
Now what district
support should
continue, change,
modified,
included ,etc.?
 Some Final Thoughts
Data Walls:
Display / use data to spark dialogue and focus school
improvement efforts.
Professional Reading:
Foster the development of professional learning communities
around data through professional articles and book studies.
Collaborate:
Value working with a PSP Learning Partner for continuous growth
Thanks for sharing and participating with us
this afternoon. We hope you take away an idea
or two to help you better serve in your role as a
PSP.
Your PSP Learning Partners
Betty and Sylvia
,
Presenter Contact Information
[email protected]
[email protected]
References
•Allen, David and Tina Blythe. The Facilitator's Book of Questions. New York: Teachers College Press, 2004.
•Bernhardt,Victoria. Data Analysis 2nd. New York: Eye on Education Inc., 2004.
•Cartoons. “Digesting the Data.” This Week in Education, 2010-11. http://scholasticadministrator.typepad.com
•DuFour, Richard., et al. Learning by Doing: A Handbook for Professional Learning Communities at Work. Bloomington, IN:
Solution Tree Press, 2006.
•Evidence Project Staff. A Collaborative Approach to Understanding and Improving Teaching and Learning . Cambridge, MA:
Harvard Project Zero,2001.
•Sagor,Richard. Guiding School Improvement with Action Research. Association for Supervision and Curriculum Development.
March 2000: 3-7
•Sirctexas.net. School Improvement Resource Center. Principal Planning Guides: A Powerful Resource for Principals.
http://www.sirctexas.net/resources_forms.htm
•Tea.state.tx.us. Texas Education Agency. Accountability Monitoring Intervention Activities: Focused Data Analysis Template.
http://www.tea.state.tx.us/index2
•U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Teachers’ Ability to Use Data to Inform
Instruction: Challenges and Supports. Washington, D.C., 2011.
•Vashisht,Rashmi. “The Importance of Triangulation and Multiple Measures for Data Assessment”. <http://datacenter.spps.org/
Rashmi Vashisht.ppt.
•Wellman, Bruce and Laura Lipton. Data-Driven Dialogue: A facilitator’s Guide to Collaborative Inquiry. Connecticut: Mira
Via,LLC, 2004.