Evaluating School Psychologists Within a Multi

Download Report

Transcript Evaluating School Psychologists Within a Multi

Evaluating School Psychologists
Within a Multi-tiered System of
Supports Delivery Model:
A New Era of Accountability
SSPEM
FASP Annual Conference
David Wheeler
George Batsche
Student Support Services Project
2
Overview
Student Success Act (S.B. 736): Setting the state for professional
personnel evaluations in Florida
Multi-Tiered System of Supports: Common Language/Common
Understanding
Alignment Between the MTSS Model and Student Services Delivery in
Florida
Why MTSS? Why Now?
School Psychologists Skills/Role in a Multi-Tiered Support System
Florida’s New Evaluation System
Overview of Student Services Personnel Evaluation Model (SSPEM)
3
Florida’s New Evaluation
System
The Student Success Act (SB 736)
1012.34, F.S.
4
Purpose of Student Success Act
For the purpose of increasing student
learning growth by improving the quality
of instructional, administrative, and
supervisory services in public schools of
the state.
1012.34(1), F.S.
5
Evaluation System Requirements
Designed to support effective instruction and student
learning growth & must be used in developing School
Improvement Plans.
Provide appropriate instruments, procedures, and criteria
for continuous quality improvement of professional skills
& results must be used when identifying professional
development.
Include a mechanism to examine performance data from
multiple sources including parents when appropriate.
Identify teaching fields for which special evaluation
procedures and criteria are necessary.
6
Non-classroom Instructional
Personnel (Student Services)
Student learning growth data (50%) assigned
over three years
OR
Combination of student learning growth data
(30%) & measureable student outcomes specific
to the assigned position
Instructional practice based on FEAPs and
specific job expectations
Professional & job responsibilities
1012.01(3)(a)1.b, F.S.
Non-classroom Instructional Personnel
(Student Services) - 1012.01(3)(a)1.b, F.S
Student performance (50%)
Student learning growth as assessed by statewide or
district assessments
OR
Combination of student learning growth data (30%) &
other measureable student outcomes specific to the
assigned position
Instructional practice (non-classroom
instructional personnel)
Florida Educator Accomplished Practices (FEAPs)
May include specific job expectations related to student
support
Professional and job responsibilities
8
Challenges for Student Services
Personnel Evaluations
FEAPs do not adequately reflect the job
responsibilities & practices of student services
personnel
Impact on student performance is indirect
Student Services personnel typically assigned to
multiple schools
Measuring student outcomes related to job
9
What Informs
Instructional Practice?
NASP (and other student services) Professional Practice
Standards
Florida Educator Accomplished Practices
School Psychologist Competency Areas
Multi-Tiered System of Supports Delivery System in
Florida
Domains of Practice that incorporate professional
standards and Florida Multi-tiered System of Supports
10
Domains of Practice
Data-based Decision Making and Evaluation
Instruction/Intervention Planning & Design
Instruction/Intervention Delivery &
Facilitation
Learning Environment
Professional Learning, Responsibility, & Ethics
11
12
13
Florida Educator
Accomplished Practices
(FEAPs)
Quality of Instruction
Instructional Design and Lesson Planning
The Learning Environment
Instructional Delivery and Facilitation
Assessment
Continuous Improvement, Responsibility and Ethics
Continuous Professional Improvement
Professional Responsibility and Ethical Conduct
14
School Psychologist
Competencies: Florida
Selected
Data-Based Decision-Making and Accountability
Knowledge of Curricula and Instruction
Knowledge of Evidence-Based Interventions
Consultation, Collaboration and Problem-Solving
Professional School Psychology and Ethical DecisionMaking
15
MTSS: Common
Language/Common
Understanding
16
MTSS
A Multi-Tiered System of Supports (MTSS) is a term used
to describe an evidence-based model of schooling that
uses data-based problem-solving to integrate academic
and behavioral instruction and intervention.
The integrated instruction and intervention is delivered to
students in varying intensities (multiple tiers) based on
student need.
“Need-driven” decision-making seeks to ensure that
district resources reach the appropriate students
(schools) at the appropriate levels to accelerate the
performance of ALL students to achieve and/or exceed
proficiency .
17
Why Organize an Evaluation System
Around an MTSS Model?
Research supports that an integrated
(academic/behavior/social emotional) service delivery
system has greater impact on student performance than
separate systems
Services and personnel in schools already are organized
by levels of intensity of service delivery
Tier 1—What everybody gets—typically general education
teacher led
Tier 2—What “some” get—typically more intensive, smaller
groups
Tier 3—What “few” get—typically most intensive,
specialized
18
Why Organize an Evaluation System
Around an MTSS Model?
Existing and proposed statutes, regulations and
practices support a multi-tiered system
IDEIA
NCLB
Learn Act
Achievement Through Prevention (PBIS) Act (SB 541)
Florida Educator Accomplished Practices (FEAPs) and
School Psychologist Competencies
NASP Model
Evaluation systems require clear responsibility for
levels of service delivery and “stakeholders” who are
one focus of the evaluation process
19
Why Organize an Evaluation System
Around an MTSS Model?
Instructional support staff of all types typically
provide instruction/intervention at all levels (Tiers 1,2
and 3) in a school and/or district
School-based research that identifies evidence-based
practices is conducted at levels aligned with the Tiers
School-wide (e.g., PBIS, Crisis Prevention)
Classroom level (e.g., Group Procedures, Instructional
Strategies
Group level (e.g., academic instruction, social skills
training, group work)
Very Small Group/Individual (e.g., therapeutic, intense
psychological skills training, academic skills)
20
MTSS: Critical Elements
The Four Corners of the “Frame”
21
Parts of the “Frame”
• 3 Tiers of service delivery into which all
academic and behavioral
instruction/intervention “fit.”
– Content is not been defined by the model
• A structured Problem-Solving Process
used to develop, implement, and
monitor instruction/interventions
Parts of the “Frame”
Instruction/interventions are modified,
intensified and or dropped based on
student performance data
Instruction is integrated and
systematically planned across the tiers
MTSS & the Problem-Solving
Process
ACADEMIC and BEHAVIOR SYSTEMS
Tier 3: Intensive, Individualized
Interventions & Supports.
The most intense (increased time, narrowed focus,
reduced group size) instruction and intervention
based upon individual student need provided in
addition to and aligned with Tier 1 & 2 academic
and behavior instruction and supports.
Tier 2: Targeted, Supplemental
Interventions & Supports.
More targeted instruction/intervention and
supplemental support in addition to and aligned
with the core academic and behavior curriculum.
Tier 1: Core, Universal
Instruction & Supports.
General academic and behavior instruction and
support provided to all students in all settings.
Revised 12/7/09
24
Problem Solving
Process
Identify the Goal
What Do We Want Students to Know and Be Able
to Do?
Problem Analysis
WHY are they not doing it?
Identify Variables that
Contribute to the Lack of
Desired Outcomes
Evaluate
Response to
Intervention (RtI)
Implement Plan
Implement As Intended
Progress Monitor
Modify as Necessary
Steps in the Problem-Solving Process
1.
Problem Identification
Identify replacement behavior
Data- current level of performance
Data- benchmark level(s)
Data- peer performance
Data- GAP analysis
2.
Problem Analysis
Develop hypotheses (brainstorming)
Develop predictions/assessment
3.
Intervention Development
Develop interventions in those areas for which data are available and
hypotheses verified
Proximal/Distal
Implementation support
4.
Response to Intervention (RtI)
Why MTSS? Why Now?
27
MTSS: Integrating Two Evidence-Based Models to
Improve the Academic and Behavior Outcomes for ALL
Students
• Challenging Times In Which to Educate America’s
Children and Youth
–
–
–
–
–
–
Performance Evaluations Tied to Student Growth
Economic Crises resulting in reduction of resources
Alternatives to Public K-12 Education
AYP Projections and Expectations
Recruitment and Retention of Qualified Professionals
Common Language/Common Understanding with
Educators, Parents and the Community
Strategies for Successfully
Addressing these Challenges
Align and allocate effective resources with student needsReturn on Investment Model (ROI)
Anticipate the Future-prevention is cost-effective
Use of Highly Effective Practices-identify them, reward them
Efficient Delivery of those Practices
Data to Evidence Effectiveness of Practices
Strong Professional Development and Support to Sustain
Effective Practices aligned with district priorities
Communicating Clearly and Frequently with Stakeholders
Use of professional personnel evaluation models that
demonstrate impact of evidence-based practices aligned with
district mission
Highly Effective Practices: Research
High quality academic instruction (e.g., content matched to
student success level, frequent opportunity to respond,
frequent feedback) by itself can reduce problem behavior
(Filter & Horner, 2009; Preciado, Horner, Scott, & Baker, 2009,
Sanford, 2006)
Implementation of school-wide positive behavior support
leads to increased academic engaged time and enhanced
academic outcomes (Algozzine & Algozzine, 2007; Horner et
al., 2009; Lassen, Steele, & Sailor, 2006)
“Viewed as outcomes, achievement and behavior are
related; viewed as causes of the other, achievement and
behavior are unrelated. (Algozzine, et al., 2011)
Children who fall behind academically will be more likely to
find academic work aversive and also find escapemaintained problem behaviors reinforcing (McIntosh, 2008;
McIntosh, Sadler, & Brown, 2010)
30
School-wide Behavior & Reading Support
The integration/combination of the two:
Are critical for school success
Utilize the three-tiered prevention model
Incorporate a team approach at school level, grade
level, and individual level
Share the critical feature of data-based decision
making
Produce larger gains in literacy skills than the
reading-only model
(Stewart, Benner, Martella, & Marchand-Martella, 2007)
31
School Psychologists Role in a
Multi-tiered Support System
32
Emerging Leadership Themes
Multi-tiered Systems of Support
Evidence-based practices
Implementation science
Rob Horner, Futures of School Psychology Conference 2012
33
Professional Development:
Core Skill Areas for ALL Staff
•
Data-Based Decision Making Process
•
Coaching/Consultation
•
Problem-Solving Process
•
Collection, Management and Use of Integrated Data Systems
•
Instruction/Intervention Development, Support and Evaluation
•
Instruction/Intervention Fidelity
•
Staff Training
•
Effective Interpersonal Skills
Student Services Role in an MTSS
System
Academic Performance of students (educator
appraisal factor) is influenced significantly by social,
emotional and behavior factors—the professional
practices of student services personnel
Combining evidence-based instructional strategies
with evidence-based strategies to enhance student
engagement results in the most dramatic student
gains (LESSON STUDY)
Enhancing student engagement (at all levels) is a
primary role of students services personnel
35
Student Services Role in an MTSS
System
The continued viability and importance of student services
personnel is influenced strongly by the impact of their practices
on student performance-particularly academic performance
Services provided by student services personnel have a strong,
evidence-based relationship with student academic
performance
A blueprint for a clear, explicit relationship between the
provision of evidence-based student services practices and
positive student outcomes is critical in the context of school
accountability
Student services personnel must PLAN in such a way as to
demonstrate ACCOUTABILITY and COMMUNICATE those
outcomes.
36
Multi-tier System of Student Supports (MTSSS):
Response to Instruction/Intervention (RtI)
An Overview of Data-based Problem-solving within a Multi-tier System of
Student Supports in Florida’s Public Schools
Intensive, Individualized Supports
•Intensive interventions based on individual student needs
•Students receiving prolonged interventions at this level may be several grade levels behind or
above the one in which they are enrolled
•Progress monitoring occurs most often to ensure maximum acceleration of student progress
•If more than approximately 5% of students are receiving support at this level, engage in Tier 1
and Tier 2 level, systemic problem-solving
Targeted, Supplemental Supports
•Interventions are based on data revealing that students need more than core, universal
instruction
•Interventions and progress monitoring are targeted to specific skills to remediate or enrich, as
appropriate
•Progress monitoring occurs more frequently than at the core, universal level to ensure that
the intervention is working
•If more than approximately 15% of students are receiving support at this level, engage in Tier
1 level, systemic problem-solving
Core, Universal Supports
•Research-based, high-quality, general education instruction and support
•Screening and benchmark assessments for all students
•Assessments occur for all students
•Data collection continues to inform instruction
•If less than approximately 80% of students are successful given core, universal instruction,
engage in Tier 137
level problem-solving
Critical Role in Addressing Barriers
to Learning
Engage in collaborative problem-solving at district, school, and
individual levels.
Provide culturally competent services (academic, socialemotional, behavioral) to students, schools, and families within
a multi-tier model of service delivery.
Develop and implement evidence-based interventions at each
tier.
Conduct assessments that inform instruction (screening,
progress monitoring, diagnostic).
Assess fidelity and effectiveness of instruction and
intervention.
38
Critical Role in Addressing Barriers
to Learning
Assist in the design and use of data systems (data
collection, display, and interpretation).
Provide leadership implementing policies and
practices that result in effective and equitable
outcomes.
Provide services and supports to reengage
disconnected students.
Engage families
Advocate for for evidence-based and culturally
competent practices.
39
Florida’s New Evaluation
System
The Student Success Act
40
Purpose for
Personnel Evaluations
As set forth in the Student Success Act and Race
to the Top, teacher evaluations are:
Designed to support effective instruction and
student learning growth.
Used when developing district and school level
improvement plans.
Used to identify professional development.
41
Purpose for
Personnel Evaluations (cont.)
Measure sound educational principles and research in
effective practice in three major areas:
Performance of students
Instructional Practice (FEAPs)
Professional & job responsibilities
Evaluations must differentiate among 4 levels of
performance:
Highly Effective
Effective
Needs Improvement or Developing (1st 3 years)
Unsatisfactory
42
Major Components of the
Evaluation System
Instructional
Practice
measured by
the District’s
Instructional
Practice
Framework
Instructional
Practice
(50%)
Performance of
Students
(50%)
43
Student
performance
measured by
student
learning
growth
Instructional Practice
Section 1012.34, F. S., requires that instructional
practice evaluate the following:
For Classroom teachers, excluding substitutes:
Florida Educator Accomplished Practices (FEAPs)
For Instructional personnel, not classroom teachers:
FEAPs
May include specific job expectations related to student
support
Instructional Framework goal: An expectation that
all teachers can increase their expertise from year
to year which produces gains in student
achievement from year to year with a powerful
cumulative effect
44
Performance of Students
At least 50% of a performance evaluation must be
based upon data and indicators of student learning
growth assessed annually and measured by statewide
assessments or, for subjects and grade levels not
measured by statewide assessments, by district
assessments as provided in s. 1008.22(8), F.S.
Section 1012.34(3)(a)1., Florida Statutes
45
Performance of Students
The performance of students represents 50% of a
teacher’s evaluation, with performance based on
student learning growth
– Growth data for 3 years of students assigned to the
teacher.
– If less than 3 years of data are available, years for
which data are available must be used, and percentage
of evaluation based on growth may be reduced to not
less than 40%.
To meet the above requirement, the development of
a fair and transparent measure of student growth is
essential.
47
Florida’s Value-Added Model
A value-added model measures the impact of a teacher on
student learning, by accounting for other factors that may
impact the learning process.
These models do not:
Evaluate teachers based on a single year of student
performance or proficiency (status model).
Evaluate teachers based on simple comparison of growth
from one year to the next (simple growth).
48
Advantages of Value-Added
Models
Teacher teach classes of students who enter with
different levels of proficiency and possibly different
student characteristics.
Value-added models “level the playing field” by
accounting for differences in the proficiency and
characteristics of students assigned to teachers.
Value-added models are designed to mitigate the
influence of differences among the entering classes
so that schools and teachers do not have advantages
or disadvantages simply as a result of the students
who attend a school or are assigned to a class.
49
Value-Added Example
Teacher X
500
The difference between the
predicted performance and the
actual performance represents the
value-added by the teacher’s
instruction.
400
300
The predicted performance
represents the level of performance
the student is expected to
demonstrate after statistically
accounting for factors through a
value-added model.
200
100
0
Student E
Prior Performance
Current Performance
Predicted Performance
50
Florida’s Value-Added Model
Begins by establishing expected growth for each
student based on historical data each year.
Represents the typical growth seen among
students who have earned similar test scores the
past two years, and share the other
characteristics identified by the committee.
Accounts for student, classroom, and school
characteristics (factors outside the control of the
teacher)
51
Factors Identified to
“Level the Playing Field”
Student Characteristics
Up to two prior years of achievement scores (the strongest
predictor of student growth)
The number of subject-relevant courses in which the student
is enrolled
Students with Disabilities (SWD) status
English Language Learner (ELL) status
Gifted status
Attendance
Mobility (number of transitions)
Difference from modal age in grade (as an indicator of
retention)
Classroom Characteristics
Class size
Homogeneity of students’ entering test scores in the class
52
Factors Identified by the SGIC to
“Level the Playing Field”
The model recognizes that there is an
independent factor related to the school that
impacts student learning – a school component.
Statistically is simply the factors already controlled for
in the model measured at the school level by grade and
subject.
May represent the impact of the school’s leadership,
the culture of the school, or the environment of the
school on student learning.
Acts as another covariate, just like all other factors.
53
Overview of SSPEM
Developing a State Model for
Student Support Services
Personnel Evaluations
54
55
56
Student Support Services
57
Fundamental Principles
Fundamental Purpose: Improve academic & behavioral
outcomes for students
Reflect a Multi-tiered System of Support (MTSS)
framework.
Align with evidence and research-based practices and
professional standards linked to positive student
outcomes.
Integrate common practice standards across student
services professions.
Support professional growth and continuous
improvement.
58
Fundamental Principles (cont.)
Offer a state-approved evaluation framework
that is dynamic (flexible & fluid) and complies
with the Student Success Act for districts to
adopt, adapt, or use as a guide.
59
Developing the Model
Focus on “practices” component
Crosswalk Professional Practice Standards with
FEAPs, Professional Competencies, &
Teacher/Principal models
Identify
Domains of practice; Practices; Indicators for each
practice (levels of performance/proficiency)
Research/evidence supporting practice
Develop an evaluation rubric
Vet model rubric with key stakeholders
60
Relevance of Professional
Standards
Purpose
Establishes foundation
Authenticates
Lends credibility and agreement
Integrates
Research linked to positive student outcomes
Evidence-based strategies
Best practices
61
Contributors/Partners
Bureau of Educator Recruitment & Student Support
Services (BEESS)
Task: Develop Process and Model
Core Development Workgroup (District Coordinators
from each of the Student Service disciplines)
Task: Develop Draft Model
Focus Group (Student Services Directors; Coordinators
from Student Service disciplines; Administrators; Other
stakeholders)
Task: Feedback on Draft Model
62
Conceptual Model
Domains (5 Domains) – broad categories used to
organize professional practices and help structure the
evaluation.
Practices (25 Practices) – standards of practice within
a a domain related to a specific area of professional
skill.
Indicators – continuum of descriptive statements that
assist in differentiating levels of performance for each
practice (Highly Effective, Effective, Emerging,
Ineffective).
63
Domains of Practice
Data-based Decision Making and Evaluation
Instruction/Intervention Planning & Design
Instruction/Intervention Delivery &
Facilitation
Learning Environment
Professional Learning, Responsibility, & Ethics
64
Scoring Rubric for Indicators
Highly Effective – practice has broader, systemic
impact (school-wide/district-wide) OR facilitates
effective practice of others through mentoring
and/or training
Effective – demonstrates the essential elements
of the practice competently and independently
with individual students and groups
Emerging – developing practice competence but
requires additional supervision, support or
training
Ineffective – does not demonstrate practice or
demonstrates practice poorly
65
Evaluation Rubric
66
Evaluation Rubric
67
Evaluation Rubric
68
Evaluation Rubric
69
Evaluation Rubric
70
Resources/Tools in Guide
Research Support for Model (Appendix B)
Crosswalks
Professional Practice Standards (Table 2)
FEAPs, Marzano, & Danielson (Table 3)
Methods and Sources of Evidence for Evaluating
Professional Practice (Table 1)
Scoring Protocols
Evaluation Rubric Scoring Protocol (Form 1)
Student Growth Protocol
Summative Evaluation (Form 3)
71
72
73
74
75
Recommendations for
District Use
The Evaluation Cycle Process
SSPEM and the District Framework
Student Growth Component
76
SSPEM is designed to
Establish practices and expectations that are linked
to student outcomes (academic & behavioral) and
based on research.
Develop evaluation procedures that align with
professional standards and accomplished educator
practices (FEAPs).
Provide feedback to the professional that recognizes
effective performance, identifies areas for
improvement, and directs professional growth
activities.
Provide support to supervisees/practitioners not
meeting performance expectations.
77
Evaluation Cycle Process
78
Student Growth
Component
Student learning growth component must account
for 50% of the evaluation (modified if less than three
years of data).
Must be based on students assigned to the
professional.
Measurable student outcomes – up to 20% of the
student learning growth component may be based
on measurable student outcomes specific to the
position.
79
80
Problem-Solving Process:
Step 1
Desired Outcome: What Do We Want the Student(s) to
Know and Be Able to Do? (Tier 1 Goal-Impact on Academic
Growth?)
Current Level of Performance (Prior/History)
Metrics
Desired Level of Performance (Expected)
Metrics
Peer Comparisons, Other Data
Rate of Growth
Actual (Value Added?)
81
Table Top Activity
Setting
Service
Expected
Outcome
Personnel
Involved
Multiple?
Data
Collection
Behavior
Academic
Tier 1
Tier 2
Tier 3
82