The New York State School Improvement Grant Initiative

Download Report

Transcript The New York State School Improvement Grant Initiative

The New York State School
Improvement Grant Initiative
Scientific and Evidence Based Evaluation of
SIG/SPDG Initiatives: One State’s Response
Office of Professional Research & Development,
Syracuse University, NY
REVISITING NYSIG
AS DESIGNED:
PROGRAM &
EVALUATION
The NYS SIG Initiative is
designed to:
Reduce achievement gap between
special and general education students
in high and low need schools.
 Reduce or eliminated the
disproportionality of language and
ethnic minority students in classification
and placement practices.

LEA
REGIONAL
SCHOOL
SUPPORT
CENTERS (RSSCs)
SPECIAL
EDUCATION
QUALITY
ASSURANCE
(SEQA)
SPECIAL
EDUCATION
RESOURCE AND
TRAINING
CENTERS (SETRCs)
STATE
IMPROVEMENT
GRANT
VESID: RESOURCES, TA,
OVERSIGHT
INSTITUTIONS
OF HIGHER
EDUCATION
(IHEs)
HIGHER
EDUCATION
SUPPORT
CENTER
Data Collection:
Student Outcomes &
Performance
Indicators
Root Cause Analysis,
Strategic Planning &
Goal Setting
The Evaluation Logic Model
NYSED (VESID)
Implementation
of Activities
Two ‘Strands’
SIG Resources and partnerships
applied to two strands:
 Inservice Teachers
 Pre-Service Teachers
Evaluation Goals

Tracking implementation

Identifying areas/strategies for program
improvement

Capturing outcomes (according to stage
of development of the program)
Evaluation Goals



Implementation to
date
Degree of match
between over
arching goals of SIG
and program
activities
Degree of match
between district
stated goals and
activities




Strengths of the SIG
Challenges to the
program and
strategies to meet
these challenges
Lessons learned and
emerging themes
Intermediate
outcomes &
performance
measure attainment
Targets for
Evaluation
District readiness
 Partnership development with IHEs
 Utility/effectiveness of technical
assistance entities (VESID funded,
HESC in particular)
 Outcomes (particularly student
improvement)

Multi-Method Evaluation
Approach






Formative assistance
Surveys (IHE faculty, recent teacher
preparation graduates, school administrators,
training participants).
Interviews and focus groups
Document analysis
Site studies (inc interviews with SIG District
grant recipients: admin, staff, TA providers
and parents)
Quantitative analysis of student performance
data
Evolution of a Systems Change

Sustained change

Initial changes
(perceptions/data)

Implementation of planned
activities to address goals

Desire for change and setting
of goals (buy in)

Relationship building

Awareness
Reporting: Using the Logic
Model Framework







Faithfulness of implementation
Effectiveness of activities
Lessons learned/Ingredients for success
Challenges noted through qualitative
data collection
Outcome data (according to
developmental stage of the Initiative)
Emerging policy issues
Recommendations
Participating IHEs with
assistance from the HESC
are expected to:

Develop, implement and/or sustain inclusive teacher
preparation programs

Link with identified districts in their area to provide
professional development and research assistance

Provide for professional development among other
teacher educators at their university or college
Document analysis




Grant applications
IHE agreements
Initiative and School
Grant reports
Meeting minutes:
 Management meetings
 Statewide Task Force IHE
meetings
 Regional Task Force
meetings
Surveys, focus groups, and interviews
WITH THE IHE




Perceptions of status of
partnerships, their role
and change in districts
Changes in their
teacher preparation
programs
Usefulness of assistance
from the HESC
Promising practices
related to partnerships
and teacher preparation
programs
WITH RECENT
GRADUATES


Perceptions of
preparedness
Identification of high
impact activities and
strategies in teacher
preparation programs
Participating School Districts with
assistance from the SIG Teams are
expected to:




Undertake a root cause analysis process
supported by SETRC Professional Development
Specialists/RSSC Special Education
Specialists.
Develop plan for professional development
and submit SIG District grant application.
Work with RSSC, SETRC, & SIG Teams to
determine the most effective use of SIG Team
resources.
SIG Teams then provide job-embedded
(sequential, on-going, and specifically jobrelevant) professional development and
tracking in the field.
Site Visit Component
Annual selection of SIG Schools for site
visits
 Based on Variety of Variables

– Geographical Location
– Need
– Urban, Suburban, Rural
– Promising Practices
Logic Model for
Instrument Development
and Data Collection
Interview Protocol for Site Visits
Developed Using Logic Model
 Items Tapping into Documenting:

– Activities
– Outputs of Activities
– Intermediate Outcomes
– End Outcomes
Outputs of Activities
Changes in Teacher, Parent, Admin.
Perceptions or Beliefs
 Recognizing a Need to be Addressed
 Recognizing New Method to Address
Need
 Recognizing Need for New Instructional
Practices

Intermediate
Outcomes

Changes in Actual Practices
– E.g. Instructional Practices Across Buildings
and at the Individual Classroom Level

Changes in Stakeholders Working
Together
– E.g. Increased Parent Involvement

Student Classroom Outcomes
– E.g. Daily Behavior, Homework, Attitude
End Outcomes

Changes in classification, placement, and
declassification practices
– Compared with state averages
– Disproportionality

Changes in Performance on State ELA and
Math Assessments
–
–
–
–
General Education Students
For Students with Special Needs
Minority Students
ESL
Benefits
Able to document patterns between
district and partners
 Able to clearly delineate improvement
by professional development method

– Focus efforts on one building or across the
entire district

Able to “tell story” over a period of time
and see changes
Just the numbers….
Scope of the effort:
Total SIG School Districts Year One –
Five: 51 (not counting Big 5)
 Total Schools in ‘Big 5’: 19
 Total IHEs Partnering With Schools: 44
 Total IHEs Participating in SIG: 65

RESPONSIVE
EVALUATION
DESIGN
Other Evaluation
Methods
Regional Think Tanks
 SIG Team Interviews
 School Outcome Data Tracking
 Participant Training Survey
 SIG Team Document Review
 Observation & Participation

Review of Methods
Revisiting SIG Evaluation Original Design:
Review of Methods Year One-Five
Responsive Evaluation
Model
During the five years of the evaluation the
design and methodologies needed to respond
to:
 External shifts/expectations/needs
 Concurrent Internal programmatic
changes/shifts
Some of these shifts were anticipated and were
worked into the original design, and some
were not….
The NY SIG Responsive
Evaluation Model
Internal Shifts: Program
Changes & Evaluation
Responses




Partnerships: Challenges engaging parents
Regional Think Tanks.
Roles: Changes to the way NY thought about
roles and responsibilities of SIG Teams and
RSSC/SETRC partners SIG Team
Interviews, Training Surveys.
Practice: Challenges connecting with schools
across vast geographical areas SIG Team
Interviews.
Programming: Introduction of new program
components Observation.
More Internal Shifts &
Responses…..
Start-up: Time needed for start-up led
to school grants for 3+ years
Document Review.
 Roll-out: Changes to NYCBOE structure
and school functioning Document
Review & Observation.
 Reporting: Institution of school
reporting mechanisms Document
Review.

A special note…
Reality Check: Many of the internal shifts
noted were linked to external shifts and
needs of ‘external’ stakeholders….
Stakeholder Needs:
Understanding Impact &
Outcomes


Stakeholders: NYSED, USDOE, WESTAT
Challenges:
– Identifying what the treatment is.
– Identifying what people are doing and why.
– Identifying who is receiving services, who isn’t,
and how much.
– Identifying the evidence base of the said activities.
– Identifying impact on schools and students
without use of experimental design.
– Identifying changes in practice tied to SIG
initiatives.
– Availability (or lack) or data.
– Identifying responsiveness and alignment to State
needs.
Responses of Stakeholders to
Challenges External Shifts
National Evaluation.
 Encourage Alignment with State
Performance Plan.
 Development of Federal Performance
Measures.
 ‘Collective call’ to utilize scientific and
evidence based practice.

National Evaluation


Requirement: Understand outcomes/change
in: child performance, teacher or
administrator behavior, systems functioning,
and scaling up of successful practices.
Evaluation Responses:
–
–
–
–
Provision of summary and full reports.
Phone Conference Interviews.
Attendance at 3 day multi-state workshop.
Responding to written information & clarification
requests.
– Move from sample data analysis to cohort data
analysis.
State Performance Plan


Requirement: Respond to indicators including:
graduation, drop-out, assessment participation,
suspension/expulsion, LRE, dispoprotionality, parent
involvement etc.
Evaluation Responses:
– ‘Retrofit’ of the evaluation plan to align with new and
developing program components.
– Regional think tanks on: parent involvement,
disproportionality and culturally responsive practice
– Participation in Disproportionality Learning Community
– Tracking new initiatives in response to SPP (i.e. Learning
Communities) Observation & Participation, Document
Review
– Informed student and school outcome data points.
– Move from sample data analysis to cohort data analysis.
Performance Measures


Requirement: Collect evidence in the areas of:
scientific and evidence based opportunities offered
and personnel trained, sustainability of efforts,
teacher retention, alignment with State Performance
Plan.
Evaluation Responses:
– SIG Team Interviews to further understand implementation
and outcomes.
– Development of SIG Team Effort To Date Rubric.
– Regional Think Tanks on sustainability.
– Strengthening of IHE and School Survey instruments in the
areas of documenting change in pre-service and in-service
teacher practice & retention.
Scientific & Evidence
Based Practice


Requirement: Personnel trained under
programs supported by SPDG will have the
knowledge and skills to deliver scientifically or
evidence-based practices to children with
disabilities.
Evaluation Responses:
– Development of SIG Team Effort To Date Rubric.
Rubric tracks: session topics, evidence base of
content, # trainings, # personnel.
– Documentation of practice through SIG
Coordinator and SIG Team interviews.
Where did we meet
stakeholder needs?
– Identifying what the treatment is.
– Identifying what people are doing
and why.
– Identifying who is receiving services,
who isn’t, and how much.
– Identifying changes in practice tied to
SIG initiatives.
– Identifying responsiveness and
alignment to State needs.
Where do we still
struggle?
– Identifying the evidence base of the said
activities.
– Identifying impact on schools and students
without use of experimental design.
– Availability (or lack) or data. i.e. ‘Mining’
data down to the individual level to
respond to SPP indicators involving IEPs
and student outcomes, transition etc
Beyond SIG
– Identifying the evidence base of the said
activities incorporation of requirement
into grant applications and site selection
processes.
– Identifying impact on schools and students
without use of experimental design
strengthening district reporting
requirements.
– Availability (or lack) or data changes to
cohort size, data collection direct from
schools and/or regions.