Assessment Tools

Download Report

Transcript Assessment Tools

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs

Kim Gulbrandson, Ph.D.

Wisconsin RtI Center

Objectives

 To provide a general overview of the research behind the tools  To share strengths and weaknesses of the current assessment tools  To provide resources to support schools/districts in using these tools in a coordinated way

BoQ  Sound development process (multiple stages)  Sound psychometrics    Good test-retest reliability (.94) High inter-rater reliability (above 90%) Good internal consistency reliability (.70 or above) PBS team – only scale with low reliability  CFA and EFA: Items with low factor loadings eliminated New Classroom Critical scale added Current 10-factor structure is solid

BoQ  Best tool for distinguishing amongst schools implementing with fidelity  Detailed scoring criteria (rubric)  Found to be a valid instrument even when administered using diverse methods When administration varied from validated method, it did not significantly change scores (if Scoring Guide used)

BoQ  Schools with higher BoQ scores tend to have greater decreases in ODR’s than schools with lower BoQ’s  No district support, CR or coaching items  Family engagement items  Highly correlated with the TIC and SET

BoQ and SET

 Offers good cross comparisons (several subscales represent similar elements)  BoQ and SET scores are significantly correlated with one another  BoQ measures PBIS areas with more specificity than the SET  BoQ measures critical features of implementation not covered by the SET Faculty buy-in Lesson plans Crisis plans Evaluation

BoQ and SET

 BoQ is better able to distinguish amongst schools that are implementing with fidelity than the SET is  SET can be used to validate BoQ reporting  BoQ can be used to identify additional areas in need of improvement that may not have been identified on the SET If done within same time frame

SET  Considered more sensitive for initial implementation than for sustained implementation  Fairly strong psychometrics  Drawback: Can score 80% on the SET without having some of the critical features of PBIS in place  Limited feedback on the implementation process  Items most appropriate for elementary (less interpretable for middle school)

SET

 Use caution with Expectations Taught and Management subscales  Time intensive  Less interpretable and reliable for large schools  Includes a district support component yields high scores only 2 items  No family engagement, CR or coaching items

TIC  Primarily looks at startup activities (only 6 questions tracking ongoing development)  Less useful for fully implementing schools or for looking at sustainability  Limited empirical research examining its reliability and validity One study - internal consistency reliability  Mixed criticisms about being too lenient  3 family engagement items  No district level, coaching or CR items

SAS

 The only tool that clearly breaks things down into 4 different systems  Limited reliability and validity data  Higher reliability for improvement priority than current status  Nonclassroom Settings and Individual Student had lowest reliability and greatest variability across staff  Suggested: Look at individual items

SAS

 Item 8 – interpret with caution  Has been used to identify specific strategies associated with reductions in racially disproportionate suspensions  3 family engagement items  No district-level, CR or coaching components

BAT

 Limited reliability (low test-retest for subscales)  Not yet validated (Tier 3 most problematic)  Tier 3 FBA/BIP scores consistently high/overinflated  Suggestion: People with specific knowledge of FBA/BIP’s complete the BAT  6 family engagement items  No coaching, CR or district items

MATT

 No formal work has been done with regard to reliability and validity  3 family engagement items  Scoring concerns (inflated implementation scores)  Suggestion: Look at tier 2 and 3 organization and critical elements subscale scores separately, or individual items

RtI All Staff Survey

 5 family engagement items  5 CR items  Aligns with the SIR (29 questions)  Aligns with the state graphic/model  Multiple levels

RtI All Staff Survey

 Reliability and validity information, but less than the SIR  No coaching items  Few leadership items

SIR

 Aligns with the RtI All Staff  5 family engagement items  Includes leadership items  Includes CR items  Multiple levels

SIR

 Reliable and valid  Modified CR items has not been re-tested – be careful comparing across years  Missing district-focused items

Considerations

 Which is most important for you to measure?

Initial implementation Sustainability District and/or school level factors Different settings All staff or team perceptions Family engagement Culturally responsive practices Leadership

Assessment Tool Review

 See handout

Using Assessments to Action Plan