Transcript Slide 1
I-RtI Network Tier 1 Assessment/Data Systems Tier 1 Grade Level Data Teams January, 2013 Facilitated/Presented by: Insert name(s) here The Illinois RtI Network is a State Personnel Development Grant (SPDG) project of the Illinois State Board of Education. All funding (100%) is from federal sources. The contents of this presentation were developed under a grant from the U.S. Department of Education, #H325A100005-12. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. (OSEP Project Officer: Grace Zamora Durán) Making What Check-in connections Applying Review One of the best ways to remember something is to test yourself. Outcomes Review Pre-Meeting Survey Results Assessment TIER 1 INTEGRATED DATA SYSTEMS 7/16/2015 Illinois RTI Network 6 Where Are We? 1. Understand what kinds of data are used in Tier 1 RTI, why, & how frequently and use each effectively & accurately. 2. Understand the limitations of tests & the importance of reliability, validity, fairness, & multiple measures. 3. Understand basic data-analysis & statistics concepts & use them to accurately observe & interpret data. 4. Make effective use of graphs & charts to display data. 5. Understand what different levels of student learning data (aggregated, disaggregated for today) are used to drill down & how to use each effectively & accurately. Survey adapted from Love, et al. (2008) Integrated Data Systems INTRODUCTION BIG IDEA: Integrated Assessment Systems This is what we’ve had. This is what we want.. Assessment Instruction Aligning Assessment and Instruction Key Features of Data Systems Efficient system for collection and entering Adequate training for use of system Key Features Data are accessible when needed Data are accurate Data are easy to collect Used for decision making Schools Use Specific Tools for Specific Assessment Purposes Type Feature Example Screening Reliable, Valid, Low Cost, Accurate, Production Type Responses, Sensitive to Between Persons Differences CBM Family Members; Common Assessments; ODRs Diagnostic Lots of Items, Production-Type Placement Tests; Responses Curriculum Assessments; Can’t Do/Won’t Do Progress Monitoring Reliable, Valid, Low Cost, CBM Family Members; Accurate, Production Type Common Formative Responses, REPEATABLE, Assessments Sensitive to Within Persons Differences Outcome Used for program evaluation ISAT; CBM; Common Assessments; GPA; ODRs Data Sources TYPES OF TIER 1 DATA Types of Tier 1 Data Sources Type of Tier 1 Data Examples State & district annual assessment data (summative) Aggregated, disaggregated, strand, item-level, & student work Data about people, practices, perceptions Demographics & enrollment data, walk through data, teacher evaluations, staff/process self assessments, surveys, interviews, observations Benchmark common assessments (summative & formative) CBM, MAP, end-of-unit tests, common grade level tests Formative common assessment Math problem-of-the-week, writing samples, science journals, other student work Formative classroom assessments Student self-assessments, descriptive feedback, selected response, written response, personal communications, observations of performance Adapted from Love, et al. (2008) Data System: Diagnostic Assessment at District/Building Level • Instruction • Walk through data focused on differentiation • Walk through data on identified instructional practices • Summary review of teacher evaluation data • Review of lesson plans (differentiation, pacing, • Evidence of use of data to inform classroom instruction • Curriculum • Review of minutes of instruction by grade level and content • Review of instructional materials utilized by grade level and content • Review of curriculum maps pacing, outcomes) • Review of alignment of curriculum with CCSS • Evidence of use of data to inform classroom instruction Data System: Diagnostic Assessment at District/Building Level Environment • • • • Office discipline referrals Attendance, tardies Truancy data Walk through classroom management data • Data on active instructional engagement (walk through, observation) • Data on positive vs negative classroom feedback Learner • Subgroup performance • Grade level performance • Grade level trends and patterns Data Tips FIDELITY OF ADMINISTRATION, SCORING, & REPORTING AIMSweb Accuracy of Implementation (AIRS) Parent Involvement in Assessment Data Tips EFFICIENT ADMINISTRATION, SCORING, & REPORTING Data Tips REFLECT ON YOUR CURRENT TIER 1 INTEGRATED DATA SYSTEM Activity Assessment Inventory • Guiding questions – What assessment tools do you have? – Are you using them for the right purposes currently? – What assessment tools do you need? – Are there any you are currently using that you can get rid of? – Are you triangulating data? Assessment Inventory Subject or Data Type Screening Diagnostic Progress Outcome Monitoring Example Assessment Inventory Reading Math Writing Behavior Progress Outcome/ Monitoring Accountability R-CBM ISEL R-CBM R-CBM DIBELS MAP DIBELS ISAT Maze Walk Throughs Maze PSAE Vocabulary Fidelity self Vocabulary EXPLORE/PLAN Matching checks Matching Early Numeracy MAP Early Numeracy MAP M-CBM Yearly Progress M-CBM M-CBM Pro Yearly Progress ISAT Pro Yearly Progress Pro W-CBM 6 Trait Writing W-CBM W-CBM Spelling CBM Rubric Spelling CBM Spelling CBM Screening Diagnostic Office Discipline Referrals (ODR) SSBD Homework Completion % ODR FBA ODR Homework Completion % ODR Assessment Audit Data Type Purpose of Assessment INSERT AREA YOU WOULD LIKE TO AUDIT Screening Systems Reading Outcomes Writing Math Behavior Other Screening Diagnostic Progress Monitoring Diagnostic Progress Monitoring Outcomes Redundancies? Gaps? Fully utilized for decision making? Communication to Stakeholders Assessment Audit - Example Data Type What we have by Redundancies Purpose of Assessment Screening System Engagement Walk Thrus; attendance, tardies, Gaps Fully utilized for decision making Communication to Stakeholders Putting tardies into 2 different systems +/student feedback Not being utilized Create more time-sensitive communication to staff after engagement walk thrus Are CBM & F&P measuring same thing as we’re utilizing them? Need more vocab. & comp. diagnos. Create action plan item for using walk through data for Tier 1 problem analysis Diagnostic Specific walk thrus/observations; teacher evaluations; Progress Monitoring Same as above Outcomes Screening Reading CBM; MAP; Fountas & Pinell Diagnostic IRIs; MAP; ISEL; etc. Not using multiple sources Sharing screening data at P/T conferences RIOT/ICEL Inventory Grade/Department:_________ Subject/Course: __________ Year: _____ Review Interview Observe Test Instruction Curriculum Environment Learner Exploring IIRC ANALYZING DATA Reading • Background information handout IIRC Exploration • http://iirc.niu.edu/ Activity: identify at least one strength and weakness (aggregate and disaggregate) •for your district •for your school 1. Go to Your District or School Pop Quiz What does the color coding represent? What does this line represent? What do these numbers mean? Trends By Cohort Trends By Subject Trends By District, School Disaggregate Data Decision Rules EARLY WARNING/CONVERGENT DATA SYSTEM Kennelly, L., & Monrad, M. (2007, October). Approaches to dropout prevention: Heeding early warning signs with appropriate Interventions. Washington, DC: National High School Center at the American Institutes for Research. www.betterhighschools.org What are Early Warning Systems? Systems which: • Utilize routinely available data housed at the school • Help identify students at-risk for dropping out utilizing highly predictive data • Allow districts and schools to target interventions that support off-track or at-risk students while they are still in school • Allow districts and schools to uncover patterns and root causes that contribute to disproportionate drop-out rates at a particular school or within a particular group of students Extreme Off Track 2-3 Years Behind No chance for graduation in a traditional school setting Disengagement Risk Factors: 1. Disengagement •20% absenteeism 2. Behind in Credits •Particularly Core Course Failures 3. GPA less than 2.0 4. Failed FCAT High Off Track 3 or more risk factors Off Track 2 of 4 risk factors indicated Students entering with 20% absenteeism and/or 2 or more F’s in 8th Grade At Risk for Off Track 1 of 4 risk factors indicated On Track No risk factors indicated Hendry County Schools Early Warning Systems Data 12th Grade" 80% 11th Grade 60% 10th Grade 40% 20% 0% Grade 9 On Track: 348 At Risk: 39 Off Track: 53 Dropout: 0% Grade 10 On Track: 147 At Risk: 53 Off Track: 157 Dropout: 1% 9th Grade 12th Grade" off-track on-track 9th Grade Grade 11 On Track: 150 At Risk: 27 Off Track: 95 Dropout: 8% Grade 12 On Track: 200 At Risk: 26 Off Track: 49 Dropout: 6% Elementary – Convergent Data Decision Rules • How can we use our multiple data sources to create decision rules at the elementary level? High Risk = Some Risk = On Track= no risk factors Activity: Create or Evaluate Your Early Warning System or Convergent Data Decision Rules • Using your Assessment Inventory or Assessment Audit & Early Warning ideas, create a draft of an Early Warning System or Convergent Data Decision Rules. If you already have one, evaluate its components. • How does/could your school utilize early warning system data for Tier 1 improvement? Teams TIER 1 GRADE LEVEL/DEPARTMENT DATA TEAMS 7/16/2015 Illinois RTI Network 65 2 Types Data Review Meetings • Big Picture – How are all students doing? • Big Picture – Who needs support & enrichment? • Smaller picture - How are groups doing? Instructional Planning & Review Meetings • Unit Planning • Assessment FOR Instruction Continuous Improvement Cycle PLC Questions How will we respond when some students have clearly achieved the intended outcomes? Collaborative Instructional Planning What do we want our students to learn? Review Standards/ Assessment Analysis & Reteaching Problem Solving Process Planning Problem Analysis End of Unit Assessment Plan Evaluation Problem Identification Teaching Plan Development How will we respond when some students don’t learn? Adjust Teaching Mid-Unit Assessment How will we know they have learned it? Comparison of Low & High Capacity Data Use Low-Capacity Data Use High-Capacity Data Use Accepts achievement gaps as inevitable Responds to achievement gaps with immediate concern and corrective action Uses single measures to draw conclusions Uses multiple sources of data before drawing conclusions Uses only summative measures Uses formative and summative measures Blames students and external causes for failure Looks for causes for failure that are within educators’ control Draws conclusions without verifying hypotheses with data Uses student work & data about practice and research to verify hypotheses Fails to monitor implementation/results Regularly monitor implementation & results Prepares for tests by drilling students on Aligns curriculum with standards & test items assessments; implements research-based improvements in curriculum, instruction, & assessment Tutors only those students just missing the cutoff for proficiency –“bubble kids” Differentiates instruction; provides extra help and enrichment for all who need it Responds as individual administrators & teachers Responds in teams & as a system Adapted from Love, et al. (2008) Grade Level Data Team Don’ts • Don’t use data to blame or punish (students, schools, staff) • Don’t make decisions without ample data • Don’t use data as an excuse for quick fixes. Focus on improving instruction! Adapted from Love, et al. (2008) District 45 Tier I Green Team Agenda and Documentation Meeting Date: Academic Area/Behavior: Winter X Language Arts/Reading Fall Percentage of Students at proficient level based on benchmark/Standard: (Present and attach building triangle) Goal of Next Benchmark Percentage of Students at proficient level based on benchmark/standard: (Set a goal for the next benchmark period.) Reflect on your current practices in this area. ( Refer to the a grade level IPF for the subject area/behavior discussion) Discuss and record changes in the areas of Instruction, Curriculum, and Environment. Determine ways to increase the number of students who meet Tier I standards. Are instructional practices scientificallybased? Is core being implemented with integrity? Is enough instruction time being allocated to ensure student success? Are there more effective ways to make sure that the “big ideas” of curriculum/positive behaviors are being instructed? Are additional grouping/ differentiation options needed? Are classroom management/transitions being implemented with fidelity? Are students engaged in the learning process? Modify grade level IPF based on your reflections. Consider changes in the areas of Instruction, Curriculum, and Environment, and note them in the differentiation column of the IPF. Spring ISAT=68% proficient; RCBM=60% proficient; MAP=65% proficient; 70% proficient on all of the above measures May need to work on classroom management across the grade level; Each teacher seems to be using curriculum materials differently; Need small group instructional opportunities; Need common IPF across 6th grade teachers Tier I Differentiation Instruction Instruction Differentiation groupings for this month: D45 Document We have a group of 20 students who scored below expectations on at least 2 measures (RCBM range=6-72 wrc) Lowest group of 5 students have a range of 6-19 wrc because they lack basic decoding skills. Another group of 12 students has low reading scores because they need additional practice with the reading strategies taught in Tier 1. 3 students have reading scores slightly below grade level standards because they need to build their fluency skills. Tier I Differentiation Instruction Differentiation groupings for this month: D45 Document Group 1 = Guided Reading group targeting intensive decoding (15 min; 5x/wk) Group 2 = Guided Reading group targeting extra practice (20 min; 4x/wk) Group 3 = Guided Reading group targeting extra practice (20 min; 4x/wk) Group 4 = Guided Reading group targeting fluency (15 min 1x/wk) Document Are we doing it? Walk-throughs Self-assessment Observations After What are we saying we will do? During Before How to assess fidelity of implementation How do we know? Evidence Data Example Tier1 GLT Documentation Tier 1 Building & Grade Level Data Teams Data 1 Problem Identification/Screening: Review of multiple sources of student outcome data. What is the level of proficiency? Source: Source: Source: Source: ______ ______ ______ ______ %=____ %=____ %=____ %=____ What is your goal level of proficiency on one Source: or more of the outcomes? ______ Goal%= Data 2 Source: ______ Goal%= Data 3 Source: ______ Goal%= Data 4 Source: ______ Goal%= Problem Analysis/Diagnostics: Review disaggregated data for sources above. Source: Source: Source: Source: ______ ______ ______ ______ Analysis Analysis Analysis Analysis Review related systems data including walkthroughs, instructional fidelity, surveys, aggregate teacher evaluation, etc. Source: Source: Source: Source: ______ ______ ______ ______ Analysis Analysis Analysis Analysis Hypotheses: Plan Development: Progress Monitoring Plan: •Formative •Summative Plan Evaluation Decision Rule: Data Teams for Secondary High School Middle School • Usually by subgroups of departments aligned by: • Core Teams—Core team who teach a constant group of students and have student in common (e.g., ELA, Math, Science) • Content Areas—Teachers who have content in common (e.g., 3 teachers who teach grade math, science, etc.) – Courses – Grade level of students – Honors, AP, IB, electives. Closing Activities