Status Update February 5, 2008 PBS in North Carolina By the end of 06-07: •81 of 115 LEAs had at least one school participating.
Download ReportTranscript Status Update February 5, 2008 PBS in North Carolina By the end of 06-07: •81 of 115 LEAs had at least one school participating.
Status Update February 5, 2008 PBS in North Carolina By the end of 06-07: •81 of 115 LEAs had at least one school participating (70%) •252 new school teams had begun training •Legislature approved full-time PBS consultant Schools Implementing PBS in NC 768 800 700 548 600 500 400 293 300 147 200 100 0 1 5 9 27 0001 0102 0203 0304 0405 0506 0607 0708 Office Discipline Referral Risk in NC 5 100% 90% 3 9 12 4 10 11 23 80% 70% 60% 50% 83 87 85 67 40% 30% 20% 10% 0% 04-05 05-06 06-07 Control 6+ ODR 2-5 ODR 0-1 ODR Office Discipline Referral Risk in NC 2 4 7 100% 14 90% 80% 70% 60% 91 82 50% 40% 30% 20% 10% 0% SET Met (n=7) SET Not Met (n=5) 6+ ODR 2-5 ODR 0-1 ODR Third Grade EOG Results in PBS Schools 86.73 90 80 83.1 71.7 70 58.34 SET Met (n=7) 60 50 SET Not Met (n=5) 40 30 20 10 0 Reading Math How do we ensure the continued growth and success of PBS in NC? State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant Regional Coordinators LEA Coordinator External Coach Trainer Team Leader (In –School Coach) School Administrator Students Community Recorder Parents Data Manager Time-keeper Communication Coordinator School Staff State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation Vision: All schools in North Carolina will implement Positive Behavior Support as an effective and proactive process for improving social competence and academic achievement. Misson: To provide leadership, professional development, resources, and on-going support in order for schools to successfully implement Positive Behavior Support State Leadership Team Visibility & Political Support Goals • Maintain an up-to-date PBS Website. •Coordinate and inform all depts/divisions at DPI regarding PBS updates •Increase awareness of North Carolina’s mission and vision for PBS •Facilitate networking among all PBS stakeholders State Leadership Team Training & Coaching Goals: • Current registry of trainers/coaches •Current registry of participating LEAs, contact people/coordinators, & schools •Provide Training, Support, and Networking Opportunities for Trainers, Coaches, & Coordinators •Fidelity of Training & Implementation •Inclusion of IHEs •Inclusion of PBS in standards for Education Leadership Candidates, preservice/graduate personnel •Support Creation of Durable Systems State Leadership Team Assessment & Evaluation Goals: • Determine specific data to be collected statewide •Create a plan for obtaining a thorough evaluation of the PBS Program in North Carolina State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant State PBS Consultant Regional Coordinators LEA Coordinator Position now External filled by Heather Trainer Reynolds Solone, Coach as a result of legislative action. The PBS consultant is part of the Behavior Support & Team Leader Special Programs Section of the EC Division, led (In –School by Chief, Diann Irwin. Coach) School Administrator Students Recorder Parents Community Data Manager Time-keeper Communication Coordinator School Staff State Leadership Team Visibility & Political Support Training & Coaching Regional Coordinators State PBS Consultant Assessment & Evaluation Regional Coordinators Regional responsibilities include the following: LEA •2/3 of the time working with PBS implementation in the region Coordinator and state (14 or 15 work days per work month). Trainer External Coach •Attend PBS coordinator meetings and training. •Host state and regional meetings for implementing schools. •Coordinate regional PBS training. Team Leader (In –School •Provide PBS Awareness Presentations in the region. Coach) School Administrator Students Recorder Parents Community Data Manager Time-keeper Communication Coordinator School Staff Regional Coordinators Regional responsibilities include the following: •Work with PBS LEA trainers to complete School Evaluation Tools. •Help plan PBS summer institute and conference presentations. •Visit implementing schools in other LEAs, as possible. •Provide PBS technical assistance and support in the region. •Coordinate data collection for the region. •Assist local PBS trainers with using and understanding data. •Stay informed about national PBS research. •Coordinate with Behavior Support Consultant from the region. Regional Coordinators Expected LEA responsibilities for the position include: •Coach participating schools in LEA. •Help train new schools in LEA. •Direct data management and program evaluation. •Chair School System PBS Leadership Team. •Work with PBS trainers and chair persons in LEA. •Link between schools, leadership team and leadership of school system. •See that School Evaluation Tool is completed for each implementing school in LEA. •Manage school system action plan. State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant Regional Coordinators LEA Coordinator Coordinate with PBS Regional Coordinator from the region. LEA •Attend PBS coordinator meetings and training. Coordinator •Coordinate LEA PBS training. Trainer External Coach •Provide PBS Awareness Presentations in the LEA. Provide PBS technical assistance and support in the LEA. •Host LEA meetings for implementing schools. Team Leader (In –School •Visit implementing schools. Coach) •Work with PBS coaches, trainers, and chair persons in LEA. •Assist localRecorder PBS teams with using and understanding data. School Data Manager Time-keeper Communication Administrator •Stay informed about national PBS research. Coordinator Students Parents Community School Staff LEA Coordinator Direct data management and program evaluation. •Chair School System PBS Leadership Team. •See that School Evaluation Tool is completed for each implementing school in LEA. •Coordinate data collection for the LEA and send to Regional Coordinator. •Link between schools, leadership team, and leadership of school system. •Manage school system action plan. State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant External Coach Coordinate with PBS LEA Coordinator. •Attend PBS Coach meetings and training. Regional Coordinators •Attend LEA PBS Leadership Team meetings •Coordinate LEA PBS training. Provide PBS technical assistance and LEA support in the LEA. •Facilitate LEA meetings forCoordinator implementing schools. Trainer External Coach •Attend implementing school team meetings. •Work with PBS trainers and school teams in LEA. •Assist local PBS teams with using and understanding data. Teamabout Leader national PBS research. •Stay informed (In –School •Complete School Coach) Evaluation Tool for each implementing school in LEA. School •Coordinate dataRecorder collectionData forManager schoolTime-keeper teams sendCommunication to LEA Coordinator. Administrator Coordinator •Link between schools and LEA Coordinator. Students Parents School Staff •Assist schools with action planning. Community State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant Regional Coordinators Trainer LEA Coordinator Work with PBSExternal Regional Coordinator Trainer & LEA Coordinator to plan Coach trainings. Complete the train-the-trainer process. •Participate in all 3 Modules as a team member. Team Leader (In –School all 3 Modules with an experienced trainer. •Co-train Coach) •Achieve competence and train independently. Attend Trainer and updates. School Data Manager Time-keeper Communication Recorderrefreshers Administrator Provide support and technical assistance forCoordinator school teams. Students Parents annual self-assessment and competency requirements. Complete School Staff Community State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant Regional Coordinators LEA Team Leader Coordinator School Administrator Students (In –School Coach) Coordinate with LEA/External Coach & LEA Trainer External Coach Coordinator. Attend PBS coach meetings and training. Facilitate team meetings for your school. Team Leader Assist (In –Schoolteams with using and understanding data. Coach) Stay informed about national PBS research. Coordinate completion ofTime-keeper School Evaluation Tool. Data Manager Communication Recorder Coordinate data collection and send toCoordinator LEA Coach. Parents School Staff Community State Leadership Team Visibility & Political Support Training & Coaching Assessment & Evaluation State PBS Consultant Regional Coordinators LEA Coordinator External Coach Trainer Team Leader (In –School Coach) School Administrator School Administrator Recorder Students Students Recorder Timekeeper Time-keeper Communication Coordinator Communication Coordinator Parents School Staff Parents Community Community Data Manager Data Manager School Staff What’s next? • • • • • Recognition System PBS Evaluation Manual Effective Individual Assessment Effective Coaching District Planning, Coordination & Assessment NC PBS DATA COLLECTION Presented by Laura Phipps Objectives Understanding of… NC PBS Data Collection Manual. NC PBS Recognition Program. Strategies for using data for effective action planning. NC DATA COLLECTION MANUAL SECTION I History and Purpose Manual History Historically, we have had inconsistent data collection due to: Different size LEAs Different data collection sources Various levels of data system knowledge Lead to challenges for schools to assess data and effectiveness of PBS implementation. State was challenged to make state-wide conclusions about PBS Outcomes. Manual Purpose Not intended to add work, but to organize the work you are already doing. Toolbox: Describes different types of data you may want to collect and the rationale for how it will help you. Road Map: Provides guidelines for completing a thorough assessment for PBS Implementation. State-Wide Goal: Increase consistency of datacollection across the state and guiding support from DPI and PBS Coordinators. Using Data at the School Level Create Action plan and Steps Collect data to determine need Make needed revisions Implement Collect data to assess plan Using Data at the State Level Collect data from all regions Implement Plan Create action steps/plan Analyze and summarize patterns Determine needed supports Manual Overview 5 Sections Purpose of data collection manual Implementation Data System Level Outcome Data Small Group/ Individual Outcome Data System-wide Implementation Data Manual What and Why How To Recognition Program Goal is to motivate schools to provide data collection SO THAT we can increase sustainability of implementation. Manual provides specific data requirements for meeting state implementation standards Recognition Program Components Systems Training Team Data Implementation Outcome data data Practices Implementation SET Score level Recognition Program: Documents Recognition Program Requirements Application for State Recognition Data Requirements on website http://www.ncpublicschools.org/positivebehavior/ Manual NC DATA COLLECTION MANUAL SECTION II Implementation Data Implementation Data Rationale To ensure that the implementation of PBS at any given school is being done with reliability and accuracy. How will collecting this data impact: School administrators PBS teams Provides specific information regarding areas for improvement in order to create meaningful action plans Teachers Provides clear information about the fidelity of implementation of PBS and guides decision making regarding use of time and resources helps the PBS team move as quickly and efficiently as possible towards creating a sustainable model, improving school climate and overall student outcomes Students, parents, communities Highly accurate implementation will quickly transition schools towards a more positive climate. Implementation Data vs. Outcome Data Implementation Data Designed to measure fidelity of implementation Goal is to develop action steps Outcome Data Measures progress on specific school-wide goals Allows schools to determine impact of PBS implementation Documents the effectiveness of PBS on overall school climate Used by LEAs to make system-wide decisions Implementation Data Schedule Month November Academic Year Current Current March-June Current April Next Current May Current June Current Data Who Action Implementation Inventory School Staff Send to LEA PBS Coordinator* School Survey (baseline) SET (Due by June 1st) School Staff Future Training List LEA Coordinator Keep for Comparison Send to LEA Coordinator* and enter results online Send to LEA Coordinator* Implementation Inventory (Comparison) Trainer Report Application for Recognition LEA Coordinator School Staff Keep for comparison PBS Trainer Send to LEA Coordinator* Implementation Data Tools Implementation Checklist/ Inventory School Survey (EBS Survey/ Self-Assessment) Trainer Report SET Future Training List See Data Manual pages 10 - 14 NC DATA COLLECTION MANUAL SECTION III System Level Outcome Data System Level Outcome Data Rationale To determine how prevention and intervention strategies are impacting the school environment How will collecting this data impact: School Administrators PBS Teams To know what kind of prevention and intervention strategies are needed based on your specific school population Teachers By evaluating system level outcome data, you can make sure that your school resources are being used most efficiently Help the PBS Team make accurate decisions about practices to use in the school and classroom based on your specific student population Students, parents, and communities Data will help choose or modify strategies to ensure best academic and behavioral outcomes Outcome Data Schedule Month Academic Year Data Who Action (internal or External) * June Current Achievement Data (pg. 19) School Staff Send to LEA Coordinator (E) Current Suspension/ Expulsion (pg. 21) School Staff Send to LEA Coordinator (E) Current Staff Retention Data (pg. 28) School Staff Current Climate Surveys (pg. 26) Keep for internal data analysis (I) School Staff Keep for internal analysis (I) School Staff Keep for internal analysis (I) Referrals Data (pg. 22) School Staff Send to LEA coordinator (E) Attendance (pg. 29) School Staff Keep for internal analysis (I) Current Current Special Education Referrals/ Eligibility (pg. 27) System Level Outcome Data Tools Achievement Data Suspension/ Expulsion Data Referral Data How to collect using SWIS vs. NC Wise (or other system). Climate Surveys Special Education Referrals/ Eligibility Data Staff Retention Data Attendance Data See Data Manual pages 19 - 29 NC DATA COLLECTION MANUAL SECTION IV Small Group and Individual Level Outcome Data Small Group and Individual Level Outcome Data Rationale Allows better identification of which students are in need of the most support Need to be able to better assess how interventions are working at the individual and small group level prior to problem behavior PBS traditionally only collects data after problem behavior has occurred preventing ability to know what interventions will work at the classroom level Small Group and Individual Level Outcome Data How will collecting this data impact: School Administrators PBS Teams determine the effectiveness of functional based behavioral supports and address problem areas through a team-based approach. Teachers Document the educational and behavioral progress of at-risk students identify which interventions are most effective in working with at-risk students Provides clear way to focus time and energy on interventions that are shown to be effective Gives clear way to communicate progress to other staff and parents Students, parents, and communities Improves quality of interventions for children Gives common way for teachers and parents to communicate about progress PBS and RTI RTI and PBS need to work together PBS good at universals implementation RTI good at small group and targeted interventions Whole school/class-wide assessment, intervention and implementation Individual child and small group assessment intervention and implementation RTI historically better at assessment and use of data PBS historically better at implementing and getting schools to do something different We need to learn from each other and adopt both methods Small Group and Individual Level Outcome Data Tools Direct Behavior Ratings (DBR) Other Options Permanent products Systematic direct observation See Data Manual pages 33-35 NC DATA COLLECTION MANUAL SECTION V System Wide Implementation Documenting System Wide Implementation Rationale For large school systems (LEAs) implementing system wide, it may be helpful to write a single report summarizing progress This allows LEA administrators to assess overall impact of PBS implementation in order to better provide support and resources A system wide report should not replace collecting and assessing data on an individual school basis, but can be an additional tool in creating sustainability of PBS Documenting System Wide Implementation Possible Components of a System Wide Report Executive summary Overview of number of schools implementing (elementary, middle and high) Combined SET Scores for each of the seven areas by elementary middle and high Combined referral information by elementary, middle and high) Average Per Day and Quarter By Location By Problem Behavior Combined suspension/expulsion data Triangle data and analysis by elementary, middle and high Achievement data (PBS schools compared to non PBS schools) Staff impact data (e.g., retention morale, etc.) COMMON CHALLENGES Challenges and Solutions Challenges: Time consuming Incompatible data collection systems Fear Data does not feel meaningful Beliefs that it is just restating the obvious Resistance to technology and numbers Challenges and Solutions Solutions Reframe thinking about TIME: Effective use of data will save time in the long run PBS Team and administrators need to work together to streamline data collection methods Build trust among staff members and administrators through frequent sharing of data=Data is information NOT judgment Make sure all data collection is CLEARLY connected to tangible action steps Show staff how objective information (data) increases staff investment and makes implementation more meaningful Demystify the word “data” ACTION PLANNING Using Data Effectively Effective Action Plans Effective action plans are… used regularly. frequently reviewed and updated. accessible to all staff. made up of specific doable action steps with clear timelines. generated using data from staff and team. Using Self Assessment Survey Data to Generate Action Steps Once survey is closed (on pbssurveys.org) you will be able to access reports using the same login number There are three separate reports You can see overall trends as well as specific numbers for each item Once you identify items and areas needing improvement the team should prioritize action steps Some items can be addressed through information only Others will need revisions or completion of tasks. Sample Action Item One school’s survey data showed that many staff indicated both high priority for improvement and not in place for the item that read “ Data on problem behaviors are collected and summarized within an ongoing system” Since the school was using SWIS the team decided the issue was more awareness. The created the following action item. Goal Steps Who Resources Required Increase staff awareness of SWIS data 1. Present SWIS big 5 graphs at next staff meeting Debbie Data Copies of big 5 Discussion Questions By When Evaluation Measure March 24 Staff will increase requests for big 5 for grade level meetings Using Implementation Inventory to Generate Action Steps The team completes the Implementation Inventory After calculating percentages of implementation, focus on the areas scoring below 80% Within each section look at the items marked 1 or 0 and create a prioritized list Cross reference the team list with the results from the staff survey Create action steps starting with the highest priority Sample Action Item After completing the implementation inventory the team found that they scored a 61% in universal practices. Upon further review they found all the items marked 1 or 0 had to do with specific teaching of school wide expectations. The also noticed from the staff survey a focus on the need to improve non classroom setting routines. The team developed the following action item Goal Steps Who Resources Required By When Increase consistent use of expected behaviors in the cafeteria Create lesson plans for cafeteria expectations Lori Lessonplan and cafeteria TA Time to meet with Cafeteria TAs March 24th All staff will complete a feedback form after completion of First week the lesson after spring Reduction of break referrals from cafeteria during spring semester Create a schedule for all teachers to teach expectations in the cafeteria Sample lesson plans from other schools Lesson plan template from PBS website Evaluation Measure Using SWIS Data to Generate Action Steps Regularly review SWIS big 5 graphs For each graph create a list of questions the data generates Create custom graphs to answer questions Bring data back to the team and/or staff for discussion of patterns List possible action items Prioritize the list and develop action steps for highest priority Sample Action Item After reviewing the SWIS big five one team discovered that the majority of problem behaviors were occurring in the classroom. They created a custom graph to determine what specific behaviors were occurring in the classroom. They found that noncompliance/disrespect was the biggest issue. The team generated the following action item Goal Steps Who Resources Required By When Evaluation Measure Increase compliance with requests in classrooms Have counselor create a social skills on specific skills •Compliance •Accepting no •Asking for help Carmine Counselor Lesson plan template First week after spring break Have all teachers schedule lesson with counselor Social skill curriculum Decrease in classroom referrals for noncompliance / disrespect Activity: Action Planning Using the SAMPLE School Data provided (Implementation Inventory results, SWIS Big 5, and PBS Self Assessment Survey), create at least one action item. Use the blank action plan provided. Data Collection Next Steps Review the Data Collection Manual with your LEA administrators and school teams Contact your PBS Regional Coordinators for any questions or concerns http://www.ncpublicschools.org/positivebehavior/coor dinator/ Look for upcoming trainings on Data Collection PBS Summer Institute Regional Trainings School-Based Behavioral Assessment: Informing Intervention and Instruction S. Chafouleas, T. Chris Riley-Tillman, G. Sugai Direct Behavior Rating (DBR) presentation by: C. Riley-Tillman Presented by: C. McCamish Section IV: Small Group and Individual Level Outcome Data Small Group and Individual Level Outcome Data: Assess effectiveness of interventions Document educational and behavioral progress Determine effectiveness of functional based behavioral supports Communication tool Direct Behavior Rating (DBR) Background Effective behavioral assessment and intervention procedures in applied settings require the use of empirically-supported yet feasible techniques To date, feasible assessment of behavior skills has been focused on ODR data – which may not be sensitive to capture all behaviors of interest To date, support for feasible, formative assessment of academic skills is available (e.g., CBM) – but attention has not been directed toward social behaviors Defining Characteristics of the DBR The DBR involves a brief rating of target behavior over a specified period of time a behavior(s) is specified rating of the behavior(s) typically occurs at least daily obtained information is shared across individuals (e.g., parents, teachers, students) the card is used to monitor the effects of an intervention and/or as a component of an intervention (Chafouleas, Riley-Tillman & McDougal, 2002) Direct Behavior Ratings Refer to a hybrid of assessment tools that combine characteristics of systematic direct observation and behavior rating scales. – SDO- method of behavioral assessment that requires a trained observer to identify and operationally define a behavior of interest, use a system of observation in a specific time and place, and then score and summarize the data in a consistent manner (Salvia & Ysseldyke, 2004; Riley-Tillman, Kalaber, Chafouleas, 2006) These tools are designed to be used in a formative (repeated) fashion to represent behavior that occurs over a specified period of time (e.g., 4 weeks) and under specific and similar conditions (e.g., 45 min. morning seat work). Using these tools requires rating target behavior on a scale (e.g., rating the degree to which Johnny was actively engaged. ) So, teachers might be asked to rate on a scale from 1 (not at all) to 5 (almost always) the degree to which Johnny was actively engaged in work activities during independent seat work this morning. Other Names for the DBR Home-School Note Behavior Report Card Daily Progress Report Good Behavior Note Check-In Check-Out Card Performance-based behavioral recording Example Standar d DBR Who already uses the DBR? 60% of teachers surveyed already use DBRs to change student behavior 32% to monitor or observe student behavior 81% to identify positive behaviors, 77% to identify negative behaviors 86% use with individual students, 19% with whole class, 9% with small groups 32% use DBRs “routinely” as part of classroom management plan (Chafouleas, Riley-Tillman, & Sassu, 2006) Many Potential Uses for the DBR Increase communication (teacher-student, home-school) As a component of an intervention package, particularly in self-management Provide “quick” assessment of behaviors, especially those not easily captured by other means Monitor student behavior over time Flexible – – – – K-12, + or – 1 student or larger group range of behaviors Many Potential Uses for the DBR Flexible Feasible Effective Evidence Based COMMUNICATION INTERVENTION ASSESSMENT DIRECT BEHAVIOR RATINGS A systematic DBR possesses the following 4 characteristics: – 1. The behavior of interest must be operationally defined. – 2. The observations should be conducted under standardized procedures. – 3. The DBR should be used in a specific time, place, and at a predetermined frequency. – 4. The data must be scored and summarized in a consistent manner. Guiding Questions 1. Why do you need the data? 2. Which tools are best match to assess the behavior of interest? 3. What decisions will be made using the data? 4. What resources are available to collect the data? Design Flexibility What is the target behavior and goal? Focus on a specific behavior (e.g., calling out) or a cluster term for behaviors (e.g., disruption) Goal to increase or decrease behavior Who is the focus of the rating? Individual, small-group or class-wide What is the period for rating? Specific school period, daily, or other What is the setting of observation? Classroom or other location How often will data be collected? Multiple times a day, daily, weekly What scale for rating will be used? Checklist, scale, continuous line Who will conduct the rating? Classroom teacher, aide, or other educational professional Will ratings be tied to consequences? Consequences must be consistently delivered by person responsible Considerations When Using a DBR Ensure that use is “systematic” – – – Identify and operationally define a behavior of interest Use a system of observation in a specific time and place Score and summarize the data in a consistent manner (Similar to the criteria that define systematic direct observation (Salvia & Ysseldyke, 2004) Provide checks on integrity and acceptability Understand correspondence with other data sources How are Direct Behavior Ratings data summarized? Data can be quantified, compared, combined, and summarized for summative and formative purposes. – For example, DBR data of Susie’s disruptive behavior over the week can be summarized into a statement of average daily or weekly rating (6 out of 9 points) or most likely period of high or low disruption if multiple ratings per day are taken (just before lunch). Since DBR involve rating on some scale, data are summarized relevant to the scale. – For example, a simple yes/no checklist can be easily depicted through a bar chart whereas rating information might be plotted on a line graph, with the intervals on the yaxis indicating the DBR scale. Summary of Strengths and Weaknesses of Use in Assessment Strengths – Highly Flexible – Highly Feasible, Acceptable, and Familiar – Minimal Cost Given Potential Amount and Uses for Data – Reduced Risk of Reactivity (atypical behavior) – Can be used in assessment, intervention, and communication Weaknesses – Rater Influence (history) – Limited Response Format – Limited Knowledge about Psychometric Adequacy Case Example Mr. Cohen is the sole school psychologist in Sunnyvale, a small, rural district. One of the teachers in the elementary school, Ms. Yoon, recently implemented a token economy in her classroom in an effort to increase pro-social behaviors among a small group of her students during cooperative learning activities. Although Ms. Yoon thinks that the intervention has been successful (she told Mr. Cohen that “the classroom environment feels more positive”), she would like to know for sure and asks Mr. Cohen to help her collect data to support this belief. Mr. Cohen is pleased that Ms. Yoon has sought him out and certainly wants to help, but his schedule is barely manageable over the next few weeks given other commitments. Thus, Ms. Yoon and Mr. Cohen agree to have Ms. Yoon collect data using a DBR, with Mr. Cohen coming in periodically (i.e., once per week) to supplement the DBR data with systematic direct observations. Points to Consider Measures perception of behavior “3 to 7” not “he is a 7” Academic-absolutes Social- No absolutes Rater Effect- vary on initial rating – – Overestimate consistently Consistent responses to changes in behavior Resources www.interventioncentral.org - This website offers an extensive resource on using behavior ratings in the Classroom Behavior Report Card Manual. Chafouleas, S.M., Riley-Tillman, T.C., & Sugai, G. (in press). Behavior Assessment and Monitoring in Schools. New York: Guilford Press. Crone, D. A., Horner, R. H., & Hawken, L. S. (2004). Responding to problem behavior in schools: The behavior education program. New York: Guilford Press. Jenson, W.R., Rhode, G., & Reavis, H.K. (1994). The Tough Kid Tool Box. Longmont, CO: Sopris West. Kelley, M.L. (1990). School Home Notes: Promoting Children’s Classroom Success. New York: Guilford Press. Shapiro, E.S., & Cole, C.L. (1994). Behavior change in the classroom: Self management interventions. New York: Guilford Press. For More Information Charouleas, S., Riley-Tillman, T. C., & Sugai, G. (2007). School-Based Behavioral Assessment: Informing intervention and instruction. [email protected]