Excellent Educators for New Jersey (EE4NJ): November Update

Download Report

Transcript Excellent Educators for New Jersey (EE4NJ): November Update

Educator Evaluation Reform in New Jersey

November 16, 2012

The Case for Reforming Teacher Evaluation Systems: Impact

Nothing schools can do for their students matters more than giving them effective educators

• Principal and teacher quality account for nearly 60% of a school’s total impact on student achievement 1 • The effect of increases in teacher quality swamps the impact of any other educational investment, such as reductions in class size 2 • Replacing one poor teacher with an average one increases a classroom’s lifetime earnings by ~$266,000 3 Top educators have a

lasting impact

on their students’ success –

in academics and in life

1. Marzano et al., 2005 2. Goldhaber, 2009 3. Chetty et al., 2011 2

Evolution of Evaluation Reform in New Jersey

NJ Educator Effectiveness Task Force work 2010-11 Teacher evaluation pilot opportunity announced Teacher evaluation pilot in progress 2011-12 Capacity-building requirements announced for all districts to follow in 2012-13 2012-13 Cohort 2 teacher evaluation/new principal evaluation pilots in progress; districts building capacity New tenure legislation in effect 2013-14 Statewide Implementation of New Evaluation System 3

2012-13 Teacher Evaluation Pilot Weights

Tested Grades and Subjects – equal weighting Teaching Practice (TP) includes the following components, totaling 50% of the pie: •

Teaching Practice Evaluation

Framework (40% - 45%) •

Other Measures of Teaching

Practice (5% - 10%) 50% TP 50% SA Student Achievement (SA) includes the following components, totaling 50% of the pie: •

Growth on NJ Assessments as

measured by SGP (35% - 45%) •

School-Wide Performance Measure

(5%-10%) •

Other Performance Measures

optional (0% - 10%) Non-Tested Grades and Subjects – variable weighting (districts have discretion) Teaching Practice (TP) includes the following components, totaling 50-85% of the pie: •

Teaching Practice Evaluation

Framework (45% - 80%) •

Other Measures of Teaching

Practice (5% - 10%) At Least 50% TP At Least 15% SA 35% District Discretion

Student Achievement (SA)

includes the following components, totaling 15-50% of the pie: •

Student Achievement Goals

(10% - 45%) •

School-Wide Performance

Measure (5%-10%) Districts determine how much of remaining 35% of pie is allocated to TP and/or SA 4

2012-13 Teacher Evaluation Pilot: Changes from First Cohort

Based on learning from 2011-12 pilots and national best practices, the 2012-13 pilot includes: Flexibility in minimum duration for classroom observations Fewer required observations for teachers of non core subjects Use of double scoring Unannounced observations Use of external evaluators Flexibility in weighting for tested and non-tested grades and subjects 5

Importance of Training

“Training of evaluators is key! Training was ongoing and included an eclectic approach: whole group that included teacher leaders, coaching by Superintendent, and instrument provider. Ongoing debriefing and double-scoring for training purposes were key strategies to support the learning of all administrators. The alignment between curriculum, lesson planning, assessment was essential in guiding our work.” Cohort 1 Survey Response Many districts are also using turnkey training to save time and money, and to engage educators in the process 6

2012-13 Evaluation Pilot Feedback Loops

Sources of Feedback

• State Evaluation Pilot Advisory Committee (EPAC) provides recommendations on pilot and statewide implementation • Each pilot district convenes District

Evaluation Advisory Committee (DEAC)

– DEACs meet monthly to discuss pilot challenges, provide feedback – Districts convene one DEAC to cover both teacher and principal evaluation work • External evaluator (Rutgers for 2011 12) studies pilot activity and provides reports

Outcomes

Assess impact of new observation and evaluation protocols  Convey best practices and lessons learned for rest of the State 

Inform proposed

regulations for 2013-14 and subsequent school years 7

DEAC Impact: Cohort 1 Pilot Survey

“Having a balanced representation of parents, teachers, administrators, and community members has allowed us to address the needs and ideas from every stakeholder in the district. Parents were able to cite students' positive reaction to the evaluators in the classroom. Parents were happy to know that we were using student achievement a part of the teacher evaluation system. The DEAC process enabled stakeholders to share information.” Cohort 1 Survey Response 8

DEACs: Required for ALL New Jersey Districts

Districts must convene DEACs by October 31, comprised of the following:

• • • • • • • Teachers from each school level represented in the district School administrators conducting evaluations (this must include one administrator who participates on the School Improvement Panel and one special education administrator) Central office administrators overseeing the teacher evaluation process

Supervisor Superintendent Parent Member of the district board of education

The mission of the DEAC is to:

• • • • Solicit input from stakeholders Share information Guide and inform evaluation activities Generate buy-in 9

School Improvement Panel: Required for all Districts

Charge Composition Duties

Establish in each school in district by February 1, 2013 Ensure effectiveness of the school’s teachers School principal or designee, Assistant/vice principal, Teacher Note: teacher will not participate in evaluation activities except with approval of majority representative Oversee mentoring, Conduct evaluations, Identify professional development opportunities Conduct mid-year evaluation of any teacher rated ineffective or partially effective in most recent annual summative evaluation 10

Summary of Lessons Learned from Cohort 1 Pilots

Stakeholder engagement is critical; open district advisory committee meetings facilitate transparency and trust Developing measures for non-tested grades and subjects is challenging Timely, comprehensive, and quality training of educators and evaluators must be emphasized Administrators face capacity challenges Selection of a teaching practice instrument takes time and should include stakeholder input 11

Lessons Learned from Cohort 1: End of Year Reports

Key Successes: Training and Communication Strategies for training: ∙ Onsite vendor support ∙ Online video exemplars ∙ District Q&A sessions ∙ Analysis of double-scoring Strategies for communication: ∙ Working with DEACs ∙ Sharing information with EPAC ∙ Engaging with NJDOE supports ∙ Sharing resources and information Key Challenges: Time Constraints and Assessing NTGS

Time constraints:

∙ Scheduling training ∙ Scheduling/completing observations ∙ Gaps between training/observations ∙Scheduling DEAC meetings

Assessing NTGS:

∙ Identifying/developing assessments ∙ Aligning with new standards ∙ Updating technology platforms ∙ Seeking guidance from NJDOE 12

Future Pathways for Evaluation

2013-14

Final report on pilots Support for statewide implement ation Learn from implement ation challenges Appropriate course adjustments Possible additional regulatory changes Learn from Implement ation results Cycle of continuous improvement

2014-15

Ongoing data collection and analysis Applying lessons learned and modifying policies as needed 13

Website and Contact Information

Website: http://www.state.nj.us/education/evaluation Contact information:

For general questions, please email [email protected]

– Phone: 609-777-3788 14