MACTE Fall Conference - Missouri State University

Download Report

Transcript MACTE Fall Conference - Missouri State University

Next Steps for Teacher Performance Assessment Consortium in Missouri MACTE Fall Conference October 2011

  Sam Hausfather, Dean, School of Education, Maryville University of St. Louis Kendyll Stansbury, Teacher Performance Assessment Consortium, Stanford University

Where TPAC fits in

TPAC is working to develop and implement at scale a way of assessing teaching that… • Provides evidence of teaching effectiveness, • • Supports teacher preparation program improvement Informs policy makers about qualities of teaching associated with student learning.

TPAC is ONE example of an assessment system that is designed to leverage the alignment of policies and support program renewal.

Stanford Center for Assessment, Learning and Equity 2011

Standards and TPAC

 Common Core alignment  InTASC alignment  Missouri Teaching Standards  NCATE/CAEP endorsement  SPA endorsement  Content focus  US DOE Blueprint Stanford Center for Assessment, Learning and Equity 2011

TPAC Lineage

National Board for Professional Teaching Standards (NBPTS) •portfolio assessments – accomplished teachers Connecticut BEST •assessment – teachers at end of induction Performance Assessment for California Teachers (PACT) •– pre-service teachers

Partnering States

Highlights of Pearson’s Role in the TPA

   Pearson has been selected as Stanford’s operational partner.

Support Stanford and AACTE with assessment development and technical review.

Train and certify scorers, provide a scoring platform and report results for the operational TPA.

Stanford Center for Assessment, Learning and Equity 2011

Design Principles for Educative Assessment

     Discipline specific and embedded in curriculum Student Centered: Examines teaching practice in relationship to student learning Analytic: Provides feedback and support along targeted dimensions.

Integrative: maintains the complexity of teaching Affords complex view of teaching based on multiple measures Stanford Center for Assessment, Learning and Equity 2011

What do candidates do?

1.

2.

• Identify a meaningful chunk of instruction around a

big idea or essential question

Learning segment of 3-5 days • Identify content and academic language Collection of artifacts and commentaries 3.

• Describe strategies and materials tailored to students in class A summative assessment of teaching practice

TPAC Artifacts of Practice

Planning

• • • • Instructional and social context Lesson plans Handouts, overheads, student work Planning Commentary

Instruction

• • Video Clips Instruction Commentary

Assessment

• • • • Analysis of Whole Class Assessment Analysis of learning and Feedback to two students Instructional next steps Assessment Commentary Daily Reflection Notes Analysis of Teaching Effectiveness Commentary Evidence of Academic Language Development Stanford Center for Assessment, Learning and Equity 2011

Conceptual Framework of Assessment

  

What?

– candidate describes plans or provides descriptions or evidence of what candidate or students did

So what?

– rationale for plans in terms of knowledge of students & research/theory, explanation of what happened in terms of student learning or how teaching affected student learning

Now what?

– what candidate would do differently if could do over, next instructional steps based on assessment, feedback to students Stanford Center for Assessment, Learning and Equity 2011

Multiple Measures Assessment System

Embedded Signature Assessments Child Case Studies Analyses of Student Learning Curriculum/ Teaching Analyses TPAC Capstone Assessment Integration of:  Planning  Instruction  Assessment  Analysis of Teaching

with attention to Academic Language

Observation/Supervisory Evaluation & Feedback Stanford Center for Assessment, Learning and Equity 2011

Targeted Competencies

PLANNING ASSESSMENT

1.

Planning for content understandings 6.

7.

Analyzing student work 2.

Using knowledge of students to inform teaching 8.

Using assessment to inform instruction 3.

Planning assessments to monitor and support student learning

INSTRUCTION

4.

Engaging students in learning 5.

Deepening student learning during instruction Stanford Center for Assessment, Learning and Equity 2011

REFLECTION

9.

Analyzing Teaching Effectiveness

ACADEMIC LANGUAGE

10.

Identifying Language Demands 11.

12.

Evidence of language use

Rubric Levels

  Is the candidate ready for independent teaching (i.e., to be the teacher of record)?

o Rubric Levels Level 1 – Struggling candidate, not ready to teach o o Level 2 – Some skill but needs more practice to be teacher-of-record Level 3 – Acceptable level to begin teaching o o Level 4 – Solid foundation of knowledge and skills Level 5 – Stellar candidate, in the top 5% of candidates, sophisticated practice.

Stanford Center for Assessment, Learning and Equity 2011

Academic Language Thinking about Students

Language of school Language & structures of tests, questions, and instructions Language of subject-specific oral & written formats  School is where you go to learn a secret language but they don’t tell you that it’s there. You have to figure it out on your own. It’s like an initiation to a secret club. Maya, 8 th grade.

 I knew I was gone fail that test when I got to the third question and I didn’t even know what they was asking me. Karah, 11 th grade  It ain’t that I don’t know nothin’, it’s that I can’t say it right. Mitch, 7 th grade Collected by Melanie Hundley, Vanderbilt University

Academic Language

 Academic language is the oral and written language used in school necessary for learning content.  This includes

the “language of the discipline”

(vocabulary and forms/functions of language associated with learning outcomes) and the “instructional language” used to engage students’ in learning content. Stanford Center for Assessment, Learning and Equity 2011

Maryville University: Why we piloted

 Had required Teacher Work Sample for 15 years  Required in Student Teaching & used as NCATE evidence  Noted lack of clear differentiation provided by TWS  Faculty agreed to use TPA for all 36 spring 2011 ST’ers  DESE created common student release forms and letter to principals and superintendents (emphasizing video)  Introduced all TE faculty to structure, rubrics, AL  ST Seminar instructor took major responsibility for assisting candidates – huge job!!

Perceived TPA Positives

 A model of teaching & learning better matched to what research tells us now  Excellent rubrics get at important concepts  Less about our candidates, more about their students and student learning  Analysis of teaching (video) lacking in TWS  Structured assessment plan more in-depth than TWS  Academic language component important in today’s schools

Perceived TPA Concerns

       Doesn’t align well with what was emphasized in programs previously Candidates not asked to write about certain things before (i.e., connections among elements of plans & lessons; AL) Academic language not well infused previously in programs Process and directions overwhelming to candidates – need conceptual or graphic organizers Lesson plan frameworks very open-ended and unclear where to include in submission Subgroups very open-ended and doesn’t emphasize achievement gap Seminar leader not a content expert in all candidate fields

Faculty involvement in TPA alignment

 1 day of May faculty retreat learning to score  All faculty scored TPAs in their content area  Backward mapping TPA into programs  Revising common lesson plan format/ journaling  Focusing more on assessment  Giving good feedback to students during lessons  Differentiating assessments  Infusing academic language into TE curriculum

Maryville syllabi considerations

 Try to embed language from the TPA prompts and rubrics into your courses  Consider the reflection questions when candidates teach lessons  Help our candidates align big ideas/essential questions to their objectives and plans  Help our candidates understand and recognize Academic Language and how to support academic language development in their planning & teaching  Require candidates to video their teaching and identify 15 minute clips that they can respond and reflect to  Require candidates to analyze student work samples  Require candidates to create assessments focused on what students do and do not understand, along with assessments that go beyond the knowledge level

Missouri Context

     

MoSPE – The Missouri Assessment Plan

Common Core Standards for students = higher expectations for student performance The new standards lead to a new teacher and administrator evaluation based on performance assessment and student growth Missouri’s interest in developing “a set of common statewide assessments.” This is why state interested in TPAC!

TPA will provide candidate data for these new demands and provide IHEs information about program quality.

A common statewide assessment to connect to the performance assessment and student growth measures used in our public schools.

Development Timeline

 2009-10 Small-scale tryout tasks & feedback from users.

 2010-11 Development of six pilot prototypes based on feedback. Piloted in 20 states. User feedback gathered to guide revisions.

 2011-12 National field test of 13 prototypes, producing a technical report with reliability and validity studies, and a bias and sensitivity review. National standard setting.

 2012-13 Validated assessment ready for adoption Stanford Center for Assessment, Learning and Equity 2011

Field Test Participation

 Subject Areas to be field tested  Elementary Literacy , Elementary Mathematics, English/Language Arts, History/Social Studies, Secondary Mathematics, Science  Special Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Studies), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World Language Stanford Center for Assessment, Learning and Equity 2011

Field Test Participation

 Pearson will support scoring training and scoring stipends for a national sample of 18,000 candidates  Scoring training and certification online (some synchronous events)  Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development.

 Local, state and national scoring Stanford Center for Assessment, Learning and Equity 2011

Framing Reliability and Validity Research

 Current policies in play  Evidence needed to support TPA use for accreditation and licensure decision-making  Potential role for VAM and other predictive validity measures Stanford Center for Assessment, Learning and Equity 2011

Next Steps

 Join TPAC Online ( http://tpaconline.ning.com/ )  Field test commitments  Technical assistance  AACTE affiliate meetings  Ongoing webinars and Ning discussions  PACT/TPAC Implementation Conference  October 20-21 in San Diego  AACTE Annual Meeting – February 17-19, 2012 Stanford Center for Assessment, Learning and Equity 2011

Questions/Discussion

 Questions  Sam Hausfather [email protected]

 Kendyll Stansbury [email protected]

 Institutions considering spring field test stay for additional discussion