Transcript PPT - BC TEAL
Shawna Williams BC TEAL Annual Conference May 24, 2014
Agenda Introduction Assessment terminology – definitions Principles of Assessment
My assessment history…
Language Assessment: Principles and Classroom Practices
Assessment Terminology
Assessment ≠ Testing
Assessment is… “appraising or estimating the level or magnitude of some attribute of a person.” (Mousavi, 2009) “an ongoing process of collecting information about a given object of interest according to procedures that are systematic and substantively grounded.” (Brown & Abeywickrama, 2010)
Assessment “A good teacher never ceases to assess students, whether those assessments are incidental or intended.” (Brown & Abeywickrama, 2010)
Assessment and Learning Tests Measurement Assessment Teaching Evaluation (Brown & Abeywickrama, 2010, p. 6)
Function of Assessment
Informal
Incidental, unplanned comments Coaching Impromptu feedback on homework Nonjudgmental
Formal
Systematic and planned Give T and Ss appraisal of achievement Tests and assignments
Function of Assessment
Formative
“evaluating students in the process of ‘forming’ their competencies and skills with the goal of helping them to continue that growth process” feedback on performance future continuation of learning
Summative
“aims to measure, or summarize, what the student has grasped” end of course or unit looking back and taking stock (Brown & Abeywickrama, 2010, p.7)
How do you know if an assessment task is effective, appropriate, useful, or . . . “good”?
Practicality Reliability Validity Authenticity Washback
A PRACTICAL TEST . . .
A PRACTICAL TEST . . . budgetary limits appropriate time constraints clear directions for administration appropriately utilizes human resources does not exceed available material resources considers time and effort for design and scoring
(Brown & Abeywickrama, 2010, p. 26)
A RELIABLE TEST . . .
=
A RELIABLE TEST . . .
consistent across two or more administrations clear directions for scoring/evaluation uniform rubrics for scoring/evaluation consistent application of rubrics by scorer unambiguous to the test-taker
(Brown & Abeywickrama, 2010, p. 27)
Reliability con’t Rater Reliability Inter-Rater Reliability Intra-Rater Reliability
Reliability con’t Rater Reliability Inter-Rater Reliability Intra-Rater Reliability Student-Related Reliability Test Administration Reliability Test Reliability
A VALID TEST . . .
A VALID TEST . . . measures exactly what it proposes to measure does not measure irrelevant or “contaminating” variables relies on empirical evidence (performance) performance that samples the test’s criterion (objective) useful, meaningful information about test taker’s ability supported by theoretical rationale or argument
(Brown& Abeywickrama, 2010, p. 30)
Validity con’t Content-Related Evidence Criterion-Related Evidence Construct-Related Evidence Consequential Validity (Impact) Face Validity
AN AUTHENTIC TEST . . .
AN AUTHENTIC TEST . . .
language as natural as possible items are contextualized rather than isolated meaningful, relevant, interesting topics thematic organization real-world tasks
(Brown & Abeywickrama, 2010, p. 37)
A TEST THAT PROVIDES BENEFICIAL WASHBACK . . .
A TEST THAT PROVIDES BENEFICIAL WASHBACK . . .
positively influences teachers’ teaching positively influences learners’ learning learners can adequately prepare feedback for language development more formative than summative conditions for peak performance
(Brown & Abeywickrama, 2010, p. 38)
1.
2.
3.
4.
5.
6.
7.
8.
Applying Principles to Creation of Assessment Tools Practical test procedures?
Test is reliable?
Rater reliability?
Content validity?
Impact has been accounted for?
Procedure is “biased for best”?
Test tasks are authentic?
Test offers beneficial washback?
9.
See Brown & Abeywickrama, Chpt. 2
Shawna Williams [email protected]