Assessment & Technology - University of Hawaii at Manoa

Download Report

Transcript Assessment & Technology - University of Hawaii at Manoa

Assessment & Technology

UH-M

COLLEGE OF EDUCATION

COE Outreach & Technology

Electronic Exhibit Room

• • • • • Systematic assessment of evidence of student learning at multiple points in program Program assessment data compiled to an internal website NCATE reviewers access at their leisure Data remain available between reviews Easy for program faculty to maintain

Program Assessment

 Assessed by: – – – – – – – Candidates (exit surveys, course evaluations) Alumni (survey, focus groups) Employers Mentor Teachers COPR Process Learned Societies

Candidate Learning Outcomes Review

Assessing Programs by Assessing Candidates

Candidate Assessment

 Candidate Assessment – Evidence Collection = Portfolio      Grade Reports Exam Scores Work Samples Faculty Observation Summaries Students’ Work Samples – Candidate Portfolio Tools    PowerPoint (hyperlinks, external files, branching) Task Stream – online CD-R

Assessing Learning Outcomes

1.

Define Program Objectives 2.

Define Points of Measurement 3.

Define Evidence for Objectives 4.

Define Rubric for Assessing Evidence 5.

Delineate Who Evaluates and When

Example

(part 1: Program defines Objectives)

 ABC Program Objective 1: (Knowledge) Candidates know . . .

(Skill) Candidates are be able to … (Disposition) Candidates exhibit… Objective 2: Objective 3: Objective 4:

Secondary Program Objectives SECONDARY PROGRAM OBJECTIVES EDUC 401 Introduction to Secondary Education EDPSYC 311/611 Educ Psychology EDUC 4 02 Field EDUC 404 Methods TECS 440 SPED 445 ETEC 414 EDUC 405-6 Multi Cultural Special Education Educ Tech nology Student Teaching

Professional Legal Responsibilities Foundations of Secondary Education

x X x x x x X x x X x

Example

(part 2: Program Chooses Points of Measurement)

Program will Assess Candidates – Beginning (defined: immediately upon admission) – Middle (defined: conclusion of EDUC XXXX course and/or prior to student teaching) – End (defined: conclusion of field experience XXXX)

Example

(part 3: Program assigns evidence to objectives)

 Objective 1:

Professional Legal Responsibilities

The teacher candidate demonstrates an understanding of

(knowledge)

ability to apply

(skill)

and and model

(disposition)

legal responsibilities expected of professional educators.

Sample artifacts

Program Objectives Suggested Artifact Evidence Rubric Scale 1. Professional Legal Responsibilities

The teacher candidate demonstrates an understanding of and ability to apply and model legal responsibilities expected of professional educators.

 Case study responses  Reflective journals and logs  Performance evaluation  Evidence of ability to maintain required reports, records, and legal documents  IEP from a case study report 1 2 3

Example

(part 4: Program defines rubric scale for evidence) 3 Target

: Evidence reflects in-depth knowledge and understanding of standard; outstanding data and evidence of application

2 Acceptable

: Evidence indicates knowledge and understanding of standard; satisfactory data and evidence of application

1 Unacceptable

: Evidence shows little or inadequate knowledge of standard; limited data and evidence of application

Example

(part 5: Program states who will measure and when)

    Candidate Outcomes Review Faculty assigned to review candidate outcomes Review Committee determines program completion for candidate Candidate outcomes aggregated Summary data provided to Associate Dean on cohort

Example

(part 6: Composite candidate scores defined, measured)

 Summarize each Candidate’s Assessment Mid-Point Assessment: – – – e.g. Overall Unacceptable : 1 or more unacceptables e.g. Overall Acceptable : 0 unacceptables, <4 superiors e.g. Overall Superior : 0 unacceptables, 5 or more superior scores

Example

(part 7: Summary data submitted)

Incomplete 1 ABC Program Midpoint Assessment 2004

Overall Unacceptable Overall Acceptable

1 23

Overall Superior

12

NCATE Review Website (mock-up)

Measurements: 5 year period

Use of Technology

 Candidate use to collect and present evidence  Program use to assess candidate learning  Program use of aggregated data for program review  College use to maintain data overtime for accreditation purposes

Challenges

 Requires a shift in thinking: from grades to authentic assessment of learning outcomes  Program objectives must be made explicit  Faculty agreement on rubrics and scales  Need to identify ways to manage the process  Technology must be helpful, not burdensome