Improving Your Program Assessment Report.”

Download Report

Transcript Improving Your Program Assessment Report.”

“Improving Your
Program Assessment
Report.”
UNIVERSITY ASSESSMENT COMMIT TEE
DEBRA BALLINGER AND ADAM MCGLYNN
Purpose
…to share what the UAC has learned from reviewing reports across campus.
…to learn from others’ experiences.
…to work on reports, ask questions, and get real‐time help before reports are due October 15.
…to share how to submit reports and seek assistance, if needed
…to learn how the evaluation process is conducted
What UAC has learned
Inconsistent formatting used by departments – makes reliable reviewing difficult. Some used
previous reporting formats. (Cut and Paste from accreditation reports not sufficient.)
Misunderstanding by many departments about the purpose and use of the reports – that they
are critical to writing university accreditation reports and to making informed decisions about
university assessment, curriculum, and student learning. (* New Culture of Assessment)
Still lack of understanding about indirect and direct assessments – and that BOTH are
recommended for all departmental assessments.
That departments don’t realize that reviewers don’t have backgrounds in the departments being
assessed (too much use of professional jargon, assumptions). Need to report for “outside eyes”.
Some departments didn’t report student learning outcome goals….or goals were not aligned to
University SLO’s in reports. SLOs are not being assessed – rather departments are reporting on
program assessments or accreditation reports with information unrelated to student learning
outcomes.
Today and the Future
Although you may learn what you believe you need to know today to write your report,
sometimes when you get back to the office/work group, you may have more questions.
Good news: there is additional assistance available!
UAC Assessment Consultant Team (ACT) members can help – but need “advance notice” before
deadline of report writing.
T0 learn more about ACT or to schedule a meeting/session with an ACT member, contact Chris
Dudley - [[email protected]]
Getting started: the Report Template
Handout – report template
I.
Program information –
◦ reporting on LAST AY (2013 – 2014) only.
◦ please include e-mail and phone contact for Chair and Dept. Assessment Coordinator
II.
Program-Specific Student Learning Outcome (Education Objectives) Assessed During
Last Academic Year.
◦ If all SLO’s are not assessed each year, please explain why, and when they are assessed.
◦ List in tabular or bullet form, the specific program SLO’s and tie to University SLO’s
Report Template, cont.
III. Direct Measures Used
◦
◦
◦
◦
◦
◦
Course embedded assessments (tests, projects, oral presentations, graded homework)
Rubrics
Direct observations of internships with a rubric
Portfolios
External certification/licensure exams
Other?
**Key is that assessments have observable behaviors with defined scoring rubrics
◦ Rubrics ideally have been reviewed and agreed upon by all department faculty
◦ Training in use of assessments (such as internship evaluations) usually occurs
Report Template, cont.
IV. Indirect Measures used
◦
◦
◦
◦
◦
Surveys (survey monkey)*
Student opinion polls *
Faculty course evaluations
Site supervisor/Internship placement personnel feedback/questionnaires
Other??
Using Course Grades as assessments
Course grades are a valid assessment measure ONLY if departments clearly explain how those
grades are derived and which part of the grade is indicative of achievement of which of the
program’s SLOs.
◦ Demonstrate scoring criteria on portion of grade derived for each SLO
◦ Describe the number and percentage of students passing the item tied to the SLO, not the number and
percentage passing a course
Why?
Grading criteria are often not clear nor the same for each professor teaching a course.
Grades often comprise a variety of components – some directly and some not at all related to
one or more specific student learning outcomes.
Grades typically “comingle” different assessments to achieve a final score – and this is not a
reliable or valid measure of any one specific student learning outcome achievement.
Using APPENDICES
Appendices should only be used in rare circumstances.
Assessment documentation, results, and student learning outcomes should be included within
the report, not as a separate document.
Use these appendices only when absolutely necessary and refrain from continued citations of
appendices in the report (e.g. See Appendix 1 for explanation).
If an item needs to be explained, please do so in the report.
PLEASE REMEMBER – reviewers are volunteer faculty members who, like you, are extremely busy.
The time available for reviews is limited, and clarity, simplicity, and completeness is helpful to
their review process.
Report Template, cont.
V.
Student Performance Outcomes
◦ Using table, fill in assessment name, target or passing scores, number of students assessed
◦ Results (% who met or did not meet standards)
VI. Key findings –
◦ Briefly summarize (no more than a page) the results and how they compare to the SLOs.
Report Template, cont.
VII.
Describe the Process Used by Program Faculty to Discuss and Interpret Key Findings
◦ (Be sure to include how, when, how often faculty met to discuss results.)
◦ Discuss why faculty believe the results were as they were.
VIII.Changes Made as a Result of the Key Findings/Actions Taken
◦ Program changes being recommended?
◦ Changes in protocol?
◦ Changes in assessments?
◦ Changes in timeline for assessment?
◦ Follow-up from previous report findings and suggestions?
◦ Other?
Report Template, cont.
IX.
Adjustments to /Deviation from the Department Assessment Plan
◦ Ex: On site visit of accreditation team precluded an assessment or added additional ones?
◦ Faculty changes and teaching assignments may have interrupted assessment cycle….
◦ Other?
What the levels mean in ratings:
If your report was rated as:
Level 4 – The program report demonstrates a consistent and significant process for the use of assessment data to
improve student learning. This process includes the use of multiple direct and indirect measures to
assess student learning. This program has created a “Culture of Assessment” where assessment has
become institutionalized in order to achieve continual improvement of student learning. (* You are a
model for other departments :})
Level 3 - This program has developed an assessment program and has provided substantial evidence to
demonstrate that they are employing assessment measures and using the data from these measures to
improve student learning. This includes the usage of both direct and indirect measures to assess student
learning.
Level 2 - This program has developed an assessment program and has provided some evidence that they are
employing assessment measures to improve student learning. However, the assessment report
demonstrates an inconsistent or sporadic implementation of its assessment plan. This may include an
unbalanced usage of direct or indirect measures.
Level 1 - This program is in the early phases of developing an assessment program and
has not yet provided evidence that they are employing assessment measures to
improve student learning.
** Remember, the ratings are based on the report that was submitted – which is the only information the
reviewers have upon which to base decisions/ratings.
Submitting the report
Reports must be electronically sent to the University Assessment
Committee e-mail address:
[email protected]
They are then uploaded to Trac Dat, where reviewers access them and
review them.
Reviews are returned to the Department Chair and Department
Assessment Coordinator, once completed.
Eventually, each department will have access to Trac Dat.
Group work
Leaders circulate to help and respond to questions….
Your questions?