Transcript Slide 1

May 7, 2015
Chicago, Illinois
1
The JRCERT promotes excellence in
education and elevates quality and
safety of patient care through the
accreditation of educational programs
in radiography, radiation therapy,
magnetic resonance, and medical
dosimetry.
Laura S. Aaron, Ph.D., R.T.(R)(M)(QM), FASRT
• Chair
Stephanie Eatmon, Ed.D., R.T.(R)(T), FASRT
• 1st Vice Chair
Tricia Leggett, D.H.Ed., R.T.(R),(QM)
• 2nd Vice Chair
Darcy Wolfman, M.D.
• Secretary/Treasurer
Laura Borghardt, M.S., CMD
Susan R. Hatfield, Ph.D.
Bette A. Schans, Ph.D., R.T.(R)
Jason L. Scott, M.B.A., R.T.(R)(MR), CRA, FAHRA
Loraine D. Zelna, M.S., R.T.(R)(MR)
4
Leslie F. Winter
CEO
Jay Hicks
Executive Associate
Director
Traci Lang
Assistant Director
Barbara Burnham Special Projects
Coordinator
Tom Brown
Accreditation Specialist
Jacqueline Kralik Accreditation Specialist
Brian Leonard
Accreditation Specialist
Radiography
619
Radiation
Therapy
76
Magnetic
Resonance
8
Medical
Dosimetry
18
7
Total
Considerations 378
Initial -9
Continuing - 80
Progress Reports 29
Interim Reports - 151
Other – 109
8
8 Year – 59
5 Year – 13
3 Year – 6
2 Year – 2
Probation – 5
Involuntary
Withdraw – 3
9
11
 “Educational
values should drive not
only what we choose to assess but also
how we do so. Where questions about
educational mission and values are
skipped over, assessment threatens to be
an exercise in measuring what is easy,
rather than a process of improving what
we really care about.”
New Leadership Alliance, 2012
12
 A process
that provides information to
participants, allowing clear evaluation of
the process, the ability to understand the
overall quality of the process and the
opportunity to identify areas for
improvement.
(New Leadership Alliance, 2012)
13
14
 The
ongoing process of
1. Establishing clear, measurable, expected SLOs
2. Systematically gathering, analyzing, and
interpreting evidence to determine how well
students’ learning matches expectations
3. Using the resulting information to understand and
improve student learning
4. Reporting on processes and results
15

Making your expectations explicit and public

Using the resulting information to document,
explain, and improve performance
16
 Information-based
 “The
decision making
end of assessment is action”
 Do
not attempt to achieve the perfect
research design… gather enough data to
provide a reasonable basis for action.
Wolvoord (2010)
17
 Compliance
with
external demands
 Gathering
data no
one will use
 Making
the process
too complicated
18
Course grade cannot pinpoint concepts that
students have or have not mastered
 Grading Criteria

◦ Attendance, Participation, Bonus points
Inter-rater reliability or vague grading
standards
 Not holistic
 Do grades have a place in an Assessment
program?

19
Courses
Student Learning Outcomes
SLO 1
SLO 2
RAD 153
I
I
RAD 154
R
RAD 232
R
SLO 3
SLO 4
RAD 150
R
RAD 234
I
I
I
R
R
R
R
RAD 250
M
M
M&A
M
RAD255
M&A
M&A
A
M&A
“I” = introduce
“R” = reinforce, practice
“M” = mastery
“A” = assessed for program
assessment
0
1
2
3
=
=
=
=
no emphasis
minor emphasis
moderate emphasis
significant emphasis
20
Student Learning
What students will do or achieve
Program Effectiveness
What the program will do or achieve

Knowledge

Certification Pass Rate

Skills

Job Placement Rate

Attitudes

Program Completion Rate

Graduate Satisfaction

Employer Satisfaction
21
Formative Assessment

Gathering of
information during the
progression of a
program.

Allows for student
improvement prior to
program completion.
Summative Assessment

Gathering of
information at the
conclusion of a
program.
22
23
Our program is an integral part of the School of Allied Health Professions and
shares its values. The program serves as a national leader in the education
of students in the radiation sciences and provides learning opportunities
that are innovative and educationally sound. In addition to exhibiting
technical competence and the judicious use of ionizing radiation, graduates
provide high quality patient care and leadership in their respective area of
professional practice.
Consideration is given to the effective use of unique resources and facilities.
Strong linkages with clinical affiliates and their staff are vital to our
success. Faculty and staff work in a cooperative spirit in an environment
conducive to inquisitiveness and independent learning to help a diverse
student body develop to its fullest potential. The faculty is committed to
the concept of lifelong learning and promotes standards of clinical practice
that will serve students throughout their professional careers.
Mission Statement
24
The mission of our program is to produce
competent entry-level radiation therapists.
Mission Statement
25
 broad
statements of student achievement
that are consistent with the mission of the
program
 should
address all learners and reflect
clinical competence, critical thinking,
communication skills, and
professionalism
Goals
26
 Contain
assessment tools
 Contain
increases in achievement
 Contain
program achievements
Goals
27

The program will prepare graduates to
function as entry-level ___.

The faculty will assure that the JRCERT
accreditation requirements are followed.

Students will accurately evaluate images for
diagnostic quality.

85% of students will practice age-appropriate
patient care on the mock patient care
practicum.
Goals
28
Program Effectiveness Measures
Outcome
Graduates will pass the national
certification exam on the 1st attempt.
Measurement Tool
ARRT or MDCB 1st Time Pass Rates
Of those pursuing employment, graduates Graduate Survey (Question 18)
will be gainfully employed within 12
months post-graduation.
Students will complete the program within Retention Rate
150% of the stated program length.
Students will be satisfied with their
education.
Graduate Survey (Question 1)
Employers will be satisfied with the
graduate’s performance
Employer Survey (Question 5)
29
Specific
Measureable
Attainable
Realistic
Targeted
31
Students will ______ ________.
action verb
something
SLOs
32
33
COMPREHENSION
KNOWLEDGE
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
Tell
Trace
Underline
ANALYSIS
APPLICATION
Associate
Classify
Compare
Compute
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
Apply
Calculate
Classify
Demonstrate
Determine
Dramatize
Employ
Examine
Illustrate
Interpret
Locate
Operate
Order
Practice
Report
Restructure
Schedule
Sketch
Solve
Translate
Use
Write
EVALUATION
SYNTHESIS
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
34
Validate
Lower division course
outcomes
COMPREHENSION
KNOWLEDGE
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
Tell
Trace
Underline
ANALYSIS
APPLICATION
Associate
Classify
Compare
Compute
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
Upper division
Course / Program
outcomes
Apply
Calculate
Classify
Demonstrate
Determine
Dramatize
Employ
Examine
Illustrate
Interpret
Locate
Operate
Order
Practice
Report
Restructure
Schedule
Sketch
Solve
Translate
Use
Write
EVALUATION
SYNTHESIS
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
35
Validate
36

Students will be clinically competent.

Students will complete 10 competencies with a
grade ≥75% in RAD 227.

Graduates will be prepared to evaluate and
interpret images for proper evaluation criteria and
quality.

Students will demonstrate ability to operate tube
locks.
SLOs
37
The most important criterion when selecting an
assessment method is whether it will provide
useful information - information that indicates
whether students are learning and developing
in ways faculty have agreed are important.
(Palomba & Banta 2000)
38
Direct Assessment
Measurements
◦ Demonstrate learning
Indirect Assessment
Measurements
◦ Provides reflection about
learning
◦ Performance learning
allows students to
demonstrate their skills
through activities
39
Direct
Indirect
◦ Rubrics
◦ Surveys (Graduate, Employer)
◦ Unit or Final Exams
◦ Self-Evaluations
◦ Capstone Courses
◦ Exit Interviews
◦ Portfolios
◦ Focus Groups
◦ Case Studies
◦ Reflective Essays
◦ Embedded Questions
40
From (Unconsciously):
Making the program look good on
paper.
41
42
You cannot determine how to improve the
program until you know how well the
students have learned.
43
 MUST
measure the outcome
 Should
represent your students’
achievements as accurately as possible
 Should
assess not only whether the students
are learning,but how well
44
a
point of reference from which
measurements may be made
 something
that serves as a standard by
which others may be measured or judged
45

Standard of Performance

Realistic yet Attainable

External or Internal

No Double Quantifiers
(Qualifiers)

Rating Scale
46
Tool
Benchmark
Timeframe
Capstone Course –
Final Portfolio
≥ 94.5 pts
(100 scale)
5th Semester
Clinical Evaluation
Form (Section 2)
≥4.0
(5.0 scale)
5th Semester
≥ 31.5 points
(35 possible points)
4th Semester
Debate Rubric
47
48
 Report the actual
◦ On assessment plan
◦ On separate document
data
 Should facilitate comparison
◦ Comparison of cohorts
◦ Comparison of students attending certain clinical setting
 Show
dates
50
What do the data say about your students’
mastery of subject matter, of research skills, or of
writing and speaking?
 What do the data say about your students’
preparation for taking the next career step?
 Do you see areas where performance is okay, but
not outstanding, and where you’d like to see a
higher level of performance?

UMass-Amherst, OAPA: http://www.umass.edu/oapa/oapa/publications/
51
Primary Uses



Curriculum Review
Requests to Curriculum
Committee
Accreditation Reports and
Reviews
Secondary Uses




Recruiting
Alumni Newsletter
Other publications
Grants and other Funding
UMass-Amherst, OAPA
52
 Identify benchmarks
◦ Sustained effort
◦ Monitoring
◦ Evaluate benchmarks
met
 Identify benchmarks not met
◦ Targets for improvement
◦ Study the problem before trying to solve it!!
◦ Evaluate benchmark
 Identify
3 years of data (trend)
53
Data Collection and Analysis?
Outcome
Benchmark
Results
Analysis/Action
Students will
demonstrate
radiation
protection.
85% of students
will average a
score of ≥ 5.0
(6.0 scale)
100% of students
scored 5 or better.
Benchmark met.
Students will
select appropriate
technical factors.
75% of students
will average a
score of 85% or
better
100% of students
scored 85% or
better.
Employers will find
our graduates as
proficient in
radiation
protection and
safety.
80% of employer
surveys will rate
grads as Above
Average or
Excellent
Continue to
monitor
100% of employers
rate our grads as
being Above
Average or
No Action Needed.
Excellent in
proficiency of
radiation protection
skills.
54
2009 – 2010
Data Collection and Analysis Example
Outcome
Students will
demonstrate
radiation
protection.
Graduates will
manipulate the
‘typical’
examination
protocol to meet
the needs of a
trauma patient
Benchmark
≥ 5.0
(6.0 scale)
≥4.0
(5.0 scale)
Results
5.28
3.40
Analysis/Action
Benchmark met. For the past 2
years this result has increased
(07/08: 4.90; 08/09: 5.15).
This may be attributed to an
increased emphasis of rad.
protection throughout this
semester.
Benchmark not met. This result
continually improves with each
cohort (07/08: 3.25; 08/09:
3.33). The increased amount of
lab time throughout the
curriculum could be attributed to
an increase in this result.
Continue to monitor.
55
56
When
it is ongoing.
58
 is
cumulative
 is fostered when assessment involves a
linked series of activities undertaken over
time
 may involve tracking progress of
individuals or cohorts
 is done in the spirit of continuous
improvement
59

The process of drawing conclusions should be
open to all those who are likely to be affected by
the results – the communities of interest.

Analysis of the assessment data needs to be
shared and formally documented. For example,
meeting minutes from Assessment or Advisory
Committee.
60
 Evaluate
the assessment plan itself to assure
that assessment measures are adequate.
 Evaluation should assure that assessment is
effective in measuring student learning
outcomes.
 Document with meeting minutes.
Objective 5.5
61

Is our mission statement still applicable for what
the program is trying to achieve?

Are we measuring valuable outcomes that are
indicative of the graduate we are trying to produce?

Do we like the plan?

Does the plan provide us with the data that we are
seeking?

Are the student learning outcomes still applicable?

Are the SLOs measurable?
62

“Do we want to use new tools to collect data for the SLO’s?”

“Are our benchmarks appropriate?”

“Do our benchmarks need adjustment?”

“Are the appropriate personnel collecting the data?”

“Are the data collection timeframes appropriate?”
Make recommendations based on the answers
63
Keeping Your Documentation
For each year:
1. Copy of Assessment Plan
2. Actual tools for each one identified in plan – Do
calculations on this tool.
3. Example of each tool (blank).
4. Meeting minutes that document analysis and
sharing of the SLO and PED data.
5. Documentation of examples of changes that were
implemented as a result of data gleaned from
assessment process.
6. Meeting minutes documenting that the
assessment plan has been evaluated to assure that
measures are adequate and that the process is
effective in measuring SLOs.
64
20 North Wacker Drive,
Suite 2850
Chicago, IL 60606-3182
(312) 704-5300
[email protected]
www.jrcert.org
142
for supporting excellence in education and
quality patient care through programmatic
accreditation.
66





Allen, M.J.(2004). Assessing academic programs in higher education.
Bolton, MA: Anker Publishing Company, Inc.
New Leadership Alliance. (2012). Assuring Quality: An Institutional SelfAssessment Tool for Excellent Practice in Student Learning Outcomes
Assessment. Washington DC: New Leadership Alliance for Student
Learning and Accountability.
Suskie, L.(2009). Assessing student learning: A common sense guide. San
Francisco, CA: Jossey-Bass.
Wonderlic Direct Assessment of Student Learning. (n.d.). You Tube.
Retrieved May 6, 2014, from http://www.youtube.com/
watch?v=JjKs8hsZosc
Walvoord, B. E. (2010). Assessment clear & simple: A practical guide for
institutions, departments, and general education. San Francisco, CA:
Jossey-Bass.
67