Transcript Slide 1

Beginning, Developing,
Exemplary: Using a Targeted
Grading Rubric to Assess the
Information-Seeking Skills of
First-Year Medical Students
Jack Bullion, MFA, MLS
Instruction Librarian, Gibson D. Lewis Health Science Library
Christine Savi, PhD
Assessment Specialist
University of North Texas Health Science Center
Fort Worth, Texas
Presentation Outline
• Overview of the Medical Informatics
course
• Grading challenges for librarians
• Implementation of grading rubrics
• Examples: “Research Writeup” Rubric
• Results of rubric implementation
• Continuing challenges
Course Background:
Medical Informatics
• Added to TCOM medical education curriculum
for 2007-2008 academic year
– Required for all 1st-year D.O. students
• Pass/Fail
• Blended course
– Blackboard assessments
– 4 face-to-face lectures, 2 online learning modules
• 2010-11: integrated with Introduction to
Research Methods Course
Grading Challenges
• “Research Portfolio” (presented at
SCC/MLA 2009 Meeting, Tulsa, OK)
solved some problems, created others1
• Some groups clearly better than
others—what factors separated the
two? What made the better ones
superior?
• Lack of delivery method for feedback
HOT: Higher Order Thinking
• Campus-wide initiative to identify learning
outcomes and evaluate students’ work
– Quality Enhancement Project (QEP)
• Higher-order thinking/critical thinking:
“something more sophisticated than
recit[ing] facts memorized from lectures or
the textbook”2
Teaming with
Center for Learning & Development
– Responsibility for employing Quality
Enhancement Plan (QEP) initiatives
– Commitment to faculty development activities
targeting application of higher order thinking
strategies, technologies and assessment
techniques
• Instructional integration and implementation
assistance
• Assessment of instrument effectiveness and data
analysis
• Summative and formative feedback systems
Using Rubrics to Assess Learning
• Rubric:
– A scoring tool listing criteria for varying
levels of performance on a task
• Narrative statements describing levels of quality
• Defines qualitative differences in levels of targeted
performance
• Contains levels which apply to indicators for each
strategy
– For each indicator, there is a description of levels of
performance based on predetermined criteria.
– Raters use descriptions to determine the level of
accomplishment on each indicator, e.g.:
» Ratings of “Beginning”, “Developing”, and
“Exemplary”
Why Use Rubrics?
• Used to:
– Define quality within the area being rated
– Articulate same target goals for
improvement for everyone (consistency)
– Track change or improvement over time
– Provide a common set of definitions
across all outcomes
– Lead self assessment and planning
Step 1: Identify Learning Outcomes
• Goals of the Course
– Students should be able to recognize when information is needed
and have the ability to efficiently and effectively locate, evaluate,
and apply the information for a specific purpose (Research
Writeup)
– Students should gain the knowledge and skills needed to locate,
synthesize, and present current best evidence in a clinical setting
(Student Grand Rounds Presentation)
• Core competencies
• ACRL Information Literacy Competency Standards for Higher
Education
• Blumenthal JL, Mays BE, Weinfeld JM, Banks MA, Shaffer J.
Defining and assessing medical informatics competencies. Med
Ref Serv Q. 2005;24:95-102.
Step 2:
Determine Which Rubric to Use
• Holistic
– “scores the overall process as a whole”3
– Provides “only limited feedback”
• Analytic
– “divides product or performance into essential
traits or dimensions so that they can be judged
separately”
– Information-seeking: a step-by-step, linear
process
Research Writeup
Step 3:
Identify the Areas Being Assessed
•
•
•
•
•
•
Citation
Information Need
Search Strategy
Limits
Justification
Credibility of Resource
Step 4: How are these components
used to classify performance?
Results of Implementation
• Clarifies “what is expected and what is valued” for
students4
• Delivers objective, consistent outcomes for
teaching faculty to focus on
• Streamlines grading process
• Enhances library role in campus-wide QEP project
• Easily adaptable for one-shot searching
assignments
• Other UNTHSC librarians can use rubrics to
evaluate students (inter-rater reliability)
Inter-rater Reliability = Consistency
• Ensure all grading participants agree with
rubric
• Verify using sample papers from pilot group
• Review and/or revise rubric from pilot
response
• Test/revise for proper maintenance
Continuing Challenges
• Rubrics must evolve to meet student/faculty needs
• Focuses on “science, not art” of searching5
• Librarians may require more training “to
consistently and accurately use rubrics”
• Clinical faculty involvement and collaboration
– Feedback on Student Grand Rounds grading
rubric
Most Recent Research Using Rubrics
Library Science
• Information literacy
– Oakleaf, (2009),
Knight, (2006),
Buchanan, Luck, &
Jones, (2002)
• Student learning
outcomes
– Yoshina & Harada,
(2007) Avery,
(2003), Choinski,
Mark, A. &
Murphey, (2003)
Medical Education
• Facilitate problem-solving
– Saunders et.al (2003),
Macklin, (2001)
• Validating evidencebased practices
– Boulet et. al (2006),
O’Sullivan et. al,
(2004), Hunt, Haidet, &
Coverdale, (2003),
Ramos, Schafer, &
Tracz, (2002)
Collaboration with Other Departments
• Extension of the Faculty-Librarian Collaboration model
– Plan ahead in considering staff, technology, facility , and time
– Understand the curriculum, course content, and supporting resources
– Support the faculty member’s role
– Utilize other teams or departments
• Incorporate strengths
• Utilize shared technologies and other resources
– Implement pilot projects
• Assemble representative test groups
• Collect data
• Obtain feedback and revise instruments
– Analyze data and employ further revisions (where needed)
Success is a work in progress
References
1. Bullion J. " How did you search for this particular item?": Using a “research portfolio" to
assess the information-seeking skills of first-year medical students. Faculty. 2009:7.
2. Bissell A, Lemons P. A New Method for Assessing Critical Thinking in the Classroom.
Bioscience. January 2006;56(1):66-72.
3. Oakleaf M. Using rubrics to collect evidence for decision-making: what do librarians need to
learn? Paper presented at: 4th International Evidence Based Library & Information Practice
Conference; May 2007, Chapel Hill-Durham, NC.
4. Callison D. Rubrics. School Library Media Activities Monthly. 2000. 17(2):40-42.
5. Oakleaf M. Using rubrics to assess information literacy: An examination of methodology and
interrater reliability. Journal of the American Society for Information Science & Technology.
2009;60:969-983.