information” learning outcomes for UNLV students

Download Report

Transcript information” learning outcomes for UNLV students

ASSESSMENT TOOL OR
EDUTAINMENT TOY
USING PERSONAL RESPONSE DEVICES FOR
LIBRARY INSTRUCTION ASSESSMENT
Patrick Griffis
August 6, 2008
Outline

Overview of Personal Response Systems/Clickers

Scalable Options of Clicker Systems

Benefits/Drawbacks for Using Clickers

Using Clickers for Assessment

Experimenting with Clickers/Lessons Learned
Personal Response Systems ‘Clickers’



System allowing instructors to pose questions to
students which students can answer anonymously
Anonymous response feature encourages
participation from students which enhances student
engagement in the classroom and provides
instructors immediate feedback
Personal Response Systems were developed as a
tool to increase student feedback and engagement
in large class settings
Scalable Options: Common Model

Clicker Software + Receiver + Clicker Devices
 Standard
model
 Classroom
set of devices and receiver
 Instructor
responsible for receiver and software
 Requires
software download to instructor’s computer
Scalable Options: Clicker Only

Clicker Devices Only
 LCD
screen handheld receiver
 Computer
and projector not required
 Answers
from student keypads displayed and stored on
handheld receiver and can be exported via USB to a
computer
 Easiest
system to implement/Can be used anywhere
Scalable Options: Clicker Only
Scalable Options: Software Only

Clicker Software + Virtual Response Keypads
 Receiver
 Local
not required and Clicker Devices not required
Area Network and/or Wireless Network
 Student
computers connect via broadcasted IP Address
 Requires
software download to instructor computer and
Virtual Clicker download to student computers
Scalable Options: Software Only
Alternative Web Polling Tools
 DyKnow
Vision (Licensed Software-One Time Cost)
 SurveyMonkey
 WordPress
(Subscription Web Survey Tool)
Polling (Ad On to WordPress Blog Software)
Tool: Clicker Benefits
 Can
increase student interactivity and engagement in
bibliographic instruction with a ‘workshop’ feel
 Can
help instructor to assess what students already
know coming into the session allowing instruction to be
tailored to what students don’t know
 Can
help instructor assess what students have
retained/learned in a session as well as areas that
need reinforcement
Toy: Clicker Drawbacks
 Learning
curve for students can take time away from an
instruction session which might already be hurting for time
 Some
instructors feel that Clickers are another of many slick
gadgets/technologies that already serve to distract students
in the classroom
 Clickers
good-cell phones bad-mixed message for students
Toy: Clicker Drawbacks

Clicker use more expensive for Libraries which have the
burden of purchasing Clicker Devices


More feasible for classes that meet regularly to require students
to purchase clicker devices
It costs more to use Clickers for Library Instruction Sessions
than for Term Courses and Library Instruction Sessions have
a much smaller window of opportunity to use these devices

Nascent need to experiment for feasibility
View from Library Instructors: Tool


“A Personal Response System (PRS), or clickers, is an
effective method for gathering assessment data
during library instruction sessions” (Page 258)
“A PRS is a useful tool for gathering data and
customizing instruction to student needs” (Page 260)
Julian, Suzanne, and Kimball Benson. "Clicking your way to library
instruction assessment: Using a Personal Response System at Brigham
Young University." College and Research Libraries News 69, no. 5 (2008):
258-61.
View from Library Instructors: Toy


“New technology is entertaining but can quickly
become the focus of a session if not used as part of
an appropriate learning activity” (Page 260)
“We discovered it was important to carefully
monitor the amount of time being devoted to clicker
questions and evaluate if their use enhanced the
instruction” (Page 260)
Julian, Suzanne, and Kimball Benson. "Clicking your way to library
instruction assessment: Using a Personal Response System at Brigham
Young University." College and Research Libraries News 69, no. 5 (2008):
258-61.
Critical Question

Can the use of clickers enhance library instruction
enough to be worth the time needed to use them?
 5-10
minutes?
 10-15
minutes?
Critical Response

Effective Clicker use allows library instructors to
identify a few key areas needing most emphasis for
student needs.
 Reduced
need to cover everything
 How
much time would be saved in a 50 minute session
if session student learning objectives were cut by a third
or even by half?
Clickers could actually enhance effectiveness of sessions
to save time.
An Unexpected Use of Clickers for
Library Instruction Assessment

Julian & Benson claim that they serendipitously
discovered that Clickers can be an effective
tool for instructional assessment.

Original intention was to add interactivity to instruction
sessions to increase student engagement and retention.

Authors began considering using Clickers for instructional
assessment.

This purpose will become a new area of consideration and
experimentation.
Julian, Suzanne, and Kimball Benson. "Clicking your way to library instruction
assessment: Using a Personal Response System at Brigham Young University."
College and Research Libraries News 69, no. 5 (2008): 258-61.
Critical Question

Is a Clicker System an Effective, Sustainable,
Practical Tool for Assessing Library Instruction?
 Depends
on the type/purpose of Instructional Assessment
 Formative
Assessment
 Summative
Assessment
Formative vs. Summative Assessment


Formative assessments are on-going assessments, reviews, and
observations in a classroom. Teachers use formative assessment to
improve instructional methods and student feedback throughout
the teaching and learning process. The results of formative
assessments are used to modify and validate instruction.
Summative assessments are typically used to evaluate the
effectiveness of instructional programs and services at the end of
an academic year or at a pre-determined time. The goal of
summative assessments is to make a judgment of student
competency--after an instructional phase is complete. Summative
evaluations are used to determine if students have mastered
specific competencies and to identify instructional areas that need
Critical Questions


Is Instruction Assessment more Formative or
Summative?
Are Clicker Systems Designed more for Formative or
Summative Assessment?
ACRL Information Literacy Standards
[Bloom’s Taxonomy]
“In implementing these standards, institutions need to recognize
that different levels of thinking skills are associated with
various learning outcomes--and therefore different instruments
or methods are essential to assess those outcomes.
For example, both "higher order" and "lower order" thinking
skills, based on Bloom’s Taxonomy of Educational Objectives,
are evident throughout the outcomes detailed in this document. It
is strongly suggested that assessment methods appropriate to
the thinking skills associated with each outcome be identified
as an integral part of the institution’s implementation plan.”
Bloom’s Taxonomy
Original
Knowledge
Comprehension
Application
Analysis
Revised
Remember
Recalling the information
Understand
Explain the ideas and/or concepts
Apply
Using the newly acquired knowledge in another familiar situation
Analyze
Comparing and differentiating between constituent parts.
Evaluation
Evaluate
Justifying a decision or course of action
Synthesis
Create
Generating new ways of creating products, ideas or ways of viewing things
View from Clicker Instructors
Higher Order & Lower Order Thinking Skills


“Different questions elicit different responses and require different
levels of cognitive engagement. Knowledge-level questions ask for
simple recall of facts and data without assessing them critically,
whereas analysis, synthesis and evaluation questions require critical
thinking and judgment.”(Page 5)
“When faculty are simply assessing students’ basic understanding, a
knowledge-level question may be appropriate. But when faculty wish
to engage students in thinking critically about course content, a
knowledge-level question may fall short of reaching the goal.”(Page 5)
Zhu, Erping . "Teaching With Clickers." Center for Research on Learning and
Teaching Occasional Papers (2007): 1-7.
http://www.crlt.umich.edu/publinks/CRLT_no22.pdf.
Critical Questions

Are Clickers an Effective Instrument for Measuring
Information Literacy Outcomes Requiring Higher
Order Thinking Skills?
What Type of Questioning Requires Higher
Order Thinking Skills?
What Type of Questioning is Possible with
Clickers?
Questioning Capabilities of Clickers

Initial Models supported Fixed Response Questions
 Multiple
Choice
 True/False

Newer Models support Alpha-Numeric Answers
 10-15
Character Text Entry
 LCD Display Screen

What About…..
 Matching
Answers
 Short Answer/Essay Questions
Will Clickers Evolve to Fully
Support Summative Assessment

Applications are being developed to enhance the
Assessment capabilities of Clickers.
 TurningPoint’s
‘TestingPoint’ Application
Higher Education Assessment Tool
With TestingPoint, professors and lecturers can create and
administer self-paced assignments, quizzes and tests with ease,
allowing for real-time summative and formative assessment.
TestingPoint

TestingPoint supports questions in multiple formats:
Multiple Choice
 Multiple Response
 Numeric Response
 Matching
 Fill in the Blank
 Bimodal
 True/False
 Short Answer
 Essay

http://www.turningtechnologies.com/k12studentresponsesystem/testingpoint.cfm
Summative Assessment Considerations




Clickers were initially designed for Formative Assessment
Instructors have serendipitously experimented with using
Clickers for Formative and Summative Assessment
Clicker developers are taking the need for Summative
Assessment capabilities to heart
Library Instructors will have opportunity to experiment
fully with Clicker Summative Assessment soon
My Own Experience


Chaired a campus committee tasked with evaluating
Clicker systems and selecting one to be the campus
standard.
Developed knowledge about Clicker systems and a
desire to try them in library instruction to increase
interactivity and student engagement.
Experimentation Issues: Trial Kit Size

Vendor ‘Trial Kits’ limited in scale

Trial Kits typically no larger than 10 Clicker Devices

Could not experiment with using Clickers in large class
settings for which Clicker Systems are intended

Could not negotiate to have a site license trial of a Virtual
Clicker

Trial Kits allow for familiarity with a Clicker System before
purchase but does not allow for real class feasibility
experimentation
Experimentation Issues: Trial Duration

Vendor Trials Limited in Time
 Typically
 Limited
one to three month duration
time for experimentation after learning
curve
 Only
enough time to learn how to use the Clicker System
leaving little time for significant feasibility
experimentation
Experimentation Recommendation

Collaborate with Professors Already Using Clickers
 Simply
visit their classroom and use their Clicker System
 Requires
 Students
 No
 No
familiarity with their Clicker System
have Clickers and know how to use them
learning curve for Students taking away instruction time
cost ideal solution to experimentation issues
 Real
classroom feasibility experimentation
Further Considerations


Experiment with a Personal Response System which is
widely used on campus

Which system is used most on campus?

Has your campus adopted a standard system?
If your campus is in the process of adopting a campus
standard, try to arrange for a representative from your
library system to be on the campus working
group/committee
Favorable Conditions for my Experimentation
with Clickers in Library Instruction



COM 101 Public Speaking Course has requirement
for library instruction
Director of COM 101 Program was on Clicker
Standardization Committee
I have conducted library instruction for these
sessions numerous times
Challenges for my Experimentation with
Clickers in Library Instruction



COM 101 library instruction sessions have a
graded quiz assignment which is a summative
assessment of information literacy learning
outcomes
The quiz takes considerable time from instruction
sessions already
Quiz questions are in formats not widely supported
by Clicker systems
A Silver Lining


COM 101 Course Director considering making the
required quiz take home for students
Future Clicker enhancements might fully support a
quiz like this making the back end grading work for
library instruction assessors much more efficient and
sustainable without compromising effectiveness
QUESTIONS/COMMENTS
Patrick Griffis
Business Librarian
University of Nevada, Las Vegas
4505 Maryland Parkway
Las Vegas, NV 89154-7014
(702) 895-2231
[email protected]