Penny Beile March 22, 2004

Download Report

Transcript Penny Beile March 22, 2004

2008 Library Assessment
Conference, Seattle, WA
Using iSkills to measure instructional
efficacy: One example from
the University of Central Florida
Penny Beile
University of Central Florida
Background





Institutional description
SACS Quality Enhancement Plan
Accreditation driven initiative
Four programs initially selected
Multiple assessments
A (very brief) Comment on
Types of Direct Measures…
Objective
Interpretive
Costs
$$ to purchase
Labor to score
Administration
Large scale
Smaller numbers
Results
Wide and thin
Narrow and deep
Domain
Knowledge
Performance
Methods - Nursing



Goal is to collect baseline data, design
curricular interventions, reassess to evaluate
instructional efficacy
Nursing students matriculate as a cohort, ~120
students enter each semester for BSN program
Analyze at cohort level and across cohorts
2007-2012 Plan
Program Entry
Program Exit
Cohort 1
Baseline, no intervention,
Design instruction to
target deficiencies (2007)
Maturation, possibly control
for that later (noninstructional variables) (2009)
Cohort 2
Intervention effect (2008)
Growth in program (2010)
Cohort 3
Intervention effect (2009)
Cohort 4
Intervention effect (2010)
Growth in program (2011)
Growth in program (2012)
Still Early in the Project
Program Entry
Program Exit
Cohort 1
Baseline, no intervention,
Design instruction to
target deficiencies (2007)
Maturation, possibly control
for that later (noninstructional variables) (2009)
Cohort 2
Intervention effect (2008)
Growth in program (2010)
Cohort 3
Intervention effect (2009)
Cohort 4
Intervention effect (2010)
Growth in program (2011)
Growth in program (2012)
Results of 2007
Administration



114 students in class,
107 completed iSkills
Scores ranged from 485 to 625,
m=561.36, sd=29.94
Established cut score of 575
How Data are Being Used



To identify where instruction is needed
Over time, to assess efficacy of
interventions and instructional models
To provide evidence that we are meeting
our instructional goals
Implications for Practice
Assessment offers a critical and objective
way to see how effective we are in
meeting our instructional goals
– Do libraries contribute to the academic
mission of the institution?
– How effective are our current models?
– Lead us to explore new frameworks
2008 Library Assessment
Conference, Seattle, WA
Using iSkills to Measure Instructional
Efficacy: The CSU Experience
Stephanie Brasley
California State University, Office of the Chancellor
[email protected]
Background- Calif. State University
23 Campuses
 Information Competence (IC) Pioneers
 Sponsoring Partner with ETS on iSkills
Assessment
 IC Grant Program, iSkills Focus, 2006-2008

– 9 Campuses

Snapshot of Use
– California Maritime Academy – Small campus
– CSU Los Angeles – Medium-Sized campus
– San Jose State – Large campus
California Maritime Academy
(CMA) - Approach
Mindy Drake – [email protected]
Advanced test used as pre-test
 Goal: Baseline Data and Current skills-set
 Test Groups:

– Freshmen in Com 100 and Engr 120
• 151 Tested; 137 analyzed: 57% of incoming frosh
– Seniors in capstone courses
• 80 tested; 49 analyzed: 32% of Senior population
CMA - Deliverables
– Information Fluency and Communication Literacy
Learning Objectives
– Rubric for assessing the development of information
and communication technology skills within
course assignments
– Modified COM 100 & ENG 120 assignments and
supplemental materials
– Syllabus and iSkills-influenced learning objectives of
the newly developed LIB 100: Information
Fluency in the Digital World course
CMA – Summary Results

iSkills data used in 4 ways
– Development of learning objectives
– Baseline for ICT Literacy of incoming
freshmen
– Determining ICT Literacy skill-set of current
seniors
– Catalyst for innovation in design of ICT
literacy instructional activities for freshmen
CMA – More Results
CSU Los Angeles – Approach
Catherine Haras - [email protected]
Advanced test used as pre-post test
 Goal: Evaluate ICT literacy-related
instructional interventions
 Target Group: 234 students enrolled in
Business Communications (Juniors and
Seniors)

– Approx. 60% transfer students and 70% ESL
students

Study run over three quarters (Fall 2006,
Winter 2007, Spring 2007)
CSU Los Angeles Study
Methodology
Treatment (day)
Instructor A
iSkills
Pretest
Business 305 Curriculum
1.5 hr Library Lecture
Two Library Workshops
Information Literacy project
Treatment (evening)
Instructor B
Business 305 Curriculum
1.5 hr Library Lecture
Two Library Workshops
Control
Instructor A
Business 305 Curriculum
1.5 hr Library Lecture
iSkills
Posttest
CSU Los Angeles - Summary Results
590
580
Treatment (day)
570
Control
560
550
Treatment (evening)
540
530
520
510
Pretest
Posttest
CSU Los Angeles – More Results
600
590
580
570
560
550
540
530
520
510
500
Treatment (Day) - English
Treatment (Evening) - English
Control - English
Treatment (Day) - Another
Treatment (Evening) - Another
Control - Another
Pre-test
Post-test
Submitted June, 2008 – J. of Education for Business
San Jose State University
Toby Matoush – [email protected]
Advanced test used as a post test
 Goal: Determine gaps and develop
instructional interventions
 Test Groups:

– Freshmen: MUSE students (59); Eng. 1B (100)
– Sophomores - Juniors
Grade
N
Mean
Std. Deviation
95% Confidence Interval for Mean
Lower Bound
10th grade
12 grade
Freshman
Sophomore
Junior
Senior
Grad
Other
Total
Minimum
Maximum
Upper Bound
2
555.00
21.213
364.41
745.59
540
570
4
520.00
48.305
443.14
596.86
480
590
154
554.77
31.645
549.73
559.81
455
620
93
547.90
26.696
542.41
553.40
465
615
192
548.15
32.345
543.55
552.76
470
615
149
548.56
32.290
543.33
553.78
475
620
4
580.00
44.347
509.43
650.57
525
625
5
544.00
29.240
507.69
580.31
515
575
603
549.92
31.623
547.39
552.45
455
625