ICT in Assessment and Learning

Download Report

Transcript ICT in Assessment and Learning

Thinking Skills Tests for
University Admission
Assisting Admissions Decisions
Robert Harding
Director
TSAT Project and ITAL Unit
Interactive Technologies in Assessment and Learning
http://tsa.ucles.org.uk/
CAA 2004
1
Thinking Skills Assessment :
Recent history in Cambridge
 Need for additional information to select applicants

BMAT: BioMedical Admissions Test:
now used also by Oxford and UCL, Royal Vet College

About 4000 candidates in 2003, expecting 6000 in 2004
 TSA numbers (used in some Cambridge Colleges):

289 in 2001

478 in 2002

Over 1500 in 2003
 There is evidence that in the Cambridge context the test is a
better predictor of performance than either A-Levels or interview
scores.
CAA 2004
2
Some admissions tests in ROW
 The United States Medical Licensing Examination (USMLE)
http://www.nbme.org/about/about.asp
 The Law School Admission Test (LSAT)
http://www.lsac.org/
 Graduate Medical School Admissions Test (GAMSAT)
http://www.acer.edu.au/tests/university/gamsatuk/intro.html
 “The SAT is a three-hour test that measures verbal and mathematical
reasoning skills students have developed over time and skills they need
to be successful academically.”
http://www.collegeboard.com/student/testing/sat/about/SATI.html
CAA 2004
3
BMAT :
what is tested?
 Section 1 Thinking Skills:
40 multiple choice questions, 1 hour
 Section 2: Science and Maths core knowledge:
30 questions, 30 mins
 Section 3 essay: capacity to develop ideas and to
communicate them effectively in writing
30 mins.
CAA 2004
4
Thinking Skills Assessment :
what is tested?
 Critical Thinking and Problem Solving
 50 multiple choice questions, online or on paper
 One ‘paper’: test lasts 90 minutes
 Origins in academic research around mid 1980’s
 MENO 1993/4 tested six ‘thinking skills’
 Today:
 OCR’s AS Level in Critical Thinking
 CIE’s AS Level in Thinking Skills
 Evidence that these skills are highly relevant to successful study.
CAA 2004
5
Thinking Skills Assessment :
Is it useful?
“The TSA Project has been running on a pilot basis since
2001 in Cambridge. Its results are used carefully and
conditionally by Colleges as additional information and
never replace any traditional criteria on which
admission decisions are based. More research and
evaluation is being done in this pilot project to
establish its value for admissions purposes. Although
this research is not yet complete, the use of the TSA
test within Cambridge has grown each year as
confidence in its usefulness grows.”
CAA 2004
6
Thinking Skills Assessment :
what does this do for ‘access’?
Provides a uniform measure of readiness
for Higher Education
Can be self-tested; encourage applicants
Low cost support materials widely
available, eg via www
Worth learning these skills
Facilitates access research.
CAA 2004
7
350
300
R = 0.33
Tripos mark
250
200
150
100
50
0
0
10
20
30
40
50
60
70
80
90
TSA score
Data for 48 candidates who took the TSA in December
2001 and their first University examination (the Tripos) in
the summer of 2003, all Computer Science.
350
300
R = 0.33
Tripos mark
250
200
150
100
50
0
0
10
20
30
40
50
60
70
80
90
TSA score
2001/2003 data for 48 candidates, showing the top 12
TSA scorers and the bottom 12 TSA scorers.
350
300
R = -0.002
Tripos mark
250
200
150
100
50
0
0
10
20
30
40
50
60
70
80
90
TSA score
2001/2003 data for 48 candidates, but Tripos marks
replaced by randomised values.
6
R = 0.27
Tripos grade 2004
5
4
3
2
1
0
0
10
20
30
40
50
60
70
80
TSA score 2002
Scatter plot of 2002 TSA score vs 2004 Tripos score:
478 candidates took the TSA, 83 went on to 1st year exams in Natural
Sciences, Computer Science, and Mathematics courses.
90
200
Frequency
150
100
50
Mean = 60.62
Std. Dev. = 8.216
N = 1,551
0
0
10
20
30
40
50
60
70
80
90
Overall Mark
TSA 2003 total scores - distribution
100
200
150
Frequency
Frequency
150
100
100
50
50
Mean = 62.8
Std. Dev. = 10.227
N = 1,551
Mean = 59.23
Std. Dev. = 10.005
N = 1,551
0
0
0
10
20
30
40
50
60
70
80
Critical Thinking
Critical Thinking
90
100
0
10
20
30
40
50
60
70
80
Problem Solving
Problem solving
TSA 2003 scores distribution
90
100
What’s the CAA perspective?
 Whole process is computer-assisted:
 Item banking & test construction
 Administration and registration
 Answer sheet scanning (ICR)
 Results return and reporting
 System developed by Cambridge’s CARET
(Centre for Applied Research in Educational Technology)
 To be made Open Source, QTI compliant
 MCQ’s, so can be taken on paper or on-line:
Can compare on-line & paper performance.
CAA 2004
14
What’s the CAA perspective?
 Whole process is computer-assisted:
 Item banking & test construction
 Administration and registration
 Answer sheet scanning (ICR)
 Results return and reporting
 System developed by Cambridge’s CARET
(Centre for Applied Research in Educational Technology)
 To be made Open Source, QTI compliant
 MCQ’s, so can be taken on paper or on-line:
Can compare on-line & paper performance.
CAA 2004
19
Delivery
Mode
Score type
Paper
Online
N
Mean
Std
Dev
Total
1114
60.4
8.3
Prob. Solv.
1114
62.5
10.2
Crit. Think.
1114
59.0
10.1
Total
437
61.0
8.1
Prob. Solv.
437
63.4
10.2
Crit. Think.
437
59.5
9.6
TSAT in 2004 and Beyond?
 UCLES is planning to continue pilot testing the TSA in Cambridge, on
a similar basis to 2003
 The BMAT continues in 2004 with additional universities
 We will continue to consult interested parties and stakeholders
 We will continue to develop the technology
 We may look for interest from other Universities, and will consider
overseas testing
 We will look at issues of timing and location
 We will continue to research admissions issues.
CAA 2004
21
Thinking Skills Admissions Testing
 Some URL’s:
http://www.ucles.org.uk/
http://tsa.ucles.org.uk/
http://bmat.org.u/
http://www.caret.cam.ac.uk/
Acknowledgements and thanks …
To many colleagues in UCLES and to our partners in
CARET who played very full parts in this work.
CAA 2004
22