Test Development - CAA) Conference
Download
Report
Transcript Test Development - CAA) Conference
ECDL Test Development and
Validation Process
Paul Davis,
Oxford University Computing Services
Garry Cleere
The European Computer Driving Licence Foundation Ltd.
Test development process
Syllabus Definition
Soft Characterisation
Core Item Definition
Model Test Grids
Cut Score Definition
Test Blueprint (CTT)
Test Development
* Test Item Development
* Whole test Development
* Expert Validation
* Pilot testing
Pilot Test Validation
Field Test Validation
CAA Conference
2
Syllabus Definition (Expert Opinion)
CAA Conference
3
Soft Test Characterisation
• Non-threatening test experience
• Predominantly performance based
• Straight forward language
• No trick questions
• Coherent outputs
• Sustains the metaphor of the driving licence
• Ample time
CAA Conference
4
Forum score
Score by engine (290 tests)
100%
80%
60%
40%
20%
0%
Mod1
Mod2
Mod3
Simulation
CAA Conference
Mod4
In-App
Mod5
Mod6
Mod7
MCQ / HS
5
Test characteristics – real life
Average Time (Passed & Failed)
45.0
40.0
35.0
Minutes
30.0
25.0
Passed
Failed
20.0
15.0
10.0
5.0
0.0
Module 1
CAA Conference
Module 2
Module 3
Module 4
Module 5
Module 6
Module 7
Total
6
Forum duration of tests
Mins
Time comparison (290 tests)
45.00
40.00
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Mod1
Mod2
Mod3
Simulation
CAA Conference
Mod4
In-App
Mod5
Mod6
Mod7
MCQ / HS
7
Mean duration
4 500
4 000
#success
#failure
total
3 500
3 000
2 500
2 000
1 500
1 000
500
0
0-2 2-4
CAA Conference
4-6
6-8 8-10 10- 1212 14
1416
16- 1818 20
20- 2222 24
2426
26- 2828 30
8
Criticality (Core Item Identification) Exercises
SKILL SET
REF.
TASK ITEM
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.1.1
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.I First Steps with Word
Processing
3.1.2 Adjust Settings
3.1.1.7
Create a new document (default
template).
Save a document to a location on a
drive.
Save a document under another
name.
Save a document in another file
type such as: text file, Rich Text
Format, HTML, template,
software specific file extension,
version number.
Switch between open documents.
3.1.1.8
Use available Help functions.
Option
3.1.1.9
Close a document.
Option
3.1.2.1
3.1.2 Adjust Settings
3.1.2 Adjust Settings
3.1.2 Adjust Settings
3.1.2.2
3.1.2.3
3.1.2.4
3.1.2 Adjust Settings
3.1.2.5
3.2.1 Insert Data
3.2.1 Insert Data
3.2.1.1
3.2.1.2
Change between page view
modes.
Use magnification/zoom tools.
Display, hide built-in toolbars.
Display, hide non-printing
characters.
Modify basic options/preferences in
theapplication: user name, default
directory/folder to open, save
documents.
Insert text.
Insert special characters,
symbols.
CAA Conference
3.1.1.2
3.1.1.3
3.1.1.4
3.1.1.5
3.1.1.6
Open (and close) a word processing
application.
Open one, several documents.
ITEM STATUS
NE
Option
Core
Option
Core
Core
Option
Core
Option
Option
Option
Option
Core
Core
9
Model Tests Exercise
•
•
•
•
•
SMEs create 4 streams of tests
Spread core items across streams
Provide a reasonable coverage in each test
Provide syllabus coverage across streams
Average number of items
CAA Conference
10
Model Tests Grids Scheme
REF.
TASK ITEM
3.1.1.1
Open (and close) a word processing application.
3.1.1.2
Open one, several documents.
3.1.1.3
Create a new document (based on default, other available template).
CAA Conference
MT 1
MT 2
MT 3
MT 4
NE
NE
NE
NE
1
1
1
1
1
11
Item Count 36 Items
• Item Count dictated by coverage requirement
• Dictated by congruency and criticality in tests
• 100% Syllabus coverage throughout ECDL tests
• Balanced, discriminating test profiles for Syllabus V.4
• Tests Syllabus Version 4.0 rigorously
• High reliability in tests and between tests
• Item Count standardisation in all Modules
CAA Conference
12
Result against completion time
100.0
90.0
%success
%failure
%total
80.0
70.0
60.0
50.0
40.0
30.0
20.0
10.0
0.0
0-2
CAA Conference
2-4
4-6
6-8
810
1012
1214
1416
1618
1820
2022
2224
2426
2628
2830
13
Cut Score / Pass Mark
• Systematic approach and Industry based cut score technique
(modified Angoff Method)
• Underpinned by expert opinion
• Cut Score exercises carried out for each module
• Borderline Candidate, or Minimally Competent Candidate (MCC)
• Meaningful Cut Scores based on profile of ECDL tests derived from
ECDL V4 Syllabus
• Item Count standardisation in all Modules
• An auditable, standard cut score process
• Common cut score across all ECDL core level modules, (M1 - 7)
CAA Conference
14
Cut Score Exercises
CAA Conference
15
Model Tests Collation and Balancing
CAA Conference
16
Characterisation Test Template Version 4.0
Heading
1. CONTEXT
1.1. ECDL Programme
1.2. Subject
1.3. Cognitive Level
Content
The ECDL ‘core’ Programme (See Mission Statement.)
ECDL Syllabus Version 4.0 Ref: SWG110159
Module 1: Knowledge-based test.
Modules 2-7: Skills-based practical test.
(Construct Specification)
2. TEST SPECIFIC
CHARACTERISTICS
2.1. Types of Questions Module 1: 36 knowledged-based questions/test.
/ Number of
Questions
Module 2-7: 36 predominantly skills-based questions/test.
Skills-based question items target a Syllabus task item with
appropriate question item formats such as: Hotspot, Image
supported MCQ, In application, Simulation.
2.2. Number of Streams Module 1: Knowldege-based question items/test.
4 tests each with 36 questions. (144 questions in total)
& Question
The same Question Item is not repeated in a test.
Sequencing
The Question Items are presented in random order in a test.
Module 2-7: Skills-based question items/test.
4 tests each with 36 questions. (144 questions/module)
2.3. Test Duration
2.4. Marking / Cut
off.
CAA Conference
Module 1 - 7:
Modules 1- 7: One mark per correct question answered.
Pass Mark : 75%
17
Item / Test Development
Dependant upon:
• Syllabus
• Soft Test Characterisation
• CTT
• ECDL-F Evaluations
• Outcome Probability Grids (expected pass
rate from question)
CAA Conference
18
Item / Test Development
CAA Conference
19
Expert Review of Items / Tests
• Syllabus Congruence
• Syllabus Coverage
• Compliance with CTT
• Appropriateness of Item format
• Technical Accuracy
• Language & Clarity
• Text Content
CAA Conference
20
Expert Review : ATES Systems
•
•
•
•
•
•
•
•
•
•
•
•
CAA Conference
Test Presentation / Interface Features
Timing & Information
Appropriateness of ATES Test Content
Test Content
Marking Algorithm
Candidate Information and Guidance
Practice Tests
Documentation and Training Materials
ATES Technical Support
Reporting
Test Generation
Security Features
21
Perceptions of tests
60
50
Max 60
40
30
20
10
0
M2
M3
In App
CAA Conference
M4
Sim
M5
MCQ/HS
M6
M7
Manual
22
Choice of test engine (V3)
Simulation
InApplication
Rich
Simulation
Web-based
MCQ/HS
MQTB
First
12
12
1
3
3
Second
8
7
5
5
5
Third
5
4
11
6
3
Fourth
3
5
12
6
3
Fifth
1
1
0
9
15
CAA Conference
23
Expert Review : Test Design / ATES Systems
•
•
•
•
•
•
ATES System Overview
ATES System Hardware Requirements
ATES Operating System Requirements
Other Software Requirements
Minimum Screen Resolution
Assistive Technologies
• Screen Design and Visual Impairment
CAA Conference
24
Assembly and Administration of Pilot Tests
• Tests are made available for Pilot Test activity
• Qualitative analysis
• Test taker and administrator observation sheets / questionnaires
• Test data requirements
Remember:
• Across 4 streams for 7 tests across 100 countries and many
languages.
CAA Conference
25
Pilot Tests Review
Pilot Test results are reviewed by the ATES Evaluations Group.
Any changes requested in the tests based on Pilot testing are then
undertaken before the live certification tests are administered.
To ensure statistical validity some pre-testing on the fly is used until
results don’t change. Maintenance then takes over.
CAA Conference
26
Empirical Evidence Evaluation
ECDL Tests are evaluated and monitored on an ongoing basis by
the ATES Evaluations Group using recognized psychometric
measurements.
CTT Version 4.0 - Test behaviour statement
CAA Conference
27
Ongoing Maintenance of Tests
ECDL tests are monitored on an ongoing basis by the ECDL
Foundation ATES Evaluations Group. Tests are updated in line
with Syllabus revisions and also where test results or empirical
evidence indicates that items or tests are not functioning in line with
test behaviour parameters.
CAA Conference
28
Process Steps and the Validity Argument
•
•
•
•
•
•
Documented process - expert opinion
Detailed test data reporting requirements
Data fed back to the ECDL Foundation
Industry Standard techniques
Process Loop
Valid ECDL Tests
CAA Conference
29
Contacts -
Dr Paul Davis
Deputy Head, LTG,
OUCS
13 Banbury Road
Oxford
OX2 6NN
UK
Tel: 01865 283414
[email protected]
CAA Conference
Garry Cleere
Programmes Manager
ECDL Foundation
107 The Windmill
Sir John Rogersons’ Quay
Dublin 2
Ireland
Tel. 00 353 1 679 2847
Tel. 00 353 1 679 2921
[email protected]
30