Transcript Document
IELTS and the Academic
Reading Construct
Tony Green
Cyril Weir
Centre for Research in English Language Learning and Assessment
The researchers would like to acknowledge the support of the British Council in funding this study
Test validation from the user perspective
CRELLA programme of research to explore how far IELTS academic
reading test reflects the reading practices of university students.
• analysis of undergraduate texts vs IELTS academic reading
texts
• analysis of student vs IELTS academic reading tasks
• student reading processes vs IELTS academic reading test
taking processes
CRELLA University of Bedfordshire
EALTA Athens May 2008
2
Comparisons between IELTS and
undergraduate reading
Weir et al. (2007) compared IELTS academic reading to student
experiences based on survey of 1,000 UoB students
IELTS was said to under-represent:
• expeditious reading skills (requires avg. reading speed of c. 60
wpm)
• integration of information beyond the sentence level
• information at level of the whole text
• information accessed across texts
Current study intended to extend self-report data to larger sample of
test takers in variety of contexts.
CRELLA University of Bedfordshire
May 2008
3
Instruments
IELTS academic reading test
IELTS academic reading has 3 parts
1 Test Part has an input text of c.800 (min 586 – max 1036) words and 13 or
14 associated questions.
Used 2 IELTS academic reading tests from C.U.P. Cambridge Practice Tests
for IELTS: Volume 2 (released material that has passed through Cambridge
ESOL test development procedures). These…
• Only employed currently approved Q types (see www.ielts.org)
• Required both explicit and implicit information sources
• Were judged to encourage both expeditious and careful reading types
• Contained texts well within typical IELTS ranges for readability,
vocabulary range and syntactic complexity
CRELLA University of Bedfordshire
May 2008
4
Instruments
Retrospection form
Groups of students were administered one Test Part (20 minutes)
Test Part = 1 text + up to 4 Sections of different Q types = 13/ 14 Qs
Followed by a retrospection form eliciting…
• Background information (age, gender, L1, nationality, previous IELTS,
uni. subject)
• Text preview – did test takers read the text before looking at the
questions?
• Strategies for responding – how did test takers go about looking for the
answers?
• Information base for the response – where did the the test takers find
the information they needed to answer the questions?
CRELLA University of Bedfordshire
May 2008
5
Participants
Background and score levels
352 participants
40 - 74 participants per Test Part
16 languages 79% L1 Chinese, 4% Arabic, 4% Thai
59% female
Median age 22
Divided into 3 broad score levels, loosely interpreted (based on
equivalences suggested at www.ielts.org) as representing…
0-5 points c. IELTS 5.5 or below
6-8 points c. IELTS 6.0
9+ points c. IELTS 6.5 or above
CRELLA University of Bedfordshire
May 2008
6
Text Preview
PR1
read the text or part of it slowly and carefully
PR2
read the text or part of it quickly and selectively to get a
general idea of what it was about
PR3
did not read the text
CRELLA University of Bedfordshire
May 2008
7
Text Preview
• Over ½ of all report quickly and
selectively previewing text
• Highest scoring test takers less
likely to preview the text
• Lowest scoring most likely to
preview slowly, carefully
1: slowly, carefully, 2: quickly, selectively, 3: no preview
CRELLA University of Bedfordshire
May 2008
8
Response strategies
ST1
match words that appeared in the question with exactly the same
words in the text
ST2 quickly match words that appeared in the question with similar or
related words in the text
ST3 look for parts of the text that the writer indicates to be important
ST4 read key parts of the text such as the introduction and conclusion
ST5 work out the meaning of a difficult word in the question
ST6 work out the meaning of a difficult word in the text
ST7 use my knowledge of vocabulary
ST8 use my knowledge of grammar
ST9 read the text or part of it slowly and carefully
ST10 read relevant parts of the text again
ST11 use my knowledge of how texts like this are organised
ST12 connect information from the text with knowledge I already have
CRELLA University of Bedfordshire
May 2008
9
Response strategies
Most and least popular strategies
83% use
77% use
76% use
8% use
ST2: quickly match words that appeared in the question
with similar or related words in the text
ST10: read relevant parts of the text again
ST3: look for parts of the text that the writer indicates to be
important
ST8: use my knowledge of grammar
CRELLA University of Bedfordshire
May 2008
10
Response strategies
Differences by level
ANOVA reveals differences in strategy use by level for:
Used more often by higher scoring test takers
ST2 quickly match words that appeared in the question with similar or
related words in the text
ST10 read relevant parts of the text again
Used more often by lower scoring learners
ST5 work out the meaning of a difficult word in the question
CRELLA University of Bedfordshire
May 2008
11
Response strategies
Patterns by item type (Test Section)
Example
ST3
ST4
look for parts of the text that the writer indicates to be important
read key parts of the text such as the introduction and conclusion
Both associated with higher scores on the following item set:
Choose the most suitable heading for paragraphs A-G from the list of headings
below.
i Common objections
vi They can't get in without these
ii Who's planning what
vii How does it work?
iii This type sells best in the shops viii Fighting corruption
iv The figures say it all
ix Systems to avoid
v Early trials
x Accepting the inevitable
CRELLA University of Bedfordshire
May 2008
12
Location of necessary information
L1
within a single sentence
L2
by putting information together across sentences
L3
by understanding how information in the whole text fits together
L4
without reading the text
L5
could not answer the question
CRELLA University of Bedfordshire
May 2008
13
Location of necessary information
Test E
Test Part & Section Within sentence
Across sentences
E1.1
E1.2
+
+
E1.3
+
E2.1
+
+
+
E2.2
E2.3
E2.4
E3.1
+
+
E3.2
CRELLA University of Bedfordshire
Whole text
+
May 2008
14
Location of necessary information
Test F
Test Part & Section
Within sentence
Across sentences
F1.2
+
F2.1
+
F2.2
+
F3.1
+
F3.2
CRELLA University of Bedfordshire
Whole text
+
May 2008
15
Conclusions
Response strategies cannot be assumed from item type or predicted with
sufficient accuracy via expert judgement
Protocol forms potentially of great value in routine piloting
Can highlight issues with particular items as part of the item QA process – e.g.
‘guessability’
Can help to confirm that required range of reading skills are addressed in
every test form
IELTS test takers do
locate necessary information across sentences, but whole text level not
always required
use more expeditious reading strategies than predicted from Weir et al
2007, but few items require these
CRELLA University of Bedfordshire
May 2008
16