Assessing Reading Literacy in the Language of Instruction John H.A.L. de Jong Intergovernmental Conference: Languages of Schooling towards a Framework for Europe Strasbourg, 16-18

Download Report

Transcript Assessing Reading Literacy in the Language of Instruction John H.A.L. de Jong Intergovernmental Conference: Languages of Schooling towards a Framework for Europe Strasbourg, 16-18

Assessing Reading Literacy in
the Language of Instruction
John H.A.L. de Jong
Intergovernmental Conference: Languages of Schooling
towards a Framework for Europe Strasbourg, 16-18 October 2006
LANGUAGE TESTING S ERVICES
1
John H.A.L. de Jong 2006
Overview
• Background of the PISA studies
• Design of the PISA studies
• PISA approach to Reading Literacy
• Some central findings of PISA 2000
• Extensions for PISA 2009
• Relationships between PISA & CEF
• Applying CEF in Upper Secondary Vocational
LANGUAGE TESTING S ERVICES
2
John H.A.L. de Jong 2006
Background of PISA study
OECD: Organisation for Economic
Co-operation and Development
• 30 member countries committed to democracy and
market economy
• Provide comparative data and analyses
• For countries
–
–
–
–
to compare policy experience;
seek answers to common problems;
identify good practice;
co-ordinate policies.
• To support economic growth, boost employment,
raise living standards.
LANGUAGE TESTING S ERVICES
3
John H.A.L. de Jong 2006
History OECD
• 1948 Organisation for European Economic Co-operation
(OEEC), set up with support from the United States and
Canada to co-ordinate the Marshall Plan for the
reconstruction of Europe after World War II
(www.oecd.org/history).
• 1961 Created as an economic counterpart to NATO, the
OECD took over from the OEEC
• Early recognition of importance of education of its
Citizens
• 1985  Gather indicators on investment in education
• Education at a glance yearly publication
• 2000 first PISA
LANGUAGE TESTING S ERVICES
4
John H.A.L. de Jong 2006
PISA Goals
After studying resources & investments in
education for a number of consecutive years:
Wish to know the yield of education:
at the end of compulsory education how
well are students prepared to continue
further education or to start working life?
LANGUAGE TESTING S ERVICES
5
John H.A.L. de Jong 2006
PISA assessment schedule
2000
2003
2006
2009
2012
Reading Reading Reading Reading Reading
Maths
Maths
Maths
Maths
Maths
Science
Science
Science
Science
Science
LANGUAGE TESTING S ERVICES
6
John H.A.L. de Jong 2006
PISA assessment schedule
2000
2003
2006
2009
2012
Reading Reading Reading Reading Reading
Maths
Maths
Maths
Maths
Maths
Science
Science
Science
Science
Science
LANGUAGE TESTING S ERVICES
7
John H.A.L. de Jong 2006
Participation in OECD/PISA
PISA Year
OECD
Partner
2000
28
15
2003
30
10
2006
30
28
LANGUAGE TESTING S ERVICES
8
John H.A.L. de Jong 2006
Definition
Reading literacy is
understanding, using and reflecting
on written texts,
in order to achieve one’s goals,
to develop one’s knowledge and potential
and to participate in society.
LANGUAGE TESTING S ERVICES
9
John H.A.L. de Jong 2006
Retrieving
information
LANGUAGE TESTING S ERVICES
Interpreting
text
10
Reflection &
Evaluation
John H.A.L. de Jong 2006
Let’s look at Reading
In the activity of reading we can distinguish four main
elements:
1 there is a reader
2 there is a text
(the author is part of the text, but not present in the activity)
Both of these are observable.
3 the reader has a goal
(a self-chosen reason for reading the text)
4 the reading activity will have a result
(a change in the reader: more knowledge, satisfaction, etc )
These are not observable.
LANGUAGE TESTING S ERVICES
11
John H.A.L. de Jong 2006
Reading in ‘real’ life
Text - Reader
- Goal - Result
Assessing Reading:
Text - Test taker - Task - Response - Scoring rules - Score
Conform to
Not chosen, but
the task
imposed by test
Must be
observable
LANGUAGE TESTING S ERVICES
12
Reflects difference between
task and observed response
John H.A.L. de Jong 2006
A Reading item
• refers to (part of) a text
• sets task pertaining to the text,
requiring particular subskill(s)
• may require to link information from the
text to prior knowledge and experience
These
elements
should be
relevant to the
intended trait
• specifies format for overt response
demonstrating application of subskill
These elements
• specifies rules for scoring response
the least possible
should interfere
Only then are score differences likely to
reflect differences in true ability on the
intended trait: i.e. reliable and valid
LANGUAGE TESTING S ERVICES
13
John H.A.L. de Jong 2006
Aspects of item difficulty
+
Reader
Evaluate / Hypothesize
Find / Connect /Assess
Comprehension
Difficulty of task
Difficulty
Difficultyofofitem
item
Difficulty
of prior
item
depends
on
Difficulty
of is
Comprehension
depends
depends
on scoring
on
▪Complexity
depends
on
skills
knowledge
&
comprehension
the
interaction
response
rules
format ▪Ling.Characteristics
Scoring
Guide
required
to on
perform
experience
of
depends
text
between
text
&
▪ # of elements
Specificity
task
candidate
characteristics
reader
▪ Familiarity
Text
LANGUAGE TESTING S ERVICES
14
– Formal / Public
+
Response
Format
John H.A.L. de Jong 2006
LANGUAGE TESTING S ERVICES
15
John H.A.L. de Jong 2006
LANGUAGE TESTING S ERVICES
16
John H.A.L. de Jong 2006
PISA 2000: Relation SES & Reading
SES
READING
High
540
Medium
500
Low
460
LANGUAGE TESTING S ERVICES
17
John H.A.L. de Jong 2006
LANGUAGE TESTING S ERVICES
18
John H.A.L. de Jong 2006
Mean PISA 2003 scores per educational track in
the Netherlands
LANGUAGE TESTING S ERVICES
19
John H.A.L. de Jong 2006
Mean Reading Literacy score for Dutch-born
students and for immigrant students
LANGUAGE TESTING S ERVICES
20
John H.A.L. de Jong 2006
Language proficiency in upper
secondary vocational education
CINOP Study using CEF
• what is the actual language proficiency of
students?
• is there a discrepancy between actual and
required language proficiency for further
education and professional occupation?
LANGUAGE TESTING S ERVICES
21
John H.A.L. de Jong 2006
Research design
- written questionnaires for language
teachers and vocational teachers, N = 210
- self assessment on language proficiency
for students, N= 345
- followed by a test for reading, N= 328
- oral interviews with teachers, staff
members and management, N= 40
LANGUAGE TESTING S ERVICES
22
John H.A.L. de Jong 2006
Teachers’ views
• 60 to 70 % estimate language proficiency
(reading A2-B1) insufficient for school
• 10 % estimate the language proficiency on
a higher level than needed for school
• 80 % estimate the language proficiency
insufficient for professional occupation
LANGUAGE TESTING S ERVICES
23
John H.A.L. de Jong 2006
Self assessment by students
• > 2/3 of students estimate their reading
proficiency level at B2
• In reality:




7 % at A1
24 % at A2
52 % at A2-B1
17 % at B1-B2
LANGUAGE TESTING S ERVICES
24
John H.A.L. de Jong 2006
CEF appropriate for description of
language of instruction
• Coherent and workable structure for teachers
(contributes to improvement of their professional
competences)
• Competence based descriptions more suitable for
language of instruction than linguistic descriptions of L1
• Descriptors applicable
• More transparency and motivating for students
(specially in combination with a language portfolio)
• Offers opportunity for reflection on language learning by
students
LANGUAGE TESTING S ERVICES
25
John H.A.L. de Jong 2006
Shortcomings of CEF in describing language
of instruction in vocational education
• Can do statements are geared at academic learning
routes: not well adjusted to describing specific tasks in
vocational education and specific tasks in the workplace
• More situations and examples have need to be offered
for a range of vocational sectors
LANGUAGE TESTING S ERVICES
26
John H.A.L. de Jong 2006
“Milestones”
• 2000 1st PISA Cycle: Reading Major Domain
• 2002 Publication Reading for Change
• 2004 CINOP Dutch in Vocational Education
• 2006 ALARM: Language skills at risk
LANGUAGE TESTING S ERVICES
27
John H.A.L. de Jong 2006