ELA Update: TEKS and TAKS

Download Report

Transcript ELA Update: TEKS and TAKS

STATE OF TEXAS ASSESSMENTS OF ACADEMIC READINESS (STAAR TM ) Grades 4 and 7 Writing English I, II, and III Victoria Young Director of Reading, Writing, and Social Studies Assessments Texas Education Agency

2

Writing Administration Issues

Students writing compositions on incorrect page in answer document (e.g., composition #1 in space for composition #2) Students recording answers to multiple choice questions and/or writing prompts in test booklet but not transferring answers/compositions to answer document Test administrators not allowing students to keep dictionaries on revising and editing

3

Initial Rangefinding Discoveries

Expository writing caused the most difficulty across the board —grade 4 through English II Central idea/controlling idea/thesis statement So broad and nebulous that it caused us difficulty discerning which ideas in the essay actually functioned as support/development Weak or evolving controlling idea led to inclusion of extraneous information

4

Initial Rangefinding Discoveries

Essay “jumpy” from idea to idea—a result of too many ideas in one page and a lack of meaningful transitions (e.g., One reason that…, Another reason that…) Student included a personal narrative that simply told a story rather than explained

5

Initial Rangefinding Discoveries

Potential solutions: Honing the central idea/controlling idea/thesis statement Writing “narrow and deep” Teaching the difference between personal anecdote that explains and personal anecdote that doesn’t

6

STAAR Writing Rubric

Score Point 1 —VERY LIMITED Score Point 2 —BASIC Score Point 3 —SATISFACTORY Score Point 4 —ACCOMPLISHED

7

Scoring Model for STAAR

TAKS compositions were scored using the “perfect agreement” model. Two readers read each paper, and if the scores did not agree, a third reader (and sometimes a fourth) read the paper to determine the final score.

STAAR compositions will be scored using the “adjacent scoring” model. Perfect agreement does not have to be reached. With this method, districts will receive a more accurate description of each student’s writing performance.

Scoring Model for STAAR

8 SCORE 1 SCORE 2 SUMMED SCORE 0 1 1 2 2 3 3 4 0 1 2 2 3 3 4 4 0 2 3 4 5 6 7 8 CSR RATING Nonscorable Performance Very Limited Performance Between Very Limited and Basic Performance Basic Performance Between Basic and Satisfactory Performance Satisfactory Performance Between Satisfactory and Accomplished Performance Accomplished Performance

9 Rating

0 2 3 4 5 6 7 8 Total

# STAAR Summary Reports Constructed Responses %

RATING

10 Coding Student Expectations on

STAAR

Example #1 —Grade 7 Poetry Question: The repetition of the line “xxxx” suggests that the speaker – Coded as 4A SE assessed 4(A): explain the importance of graphical elements (e.g., capital letters, line length, word position) on the meaning of a poem

11 Coding Student Expectations on

STAAR

Example #2 —Grade 7 Poetry Question: Because the poem is written from the speaker’s point of view, the reader is better able to understand – Coded as 4FD SE assessed Figure 19(D): make complex inferences about text and use textual evidence to support understanding The “4” indicates the genre.

12 Determining Appropriate Readability Level on STAAR Steps used in test-development process to ensure appropriate readability level: Approximately 6 months before submission (selections and questions) is due, reading selections are presented at “early passage review” to reading content teams before development of questions begins —selections are accepted, rejected, or moved; specific edits are requested on commissioned selections

13 Determining Appropriate Readability Level on STAAR Internal review by TEA and contractor reading content teams to edit selections and questions Educator committees meet to determine appropriateness of proposed reading selections and questions —committees approve, edit, or reject selections and questions; committees may also recommend that a selection and its questions be moved to another grade or course

14 Determining Appropriate Readability Level on STAAR Approved selections and questions are field-tested within the actual test Field-test student performance data are analyzed to determine appropriateness Questions are accepted into the item bank or rejected If rejected, selection and its questions can be redeveloped at a different grade level and put back through the entire development process the next year

15 Determining Appropriate Readability Level on STAAR Bottom Line: Educators, not readability formulas, are primary determiners of grade-level appropriateness of reading level Test development process is a “fail-safe” system, with multiple levels of review incorporated from different perspectives A selection and its questions is never placed into the item bank until successfully “passing through” all stages of the development process

16 STAAR Reading Design STAAR reading assessments will emphasize students’ ability to make connections within and across texts to think critically/inferentially about different types of texts (almost all test questions go beyond literal understanding) to understand how writer’s craft affects meaning to understand how to use text evidence to confirm the validity of their ideas

17 STAAR Reading Performance Short Answer Questions Score Point 0 —INSUFFICIENT Score Point 1 —PARTIALLY SUFFICIENT Score Point 2 —SUFFICIENT Score Point 3 —EXEMPLARY “Perfect” agreement between two readers is required during scoring.

A total of 56 points on reading test: multiple choice worth 38 points (68% of total score) and short answer questions worth 18 points (32% of total score)

18 STAAR Short Answer Questions Idea By English I, all students must understand what constitutes a credible idea in response to a question. An idea represents the quality and depth of thinking and understanding Idea for a score of 3: perceptive, coherent, discerning, clearly analytical Idea for a score of 2: reasonable and specific; goes beyond literal reading

19 STAAR Short Answer Questions Idea Idea for a score of 1: lacks explanation or specificity; represents only a literal reading of the text Idea for a score of 0: doesn’t answer the question; incorrect or invalid reading of the text; too general, vague, or unclear to judge whether it is reasonable

20 STAAR Short Answer Questions Text Evidence By English I, all students must by able to use text evidence to prove that their ideas are valid. Text evidence substantiates the reader’s ideas; it reflects the degree to which the reader can connect his or her own ideas with the pieces of the text that best support the analysis.

21 STAAR Short Answer Questions Text Evidence Text evidence for a score of 3: specific and well chosen Text evidence for a score of 2: accurate and relevant Text evidence for a score of 1: only a general reference, too partial, weakly linked, or wrongly manipulates the meaning of the text Text evidence for a score of 0: not evident or not attached to an idea

22 Analytical Writing A combination of expository writing and interpretation of one aspect of a literary or expository text (really a hybrid of writing and reading) Analytical prompts contain a literary or informational text (approximately 350−450 words), which students must analyze Score based on (1) the student’s ability to interpret the text and support it with relevant textual evidence (15C) AND (2) the quality of the writing (criteria under expository writing in 15A)

23

CONTACT INFORMATION

Victoria Young Director of Reading, Writing, and Social Studies Assessments Texas Education Agency 512-463-9536 [email protected]