Transcript Slide 1

Using Data to Improve Student Achievement & to Close the Achievement Gap

Tips & Tools for Data Analysis Spring 2007 1

BEFORE/AFTER SCHOOL ACTIVITIES/DUTIES TRANSPORTATION EXPERIENCES ATTITUDES PRIOR SUCCESS/ FAILURE LIVING SITUATION/ FAMILY STRUCTURE/ FAMILY SIZE SOCIOECONOMIC STATUS MOBILITY PHYSICAL, MENTAL, & SOCIAL HEALTH SPECIAL NEEDS BACKGROUND KNOWLEDGE LANGUAGE FLUENCY BELIEFS

Looking at the BIG Picture

3

STEP 4

What are we going to do about the lack of achievement?

(PLANNING & IMPLEMENTATION )

STEP 1

What evidence shows that students learned?

(DATA) 4 Step DDDM Process

STEP 3

Why Aren’t Students Achieving?

(HYPOTHESIS)

STEP 2

Who is and is not achieving?

(ANALYSIS)

Multiple Measures

 Demographics  Enrollment, attendance, drop-out rate, ethnicity, gender, grade level    Perceptions  Perceptions of learning environment, values & beliefs, attitudes, observations Student Learning  Standardized tests (NRT/CRT), teacher observations of abilities, authentic assessments School Processes  Description of school programs & processes 5

Criterion-Referenced Data

 What’s required?

 Proficiency percentages for combined pop. & identifiable subgroups by  Test  Year (for latest 3 years)  Analysis of test by  Passage type & type of response for literacy  Writing domain & multiple choice for literacy  Strand & type of response for math

…in order to identify trends and draw conclusions based on results over 3 year period

6

Norm-Referenced Data

 What’s required?

 National percentile rank & standard score for combined population & identifiable subgroups by  Test  Year  Analysis of test by  Content subskill & skill cluster

…in order to identify trends, measure growth, and draw conclusions based on results over 2 year period

7

Disaggregated Data Tools

 CRT  ACSIP Template: # and % of students non proficient/proficient for combined and subgroup populations  ACSIP Strand Performance Report: combined and subgroup performance averages by test, passage type/domain/strand, & type of response  Data Analysis Set: [email protected]

8

DATA SUMMARY REPORT BENCHMARK RESULTS COMBINED POPULATION GRADE LEVEL 3 # 4 5 12 NP MATH BENCHMARK RESULTS 2004 2005 # P # 22 NP 69 # 10 29 30 71 12 27 39 60 19 18 6 7 28 78 8 22 19 27 65 73 8 32 74 11 26 23 64 EOC ALGEBRA 5 17 25 83 9 28 EOC GEOMETRY 11 38 18 62 14 KEY: # = actual number of students NP = percentage of non-proficient students P = percentage of proficient & advanced students 39 10 10 13 23 22 34 27 36 72 61 P 31 61 40 23 20 27 12 6 # 13 NP 38 2006 # 21 21 12 60 40 14 18 51 53 66 34 20 22 18 14 23 24 49 47 34 66 80 P 62 40 60

SUB-GROUP AFRICAN-AMERICAN CAUCASIAN HISPANIC SPECIAL SERVICES ECONOMICALLY DISAD.

ELL # NP 4 TH GRADE MATH BENCHMARK RESULTS 2004 2005 # P # NP # # NP 2006 # P P 17 12 18 1 11 16 46 29 60 50 50 64 20 29 12 1 11 9 54 71 40 50 50 36 19 12 17 5 5 12 48 39 47 83 38 67 21 19 19 1 8 6 52 61 53 17 62 33 22 63 13 21 9 6 10 10 62 41 100 56 56 13 13 0 8 8 37 38 59 0 44 44 KEY: # = actual number of students NP = percentage of non-proficient students P = percentage of proficient & advanced students

M/C 2004 50 O/R 67 33 LITERARY 2005 49 M/C 56 O/R 42 M/C 2006 38 O/R 50 26 M/C 69 M/C 42 M/C 62 2004 52 2004 36 2004 42 O/R 35 O/R 31 O/R 22 LITERARY M/C 58 LITERARY 2005 34 M/C 38 LITERARY M/C 53 2005 51 2005 35 O/R 44 O/R 30 O/R 17 M/C 51 M/C 2006 44 O/R 55 M/C 67 2006 39 2006 43 O/R 27 53 O/R 19 M/C 71 COMBINED POPULATION CONTENT 2004 2005 2006 63 61 50 O/R 56 M/C O/R M/C 60 O/R 40 69 54 CAUCASIAN CONTENT M/C 72 2004 65 2005 61 O/R 58 M/C 68 O/R 54 M/C SPECIAL SERVICES 65 2006 54 O/R 43 M/C 2004 47 O/R CONTENT M/C 2005 20 O/R M/C 2006 31 O/R 57 36 40 0 43 20 ECONOMICALLY DISADVANTAGED CONTENT M/C 76 2004 49 O/R 23 M/C 62 2005 41 O/R 20 M/C 61 2006 43 O/R 25 M/C 2004 62 O/R 72 53 PRACTICAL 2005 52 M/C 64 O/R 40 M/C 2006 59 O/R 65 54 M/C 73 M/C 53 M/C 70 2004 64 2004 38 2004 47 O/R 54 O/R 22 O/R 24 PRACTICAL M/C 64 PRACTICAL 2005 16 M/C 33 PRACTICAL M/C 55 2005 53 2005 38 O/R 42 O/R 0 O/R 20 M/C 70 M/C 2006 17 O/R 34 M/C 61 2006 64 2006 45 O/R 58 0 O/R 28

Disaggregated Data Tools

 NRT     ITBS ACSIP Report: # & % of students performing above the 50 th percentile on each test and content subskill for combined & subgroup populations Performance Profile: standard score & NPR on each test and content subskill for combined population School Coded Summary: standard score & NPR on each test for subgroup populations Data Analysis Set: [email protected] 12

NRT Growth & Assessment

2003 2004 2005 2006

SS

180 200 215 230

Standard Scores: Show relative development over time

14

15

16

17

18

SLE Analysis

SLE

1.5

1.5

ITEM #

2 11

PERCENTAGE CORRECT

81.5

74.1

1.5

36 95.6

TOTAL… 83.7

Total the percentage correct column & divide by 300 (in this case).

20

SLE Analysis…

Percentage Meeting Standard Suggested Action to be Taken

0-34% 35-49% 50-69% 70-84% 85-100% Align curriculum & classroom instruction; curriculum has not been taught or does not exist; indication that instruction is “textbook-driven” Coordinate curriculum objectives across grade levels & subject areas making sure all objectives are taught (horizontal/vertical alignment) Implement high-yield instructional strategies in all classrooms; there is probably a high percentage of lecture, whole-group, & direct teaching Spend more quality time on instructional strategies to yield greater results; check “learning minutes” in schedule & nature of tasks on which students spend their time Provide

aligned

enrichment; add depth & breadth; review pacing; reteach for mastery; be sure

distributed practice

is occurring

Source: Learning 24/7

Digging Deeper

 CRT Item Analysis  Content Standard  Language of Question   Level of Questioning Distracters 22

Content Standard

What is it that the student must know or be able to do?

    When is this introduced in the curriculum?

How is it paced?

Is it a “ power standard ” ?

What instructional strategies are used to help students master this standard?

 Have I given students the “ tools ” (e.g. calculator skills, writing tips, test taking skills, etc.) necessary to respond appropriately?

 Can this standard easily be integrated into other curricular areas?

23

Language of Question

 How is the question worded on the test?

 Are there vocabulary words used that may hinder comprehension?

  Do I teach and test using the same language?

Do I have word/learning walls in my content area to support this standard and related vocabulary? 24

Level of Questioning

 According to Bloom ’ s, what is the level of questioning used to measure mastery of the standard?

 Highlight the verb(s) in the question. Do I use those same verbs in my teaching and testing?

 Have I taught “ key ” or “ clue ” words that will help students to understand what is being asked of them?

 Is the question “ multi-layered ” ?

25

Distracters

 Are there items that “ distract ” the student from identifying what is being asked, or are there items that may “ confuse ” the student as he/she makes an answer choice?

 Labels     Additional information Multi-layered tasks Conversions “ Not ” 26

SLE Correlation: NPO 1.3 (prior to 2004 revisions) which states…

Apply and master counting, grouping, place value, and estimation.

Item Analysis: -What must the student know or be able to do?

Content Standard

-How is the question worded on the test?

Language of the Question

According to Bloom’s, what is the level of questioning used to measure mastery of the standard?

Level of Questioning

Are there items that “distract” the student from identifying what is being asked, or are there items that may “confuse” the student as he/she makes an answer choice?

Distracters

28

Digging Deeper

 NRT Item Analysis  Building Item Analysis  Identify items that have a negative value of 10 or more as indicated by the bar falling to the left of the 0 mark  Analyze results of all related items 29

30

Peeling the Data: Levels of Looking at Data

 District  K-12 Feeder Patterns  School Levels  Grade Level  Programs & Tracks  Classroom-teacher  Student 31

“Data analysis should not be about just gathering data. It is very easy to get

analysis paralysis

by spending time pulling data together and not spending time using the data.”

-Bernhardt, 2004, p. 19

Peeling the Data: Questions to Ask

       Are there any patterns by racial/ethnic groups? by gender? by other identifiers?

What groups are doing well?

What groups are behind? What groups are on target? Ahead?

What access and equity issues are raised?

Do the data surprise you, or do they confirm your perceptions?

How might some school or classroom practices contribute to successes and failures? For which groups of students?

How do we continue doing what’s working and address what’s not working for students?

Peeling the Data: Dialogue to Have

       How is student performance described?

(by medians, quartiles, levels of proficiency, etc.)

How are different groups performing? Which groups are meeting the targeted goals?

What don’t the data tell you?

What other data do you need?

What groups might we need to talk to?

(students, teachers)

What are the implications for?

   Developing or revising policies Revising practices and strategies Reading literature   Visiting other schools Revising, eliminating, adding programs  Dialogues with experts  Professional development goal setting and monitoring progress How do we share and present the data to various audiences?

Sample Questions from a School’s Data Team

 Are there patterns of achievement based on Benchmark scores within subgroups?

 Are there patterns of placement for special programs by ethnicity, gender, etc.?

 What trends do we see with students who have entered our school early in their education vs. later? Is there a relationship between number of years at our school and our Benchmark scores?

Sample Questions from a School’s Data Team

      Is there a relationship between attendance/tardiness and achievement?

How do students who have been retained do later?

How do our elementary students do in middle school?

Do findings in our NRT results support findings in our CRT results?

Can our findings be directly linked to curriculum? instruction? assessment?

What are our next steps?

Making It Personal for Teachers…

     Teachers can use their own data to… Identify the strengths of their own students Identify the challenges of their own students Identify common misconceptions & error patterns Identify their own successful teaching methods Pinpoint areas needed for professional development

Making It Personal for Students…

Students can use their own data to…  Reflect on their own knowledge & test taking strategies  Reflect on their strengths & weaknesses  Set goals for improvement

Necessary Variables for Data-Driven Decision-Making

CULTURAL

Change

KNOW HOW WANT TO SUCCESS TIME LEADERSHIP ACSIP PLANNING/FUNDING SOURCES

Candie Watts [email protected]

Arch Ford Education Service Cooperative http://af1.afsc.k12.ar.us

43