Telecharger un fichier ppt gratuit : Understanding Value-Added

Download Report

Transcript Telecharger un fichier ppt gratuit : Understanding Value-Added

Understanding Value-Added Lesson 2: How Value-Added Works

Office of Accountability

Recap: What is Value-Added?

  Value-Added is the District’s measure of elementary school and teacher growth.

Value-Added is a nationally recognized way of measuring growth.

2012 Academic Growth = Student Learning 2013

 Emphasizes continual student improvement  Provides information to understand what drives continual improvement Office of Accountability 2

Measuring Growth, Not Attainment

In this school, the percent meeting state standards is 25% in both Year 1 and Year 2.

Attainment is unchanged –

learning?

but are students

200 210 220 230 Scale Score 200 210 220 Office of Accountability 230 240 240 250 250 Analyzing growth provides this information 260 260 270 270 280 280 3 (Year 2) 290 300 (Year 1) 290 300

Accounting for Student Populations

 Student academic growth varies by grade, prior performance, and demographics.

 The goal of the Value-Added metric is to measure the

school or teacher’s impact

on student learning independent of student demographic factors.

 Value-Added accounts for the following student factors: Prior Reading Score Prior Math Score Grade Level Gender Race/Ethnicity Low-Income Status ELL Status IEP Status Homelessness Mobility  Controlling for the factors above gives proper credit for growth to low attainment schools and schools that serve unique populations.

Office of Accountability 4

How it Works

 Value-Added is not a comparison to similar schools.

‐ We do not look for a comparison group of schools that match each other on all 10 student factors…such a group might not exist.  Rather, Value-Added compares growth of students in each school to growth of students across the District, controlling for the list of student factors.

 To do this, we utilize a regression methodology, developed in collaboration between CPS and academic experts from the University of Wisconsin. Office of Accountability 5

What is Regression?

 By measuring the impact of each student factor, the regression model isolates the impact of the teacher on student growth.

 In other words, some growth is explained by external factors. We can measure the average impact of these external factors on growth at the District level and subtract that impact from the teacher’s absolute growth.

 The growth that is left over after removing the impact of these factors is attributed to the teacher. This is the value added by the teacher.

Office of Accountability 6

For More on Regression…

 Two other presentations on this topic are available at http://cps.edu/Pages/valueadded.aspx

 For an illustrative example of regression, view the “Oak Tree Analogy” presentation. This presentation illustrates the Value Added model by using an analogy of two gardeners tending to oak trees.

 For technical details, view “Lesson 3: Technical Specifications of the Value-Added Regression Model” Office of Accountability 7

Some Things to Know

    • Tested Students All students making normal grade progression who took ISAT or NWEA MAP in both the previous year and current year are included in analysis.

• • Mobile Students Mobile students count towards the Value-Added score in each school they attended, but are weighted in the analysis by the amount of time they were in the school during the year.

At the teacher-level, mobile students count towards the Value-Added score for each teacher that provided instruction to that student, but are weighted in the analysis by the time they were in the school and the amount of instruction provided by each teacher.

• • English Language Learners For ISAT: ELL students in Program Years 0 through 5 are excluded from the analysis. For NWEA MAP: Students with an ACCESS literacy score below 3.5 are excluded.

• • Students with Disabilities IEP status is differentiated by type of IEP. For example, the impact of a severe and profound disability is considered separately from the impact of a speech and language disability. Office of Accountability 8

Value-Added Scores

Value-Added measures the difference between the growth of students for whom a school or teacher provided instruction and the growth of similar students across the District.

A positive score indicates a school or teacher whose students are growing at a faster pace than similar students. Zero (0) is the District average. A score near zero indicates a school or teacher whose students are growing at about the same pace as similar students.

A negative score indicates a school or teacher whose students are growing at a slower pace than similar students.

Office of Accountability 9

Standardization of Scores

 Growth is measured in scale score points (for NWEA, these are called “RIT” scores).

Student A “grew” by 35 scale score points

200 210 220 230 240  However, one scale score point of growth is more difficult to obtain in some grade levels than others.

 As a result, standardization is used to ensure that all Value-Added scores are on the same scale.

Office of Accountability 10

Standardization of Scores

 Standardization is a common statistical process. In this case, it is used to convert scale score points to a standard scale.

 The unit of measure is the “standard deviation” which is a measure of distance from the mean.

‐ i.e., how much does School A’s score deviate from the mean?

 This places all scores on the same scale, allowing for more precise comparisons between scores at different grade levels.

Office of Accountability 11

The Standard Scale

Features of the Standard Scale  Zero (0) is the District average.

 About 68% of scores fall between -1 and 1.

 About 95% of scores fall between -2 and 2.

 About 99% of scores fall between -3 and 3.

 Only about 1% of scores are less than -3 or more than 3.

2.5% 13.5%

Office of Accountability

34% 34% 13.5% 2.5%

12

Reading the Value-Added Reports (School-Level Report)

Value-Added Score

Percentile: This is the percent of scores that fall below this score. Percentiles range from 0 th to 99 th

Number of Students in the calculation

Office of Accountability Confidence Interval: This is explained in the next set of slides.

13

Confidence Intervals

 The Value-Added model controls for factors that CPS can measure, but there are some factors that cannot be measured, such as: ‐ Motivation to learn ‐ Family circumstances ‐ Health  In addition, the Value-Added model is a statistical estimation of the school or teacher’s impact on student learning and therefore contains a certain amount of random error.

 For these reasons, the Value-Added model includes confidence intervals . Office of Accountability 14

Real World Example: Political Polling

A Political Polling company surveys a representative random sample of 1,000 community households about for whom they are going to vote on Election Day. The question they pose is:

If the election were held today, for whom would you cast your ballot?

The percentages of responses breakdown as follows:   Candidate Jones would receive 54% of the vote Candidate Smith would receive 46% of the vote 

There is a +/- 3% margin of error

Office of Accountability 15

Confidence Intervals in Political Polling

With the margin of error of +/- 3%, the range of the percentage of people who plan on voting for each candidates is as follows: Candidate Jones would receive between 51% and 57% of the vote.

43% 44% 45% 46% 47% 48% 49% 50% 51% 52% 53% 54% 55% 56% 57% Candidate Smith would receive between 43% and 49% of the vote. The confidence intervals do not overlap. Therefore the race is NOT “too close to call.” We can predict with a high degree of confidence that Candidate Jones will win the race.

Office of Accountability 16

Confidence Intervals in Value-Added

 A confidence interval is a range of scores around the Value-Added estimate.

Example: 0.7

1.0

1.3

    The Value-Added estimate is 1.0.

The confidence interval is ± 0.3.

The confidence interval range is from 0.7 to 1.3.

The district average (0) is not in the confidence interval, so we are 95% confident that the school’s effectiveness is different than the average (above average in this example).

 We are 95% confident that the true Value-Added score falls within the confidence interval range.

 The confidence interval is “n” dependent, meaning larger samples yield smaller confidence intervals. ‐ This is because in larger samples, a score that is different from the average is less likely to be due to random error alone.

Office of Accountability 17

Statistical Significance

 If the confidence interval does not include zero , we say that the score is statistically significant , meaning we are 95% confident that the score is different from zero.

 A color is associated with each score based on the statistical significance: Green • Entire confidence interval is above zero.

• Score is positive and statistically significant at the 95% confidence level.

Yellow • Confidence interval includes zero.

• Score is not statistically different from zero at the 95% confidence level.

Red • Entire confidence interval is below zero.

• Score is negative and statistically significant at the 95% confidence level.

Office of Accountability 18

How Confidence Intervals are Reported

This is how Value-Added scores are displayed in the school-level reports.

This school has a Value-Added score of -0.5 in reading

(the score is ½ of a standard deviation below the mean)

The confidence interval ranges from -1.9 to 0.8

 Because the confidence interval includes zero, we say that this school is not statistically different from zero at the 95% confidence level.

 For that reason, the bubble is yellow.

Office of Accountability 19

For More Information

More lessons and other resources for understanding Value Added are available at: http://cps.edu/Pages/valueadded.aspx

 Lesson 2 (Part 2): Oak Tree Analogy  Lesson 3: Technical Specifications of the Value-Added Regression Model Office of Accountability 20