Advertisement

How the teachers were evaluated

Share

The Times used a statistical approach called value-added to estimate the effectiveness of teachers in Los Angeles schools. The approach has been around since the 1970s but has recently grown more popular -- and controversial -- as it is used for assessment of instructors.

What is value-added analysis?

Value-added estimates the effectiveness of a teacher by looking at the test scores of his students. Each student’s past test performance is used to project his performance in the future. The difference between the child’s actual and projected results is the estimated “value” that the teacher added or subtracted during the year. The teacher’s rating reflects his average results after teaching a statistically reliable number of students.

Advertisement

Will a teacher’s score be affected by low-performing students, English-language learners or other students with challenges?

No. Because value-added measures the growth of each child compared to his past performance, it levels the playing field between high- and low-achieving students.

Aren’t the standardized tests on which value-added is based flawed?

No achievement test is perfect, and many produce raw scores that reflect socioeconomic background more than classroom learning. But because value-added compares students to themselves in previous years, rather than to other students with different backgrounds, it overcomes one of the major flaws with current uses of raw achievement scores.

What if a student has a bad year due to behavioral problems or difficulties at home? Would the teacher’s score suffer?

Probably not. The Times only developed scores for teachers who had taught 60 or more students, so no single student’s test scores should dramatically change a teacher’s ranking.

Advertisement

Why did you decide to publish teacher’s scores?

Teachers are the single most important school-related factor in a child’s education, but until now, parents had little objective information about instructors’ effectiveness. Los Angeles Unified has had the data to estimate teacher effectiveness for years but has not done so.

What are some of the limitations of the value-added approach?

Scholars continue to debate the reliability of various statistical models used for value-added estimates. Each has an inherent error rate that is difficult to measure. Value-added estimates may be influenced by students’ not being randomly assigned to classes, or by students’ moving from class to class during a single year. Likewise, they could be misleading for teachers who team-teach. Even many critics of the approach, however, say value-added is a vast improvement on the current evaluation system, in which principals make subjective judgments based on brief pre-announced classroom visits every few years.

Test scores notoriously drop in third grade. Does that hurt the value-added score of third-grade teachers?

No. By first ranking teachers relative to their grade-level peers, the approach controls for grade-level differences in the state test. Although third-graders tend to score lower on the California Standards Test compared with second- and fourth-graders statewide, this does not result in a disadvantage for third-grade teachers.

Advertisement

Can value-added be used to evaluate all teachers?

No. It can only be used for teachers whose subjects are tested annually -- in this case, English and math. Some districts have accounted for this limitation by giving teachers of non-tested subjects a score based on schoolwide gains, but it is generally a very small factor in their overall evaluations.

Why does the database just contain third- through fifth-grade teachers in L.A.?

The Times first analyzed elementary school teachers in L.A. Unified, the nation’s second largest school district. In coming months, The Times plans to publish similar rankings for the district’s math and English teachers in higher grades. Statewide scores and scores for other school districts are not currently available.

Does this tell me anything about how L.A. Unified compares to other districts?

No. The approach allows for relative comparisons across the district but is not an absolute measure of performance.

Advertisement

Do you need a doctorate to understand the statistics behind the value-added approach?

Not to understand the basic principles but perhaps to grasp the more technical points. This “black box” problem is a concern to many opponents -- and proponents -- of value-added. The Times has posted a technical paper online at latimes.com/media/acrobat/2010-08/55538493.pdf showing the details of our methodology and findings. We will be posting additional information that explains this to a lay audience.

Advertisement