Does L.A. count its homeless, or make its best guess? A little of both, it turns out

Gospel musician Clemmie Williams was living homeless in North Hollywood Park in January.
Gospel musician Clemmie Williams was living homeless in North Hollywood Park in January, when a count of homeless people is overseen by the Los Angeles Homeless Services Authority.
(Genaro Molina / Los Angeles Times)

Keeping track of the number of people who are homeless in Los Angeles is an exercise in uncertainty.

Not only do the numbers change from year to year, presumably reflecting real shifts in the homeless population, but, once published, they can change months or years later — based not on actual changes in the population, but on changes in how it’s calculated.

So although the original number might hang around on the internet — especially in news articles — revised numbers pop up in contradiction. Beware the casual searcher!


In the latest change, the Los Angeles Homeless Services Authority, which conducts the annual count, has reduced the number it published for the city of Los Angeles last year by 615, giving a new total of 35,550.

Although the change does not affect the countywide totals reported to the U.S. Department of Housing and Urban Development, it highlights an uncomfortable truth about LAHSA’s annual reports: The much-discussed “counts” of homeless people, which are expressed with absolute precision, aren’t truly counts, but estimates, subject to statistical error.

Now, for the first time, LAHSA’s statisticians at USC have acknowledged that the numbers were imprecise by publishing a margin of error for the new city totals.

“I thought we really should be recording this error,” said Patricia St. Clair, statistical analysis director for the USC Homeless Count team.

What causes these numbers to change is the evolution of the statistical method USC uses to convert data collected by thousands of volunteers each January into the estimates that are loosely characterized as “counts.”

The volunteers record every tent and lean-to they see as well as every car, van or RV they deem to be occupied. They do not attempt to determine the number of people sharing those dwellings. That’s USC’s job.


The USC data team conducts a survey of about 4,000 homeless people across the county to learn the demographics of homelessness and to estimate how many people, on average, live in each type of dwelling. This year, for example, the survey found that 1.484 people, on average, occupied tents, 1.702 were in RV/campers and so on.

Because those numbers are based on a sample — even though an unusually large one — they are subject to statistical error, a fact that LAHSA has never acknowledged in its annual presentations, which report an exact number of how many people were “counted.”

Behind the scenes, USC’s Data Core, which won the homeless contract in 2017, has worked diligently to refine and improve an endeavor that, by its nature, cannot achieve perfect accuracy. USC’s statisticians have tracked down errors and tweaked their methodology without paying a lot of attention to how the changes might be perceived.

The new acknowledgment of the system’s frailties came in the form of side-by-side tables, which address numbers only for the city of L.A., not the entire county. They offer a new and more accurate picture of the uncertainty in counting the homeless population. They’re also dizzyingly hard to follow and don’t exactly reconcile with the data posted elsewhere on LAHSA’s website.

The tables show the estimated growth of homelessness from 2018 to 2019 by two different methods. This year’s methodology, applied retroactively, would show an increase in the homeless population of 16%. Using last year’s method, the increase would have been 20%.

Neither of those numbers are consistent with LAHSA’s 2019 report, still posted on the agency’s website. It says that the increase in the city was only 14%.


The reason for the discrepancies is buried in the 2018 count, for which there are now three numbers for the city’s homeless population, ranging from 29,937 to 31,285, depending on the methodology used.

“That the numbers don’t all line up is OK because there is error in every number and it’s really a range,” St. Clair said. “For policy decisions I think you want the best numbers you can get, so that’s why we try to make them more precise.”

The adjustment that LAHSA just announced for 2019 came from a change in statistical reasoning. In calculating a margin of error for the first time, the USC team found that the accuracy of its estimates for the city could be improved.

Numbers for each of the city’s 15 council districts had been calculated from the survey responses within the district. The team concluded they would be more accurate if they were based on surveys from a wider area. The resulting change may have masked some of the differences among council districts, but produced a more accurate — and lower — number for the whole city. The margin of error dropped from more than 5% to less than 2%.

St. Clair said she was gratified to see that the range of numbers all fell within the margin of error, a rule-of-thumb way of saying they are really not different.

LAHSA has yet to publish a margin of error for the countywide total of 66,436, which was not affected by the recent change. One drawback to doing so would be that it, too, could give a false sense of accuracy.


As St. Clair and team leader Benjamin Henwood acknowledge, the statistical modeling is only one part, and possibly the smallest part, of the error in the annual “count.”

There’s no way of accounting for the mistakes that come from the annual outpouring of civic engagement that generates the raw numbers for USC’s analysis. Thousands of lightly trained volunteers fanning out over dark streets must make spur-of-the-moment judgments: Is someone living in the RV parked on a dark residential street? Are the three tarps strung side-by-side three lean-tos or one with three rooms?

“We’re aware of that,” St. Clair said.

“Obviously, there is kind of noise in the count every year, Henwood said. “But the question is, ‘Would there be some systematic difference each year?’ Hopefully if the method is relatively consistent that’s less of a concern.”

In other words, hopefully the errors will cancel out.