Advertisement

COLUMN ONE : Waste Pile of Data on Pollution : Thousands of environmental test results are questionable--or wrong. Incompetence, fraud and poor testing methods contribute to the problem.

Share
TIMES STAFF WRITER

For years Marvin and Anita Rudd were key players in the government’s efforts to make environmental decisions wisely.

The Bay Area couple provided $3-million worth of bottles used to collect soil and water samples at the nation’s most hazardous waste sites.

The bottles were supposed to be checked for contamination that could generate false readings in laboratory tests. But, according to a federal investigation, the Rudds did not perform the checks and for two years did not even own the necessary equipment.

Advertisement

The case casts doubt on the accuracy of thousands of lab tests and highlights a problem now coming into focus after two decades of massive testing required by environmental laws.

Each year the government spends an estimated $500 million on testing, and companies that discharge chemicals into the nation’s air, soil and waterways spend $5 billion more. But many test results are questionable, and many are wrong.

The result, scientists and government officials say, is a billion-dollar pile of painstakingly gathered misinformation.

The consequences are far-reaching. Test results are used to make important environmental decisions such as: How much pollution should be allowed in a river or a stream? When should a cleanup be ordered? Who should pay for it?

To make sure wrong data did not lead to wrong decisions, millions of dollars have been spent on retesting and new studies. Nonetheless, bad data may have saddled industry with unnecessary cleanup costs or put the environment at risk.

“There is uncertainty that we have overregulated, but there is just as much uncertainty that we’re under-regulating for some pollutants,” said Sam Luoma, a U.S. Geological Survey biologist.

Advertisement

Contributing to the bad data were:

* Fraud. To save money, many private laboratory operators take scientific shortcuts, then cover them up, said Charles Aschwanden, a lawyer for the Environmental Protection Agency’s National Enforcement Investigations Center. “Something strange happens when a chemist becomes an entrepreneur.”

In the past four years, six laboratories and 17 individuals doing work for the Superfund cleanup program involving the nation’s most toxic sites have been convicted of taking fraudulent shortcuts. About three dozen other labs are reported to be under investigation.

Despite the enforcement effort, the chances of laboratories getting away with fraud are “excellent,” Aschwanden said.

* Poor testing methods. The government, in approving test methods that are relatively easy to perform, has sacrificed precision. Experts say government-approved test methods often fail to detect significant levels of pollutants because the methods are not sensitive or flexible enough.

For example, labs monitoring the Kesterson National Wildlife Refuge gave no warning of a major selenium problem in the early 1980s because the government-approved tests for selenium did not measure the specific form of the element that was poisoning thousands of waterfowl.

* Incompetence. In proficiency tests, laboratories on which regulators depend for accurate information get the answer wrong 15% to 20% of the time, EPA statisticians say. Worse yet, such labs probably make far more errors when they do not know they are being reviewed, according to scientists and government officials.

Advertisement

Environmental officials said that even more mistakes take place in the field than in laboratories because samples are taken improperly and do not accurately reflect pollution levels and types.

“There are far more bad data out there than good data,” said Herbert Windom, professor of oceanography at Skidaway Institute of Oceanography near Savannah, Ga.

He recently demonstrated, along with other scientists, that 20 years’ worth of information gathered by the U.S. Geological Survey vastly overestimated the extent of metals pollution in the nation’s rivers.

“We just crank out a lot of numbers and we think we’re doing the real wise thing environmentally, and we’re not,” Windom said. “We’re just spending a lot of money.”

In a critical 1988 study of the EPA, the prestigious National Academy of Sciences concluded that the agency often failed to maintain data quality and “on occasion functioned badly” as a result. An EPA advisory committee in 1991 said samples often are collected improperly and inappropriate tests are often performed.

Peter Preuss, director of the EPA’s Office of Technology Transfer and Regulatory Support, said he is certain that bad data is being used to make decisions, and the agency has not assessed the unnecessary costs and cleanup delays that may have resulted. The EPA has “a long way to go” to improve data quality, Preuss said.

Advertisement

Incorrect data often does no harm. For example, a lab might understate the level of pollution at a junkyard, but the test results still might be high enough to require a cleanup.

On the other hand, accuracy is critical to some environmental decisions--and bad data has hampered important cleanup efforts and led to unnecessary scares, delays and spending.

A nationwide EPA study in the mid-1980s of ground water monitoring at 24 landfills, for example, found that bad sampling and laboratory analyses had prevented accurate definitions of the pattern of pollution, resulting in poor cleanup decisions.

In San Francisco Bay, poor data has left scientists and regulators bewildered about what pollution control steps to take beyond the billions of dollars already spent on improving water quality.

Researchers in the 1970s and early 1980s grossly overestimated some pollutants in the bay, said UC Santa Cruz environmental toxicologist Russell Flegal, making it impossible to determine how much cleanup efforts during the same period had improved water quality.

Waste-water treatment clearly has resulted in some improvements, but has it been worth the cost? “Did we need to spend $4 billion, or do we need to spend $4 billion more to have some impact at the aquatic organism level?” said Steve Ritchie, executive officer of the San Francisco Bay Regional Water Quality Control Board. “That’s what we don’t know.”

Advertisement

In the Chesapeake Bay, the mystery for years was whether nitrogen or phosphorous was most responsible for harming aquatic life.

The EPA’s test methods showed the culprit was phosphorous. But scientists such as Christopher D’Elia, director of the Maryland Sea Grant College research program, complained that the EPA’s methods were inaccurate, and that better methods would show nitrogen was to blame.

It was not until 1987 that the government acknowledged a mistake, which D’Elia says delayed cleanup of the bay for years.

On a much smaller scale, residents of the San Joaquin Valley town of Dos Palos got a scare two years ago when a private lab found very high levels of arsenic in drinking water.

Officials feared that residents complaining of flu-like symptoms were being poisoned. But other labs found that the first lab had overestimated arsenic by 50 times because of what a government investigation concluded was arsenic contamination inside the laboratory.

The same lab also reported a selenium problem in the California Aqueduct, touching off a 1990 scare that water carried to Southern California might not be safe to drink. A government inquiry later determined there was no selenium hazard.

Advertisement

Just the opposite problem occurred in 1988 near Mojave, where the state was trying to determine causes of a cancer cluster. The initial test badly underestimated dioxin at a smelting plant. This caused a 2 1/2-year delay in abandoning the site and aggravated a public health threat, state officials said. Dioxin levels turned out to be comparable to those that prompted evacuation of Times Beach, Mo.

Sometimes lab errors only have wasted money.

The Envirite Corp. of Connecticut had to spend $1 million to “double-treat” its waste in 1989 and 1990 and pay a $60,000 fine because an EPA lab erroneously concluded the waste was hazardous. In court, a federal judge later condemned the EPA for hiding its mistake.

In the early 1980s, the U.S. Bureau of Reclamation spent $400,000 to find out how much selenium and heavy metals were in crops in the western San Joaquin Valley. But two labs assigned the task reached vastly different conclusions, and there were not enough suitable samples for the results to be rechecked by a third. “We didn’t get much out of it,” said John Fields, a quality assurance specialist for the bureau.

Errors have led the EPA to incorrectly classify some hazardous waste sites as potential Superfund sites, which industry must pay to clean up, said Larry Reed, director of site evaluations for Superfund.

Stakes are so high in cleanups, and errors so commonplace, that big companies with potentially large environmental liabilities increasingly test the labs they use, or hire consultants to audit labs’ work.

Environmental Standards Inc. of Valley Forge, Pa., a lab auditing firm, has found sampling or testing problems in all but one or two of more than 1,000 sets of data it has examined for polluted sites over a decade, said co-owner Rock Vitale. “It’s frightening,” he said.

Advertisement

Shell Oil Co. recently found that about half of the 24 private, commercial laboratories it uses failed in conducting simple tests for substances like oil.

Pressures and Fraud

Polluters generally are responsible for monitoring themselves--and some are cheats, or they hire people who cheat.

In recent testimony to the Los Angeles County Grand Jury, environmental tester Earl Ortloff said he certified that hundreds of gasoline tanks under Southern California service stations were properly sealed, although he had never tested them.

The grand jury indicted the alleged owners of 150 Buy RitM Service Stations--Gary Lazar and Divine Grace Lazar--accusing them of hiring Ortloff to cover up badly leaking tanks. An investigator testified that soil and water contamination was severe around several of the stations, but not all the sites have been tested.

Carmen Trutanich, an attorney for Gary Lazar, said only that “it appears the Lazars have done things correctly.” An attorney for Grace Lazar did not return calls.

No one knows how big the fraud problem is, although the government in the last decade has won convictions against six labs in the Superfund program, another lab with a contract to monitor a landfill, and scores of companies and individuals who lied about the amount of pollution they discharged.

Advertisement

H. Brooks Griffin, an EPA assistant inspector general for investigations, said it appears that “every time we look (at a private lab in the Superfund program) we find a crook.”

A common reason for fraud is commercial pressure. Labs win government contracts by submitting low bids, and some labs cut corners to enhance profits.

At U.S. Testing of Hoboken, N.J., a lab convicted in connection with Superfund site testing, a vice president instructed employees to “use any method or shortcut” that would make it appear the required testing had been performed and that the company was entitled to be paid, the government charged.

Employees of another convicted lab, Metatrace, of Earth City, Mo., told investigators that they routinely altered data. This bad data, according to EPA grants administrators, led to “potentially life-threatening decisions,” such as how fast to clean up a polluted site.

But EPA quality-control officials say there usually is so much testing of each Superfund site that frauds do not lead to bad decisions. They acknowledge only that some frauds have resulted in costly delays and some retesting.

At Stringfellow Acid Pits, officials at Science Applications International Corp. of San Diego admitted that some employees were “time traveling”--or holding volatile organic samples longer than permitted under government contracts and hiding the practice. The practice risks evaporation that could produce low readings of pollutants, scientists say.

Advertisement

It cost the government $1 million to recheck the results, officials said.

The case against the Rudds, the Bay Area couple, also grew out of Superfund. Now divorced, they owned and operated the I-Chem laboratory, sole supplier of sample bottles for Superfund sites between 1983 and 1987.

Contracts required the bottles to be tested for contamination. But the government alleges that the Rudds did not obtain the testing instrument for two years, and, when they finally obtained the equipment, no one at the company was trained to operate it. At one point, the government says, I-Chem’s laboratory became contaminated and customers complained.

Marvin Rudd has pleaded not guilty to federal fraud charges, and his attorney said that neither he nor Rudd would have any comment.

Anita Rudd and I-Chem have pleaded guilty to a single count of filing a false claim for a $35,086 payment. The count charged that $3,456 worth of bottles were not tested. She faces up to five years in prison and a fine of up to $250,000. I-Chem faces a fine of up to $500,000.

In a statement to The Times before her guilty plea, Anita Rudd suggested that no harm had been done. She noted that companies using I-Chem’s bottles also were required to perform quality checks. “Accordingly, the chances are negligible that, if any I-Chem containers . . . were contaminated, such contamination would have gone undetected,” she said.

Test Methods Vary

The quality of EPA’s standard testing methods varies greatly, “sort of like recipes for bread,” said Nancy Wentworth, director of the agency’s quality assurance management staff. “Some make good bread. Some make doorstops.”

Advertisement

The EPA’s strategy has been to devise tests that are relatively easy to use. The trade-off, critics say, has been quality for quantity.

“That is the most insidious issue,” said James G. Sanders, director of the Academy of Natural Sciences’ Benedict Estuarine Research Laboratory on the Chesapeake Bay. “You end up with methods put into place primarily because they are (easily) reproducible . . . not because they’re good at measuring what’s there.”

Many tests cannot detect low levels of pollution. As a result, many labs report that pollutants cannot be detected when they may, in fact, be present at harmful levels.

Government computers are “loaded with such useless data” about the quality of the nation’s waterways, D’Elia, the Chesapeake Bay researcher, testified before a congressional committee last year. He cited a 1987 study in which labs found that more than half the samples taken from Chesapeake Bay tributaries had “undetectable” levels of trace metals suspected of polluting the bay.

Using such data to calculate the bay’s pollution is “futile,” said D’Elia.

Another major problem is that many test methods are not capable of spotting all the chemicals in heavily polluted samples.

Some government-approved methods are able to identify only one-tenth to one-third of the different kinds of pollutants in a sample, said Joe Lowry, chief chemist for the EPA’s National Enforcement Investigations Center.

Advertisement

A tragedy unfolded at Kesterson National Wildlife Refuge in the early 1980s because an EPA method was not sufficiently flexible to identify two varieties of the same pollutant in a sample.

Regulatory officials were monitoring selenium at Kesterson, but tests were geared only to detect one form of the element--selenite--which was present in small amounts. The tests could not detect another form, selenate, which was 10 times more prevalent. Regulators did not realize the magnitude of selenium poisoning at Kesterson until thousands of waterfowl and other wildlife had been mutilated or killed.

A decade later, some labs in California were still making the same mistake, said John Fields of the U.S. Bureau of Reclamation.

Simple clerical errors also lead to mistakes.

For about 15 years, a “miscalculation or typographical error” in the U.S. Geological Survey’s approved test for phosphorous pollution went undetected. The result, the USGS acknowledged this summer, was an underestimation of the amount of phosphorous--a fertilizer--in waterways.

Much of the old data is presently unsalvageable, David Rickert, USGS’s water quality chief wrote in a bulletin to data users.

Last year, the National Academy of Sciences found that government analyses were severely underestimating hydrocarbons--a precursor to ozone, which in turn is a key component of smog. Better measurements could require “a fundamental change in the nation’s ozone reduction strategy,” the academy concluded.

Advertisement

“Southern California has the most detailed monitoring network available anywhere, but even here the kinds of measurements that are needed are not being routinely made,” said John Seinfeld of Caltech, chairman of the academy panel.

Incompetent Labs

To gauge how good a job labs do in testing for pollutants in water, the government regularly gives them the scientific equivalent of open book tests. Labs are asked to use EPA-approved methods to analyze water samples spiked with certain pollutants.

Critics say that because the labs get to choose which pollutants they will measure, the tests amount to exercises in calibrating equipment.

Still, there are plenty of errors. California labs that test drinking water get more than half the answers wrong 11% of the time, and get all answers correct only 12% of the time, according to an analysis of records obtained under the Freedom of Information Act. Times Director of Computer Analysis Richard O’Reilly performed the analysis.

Labs that test soil do even worse. In recent tests on soil samples containing PCBs, 30% of participating California labs drastically underestimated the contamination. These labs reported levels that would not trigger cleanups, when PCBs were present at twice their legal limit.

Many labs would do far worse if they did not know they were being tested, scientists say. One lab in Albuquerque was 12% off correct measurements when it knew it was being tested, and 270% off when the test was a surprise.

Advertisement

EPA scientists advocate surprise testing as the best way to assure good lab performance. But such testing costs more and is harder to conduct, and so is seldom used.

EPA officials are also considering proposals to accredit laboratories or analysts.

Meanwhile, the EPA is cautioning data users to beware.

The biggest problem is “uneducated users” who do not recognize the data’s shortcomings, said Phillip Ross, director of EPA’s environmental statistics and information division.

There are plenty of such users. Environmental data carries the cachet of science and is routinely relied on by government and health care workers, as well as engineers, lawyers and consultants.

“You’re forced into a situation of trusting the data,” said consultant Joel Hirschhorn, former head of hazardous waste for Congress’ Office of Technology Assessment.

Environmental Testing

Many federal, state and local environmental laws have been established to protect soil and water from pollution. At the heart of most laws is a system for detecting and monitoring contamination, testing the levels of pollution and assessing the need for remedial action. Typically it works like this: The government or owners of regulated businesses, such as factories, take samples of substances that are discharged into the environment or known to be in the ground.

The samples are sent to hundreds of private and government laboratories throughout the country and tested for a variety of pollutants, ranging from solvents to heavy metals.

Advertisement

Test results are forwarded to the proper government agency for evaluation. The data is used to decide important questions such as whether a site should be cleaned up and who should pay for it.

The problem is that much of the data is erroneous, resulting in unnecessary cleanup costs and delays in addressing environmental hazards.

Testing the Testers

Twice a year, the Environmental Protection Agency sends contaminated water samples to laboratories to check their ability to measure pollution. The labs know they are being tested and usually get to choose which contaminants they will measure.

More than 1,700 labs participate in the drinking water tests, more than 2,000 in the polluted water tests. They conducted 200,000 tests in 1991. The EPA says it expects labs to get 5% of the answers wrong, but in fact the labs were wrong much more often:

POLLUTED WATER TESTS

Substance % Wrong Tested for Trace metals 15% PCBs 36% Pesticides Overall 13%

DRINKING WATER TESTS

Substance % Wrong Tested for Trace metals 20% Insecticides 18% Herbicides 20% Overall 18%

Advertisement