Advertisement

COLUMN ONE : Protecting Human Guinea Pigs : Radiation tests raise the chilling question; Can it happen again? Probably not, experts say, citing a wave of reforms. But ethical gray areas remain whenever people experiment on people.

Share
TIMES MEDICAL WRITER

The experiment would have proceeded quietly had Ernest Prentice not put a stop to it. He thought it was immoral, maybe even illegal.

The concept was simple: A University of Nebraska pediatrician wanted to test human growth hormone on girls with Turner syndrome, a genetic defect that causes retardation, stunted sexual development and extreme shortness. Half the girls would get three hormone injections a week for 18 months. The other half, the “controls,” would get placebo shots of saline solution.

In the annals of scientific horrors--a legacy that includes the grotesque medical experiments of the Nazis, the deliberate failure of U.S. doctors to treat poor black men for syphilis and, now, shocking revelations about human radiation testing--this study, proposed in 1988, would hardly merit a footnote.

Advertisement

Nonetheless, it troubled Prentice. As vice chairman of the review board that approves all research at Nebraska, the cell biologist did not think the pediatrician’s plan complied with stringent federal regulations designed to protect child research volunteers. He was uncomfortable with the idea of children in the placebo group getting so many injections that would do them no good.

He urged his board to turn down the request. The experiment was canned before it began.

At the same time the Nebraska experiment was scuttled, an ethics panel at the National Institutes of Health in Bethesda, Md., confronted the same decision but reached a very different conclusion. The NIH approved the human growth hormone experiment, placebo injections and all, saying it did not pose a danger to the children. The study is under way.

Today, Prentice remains dumbfounded by the ruling.

“What it comes down to,” he said, “is can you ask kids, particularly kids who are retarded, to have hundreds of injections, which are obviously painful, for benefit to society? There’s an ethical question and a regulatory question and my answer to both is no.”

*

The Nebraska controversy illustrates the delicate, and extremely subjective, ethical choices that accompany any experiment in which people are the subjects. Yet the fact that the debate occurred--and that Prentice had the authority to halt a study he believed was improper--is testimony to dramatic advances made in the past two decades to protect Americans who serve as guinea pigs in thousands of scientific studies each year.

While the nation confronts the chilling disclosures of Cold War radiation testing conducted on children, prisoners and others without their knowledge, many are shaking their heads and asking the obvious question: Could this happen today?

The answer, according to medical ethicists, is probably not--at least not easily, or in as egregious a fashion. Clearly, gray areas exist in interpreting ethical codes. And there still is room for abuse. Although scientists might be able to dance around the edges of ethics, experts say that moral breaches on a grand scale are unlikely.

Advertisement

No longer is science left up to the conscience of individual scientists. After a series of scandals that broke in the mid-1970s--including the radiation tests, which first became public then albeit in much scantier detail than now--a web of federal regulations and ethics boards was established to govern nearly every research endeavor in the nation.

Scientists who flout the rules may lose their privilege to conduct future research, have their funding slashed and, if the violation is severe enough, face criminal prosecution. “We’ve already had our public response (to the radiation experiments),” said Robert J. Levine, a medical ethicist at Yale University, “and we have taken corrective action.”

Even so, as a congressional inquiry into the radiation experiments began last week, Sen. Edward M. Kennedy (D-Mass.), who is chairing the hearings, indicated that further legislation may be necessary, particularly to protect those who were often victimized--children, pregnant women, the mentally retarded and prison inmates.

Already, experiments involving these groups come under especially intense scrutiny. Research on prisoners is an area with vast potential for abuse, as Oregon and Washington scientists demonstrated in the 1960s when they dipped inmates’ testicles in irradiated water in exchange for $5 per month.

Prison experimentation is now so tightly regulated that it rarely occurs. Many states, including California, ban the practice. And in a curious backlash, some AIDS-infected inmates have called for a loosening of the rules so they can participate in tests for experimental drugs.

“People say ethics is fuzzy and that it doesn’t get us anywhere,” said Arthur L. Caplan, a University of Minnesota ethics professor and author of “When Medicine Went Mad: Bioethics and the Holocaust.” “If you want to see a place where ethics has gotten somewhere, human experimentation is a good place to look. I’m not going to tell you that everybody comes to the right answer and that everything is rosy, but it is much better.”

Advertisement

Even Dr. Neal Barnard, president of the Physicians Committee for Responsible Medicine, which frequently criticizes biomedical research, concurs--up to a point.

“We have come a long way, there’s no doubt about that,” Barnard says. “We have rooted out every egregious abuse that we could find. But what we are left with are those abuses that we don’t find, because they are done by the military and are secret for that reason. And we haven’t rooted out the abuses that are subtle, that have not yet become the substance of ethical discussions.”

Science, however, did not accomplish this transformation on its own. It took a sordid history, public outrage, a crusading anesthesiologist and, ultimately, an act of Congress to do it.

*

There is inherent tension in biomedical research; each study is like a tug of war, with the researcher’s quest for knowledge on one side and the rights of the subjects on the other. Medical literature is punctuated with cases in which the rope got pulled too far in the wrong direction.

“Human curiosity is such a powerful force,” Barnard said. “It is the driving force behind science. The history of science is the history of curiosity often running amok.”

Moral matters often shape modern science. Take the case of AZT. When government researchers first tested this drug on AIDS patients, they pulled the plug on the experiment when they discovered that the medicine seemed to be working--it would have been unethical to withhold the drug from the patients who were getting a placebo. Doctors would no doubt have learned much more had the study continued, but as is often the case, the research had to be stopped before they answered all their questions.

Advertisement

The most basic principle of medical ethics--and the one most often violated when science does run amok--is that of “informed consent.” The idea is simple: researchers must obtain the consent of the people who participate in their experiments, and inform them of the risks.

This concept--clearly ignored by some scientists who conducted the radiation research--dates back at least to the turn of the century, to a series of classic experiments performed by Army Maj. Walter Reed.

In 1900, Reed, a physician, was asked to head a commission to investigate the cause of yellow fever, which killed thousands every year. He suspected mosquitoes played a role in transmitting the deadly tropical disease, and so he asked his soldiers if they would permit themselves to be bitten by insects that had bitten yellow fever patients.

“The soldiers were told they could die from it, and at least one did,” said James Whorton, a medical historian at the University of Washington. “This was a very unusual situation. You were intentionally giving somebody a disease which had no treatment. That stirred doctors’ consciences.”

The experiment proved that mosquitoes do carry yellow fever, and it won Reed a page in the history books. Yet it would never have been permitted today. Nor, for that matter, would some other famous experiments, including Jonas Salk’s pioneering research on a polio vaccine, in which thousands of schoolchildren volunteered to be injected with a killed form of the virus before tests had proved it was safe.

Moreover, by current standards, Reed’s methods of obtaining consent were crude at best. Says Whorton: “It’s never been clear to me how much pressure there might have been from the Army for people to volunteer.”

Advertisement

It was not until four decades later, when Nazi doctors went on trial at Nuremberg, that the notion of informed consent was formalized. The trial disclosed the grisly medical legacy of the Third Reich: To study hypothermia, doctors dumped concentration camp prisoners in icy vats of water and watched them freeze to death. To see what would happen to pilots at high altitude, they put inmates into high-pressure tanks and let them decompress; the subjects’ lungs exploded. They shot people to examine wounds with different bullets.

The Nazis advanced three defenses: They said they were justified because the subjects were doomed to die, because the experiments were necessary for national defense, and because the participants were a drain on society.

“Those arguments all got rejected in 1947 when the Nuremberg tribunal issued the Nuremberg Code,” Caplan said. “So when somebody says that the ethics were different in 1950 (when many of the U.S. radiation tests took place) than they were in 1994, forget it. We knew exactly what the Germans had argued at the trial and we knew exactly what we had put forward as a response, which was to say that none of those arguments were persuasive.”

Still, serious abuses continued long after Nuremberg. Many were documented by Dr. Henry K. Beecher, a Harvard Medical School anesthesiologist who wrote a landmark 1966 article in the New England Journal of Medicine outlining 22 of what Caplan calls “the most miserable, rotten experiments” that took place after World War II. All endangered subjects without their knowledge.

Beecher’s revelations--including one study in which penicillin was intentionally withheld from military men who had rheumatic fever and another in which healthy babies were X-rayed so doctors could study their bladders--rattled the American medical community. But it was not until the 1970s that the issue of ethics in medicine exploded, with a series of newspaper and television exposes.

One of the most notorious cases was cited in Beecher’s article: In 1955, at the Willowbrook State Hospital in Staten Island, N.Y., mentally retarded children were deliberately infected, without their knowledge, with hepatitis as part of an experiment to develop a vaccine. At the Jewish Chronic Disease Hospital in New York, terminally ill elderly patients were not informed that they were being injected with live cancer cells.

Advertisement

But the experiment that outraged Americans more than any other was the “Tuskegee Study of Untreated Syphilis in the Negro Male,” a breach of ethics that lasted 40 years.

In 1932, doctors for the U.S. Public Health Service set out to research syphilis in 400 poor, illiterate black men in Tuskegee, Ala. They lured their subjects with the false promise of free therapy; among the “treatments” were spinal taps with no anesthesia.

Most horrifying was that these men were denied penicillin--after it became clear in the 1940s that the antibiotics would have cured them--so the researchers could study the natural progression of the disease. This went on until July, 1972, when a whistle-blower blew the lid off the experiment.

In 1974, the disclosures prompted Congress to create the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. After countless public hearings, the panel issued a litany of regulations governing research paid for with taxpayer dollars.

Although the Department of Defense, which underwrote many of the radiation experiments, now must comply with those rules, the agency initially resisted and operated under its own guidelines until 1991.

“They said: ‘Look, a lot of our research has to do with classified stuff and, therefore, how can we put it under public view the way biomedical research has to be done?’ ” said Albert R. Jonsen, an ethicist at the University of Washington and commission member. “And we essentially said: ‘Well, you’ve got to do it.’ ”

Advertisement

The commission changed the culture of medical research in two ways. First, no longer could scientists simply claim that their work would benefit society; it had to benefit those subjects involved.

The second was that all research would be subject to the approval of a so-called Institutional Review Board, such as the one Prentice serves on at Nebraska. Today, there are thousands of these “IRBs” across the nation, from the tiniest community hospital to the best-known universities to huge federal agencies.

*

But no system is fail-safe, and these boards are only as diligent as the people who serve on them. At a hearing held in a Boston suburb Thursday, Kennedy said he is considering a recommendation made by Dr. Kenneth Ryan, a Harvard University ethicist, to institute site visits by federal regulators to review boards that sign off on research.

Scientists can--and have--flouted their rules, although not without consequences. As Caplan said: “There’s not an ethics cop at every lab.”

Consider the case of Martin J. Cline, a UCLA professor of medical oncology who in 1980 proposed a gene therapy experiment on people with beta-thalassemia, a genetic blood disorder. At the time, gene therapy was highly controversial, and the review board at UCLA turned down Cline’s proposal, telling him to conduct further studies on animals first.

So Cline went to Italy and Israel and performed the experiment--unsuccessfully--on patients there. The UCLA board insisted that Cline needed its approval to conduct the research, even overseas, and the National Institutes of Health disciplined the researcher by requiring him to undergo a more cumbersome review when he applied for grant money.

Advertisement

The episode, Cline said, derailed his career. He abandoned gene therapy, a field of science that today is booming. Instead, he researches leukemia.

“Basically, it was very difficult to retain a laboratory, to retain funding,” Cline said. He added that he believes his experiment was ethical. The participants were fully informed and the research was intended to benefit them.

But as the Nebraska case and others illustrate, once bottom-line questions such as informed consent are settled, the business of ethics can grow murky. One scientist’s honorable pursuit is another’s dance with the devil.

At the NIH, controversy continues to rage over experiments with human growth hormones. In addition to the Turner syndrome studies, NIH scientists are now testing the hormones on so-called “short-stature” children who have no medical problems, but are simply short.

This has provoked an intense ethical debate, as well as a lawsuit. In the wake of the suit, which claims that the hormones could expose healthy youngsters to the risk of disease, the NIH temporarily suspended the research and appointed an independent panel to investigate. The panel found nothing wrong, and the studies resumed in June.

“This is outrageous,” said Jeremy Rifkin of the Washington-based Foundation for Economic Trends, which brought the lawsuit. “Everyone is talking about these radiation experiments. . . . Why is this any different?”

Advertisement

Questioning Experiments

A University of Nebraska review board turned down a study involving human growth hormone for mentally retarded girls who have a genetic defect called Turner’s syndrome. Although the research was approved by the National Institutes of Health, Nebraska rejected it after considering the following questions raised by federal regulations adopted in the 1970s to protect people who serve as human guinea pigs in medical experiments. Here are the university’s answers:

* 1: Does the research involve only minimal risk?

Answer: No.

* 2: Does the research involve greater than minimal risk but offers direct benefit to the subjects?

Answer: No, because girls who got injections of placebo would not benefit.

* 3: If the research offers no direct benefit, is it likely to benefit society by yielding general knowledge about the disorder being studied?

Answer: Yes.

* 3A: If the above answer is yes, does the research pose only a minor increase over minimal risk?

Answer: No

* 3B: Are the procedures involved in the experiment reasonably similar to what the subjects would undergo in their actual medical treatment?

Answer: No.

* 3C: Is the experiment of vital importance to understanding the disorder?

Answer: No.

Advertisement