Advertisement

Shortchanging Basic Science: Why Young Scientists Are Discouraged : Research: It’s impossible to predict any payoff from pure science, but based on later applications like genetic engineering, life is better for it.

Share
<i> Barry G. Hall is a professor of biology at the University of Rochester. </i>

If President George Bush’s science budget proposal for the coming fiscal year survives congressional scrutiny intact, taxpayers will be supporting basic science to the tune of $8.5 billion, inarguably a lot of money. The National Science Foundation, the agency financing much of the basic research performed in this country, will see its first major real-dollar increase in several years.

Still, these are not fat times for basic science. Only the top 10% to 12% of projects approved on the basis of scientific merit probably will ever receive any grant money; as recently as 1979, the figure was closer to 50%. Those at the top of their fields are finding themselves cut off from funds, and promising young scientists’ careers are withering.

Taxpayers may wonder, however, why they should support basic research at all when there are so many urgent claims on public funds--learning how to protect our environment from destruction, for example, or developing alternative energy sources, or finding cures for cancer, kidney disease or AIDS.

Advertisement

Failure to put tax dollars into basic research, which investigates the nature of life and the universe without any particular application in mind, is shortsighted. It would jeopardize our future, harming us not just in some gauzily abstract sense, but in vividly understandable ways.

Imagine how different our world would be without X-rays, a technology whose origins depended upon basic research. X-rays have not only revolutionized medical diagnostics, they are also used to judge the integrity of structures that support buildings and bridges and those that hold airplanes together. Our lives literally depend upon being able to see through and into solid objects.

Physicist William Roentgen, who discovered X-rays in 1895, certainly wasn’t looking for them when he was experimenting with vacuum tubes. He noticed something invisible coming from the tubes caused a screen nearby to glow, and that this “something” apparently could travel through a cardboard box separating the tube from the screen. It didn’t take long after that to develop more efficient X-ray tubes, and to demonstrate that the mysterious rays could penetrate solid objects and leave images on photographic film.

Within a short time, X-ray images of the nails inside a solid block of wood were available, and the business of applied research into X-rays began. The point is that this seemingly accidental discovery was not an accident. It was the product of a careful investigation of a part of the nature of the universe that was of interest to Roentgen at the time. In fact, it is highly unlikely that any serious investigation of methods to see into solid objects would ever have been undertaken because people in that era never even entertained the concept.

This is one crucial reason to fund basic research: You can’t look for a practical solution until you believe that it is possible for one to exist. On the other hand, you can look at all sorts of things out of pure curiosity.

Since the practical applications of X-rays were almost instantly appreciated, they are perhaps not typical of how basic research pays off in unexpected ways. But consider two other examples--lasers and genetic engineering.

Advertisement

Lasers today are used in hundreds of industrial applications, plus a few familiar to nearly everyone. They read compact discs to provide the cleanest sound available; they perform microsurgery with less tissue damage than conventional surgery, and they record purchases at the supermarket by scanning bar codes. Yet the building blocks of knowledge used in the construction of lasers lay around for 20 years before anyone saw how they might be put together.

The genetic engineering story is similar. Today, biotechnologists make medically important hormones such as insulin and human growth hormone in quantities not previously possible. Gene-replacement therapy for genetically determined diseases is already being attempted on patients, and genetic engineering is speeding up crop improvements. All these applications depend on one thing: the ability to precisely cut apart genes on a strand of DNA, rearrange them and put them back together. Gene splicing, in turn, depends on molecular tools called restriction enzymes--tools as essential to the biotechnology revolution as were machine tools to the Industrial Revolution.

Like an enormous jigsaw puzzle, the picture of how restriction enzymes work was pieced together over many years by different scientists who had no idea their work would clear the path for a breakthrough in insulin production that happened in 1976. The Lilly Company, producers of insulin, anticipated that traditional methods of producing the hormone from the pancreas of pigs or cattle would be insufficient to keep up with worldwide demand from diabetics. Building on the work of three Nobel laureates--S.E. Luria, Werner Arber and Hamilton Smith--scientists encouraged by Lilly successfully isolated the gene for human insulin and spliced it into bacterial DNA, so turning bacteria into tiny factories for making human insulin.

Despite these examples of basic research producing tangible rewards, some may still wonder whether it wouldn’t be better to have universities pick up the tab, as they did up through the early 20th Century, rather than the government.

But universities have become increasingly unable to foot the bill. Especially since World War II, pushing back the frontiers of knowledge has come to depend on expensive, high-technology resources that are beyond the means of any one institution to support.

Given current fiscal pressures, some may well ask whether basic research isn’t a luxury that we should put on hold for awhile. It is not. On a case-by-case basis, any specific bit of basic research is unlikely to yield practical dividends; but taken together, it is almost certain that some of the work being done at any instant will indeed pay off.

Advertisement

I remember, as a child, reading Pearl Buck’s novels about China, and her account of what peasant farmers did when there was a famine. Although their families were starving, they never ate the last of the available grain. Instead, they sealed it into a jar as seed for the future, locked the door and left as refugees hoping to stay alive until the famine was over. Nobody was such a barbarian that he would break into a hut and take the seed because that grain was the future.

No matter how strapped we feel by budget crises, we must set aside money for basic research. Without it there is no future development beyond the limits of our current poor vision.

Advertisement