Advertisement

COLUMN ONE : Sleuths on the Trail of Weather : The need to understand global warming has made meteorology a growth industry. But significant gaps in data are limiting forecasters’ abilities.

Share
TIMES STAFF WRITER

It is not a great deal bigger than a dishwasher, an ordinary-looking box no more impressive than its own packing crate. It would fit nicely into an empty corner and make a good perch for a potted plant or--in the old days--an ash tray.

But looks are deceiving.

The unimpressive black cube, to be delivered to a government laboratory at Princeton University in two months, is in fact a new Cray YMP supercomputer. It can perform a billion mathematical computations in one second and solve millions of equations in mere minutes, tasks that used to take computers years.

The trouble is, for the government laboratory that ordered it, even the Cray YMP is--in a manner of speaking--obsolete. Jerry D. Mahlman, director of the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Lab here, said: “We are already looking toward machines that will give us a trillion operations a second.”

Advertisement

What goes on here? What are Mahlman and his associates up to when the fastest new supercomputer is overwhelmed before it is uncrated?

The answer is that Mahlman and hundreds of other climate experts in laboratories around the world are engaged in one of the most enormous and complicated tasks scientists have ever attempted.

They are trying to determine if Earth is becoming dangerously warm.

Many scientists fear that man-made carbon dioxide and methane are accumulating in the atmosphere and, like the glass walls of a greenhouse, trapping heat near the ground. It is a concern that has transformed weather forecasting from an obscure combination of art and science into a growth industry.

Preliminary reports to the United Nations’ Intergovernmental Panel on Climate Change, which recently met for three days in Washington, indicated a growing consensus that Earth’s temperature will rise 1.5 degrees to 4.5 degrees centigrade by the middle of the next century unless steps are taken to reduce such “greenhouse gas” emissions.

The potential repercussions of overall warming--rising ocean levels as the polar ice caps melt and a northward shifting of the Northern Hemisphere’s prime agricultural zone--are subjects of furious debate and international decisions of a complexity never before undertaken.

“Maybe, a decade from now, we will be able to say whether continental dryness, such as we experienced in the summer of 1988, will be very probable in the future,” Mahlman said. “Or maybe we will have found that is not the case at all.”

Advertisement

To determine whether, when and how much the Pacific Ocean will rise at Santa Monica and whether the farming conditions of the Corn Belt will move up to North Dakota, scientists must:

--Learn from the past. To predict the future, meteorologists must understand what has come before. They are studying everything from Antarctic ice cores to tax records from ancient Egypt.

--Understand clouds. Perhaps more than any other variable, they affect Earth’s temperature.

--Study the oceans. They moderate temperature swings--potentially good news to the Southern Hemisphere.

Mahlman’s laboratory on the Princeton campus is one of eight centers--four are in the United States--that are trying to unravel these mysteries. They are on the cutting edge of an unprecedented scientific enterprise: creating three-dimensional mathematical grids around the planet in an effort to understand how the oceans and the atmosphere will respond to a continued outpouring of carbon dioxide and methane.

Based on current models, Mahlman has put the major potential effects of doubling the amount of atmospheric carbon dioxide into the four categories--possible, probable, very probable and nearly certain.

Advertisement

He considers global warming, increased precipitation worldwide, a warming of the northern polar region and a reduction of sea ice all very probable--a likelihood greater than 90%. In the probable, or 80%, category, he places increased precipitation in the Northern Hemisphere, drought in North America and a rise in the average sea level. In the uncertain category he places regional changes in vegetation cover and more tropical storms.

Learning from the Past

Computer technology is not all that limits these scientists’ ability to forecast the so-called greenhouse effect.

There are vast gaps in the raw information that must go into the global models. International science is not only collecting contemporary data on an unprecedented scale, from the ocean depths to the outer limits of the atmosphere, it is also trying to reconstruct the ebb, flow and pulsation of climate over the last 160,000 years.

What they have accomplished is the scientific equivalent of ingenious detective work.

Joseph Fletcher, now director of NOAA’s Environmental Research Laboratories, found priceless historic readings of ocean temperatures in British Admiralty records. He spent years compiling temperature readings, scrupulously taken by sea captains at their boilers’ intakes, and produced a landmark collection of evidence for oceanographers and modelers.

Equally imaginative sleuths have found fish harvest records in Peruvian monasteries that can be used to infer fluctuations in ocean temperatures. Others have examined fossils, studied core samples from trees, combed Egyptian tax records for clues to flooding in the Nile Valley and analyzed pollen left behind by ancient forests.

For more recent evidence in the United States, the search has taken scientists into county and state archives for records of water levels in aquifers and wells during extended droughts. They have also gleaned information from records of debates over Colorado River water rights.

Advertisement

“We just must have a better record of the last 1,000 years, and a very much better record of the last 200,” said John Eddy, director of interdisciplinary Earth studies at the University Corp. for Atmospheric Research.

So far, climate history’s nearest approximation of a Rosetta Stone is a 2,000-meter ice core extracted during the 1950s from the Antarctic region near the Soviet Union’s Vostok Station.

Because it contains bubbles of air trapped as long as 160,000 years ago and across the intervening ice ages, the Vostok core has enabled scientists to establish a link between carbon dioxide and methane levels and surface temperature.

This stark, frozen record shows that carbon dioxide content increased from 180 parts per million during the last great Ice Age (about 160,000 years ago) to 280 p.p.m. before the Industrial Revolution in the 18th Century. That level has reached some 350 p.p.m. and is rising at a rate of about 0.4% a year.

Other core samples have been taken from the Himalayas in Tibet, from the Peruvian Andes and the ocean floor. Cores taken in the mountains of Tibet by Lonnie Thompson of the University of Ohio’s Byrd Polar Research Institute indicate that temperatures have been warmer during the last 50 years than at any time since the last small Ice Age, about 10,000 years ago. Now, under the sponsorship of the National Science Foundation, an American team led by University of New Hampshire scientists is working to get a 3,000-meter core sample from the interior of Greenland. Just several miles from the site, Danish and Swiss scientists are leading another coring project.

Clouds and Oceans

The data gaps that truly bedevil the model makers, however, are not the vagaries of climate history but a lack of the information needed to fathom climatic feedback processes and the contributions of oceans and clouds to cooling and warming processes.

Advertisement

The No. 1 priority, scientists generally agree, is to learn more about clouds. This is where the scarcity of information is most profound.

The broader questions begging to be answered, said Gordon J. MacDonald, a former chairman of the U.S. Council on Environmental Quality who began studying global warming nearly 20 years ago, concern the balance of cloud types and the effect of climate warming on their formation and distribution.

Although it is established that clouds have an overall cooling effect by reflecting sunlight, they have not become a part of global climate models. Because so many clouds of all types are so small and so widely scattered, scientists have yet to put together a coherent picture of their formation, their characteristics and their dissipation. There is vast ignorance about the reflectivity of small clouds, the processes that go on inside them and their local effect.

“We are just beginning to understand the physical processes and the relevance of the altitudes and types,” said Guy Brasseur, an atmospheric chemist at the National Center for Atmospheric Research.

The role of the oceans is somewhat better understood, although information is also limited.

“Sea surface data on a global scale is very spotty,” said Robert W. Correll, vice chairman of the Committee on Earth Sciences, which coordinates U.S. global change research. “There are many areas of the world where we have no data at all. Zilch.”

Advertisement

The most sophisticated ocean modeling is done at the Geophysical Fluid Dynamics Lab, where scientists have been at it for 20 years.

In a study there last year, Syukuro Manabe and his colleagues linked an ocean model with an atmospheric model and projected carbon dioxide buildup at a rate of 1% a year for the next 200 years. The experiment predicted a dramatic warming of the Northern Hemisphere.

In the test case, the Southern Hemisphere, particularly in the southernmost latitudes, stubbornly resisted warming. The conclusion was that the much larger expanses of oceans below the Equator kept the atmosphere from heating up.

“Down in the region of the Antarctic,” Mahlman said, “there is an upwelling, a mixing, bringing up old water that hasn’t seen the surface in perhaps 500 years, water that you might say hasn’t had time to be warmed by the greenhouse effect. The temperature of the air is determined by the temperature of the water. This is a stunner, because it says that the ocean contributes strongly to the distribution of climate change.”

The dearth of crucial data on clouds and oceans may persist for years. Many modelers scoffed when the George C. Marshall Institute concluded last year that increased investment in supercomputers could pay off in firm scientific evidence to back up policy decisions in as little time as three years.

The report was controversial in another respect as well: It concluded that current forecasts are not accurate enough to form bases for policy decisions.

Advertisement

That conclusion was said to have made a deep impression on White House Chief of Staff John H. Sununu, one of the principal makers of environmental policy in the George Bush Administration. It was heatedly challenged by environmentalists, who feared it would provide an excuse for delaying needed action.

In that debate, scientists, for the most part, remain on the sideline.

Many of them counsel against trying to overwhelm the problem with massive infusions of money.

“We are hardware-limited, we are data-limited and we are talent-limited,” Mahlman said, “but I have made myself unpopular with people who want to throw money at the climate problem. If somebody told me they wanted to double the (Geophysical Fluid Dynamics Lab) budget next year, I would say no.”

The overall effort to produce reliable models will be massively expensive nevertheless. The cost of collecting the crucial ocean and cloud data will dwarf the outlays for computers and modeling laboratories.

The Bush Administration’s proposed budget would invest half of a $2-billion increase in environmental funds programs in global warming research.

Yet $1.03 billion is barely the tip of the iceberg. Estimates are that development and 15 years of operation of the National Aeronautics and Space Administration’s Earth Observation System will cost $32 billion. Except for the manned exploration of the moon, it would be the most expensive program in the agency’s history.

Advertisement

Earth Observation System instruments will provide scientists with the precise kinds of information they need to build the high-resolution models that will begin to make regional climate projections feasible.

They are expected to supply global cloud counts, measure their altitudes and identify them by type. They will be able to observe clouds as small as 0.25 kilometer in diameter, compared to the 25-kilometer limit of today’s satellites.

But the first Earth Observation System platform is not due to be launched until 1997. In the meantime, exciting new data on the ocean is expected to come from a U.S. TOPEX satellite to be launched by the French.

The satellite, scheduled to begin operation in 1992, will carry an advanced altimeter to measure the height of the ocean surface within about 2 centimeters. This information perhaps will shed light on questions such as how water carried north in the Gulf Stream flows back to the Equator.

Climate-modeling centers and satellite operators also face problems in managing their archives. In the basement of the Geophysical Fluid Dynamics Lab, 30,000 tapes are stored. In the archives of the National Center for Atmospheric Research at Table Mesa, near Boulder, Colo., the tape file now runs to 75,000 numbers.

The massive accumulation of data is only a hint of things to come. One Earth Observation System platform will produce 10 trillion bytes of data daily. The plan is to have two of them in orbit at a time for 15 years.

Advertisement

Only by compiling more and more data can scientists hope to achieve the necessary precision in their climatological models. The coarse grids now in use have data points that are about 500 kilometers apart.

“If we are to understand more specifically the regional responses to climate change, then we must have higher resolution,” said Guy Brasseur of the National Center for Atmospheric Research. “When we reduce the distance between grid points by a factor of two, our requirement for computer power increases by a factor of 10. We think the prediction of regional response requires a resolution of 50 to 100 kilometers, so we need two or three new generations of computers.”

Equally important to some scientists is understanding what the models are saying.

Mahlman recently tried to run a high-resolution model on the old computer at the Geophysical Fluid Dynamics Lab, the one that is to be replaced with the new Cray YMP. “I have been able to run it, but I can’t make sense of it,” he said.

“The weakest link in all of this may be the inability to get at the numbers. It’s not enough just to crunch the numbers, you have to know what they mean.”

Advertisement