Advertisement

War sparks leaps of technology

Share
Sun-Sentinel

Necessity is the mother of invention, but war is often a surrogate mom.

For centuries, military conflict has propelled nations and leaders in search of technological solutions to end the battle or the threat. Often those solutions - from canned food to computers -- outlast the fight and alter society.

Historians who focus on technology believe the war on terrorism could have such a profound impact, perhaps for the first time since the Cold War.

“When war comes along, there is this incredible amount of money thrown at things,” said Paul Ceruzzi, a curator at the Smithsonian Institution’s National Air and Space Museum in Washington. “Often, it is very inelegant, but it does speed up the process.”

Advertisement

Predicting the long-term impact of current events with certainty is impossible. But since the hijackings Sept. 11 that killed an estimated 5,000 people in New York, Washington and Pennsylvania, expectations have risen for companies that develop systems to protect people and data and to aid communications.

Futurists suspect that concerns about security will influence for years areas as diverse as skyscraper design and travel. High-speed magnetic levitation trains, discussed in Baltimore and other cities for years, might get an infusion of support and dollars if air travel fears persist.

Similarly, political battles that have kept the United States second-rate in wireless technology might give way to a more standardized, reliable network, industry executives say.

More than twice as much venture capital investment was announced for wireless companies in the 30 days after Sept. 11 compared with the 30 days before -- $330 million vs. $160 million, according to Venture Reporter, an industry newsletter based in New York.

“I don’t think that’s coincidental,” said Douglas Mintz, a Baltimorean who is co-executive editor of Venture Reporter. “The way the market played out is directly related to the events.”

Historians and futurists have been discussing how the war on terrorism might shape the next generation of innovation. Some contend that research can become misdirected by anxieties and a massive infusion of money. Others argue that war doesn’t spark the process of scientific experimentation, but only lubricates it.

Advertisement

The subject became the focus of a recent conference in Silicon Valley held by the Johns Hopkins University’s Society for the History of Technology, the pre-eminent organization in that field of study. A “teach-in” on the topic also was held this month at the Massachusetts Institute of Technology, the Cambridge university that led the emergence of military research on U.S. campuses in World War II.

The Cold War’s impact was as pervasive as any declared war, influencing technology from 1945 to 1990.

President Dwight D. Eisenhower’s 1950s program to facilitate moving military forces and evacuating cities in case of nuclear emergency -- formally called the Eisenhower System of Interstate and Defense Highways -- spurred auto travel and laid the base for large suburban migration.

President John F. Kennedy’s vow to put a man on the moon by 1970 also was in direct response to military fears of the era.

Several space historians point out that once the Soviet challenge waned, U.S. trips to the moon stopped. They wonder whether the force-fed billions and frenetic pace had a downside, and whether a more consistent program with more frequent trips to the moon would have resulted absent the Soviet threat.

The shape of the computer was also greatly influenced by war and its shadow. World War II led the United States to invest $500,000 to develop one of the first electronic computers.

Advertisement

“There was a need to make a computation device quickly, so the federal government poured tons and tons of money into it. Size and expense became no object,” said Alan I. Marcus, director of the Center for Historical Studies of Technology and Science at Iowa State University.

The result in 1946: the Electronic Numeric Integrator and Computer, or ENIAC, a 30-ton behemoth at the University of Pennsylvania whose operation caused brownouts in Philadelphia.

Its mission was to compute ballistics tables for Aberdeen Proving Ground. Previously, the Army scoured women’s colleges -- men were in battle -- to locate math majors to compute the tables that soldiers used to mechanically aim large guns. The women’s job titles, “computers,” became the name of the machine, the Smithsonian’s Ceruzzi said.

ENIAC was successfully completed, but not until after World War II. The miniaturization of computers followed, but not until a generation later, in response to the space race and a new set of global fears.

Reliance on technology to quell war anxieties is not a wholly modern phenomenon.

Leonardo DaVinci engineered the forerunners of submarines, tanks and machine guns in the 16th century. Napoleon’s armies were among the first to use canned food, replacing the need to loot the nearest village to eat. Balloonist Thaddeus Sobieski Lowe introduced President Abraham Lincoln to the idea of using hot-air balloons to record the whereabouts of Confederate troops from thousands of feet in the air.

In 1915, as Americans grew worried about fighting in Europe, Thomas A. Edison told a New York Times correspondent who had asked for thoughts on the conflict that the nation should look to science.

Advertisement

“The government should maintain a great research laboratory,” said Edison, whose inventions, including the light bulb and the phonograph, occurred in peacetime. “In this could be developed ... all the technique of military and naval progression without any vast expense.”

Edison later helped advise the Navy as it created the Naval Research Laboratory. It did pioneering work in high-frequency radio, underwater sonar and the first practical radar equipment in the country.

World War II was instrumental in advancing research in airplanes, radar and prosthetic medicine. Atomic energy was being studied before the war, but its development as a weapon of mass destruction ended the war and created a new world order.

August Giebelhaus, a history professor at Georgia Tech’s Institute of Technology, recalled President Jimmy Carter’s attempt in 1979 to harness the power of war without entering into one, when a cartel of Arab nations conspired to drive up the price of oil.

“One reason the war on the energy problem did not succeed was because the American people never made that leap,” Giebelhaus said. “We never did see it as the ‘moral equivalent of war.’”

Television -- a technology whose birth was stilled for years by World War II after its coming-out at the 1939 New York World’s Fair -- has become a powerful weapon in the current crisis. The Bush administration appeared anguished by terror suspect Osama bin Laden’s televised warnings and Americans were horrified as they watched the World Trade Center towers collapse before their eyes.

Advertisement

“The real weapon here is not just the airplane, it’s the television -- the power of the imagery and symbolism,” said Rosalind H. Williams, an MIT history professor.

Some contend that the recent attack betrayed an overemphasis on technology: A nation that became a “superpower” through its scientific prowess was turned upside down by hijackers armed with box cutters.

Nevertheless, historians believe the concerns that gripped the nation beginning the morning of Sept. 11 will speed technology’s cycle of trial and error -- and the money coursing to it.

“We plod along, we get this massive headache and then we pop the pill,” Giebelhaus said.

On this subject, he likes to quote his mentor, the late Melvin Kranzberg, credited as one of the first to study the field: “Technology is the answer, but that’s not the question.”

Advertisement