Advertisement

Systems Easily Tripped in Error Bring Death in a Lake, Warning Us of Automatic War

Share
<i> Walter Reich is a psychiatrist and a senior research associate in the International Security Studies Program of the Woodrow Wilson International Center for Scholars in Washington</i>

After the downing of an Iranian airliner by an American warship over the Strait of Hormuz, Adm. William J. Crowe Jr., the chairman of the Joint Chiefs of Staff, summarized the rules of engagement that were set for U.S. military forces soon after their entry into the Persian Gulf. American commanders, he explained, were given “sufficient latitude to protect their people and equipment when hostile intent was manifested. They do not have to be shot at before responding.” In the case of the Iranian airliner, he pointed out, the people in the ship’s command center and those operating its radar “had about four minutes from the time they picked up this target up until it was declared hostile.” In fighting in the gulf, Crowe said, “we’re fighting in a lake.”

When adding up the pros and cons of entering that lake, White House and Pentagon officials must have listed, somewhere in the “con” column, just this kind of occurrence. They must have envisioned the possibility of an innocent act or circumstance that, in the absence of sufficient information for full analysis and against the background of heightened tensions, would be interpreted by our forces as hostile, provoking a defensive American response that would cause needless death.

Clearly, though, the “pros” had it; that column was judged longer, or at least weightier, and we entered the lake, guns cocked and increasingly, as we suffered casualties, set on the hair-trigger that the “con” column must have predicted.

Advertisement

Did we err in entering that lake? Did we lack the imagination necessary to truly see the worst consequences of doing so, or attach to those consequences probabilities that were unrealistically low? It’s still too early to say. But it’s certainly not too early to learn something important from this incident, not only about our dilemma in the gulf but also about the new face, and the new reality, of war.

More than anything else, it’s a face behind which it’s ever harder to be sure of the reality. What looks like hostility may in fact be innocence. But the consequence of error could be our own destruction, and so we’re forced to assume the worst and react accordingly--with results that, when we’re wrong, may be devastating for scores, or hundreds, of innocents.

We respond to potential hostility that way because we have too little time to respond in any other way, and we have too little time because the rush of technology has endowed all combatants, including those arrayed against us, with weapons that are breathtakingly fast and accurate. With the same technology that produced those weapons we have produced systems that assess their threat, and we have taught those systems, all too imperfectly, to decide which kinds of actions are in fact threatening and which are instead innocent. To reduce the likelihood of mistakes, we have ordered those systems to gather as much information as possible. But time is short and the danger great, and, even without confirmation by human means, we accept their judgments and destroy the target that, they advise us with blindingly reassuring speed, is otherwise likely to destroy us.

And destroy it we do, as we did in the case of the Iranian airliner. Moreover, we’ll do it again, and in time it will be done to us.

But the really big lessons about the new face of war are the ones that we can never afford to learn. Hundreds of innocent deaths are hundreds too many, but still only hundreds, and perhaps an acceptable possibility in the painful calculus that precedes the decision to project military force for political ends. But the same principles of technological war--defensive decisions made by machines that we teach to make them, based on imperfect assumptions and uncertain information and resulting in almost automatic responses to calculated threats--undergird the structure of weapon systems immensely more powerful than the ones arrayed in the Persian Gulf. And the dimensions of death that can result from such systems tripped in error, or through misperceptions of reality, are uncountably greater than those that can result from a downed airliner or a sunk ship.

Those systems, the nuclear ones, have never been tested in real life; the less-complicated ones, floating in that far-off Middle Eastern lake, now have. For the sake of that lake’s inhabitants, as well as the world’s, it might do to be a bit less sure about the brilliant friends that we have invented to be our eyes, and our minds, in our sudden moments of automatic war.

Advertisement
Advertisement