Advertisement

The Sims Take on Al Qaeda

Share
Times Staff Writer

Inside a concrete-and-glass laboratory at the Naval Postgraduate School, a computer simulation of Osama bin Laden’s Al Qaeda terrorist network is beginning to take shape.

Scientists are preparing to conjure deserts, urban landscapes, communications networks, weapon systems, immigration patterns and an army of terrorists cunning enough to design plots of mass destruction. They also are fashioning millions of potential victims who will be preyed on thousands and thousands of times.

In the new war against terrorism, with its infinite possibilities for unpredictable violence, the military is attempting to understand jihad through the infinitely patient and dogged computer.

Advertisement

“Interesting things happen,” said Michael Zyda, who is leading the Navy’s simulation project here, “things you didn’t expect.”

Military strategists have long used computers to wage virtual war, modeling the clash of armies and the devastation of nuclear weapons.

But terrorists aren’t fighting on traditional battlefields. They aren’t organized into traditional fighting units. And, as the attacks on the World Trade Center and the Pentagon demonstrated, they don’t care whether they survive.

The new breed of virtual war game is attempting to push into that unexplored terrain, drawing from a burgeoning field of artificial intelligence known as “agent technology.”

The goal is to create a framework flexible enough to probe the possibilities for attacks in any setting. Researchers at Argonne National Laboratory already are using this approach to scan the country’s energy distribution system for vulnerabilities that could be exploited by saboteurs.

Though many particulars about Bin Laden and Al Qaeda remain a mystery, the programs need understand only the broad outlines of how they work. The details of their strategies are supplied by the simulations, which run through millions of possible terrorist configurations to find the ones that are most threatening and destructive.

Advertisement

The terrorist simulations are similar to the popular computer game “The Sims,” in which players create their own digital worlds and populate them with autonomous characters that roam about and grow, often with surprising results.

Zyda and his fellow researchers suspect the same simple yet unpredictable interactions that make “The Sims” so lifelike have the potential to illuminate the unpredictable methods of terrorists.

In essence, they are creating their own “Sim Osama.”

“Some of the very best games have very, very simple rules,” said Will Wright, creator of “The Sims.” “But amazingly elaborate strategies emerge that you can’t predict.”

The hub of the military’s effort is an obscure research center known as the Modeling, Virtual Environments and Simulation Institute in Monterey. It is one of several groups, including the Defense Modeling and Simulation Office in Virginia and the Army’s Simulation, Training and Instrumentation Command in Orlando, Fla., dedicated to producing military simulations.

Zyda, a 47-year-old engineer with the demeanor of a gung-ho dot-commer, presides over more than 30 researchers who study the various ingredients of simulated reality. Their specialties include human movement, terrain re-creation, surround sound and casualty estimation.

On a computer screen in one of the institute’s spartan offices, 200 red and blue dots march across a tan grid, representing some foreign terrain. Zyda watches as the blue dots devise their own attack strategy to gain control of a coveted red army stronghold.

Advertisement

The scenario will take less than a minute to resolve. It unfolds differently each time, although the blue dots, which have a slight advantage in numbers and skill, are usually victorious.

The simulation program, known as GI Agent Editor, is the seed for Sim Osama, a long-term research project that might not be completed before Bin Laden is captured but will provide valuable information for the inevitable conflicts of the future.

It has taken decades of computer research to reach this point. When Zyda began work at the Naval Postgraduate School 17 years ago, war games were like elaborate choose-your-own-adventure stories. Each program could be played out only within a well-defined range of possibilities.

One training simulation that Zyda worked on had Army infantrymen move into an enemy building while under fire from a digital sniper. Though the sniper could adjust his strategy based on how the infantrymen advanced on his building, all he could do was shoot from a window.

The weakness of these war games has long been understood. Though they served as useful training exercises, the simulations were unable to accommodate anything new or unusual. They certainly couldn’t serve up a scenario that planners hadn’t anticipated.

Even before the advent of computers, military strategists understood that these limitations could have dire consequences. One of the best-known warnings came from U.S. Navy Adm. Chester Nimitz after World War II.

Advertisement

“The war with Japan had been enacted in the game rooms at the War College by so many people and in so many different ways that nothing that happened during the war was a surprise--absolutely nothing except the Kamikaze tactics toward the end of the war,” Nimitz said. “We had not visualized these.”

Before Sept. 11, no one had visualized the potential of multiple suicide attacks using hijacked jetliners.

Zyda awoke at 6:30 that morning to get his daughter ready for school and turned on the radio. The first plane had crashed into the World Trade Center’s north tower, and when he heard the news he instantly suspected terrorism.

“I started thinking right then, ‘How do we model this?’ ” he said.

Many of the critical pieces were already in place in GI Agent Editor, which was developed by Army Capt. Joel Pawloski, a former scout platoon leader and air cavalry troop commander.

Pawloski wanted the program to solve tactical problems, such as the most effective way to deploy nine snipers among a 94-person attack force. He created a blue army and a red army; and with a few mouse clicks, he set the weapons range, movement range, durability and marksmanship for each soldier.

Click on any of the dots and up pops a screen that displays what’s going on in the soldier’s digital mind. One set of boxes shows the numerical values for its personality traits, such as independence and aggressiveness. Another keeps track of where the soldier is on the battle grid and where it’s trying to go. A third box lists the goals--engage enemy, stay healthy--and shows which is of highest priority.

Advertisement

Second by second, the dots spread themselves across the screen until the blue soldiers have surrounded their target and the red soldiers have retreated to the nearby foothills or perished.

Pawloski ran the simulation 165 times, with the blue army’s snipers deployed in a variety of schemes. It turns out that the blue army had a 96% success rate when the snipers were deployed among nine-member squads, with far lower success rates when they operated out of larger units.

To the 14-year veteran, the results rang true.

“The snipers bring increased range of vision and firepower,” he said. “When the snipers are at the squad level, the stuff they see gets communicated up to the leadership earlier, and that helps.”

Planning small-scale assaults on Al Qaeda positions is the most obvious use for GI Agent Editor in the war on terrorism. The simulation’s virtual terrain can be adjusted to mirror actual places where U.S. forces are planning attacks, though it will take some time for the technology to migrate from the lab to the battlefield.

But by adjusting other variables, the war game can begin to approximate broader geopolitical factors. Soldiers can be accompanied by hoards of civilians, who respond to bombing raids by flooding refugee camps. Terrorists can be distinguished from Taliban fighters by downplaying the value they place on self-preservation and boosting their ability to operate outside traditional war venues.

Making the jump from a single battlefield to the global stage isn’t a matter of simply stretching the physical terrain. The key is re-creating the range of ephemeral social, economic and political forces that are at the core of terrorist conflict.

Advertisement

The task, in essence, requires teaching a computer to understand the meaning of fear, hatred, bigotry and other emotions that fuel terrorism.

“What happens if there’s a little more racism in society?” said Ian Lustick, a political science professor at the University of Pennsylvania who has created a virtual Middle Eastern country to experiment with such kinds of social upheaval. “What happens if we open our borders to more immigrants? Or if we ban contacts between one group and another?”

Answers to those questions reveal themselves to Lustick as brightly colored blocks in a 50-by-50 square grid on a computer screen. Using a pair of programs called Agent-Based Identity Repertoire and Ps-i--short for political science identity--Lustick defines each square as a person, a village or some other unit of humanity. The color of a square indicates its allegiances.

The grid is a stand-in for a composite of Tunisia, Syria, Jordan, Egypt and Iraq, and it is populated with about a dozen kinds of people. Some are bureaucrats, who are loyal to the government. Some are fundamentalists, who live in rural areas and aren’t influenced by the government. Some are fanatics, who can influence other agents but cannot be influenced themselves.

As Lustick starts the program, the grid of colors begins to bubble in seemingly random patterns. Gradually, the appearance melts into larger clumps of color. Squares blink as each individual reevaluates the shifting social forces surrounding it, then decides whether to change itself.

Lustick has run these types of programs more than 10,000 times in the last three years to examine the effect of social trends and government policies on anti-American sentiment and terrorism in the Middle East. He is looking for ways that seemingly small actions have big consequences.

Advertisement

“I think about terrorism in terms of popcorn,” he said. “You assume you’ll always have some kernels that are going to pop. How much lower does the temperature have to get before you have a dramatic decrease in the ability of terrorists to operate?”

His research has found that when the underlying relationships between color blocks are constantly shifting, the blocks look to the government as an anchor and their colors mesh into a pattern of support. But if the blocks share a common concern about risks from the outside world, they are more likely to become disaffected and blend with dissident groups.

Lustick’s flashing grid is conflict in its most abstract form. That turns out to be its greatest strength--as well as its most glaring weakness. Researchers are painfully aware that their models omit the messy edges of real life, and some of them might turn out to be critical.

“In practice, it’s hard to get the information from the political scientists into the hands of the computer scientists.” said Marcus Daniels, director of the Swarm Development Group, a spinoff of the Santa Fe Institute that focuses on agent software.

In these simulated worlds, filtering out scenarios that are truly implausible requires human judgment, which is fallible. They are meant to augment, not replace, the intuition of seasoned military and intelligence experts.

“We could have a detailed blow-by-blow story, and it could be seductively misleading,” said John Hiles, a research professor at the Modeling, Virtual Environments and Simulation Institute. “The danger is that you’d use [simulations] as a substitute for your own thought.”

Advertisement

In one of the early runs of GI Agent Editor, Pawloski was confronted with a stunning rout of the blue army. Instead of fighting their way to their usual victory, the blue soldiers scattered into the woods and cowered.

Pawloski was puzzled at the development and immediately opened the program’s “brain lid” to peer into the thinking of the retreating troops.

What he discovered was a logical flaw in the program. The blue soldiers were programmed to follow their leader. But when that dot was killed, the troops didn’t know how to choose a new dot to follow. Leaderless, they ran into the woods.

In real life, soldiers are trained to follow the next in line of command.

“That’s the type of stuff you see, then realize you have to go fix the program,” Zyda said.

Pawloski fixed the bug. In short order, the troops were thrown back into the fray to wage their virtual war.

Advertisement