War on Terrorism Highlights FBI’s Computer Woes
In the frantic days after the terrorists struck, FBI agents scrambled to box up investigative files at their New York office a few blocks from the World Trade Center and haul them to safety. In the FBI’s paper-driven culture, many of the documents had never even been downloaded into the bureau’s aging computer system.
In Tampa, Fla., meanwhile, agents were scurrying to send photos of the 19 hijackers by overnight mail to 56 FBI offices around the country so agents could chase down possible conspirators. Frustrated agents had been unable to e-mail the photos because the FBI’s computer system wasn’t designed to handle such a basic task.
The Sept. 11 attacks and their aftermath have exposed the FBI’s computers as a national laughingstock, a system so antiquated and inefficient that U.S. senators quip that their kids get more bang for their byte than the nation’s vaunted G-men.
FBI Director Robert S. Mueller III has laid out an ambitious three-year plan for overhauling the bureau’s beleaguered system. But the severity of the problem, and its threat to national security, have long been known to top FBI officials.
Indeed, newly disclosed records and interviews show that years of warnings at the highest levels of the FBI often have gone unheeded and that the bureau allegedly diverted tens of millions of dollars from computer upgrades to manpower needs that it deemed more important.
Former Atty. Gen. Janet Reno became so frustrated by the FBI’s inertia that she wrote then-Director Louis J. Freeh a highly unusual and strongly worded series of internal memos about the problem. In a May 2000 memo obtained by The Times, titled “Threats to U.S. National Security Interests,” Reno told Freeh that it was “imperative that the FBI immediately develop the capacity” to search its files, analyze security threats and be able to share information with other intelligence agencies.
“I think our national security requires that we get started immediately on this effort,” Reno told Freeh in a memo foreshadowing the intelligence failures that would be revealed 16 months later by the World Trade Center and Pentagon attacks.
Yet not much has changed, and the threat to national security looms even greater. How the FBI reached such a state of technological lethargy is a story of institutional arrogance, misguided priorities, missed warning signs, overmatched technical advisors and a soured relationship with increasingly distrustful benefactors in Congress.
Dating back nearly a decade, officials warned in private communication and in public reports that the bureau was severely hampered by agents’ inability to do such basic tasks as thoroughly searching case records and receiving e-mail. The shortcomings have played a part in virtually every high-profile misstep by the FBI in recent years, including missing Oklahoma City bombing documents, the Robert Philip Hanssen spy scandal and the Wen Ho Lee espionage investigation.
Investigations are still largely paper-driven, and many agents use dinosaur-era computers or even write reports longhand in this era of high-speed Pentium processors. The FBI has 42 databases that often run on incompatible software and hardware. Simple searches--allowing an agent in Minneapolis, for instance, to see whether the words “flight training school” show up in case files--are unwieldy, if not impossible.
Experts inside and outside of the FBI say myriad financial, political, technological and cultural factors explain the logjam, among them:
* The FBI, unable to pay the top salaries the private sector doled out through the 1990s, lacked the in-house technical expertise to manage complex upgrades. Until the last few years, officials often believed, mistakenly, that their people could do the job themselves without the help of outside experts.
* A distrustful Congress, grown weary of huge cost overruns after doling out $1.7 billion on FBI computer projects since 1993, has kept the bureau on a tighter financial leash, refusing to fund new projects until higher standards were met.
* And, perhaps most critical, the bureau experienced cultural resistance to letting machines take the place of solid, old-fashioned police work, an attitude shared by many top officials and street agents alike.
As one veteran agent said, the FBI has been dominated by an old-school attitude that “real men don’t type. The only thing a real agent needs is a notebook, a pen and gun, and with those three things you can conquer the world. That was the mind-set for a long time, and the computer revolution just passed us by because of it.”
Sept. 11 Attacks Provided a ‘Sense of Urgency’
The FBI itself realized as early as 1996 that a newly installed case-file system had glaring holes. It sent in a special “red team” of experts and agents to analyze the problems, according to law enforcement sources familiar with the review. Six years later, the case system, with many of the same holes, has not yet been replaced.
FBI officials acknowledge that the Sept. 11 attacks forced them to rethink their priorities in rebuilding their information system. “There was always the recognition that we needed to do this. The sense of urgency is what’s different now,” said Mark Tanner, the FBI’s deputy chief information officer.
Mueller’s overhaul plans--built around “paperless” files and artificial intelligence to “predict” terrorist activity--call for a full-speed sprint. But first, “we’ve got to get walking,” Robert J. Chiaradio, one of the FBI’s top systems gurus, admitted in a recent interview before leaving for the private sector. “You cannot [use technology to fight terrorism] unless you’ve got the foundation. So we’re building this foundation.”
The stakes are enormous. Many believe the FBI’s success or failure, after more than a decade of fits and starts, will be a pivotal factor in deciding the outcome of the war on terrorism.
“I do not think the FBI can manage its responsibilities in the intelligence arena and the law enforcement arena, where national security’s involved, without being sure that its technology is successfully upgraded to perform its mission,” said William H. Webster, a former director of the FBI and CIA who is widely respected in Washington.
CIA officials have indicated in recent closed-door testimony that they are reluctant to share some sensitive information with the FBI because of concerns about safeguarding the data, according to a congressional source.
And FBI agents in the New York field office have simply refused to put some national security information into the system for fear it could be compromised, according to a review in March by the Webster Commission, appointed by the Justice Department to look into security issues. The concerns were driven home several years ago when an FBI college intern, given ordinary access to the system to test its vulnerabilities, penetrated restricted files in a single afternoon.
In recent weeks, scrutiny of the FBI’s dilapidated system has set off a gut-wrenching exercise in “what if” scenarios: What if the FBI had a nimble, secure, well-integrated system in place before Sept. 11?
Could agents in Minneapolis, Phoenix and Oklahoma City, each harboring suspicions about Middle Eastern flight students, have pooled their resources to detect a pattern? Would the FBI, working more quickly with the CIA, have found two of the 19 hijackers-in-training who were living quietly in San Diego after showing up on a watch list? Would FBI analysts have been able to decipher a spike in terrorism intelligence “chatter” to predict an attack?
“It just makes my jaw drop to think that on 9/11 ... the kind of technology that is available to most schoolkids, and certainly every small business in this country, wasn’t available to the FBI,” Sen. Charles E. Schumer (D-N.Y.), who has pressed the FBI for years on the issue, told Mueller at a recent hearing.
Unless the FBI overhauls its system--and does it even more quickly than Mueller’s three-year timetable--the nation risks “another horrible attack,” Schumer warned.
Computers Were Not a Priority for Ex-Director
Mueller admitted he was shocked to find the bureau’s system in such disarray when he took over last year, a week before Sept. 11. “We are way behind the curve,” Mueller told lawmakers.
With millions of pieces of information collected by FBI investigators but no good way to sort it all out, officials admit that “we don’t know what we know.”
Many blame former Director Freeh for fostering an anti-computer attitude during his tenure from 1993 through last year.
Freeh, a dogged investigator who rose from the ranks of FBI agents and took an active role in top-priority probes as director, eschewed the use of computers himself. “I never saw him use one,” said Robert “Bear” Bryant, his top deputy.
To Freeh’s credit, the FBI’s ranks grew significantly under his leadership, as he gave the bureau a much-expanded international presence. But his perceived lack of interest in the FBI’s computer woes became a growing source of frustration for Reno, according to officials familiar with their discussions.
“She was always pushing them to do more in that area and, sadly, she was right,” said a former high-ranking official at the Justice Department under Reno, who asked not to be identified. “The results just weren’t there.”
Jamie Gorelick, the No. 2 official at the Justice Department in the mid-1990s, said in an interview: “Director Freeh’s priorities were putting agents on the ground and building the [FBI’s overseas] operations. He was simply less interested in, frankly, what was the more boring work, of infrastructure development.”
Reno became particularly incensed in 1997 when the FBI began investigating allegations that the Chinese government had tried to illegally buy influence in U.S. elections, several former aides said. Congress wanted relevant documents on the issue from the Justice Department for its own investigation, but the FBI repeatedly missed records from its own files, aides said. At one point, CIA Director George J. Tenet told an embarrassed Reno that his agency had found a relevant FBI document in its own files and was turning it over to Congress. The FBI apparently didn’t even know of the document’s existence, aides to Reno said.
Freeh declined requests for an interview. Bryant and several other former aides said that Freeh, contrary to his critics’ perceptions, did understand the importance of upgrading the FBI’s computer capabilities.
But, according to a former aide who supports Freeh, “it would never be a top priority. He didn’t care about it enough to devote his own time to it” because he was so often immersed in major investigations.
Money Intended for Technology Was Diverted
Publicly, Freeh spoke of the need to ramp up FBI technology. But privately, law enforcement sources disclosed, he allowed the FBI to raid its computer budget repeatedly, taking money intended by Congress for systems and infrastructure upgrades and using it instead to fund shortfalls in staffing and international offices.
The diverted money, much of it designated for vital computer upgrades, totaled $60 million in 2000, with millions more in other years, according to a former senior official at the Justice Department.
Members of Congress referred to the practice as “hollow” budgeting because it allowed the FBI to artificially inflate its manpower budget. Tensions became so great that the Bush administration, under pressure from Congress, last fiscal year quietly cut the maximum number of authorized agent positions by more than 400 to prevent the bureau from diverting more computer money, officials said.
“Louis Freeh wanted more cops on the beat, and he was robbing from the equipment side to pay for people,” said Rob Nabors, an FBI budget specialist with the Republican staff of the House Appropriations Committee. “We saw it as an end run around the appropriations process. Legally, he didn’t do anything wrong, but he was clearly violating the will of the appropriations committees.”
FBI officials denied that they improperly diverted any money, but they declined to discuss the issue in detail or provide a breakdown of how computer money has been spent.
By the late 1990s, members of Congress were fed up with the money pit that the FBI’s computer overhaul had become. The agency had suffered two black eyes in the development of its fingerprinting and criminal background check systems, which came in years behind schedule and $300 million over budget, according to Justice Department figures.
“It was just unconscionable,” according to the former senior official.
With in-house people running the programs, the official said, “the bureau suffered from the mentality that an FBI agent can do anything
Part of the problem, officials said, is that systems jobs were, until recently, not seen as plum assignments. The bureau often relied on agents with limited technical backgrounds who were at or near the bottom of the career ladder.
“The FBI agents want to do cases. [Top officials] have not traditionally paid much attention to getting the best people in these jobs,” said Harvard University management professor Steven Kelman, who oversaw federal procurement in the Clinton administration.
Frustrated, Congress placed tight restrictions on FBI computer funds in the late 1990s and demanded unprecedented scrutiny of how the money would be spent. With Congress reluctant to give the green light, the FBI shelved two plans for replacing its problem-riddled case system, which was only a few years old.
After the string of failures, “there was some skepticism [in Congress] as to whether we could actually deliver a major project,” the FBI’s Tanner acknowledged.
Retired IBM Executive Changed the Culture
A sea change came in 2000, when Freeh brought on retired IBM executive Bob Dies to oversee technical operations. Dies is credited with restoring the FBI’s battered credibility in Congress and freeing up tens of millions of dollars for new automation systems before leaving the bureau this spring.
“Bob Dies was really the beginning of an evolution in terms of bringing substantial numbers of people in from the private sector,” said Assistant FBI Director John Collingwood.
But the scars from years of neglect remain, much to the frustration of agents who believe their warnings have fallen largely on deaf ears.
One FBI agent complained that he didn’t have access to office e-mail to communicate with the parents of a kidnapping victim, so he resorted to using his personal e-mail account.
Another agent said he recently couldn’t get access to PowerPoint software to give an important presentation on weapons of mass destruction, so he had to bootleg the software.
And still another agent said that after the FBI finally gave him a new laptop, he couldn’t get requisition authority for a battery to operate it.
Nancy Savage, president of the FBI Agents Assn., said agents often waste hours trying to resolve technical glitches.
“This is a problem we’ve been screaming about for years,” she said.
“You’re not getting your bang for your buck when you’re paying agents to deal with faulty automation instead of putting people in jail.”
Savage’s predecessor, Agent John Sennett, also hammered that theme repeatedly in internal communications, warning in a 1999 bulletin that the FBI “is stuck in the slow lane.”
But the response was minimal, and the results were often disastrous.
In 1999, for instance, when Angel Maturino Resendiz was caught sneaking across the New Mexico border, U.S. Border Patrol agents sent him back to Mexico--even though the FBI had a warrant out for his arrest in connection with three slayings in Texas and Kentucky.
The so-called railway killer went on to kill four more people in the United States in a cross-country railroad trek of murder and rape before his surrender.
Although the INS bore the brunt of the criticism for allowing Maturino Resendiz to get away, a March 2000 report by the Justice Department inspector general concluded that the failure of the FBI and the INS to integrate their fingerprinting systems was a critical problem in the deadly chain of events. The two systems are still not fully integrated.
Less than a year later, computer troubles haunted the FBI again with the arrest of longtime agent Hanssen, who had given the Russians reams of secrets on his way to becoming one of the most damaging spies in U.S. history.
It turned out that Hanssen, an adept computer user, had routinely plugged his own name and spy terms such as “dead drop” into the FBI’s computer system to determine whether the FBI was onto him.
The FBI lacked basic computer-auditing safeguards that might have caught such suspicious activity, helping Hanssen’s espionage go undetected for 22 years.
And in the Oklahoma City bombing, the FBI’s inability to find more than 4,000 pages of documents and properly turn them over to Timothy J. McVeigh’s attorneys forced a delay in McVeigh’s execution last year.
Many victims’ families, who had been hoping for a sense of finality, waited in anguish for nearly a month before the execution was allowed to proceed.
Again, the FBI promised reforms. But for all the major lapses the FBI has suffered in its computer systems, critics say it shouldn’t have taken a crisis of the magnitude of Sept. 11 to light a fire under the bureau.
“It defies logic to think that an agency with the world at its feet has let things deteriorate to this point,” said Nabors of the House Appropriations Committee.
“There was a train wreck coming, and they should have seen it coming from a mile away.”
(BEGIN TEXT OF INFOBOX)
Computer Problems an Issue for Years
The FBI’s computer systems have been dogged by problems in recent years. Some key events:
The FBI experiments with developing “artificial intelligence” to predict criminal activity, but after devoting significant resources with minimal results, abandons the research.
The development of two major FBI systems, aimed at upgrading the checks done on fingerprints and criminal backgrounds, suffers delays and cost overruns.
The FBI’s automated case system, designed to replace the bureau’s paper-laden system, goes online. An internal study the next year finds the system fraught with problems, many of which remain unresolved today.
The FBI’s national instant check system, mandated by Congress to verify the backgrounds of gun purchasers, goes online. The system is criticized by both the gun lobby and gun-control activists.
The National Infrastructure Protection Center, run by the FBI, is created to combat cyber-crime. It improves the federal government’s computer forensic abilities but struggles to earn the respect and cooperation of the private sector.
Congress and the FBI commit more than $86 million to overhauling the troubled automated case system, but the money is never released because of concerns over the FBI’s history of problems with new systems.
The FBI’s troubled fingerprinting and criminal background check systems go online in the same month--on schedule and at a cost of $823 million--but parts of the fingerprinting system are incompatible with other U.S. and European systems.
Retired IBM executive Bob Dies joins the FBI to oversee technological upgrades and is widely credited with shoring up the bureau’s credibility with members of Congress. Dies left the agency this spring.
The FBI begins its third attempt to replace the automated case system. In the wake of Sept. 11, Director Robert S. Mueller III vows to speed up the bureau’s systems overhaul with new efforts into at data-mining and gathering artificial intelligence.
Congressional investigators examine the role that the FBI’s systems problems may have played in the intelligence failures surrounding Sept. 11.
Monday: The FBI plans to turn its massive collection of data into a mother lode of predictive intelligence.
Start your day right
Sign up for Essential California for news, features and recommendations from the L.A. Times and beyond in your inbox six days a week.
You may occasionally receive promotional content from the Los Angeles Times.