Advertisement

FBI Draws Heat Over Pricey Software Trouble

Share
Times Staff Writers

Members of Congress reacted angrily Thursday to new disclosures of trouble with a massive upgrade of the FBI computer system, accusing bureau officials of misleading them about the situation while acknowledging that millions in additional funds will be needed to fix it.

FBI Director Robert S. Mueller III, addressing the latest software problem for the first time, said he was frustrated.

His comments came as the bureau began damage control after the Los Angeles Times reported Thursday that FBI officials had concluded they might have to scrap a $170-million computer program designed to help agents share information to ward off terrorist attacks.

Advertisement

The software is part of a four-year, $581-million computer system overhaul that has been one of Mueller’s priorities in the agency’s reorganization after the Sept. 11 attacks. But the project has been mired in cost overruns and delays, and the software, Virtual Case File, is now considered outdated and inadequate.

“I am frustrated by the delays,” Mueller said Thursday at a news conference in Birmingham, Ala., according to Associated Press. “I am frustrated that we do not have on every agent’s desk the capability of a modern case-management system.”

Mueller said the bureau was hoping to salvage the software, but other officials, who spoke on condition of anonymity, said there was a good possibility that it would have to be replaced.

The FBI will need another four months to decide on a new strategy, the officials said -- including a search for a new software partner to replace the original contractor, Science Applications International Corp. of San Diego.

Sen. Patrick J. Leahy (D-Vt.), who met with bureau officials Thursday, said that he had been assured in May that the software would be completed by the end of 2004 -- a year behind schedule -- and that it would give the FBI “cutting-edge technology.”

“Now, we learn that the FBI began to explore new options last August, because it feared that VCF was going to fail,” said Leahy, the top Democrat on the Senate Judiciary Committee.

Advertisement

Congress will have to funnel more money into the project to “get the job done,” he said, adding: “Bringing the FBI’s information technology into the 21st century should not be rocket science.”

A Republican member of the Judiciary Committee agreed with Leahy’s assessment. “I hope we haven’t just been pouring money down a rathole at taxpayers’ expense,” said Sen. Charles E. Grassley of Iowa.

Grassley said he has asked the Government Accountability Office “to look at how this happened.”

The computer-system overhaul was launched at a time when the bureau was reeling from a series of high-profile security breaches.

As an example, FBI agent Robert P. Hanssen was able to spy for the Russians undetected for 22 years, in part because he was able to hack into the aging computer system to see whether investigators were on to him.

But the Sept. 11 attacks changed the FBI’s mission from catching spies and arresting bank robbers to tracking terrorists before they had the opportunity to strike. That required a far more complex computer system.

Advertisement

The FBI wanted the Virtual Case File software to be built from scratch to maximize the safety and security of information. But the custom design proved extraordinarily expensive, and over the years software companies have been able to develop comparable off-the-shelf software for a fraction of the cost.

A preliminary report from Aerospace Corp., a federally funded nonprofit research firm in El Segundo hired by the FBI to assess its options, has identified commercially available programs that could meet the FBI’s requirements, sources familiar with the study said. Using such programs would also enable the FBI to integrate its software with that of other agencies doing similar work -- a far more complicated task if it chose to stick with a custom product.

The FBI’s computer woes go back decades, reflecting a deep-seated resistance to change among traditionalists.

“You’ve got a culture there that has been technophobic for some time,” said Eugene Spafford, a computer science professor at Purdue University and a member of the President’s Information Technology Advisory Committee. “It has created a climate over time that has put them in a deep hole.”

In 1995, the agency’s vast paper files were partly loaded into a searchable electronic database. That system was soon found to be inefficient and riddled with software bugs, and some top officials never adopted it.

Until recently, much of the rest of the agency -- including some field agents -- lacked such basic features as e-mail.

Advertisement

After the Sept. 11 attacks, the FBI’s hardware, software and communications networks, built up at the cost of hundreds of millions of dollars, were revealed as severely outmoded.

Many important records were never loaded onto the system. Key clues within the FBI’s files, including the revelation that Osama bin Laden was sending students to the United States for flight training, were overlooked.

The system was unable to search across the agency’s vast data stores, housed in many separate systems -- impeding the investigation of the hijackers’ likely accomplices.

But the bureau’s ambitions were far greater than just fixing existing problems. In 2002, FBI officials said that within three years they would be able to tie together case files with modern database software that would also check other government databases and the Internet to instantly learn about suspects or explore possible leads.

The linchpin of the planned computer system was the Virtual Case File software, designed for use on fast computers and networks to make it widely accessible across the FBI’s scores of offices. The plan was to include images, video and sounds, crucial components of a modern investigation.

The agency’s ultimate goal, it has said, is to build predictive intelligence into its systems -- discerning terrorist plans from billions of data points in order to prevent an attack.

Advertisement

But predictive data mining, regarded by computer scientists as a holy grail of the field, is still largely unproven, Spafford said.

“Data mining has a lot of potential,” he said, “but it’s probably not capable of a lot of the things people say it is.”

*

Schmitt reported from Washington and Piller from San Francisco.

Advertisement