Smudge science

Share via
Jason Felch has written extensively about forensic evidence, especially DNA, for The Times

When Thomas and Ann Farrow were found murdered in their paint shop, their heads crushed with a blunt object, the only clue was a bloody right thumbprint on the store’s empty cash box.

The brazen murder shocked the people of Debtford, a sooty industrial suburb of London. They clamored for police to find the killer.

The year was 1905. Forensic science was in its infancy. Scotland Yard had only recently begun collecting carefully pressed fingerprints from criminals, stashing the cards in pigeonholes of a makeshift filing system.


But Scotland Yard Inspector Charles Collins believed that the bloody print could help him solve his crime. After learning that a man named Alfred Stratton had been seen near the crime scene, he collected the unemployed ruffian’s thumbprint and compared it with the one left at the crime scene. A close inspection showed there were 11 minute features that the two prints shared.

The prosecutor at Stratton’s trial told jurors the similarities left “not the shadow of a doubt” that the crime-scene print belonged to Stratton.

But the defense had a surprising ally at their table: Henry Faulds, a Scottish doctor who two decades earlier was the first to propose using fingerprints to solve crimes.

Faulds believed that even if fingerprints were unique -- there was, after all, no scientific basis for the popular assumption -- the same was not necessarily true of “smudges,” the blurry partial prints accidentally left behind at crime scenes in blood, sweat or grease.

A single bloody thumbprint, he felt, was not enough evidence to convict anyone of murder.

Stratton’s trial would be the first test of the new science of fingerprinting, and it raised concerns that, more than a century later, still have not been addressed.

Today, fingerprints are once again on trial.

In 2007, a Maryland judge threw out fingerprint evidence in a death penalty case, calling it “a subjective, untested, unverifiable identification procedure that purports to be infallible.”


The ruling sided with the scientists, law professors and defense lawyers who for a decade had been noting the dearth of research into the reliability of fingerprinting. Their lonely crusade for sound science in the courtroom has often been ignored by the courts, but last month it was endorsed by the prestigious National Academy of Sciences.

The question is not whether fingerprints are unique -- most scientists agree they probably are, though that assumption remains largely unstudied. The issue is whether the blurry partial prints often found at crime scenes -- what Faulds called “smudges” -- are sufficient to identify someone with any reliability.

The answer: No one knows. There are no national standards for declaring a fingerprint “match.” As a result, fingerprint identifications are largely subjective.

For ages, people have marveled at the immutable ridges, arches, loops and whorls embedded in every fingertip. Believing them unique, ancient Babylonians pressed their fingers into wet clay tablets to sign legal contracts.

But it was not until the 1880s that Faulds discovered their utility as a forensic tool. He had begun cataloging the curious impressions when someone stole alcohol from his laboratory, according to Colin Beavan, the author of a book about Faulds and the Stratton trial. Faulds used the fingerprints left on the glass vial to identify the culprit -- the first known use of latent prints to solve a crime.

But by the time of Stratton’s trial in 1905, fingerprinting had moved from the realm of scientists to that of police agencies.


Faulds was sitting silently at the defense table, Beavan wrote, stewing bitterly. The limitations of his technique were being ignored.

“The least smudginess in the printing of them might easily veil important divergences ... with appalling results,” Faulds wrote in a book that year. Police were “apt to misunderstand or overstrain, in their natural eagerness to secure convictions.”

His warnings were ignored. Jurors took just two hours to decide Stratton’s fate, with the fingerprint as the only piece of evidence linking him to the crime. He and his brother were hanged 19 days later.

The concerns Faulds raised would go unanswered and largely ignored for decades as fingerprints became definitive proof of identity. What had started as a hypothesis for 19th century scientists became an article of faith for forensic scientists and the courts in the 20th century, says Michael Saks, the author of several articles on the social history of identification sciences.

When fingerprints were first used in an American court in a 1920 Chicago murder trial, a juror told reporters that “fingerprints and fingerprints alone convinced us.” Ever since, experts have claimed their power to eliminate any doubt.

That air of certainty soon carried over to other emerging forms of forensic identification. Handwriting, shoe prints, tire tracks, bite marks -- all were asserted to be reliable identifiers, based largely on faith and police experience rather than any rigorous scientific study. Even the hard science of DNA evidence gained credibility in its early days by calling itself “genetic fingerprinting.”


Even today, fingerprint experts present their conclusions as nothing short of certainty. Many testify that fingerprinting has an error rate of zero. Few judges have been willing to question such statements, fearful of contradicting a century of legal precedent.

Only recently, with the advent of DNA evidence, have the “appalling results” that Faulds warned of begun to come to light.

In 2004, the Boston Police Department was forced to shut down its fingerprint lab after a “glaring mistake” led to a wrongful conviction. That same year, the FBI’s top fingerprint analysts were forced to admit that they were wrong after claiming to be “absolutely confident” that a fingerprint had linked a lawyer in Oregon to the Madrid train bombings. The Los Angeles Police Department is now reviewing nearly 1,000 fingerprint cases after an internal review that found two people had been wrongfully accused by fingerprint “matches.”

If the National Academy report succeeds in forcing the courts to ponder questions first raised a century ago, Faulds, who went to his grave in 1930 still angry that the limits of “smudges” were being ignored, might finally rest in peace.


Jason Felch has written extensively about forensic evidence, especially DNA, for The Times.