Law-enforcement technologyComputational forensics determines the rarity of a finger print
Crime scene forensic analysis has long functioned on the premise that a person’s unique identity is hidden in the tiny loops and swirls of their fingerprints, but teasing that information out of the incomplete prints left at crime scenes is still an inexact science, at best; a University at Buffalo researcher has developed a way computationally to determine the rarity of a particular fingerprint and, thus, how likely it is to belong to a particular crime suspect
Crime scene forensic analysis has long functioned on the premise that a person’s unique identity is hidden in the tiny loops and swirls of their fingerprints, but teasing that information out of the incomplete prints left at crime scenes is still an inexact science, at best.
Now, a University at Buffalo professor — who in 2001 provided the first scientific evidence that fingerprints truly are unique — has developed a way computationally to determine the rarity of a particular fingerprint and, thus, how likely it is to belong to a particular crime suspect.
The paper, “Evaluation of Rarity of Fingerprints in Forensics,” was presented by Sargur N. Srihari, Ph.D., co-author and SUNY Distinguished Professor in the UB Department of Computer Science and Engineering, at the Proceedings of Neural Information Processing Systems conference held Monday in Vancouver.
By combining machine learning with the ability to automate the extraction of specific patterns or features in a fingerprint and then comparing it with large databases of random fingerprints, Srihari and co-researchers are able to come up with a probability that a specific fingerprint would randomly match another in a database of a given size.
The UB research is the first attempt to determine the rarity of a fingerprint using computational tools.
“Current procedures for forensics do not provide a measured accuracy for fingerprint analysis,” says Srihari.
The UB research lays the groundwork for the development of computational systems that could, for the first time, quickly and objectively reveal just how meaningful is the fingerprint evidence in a given case.
The research directly addresses some of the profound shortfalls identified by the National Academy of Sciences’ Committee on Identifying the Needs of the Forensic Science Community, which Srihari served on with other national experts from 2007-2009. Some of the committee’s recommendations dealt specifically with fingerprints, including the need for baseline standards to be used with computer algorithms to map, record and recognize features in fingerprint images.
“When we look at DNA, we can say that the likelihood that another person might have the same DNA pattern as that found at a crime scene is one in 24 million,” Srihari explains.
“Unfortunately, with fingerprint evidence no such probability statement can be made. Our research provides the first systematic approach for computing the rarity of fingerprints in a scientifically robust and reliable manner.”
Part of the difficulty is due to the intrinsic nature of fingerprint evidence, he says, where fingerprints are invisible to the naked eye and have to be lifted using either powder or ultraviolet illumination.
According to Srihari, two types of uncertainty are involved in fingerprint analysis, similarity between two fingerprints and the rarity of a given configuration of ridge patterns.
“Human examiners describe the results of their analyses in one of three ways: likely to confirm identity, called individualization, unlikely to confirm identity, called exclusion, or inconclusive,” he says. “A probability statement as to how rare a specific finger print is would be a dramatic improvement in the way that such evidence is currently described to juries.”
Forensic analysis depends on something called a likelihood ratio, which is the ratio between the probability that the evidence found at the scene and the known data — for example, a suspect’s fingerprint — come from the same source and the probability that they come from different sources.
The new method developed at UB uses machine learning, a type of artificial intelligence where machines learn from examples, through the use of statistics and probability. The UB researchers used machine learning to predict the core point, usually the center in the finger around which the ridges flow.
“In forensic analysis, the fingerprints are usually incomplete,” Srihari explains. “Thus a guess has to be made as to which part of the finger it came from. Our approach allows us to predict the core point and thus orient the print for further analysis.”
Srihari’s co-author is Chang Su, a doctoral candidate in the UB Department of Computer Science and Engineering in the School of Engineering and Applied Sciences.
The research was supported by a grant from the U.S. Department of Justice.
— Read more in Sargur N. Srihari, “Beyond C.S.I.: The Rise of Computational Forensics,” IEEE Spectrum (December 2010); and Srihari, “Evaluation of Rarity of Fingerprints in Forensics” (paper presented at the Proceedings of Neural Information Processing Systems conference, 6 December 2010, Vancouver, B.C., Canada)