Printer Friendly

Is forensic science tipping the scales of justice? A new report reignites the call to make forensic science more quantitative, and casts doubt on work done in the lab.

For decades, an expert's testimony under oath was good enough in an American courtroom. An analyst could compare bite marks or ballistics, link them from a crime scene to a suspect, and tell a jury they matched based on the expert's opinion. The defense attorney's role was to cross-examine that expert, and try to cast doubt on the testimony.

But those dynamics began to change in 2009. A report by the National Research Council (NRC) contended that some aspects of science used in courtrooms did not have a quantitative, or sufficiently scientific, foundation. As DNA science continued to improve, and prisoners continued to be exonerated and released after lengthy times behind bars for crimes they didn't commit, a sea change gradually began to crest.

Seven years later, the wave is still swelling. There are still calls to quantify forensic evidence. An entirely new report released in September 2016 by the President's Council of Advisors on Science and Technology (PCAST) reignited the call for a statistical approach to justice. Efforts are underway to make forensic science more quantitative, in order to provide juries with clear probabilities that the court system has gotten the right person.

But only time will tell how this shift balances--or tips--the scales of American justice.

PCAST report

The 2009 report by NRC essentially recommended a complete overhaul of the forensic sciences. "Strengthening Forensic Science in the United States: A Path Forward" called for major reforms to the criminal justice system, and the establishment of national forensic scientific standards.

The report upended the traditional weighing of some forensic evidence that had been presented in American courtrooms for decades. Most notably, hair analysis and bite marks were singled out.

The FBI admitted in April 2015 that flawed science tainted hundreds of cases over the course of two decades, prompting multiple states to review thousands of other convictions involving the subjective techniques.

It wasn't just a handful of cases, and these were not short stints behind bars. Santae Tribble was convicted in 1978 of felony murder in the death of taxicab driver John McCormick in the nation's capital. At his 1980 trial, an FBI-trained expert said hairs found at the scene had a "l-in-10-million chance" of not being from Tribble. Decades in prison later, DNA proved otherwise, and Tribble was released in 2012.

The reliability of bite marks has also been shaken to its foundations. Exonerations from overturned dental evidence resulted in prisoners being freed after decades behind bars. Keith Allen Harward served 33 years in prison for a murder and rape he did not commit, and served three of them on death row, based solely on bite mark evidence. He was declared innocent by the Virginia Supreme Court in April, and walked away a free man.

"The fact that this case involved an innocent man who faced the death penalty should terrify everyone, not just in the state of Virginia but also in the 31 other states that still have the death penalty," Olga Akselrod, an Innocence Project attorney, told Laboratory Equipment.

Multimillion dollar lawsuits from freed inmates have proliferated in the past two years. Tribble won a $13.2-million wrongful conviction lawsuit against Washington D.C. Most often, like in the Tribble and Harward cases, it is newly unearthed DNA evidence and representation by groups like the Innocence Project that has resulted in the reversals of fortune for the wrongly convicted.

Critics of forensic science, and the criminal justice system at large, contend it's not just hair and bite marks. Other forensic sub-disciplines also need a complete retooling, they contend. DNA has set the bar for quantitative assurances ("one-in-a-million" chances) in courtrooms. Indeed, seven years after that National Research Council report, there are still calls to quantify forensic evidence--and for the scientific method to validate those probabilities. And that's where the PCAST report has planted its flag.

The PCAST report was written by many of the same authors as the 2009 National Research Council report, counting among its number nine federal judges, a former U.S. Solicitor General, a state supreme court justice, law school deans and statisticians. Overall, the new report reiterates the need to balance the scales, which the authors contend are still tilted in favor of prosecutors and law enforcement.

For instance, the 160-page report argues it's no longer enough for an expert to match two pieces of ballistic evidence and tell a jury that they match. Instead, the expert should have to tell the jury of their similarity, and then put into context how likely it is that another gun could have been used in the crime, and been mistaken for the accused's firearm.

Overall, the report contends that the culture needs to change to focus on fact-finding and numbers crunching in the laboratory--not on-the-job experience during criminal investigations. "Casework is not scientifically valid research, and experience alone cannot establish scientific validity," the report states. "In particular, one cannot reliably estimate error rates from casework because one typically does not have independent knowledge of the 'ground truth' or 'right answer.'" Forensic science results must be "repeatable, reproducible and accurate," the authors add.

Among other things, forensic science recommendations included:

* DNA mixtures. The combined probability of inclusion (CPI) analysis of complex mixtures, which involves some subjective estimations by an analyst, does not have a valid foundation, the report found. Instead, probabilistic genotyping software needs to be further validated by third parties. Some advanced pieces of software like TrueAllele crunch the complex calculations to standards, but PCAST contends the programs

need more vetting.

* Bitemarks and hair analysis do not meet scientific standards, the panel said. They left little room for ever using those comparison methods in criminal trials--unless DNA samples could be taken from saliva in the bitemarks or from hair follicles.

* Shoeprints and footwear comparisons, beyond size and make, are not scientifically valid.

* Latent fingerprints are "heading in the right direction" from a forensic science standpoint, especially given a recent FBI "black box" analysis.

* Firearms and ammunition comparison need to be better explained to juries, especially in a quantitative sense. But a fully automated analysis method could be coming in the near future, which would provide further validation.

The PCAST panel touted a massive 2002 FBI study of hair analysis as a "landmark of forensic science" due to its scientific standards. That study eventually led to the Bureau's admission of the hair analysis problem last year, and the complete reconsideration of thousands of cases.

Science answers the call

The PCAST group also recommended that the FBI Laboratory assume more of a lead role in the ongoing "overhaul" of the forensic sciences through training, research and validation of existing and future technologies. More government funding should be directed to agencies to further forensic scientific study, they added.

"The total level of federal funding by NIJ, NIST and NSF to the academic community for fundamental research in forensic science is extremely small," they write. "Substantially larger funding will be needed to develop a robust research community and to support the development and evaluation of promising new technologies."

Taking their cue, a series of governmental agencies have already begun their own forays into the fundamentals of forensic sciences.

The American Association for the Advancement of Science (AAAS) has undertaken an investigation of 10 evidentiary categories in U.S. forensic science. Ballistics and tool markers, latent fingerprints and arson investigations are the first three. The next seven are: bloodstain pattern analysis, digital evidence, footwear and tire tracks, bitemark analysis, fiber trace evidence, hair trace evidence, and trace evidence of paint and other coatings. The results will be released on a rolling basis, said Mark Frankel, director of the agency's project.

"We expect the reports emerging from the project to encourage basic research and contribute to improving the quality of forensic science used in the legal system," Frankel said last year. "The project's impact could be transformational for the criminal justice system, enabling the public to have confidence that the ability to convict the guilty and exonerate the innocent is advanced."

The National Institute of Standards and Technology (NIST) is another agency working on developing quantitative platforms that can automatically generate information that can be used in all American courtrooms. NIST hosted its annual forensic conference in November, and put on display many of its working models of how to standardize and quantify some of the realms of evidence.

"With a lot of these, you can collect data, and feel very confident that what you're looking at is viable. All of them are done automatically, to a greater or lesser extent, but there's always a human being there to quantify a search," Robert Thompson, senior forensic science research manager at NIST, told Laboratory Equipment. "How do we tell the jury, 'Here, my opinion is this,' and back it up with a scientific, objective measurement that is standardized?"

The new processing methods include those for fingerprints, ballistics, DNA mixtures, footwear impressions, drug identification, fiber trace evidence, more sensitive DNA analysis and bloodstain pattern recognition:

* SHOECALC is a software program currently under development. The system is touted as going beyond simple side-by-side comparison to include matching algorithms, similarity scores and metrics that even take into account the quality of the image itself. The goal is to do statistical comparisons that go beyond just a make and size of a shoe, and instead use the minutest details to compare unique shoes to impression evidence at the scene.

* Drug analysis using nuclear magnetic resonance spectroscopy is also being explored by NIST scientists, especially in reference to emerging synthetic drugs. Other presentations at the conference assessed the quality of existing gas chromatography mass spectrometry libraries in analyzing seized drugs, and toxicology exams.

* Fingerprint biometrics were the subject of a handful of presentations at the NIST conference. One of the feature projects is determining likelihood ratios for friction-ridge details. Elham Tabassi, a NIST engineer giving one of the presentations, said in an interview that the fingerprint comparisons will be automatically processed to a degree beyond simple match probability. "These are really powerful algorithms," Tabassi said. "We're trying to automate the process as much as possible."

* Ballistics and toolmark comparisons are another major category of evidence. The National Integrated Ballistic Information Network (NIBIN) run by the Bureau of Alcohol, Tobacco, Firearms and Explosives is rapidly being upgraded in both capability and efficiency. But there was also a promising new open-access 3-D ballistics database unveiled earlier this year by NIST scientists Thompson and Alan Zheng. Still, the courtroom is years away, and only 1,600 test fires have been uploaded into the new dataset thus far. Thompson estimates that it will take five years to get a big enough population of firearms to start making advanced statistical analyses with algorithms. Part of the bottleneck is due to the relative scarcity of the microscopes needed to complete the standard analysis: only a handful are available currently to U.S. law enforcement agencies. Once the sampling is big enough, it may take another five years for the database to appear in American courtrooms.

Battle lines drawn in the courtroom

Already there are conflicts over how this new trend in crime labs will play out. For instance, ballistics became a point of contention in the trial of an infamous serial killer earlier this year.

The Grim Sleeper was a killer who terrorized tough areas of Los Angeles off and on from the 1980s into the 2000s. Lonnie Franklin, Jr., now 63, was identified and linked to the murders of 10 young women through advanced familial DNA database searching.

The defense argued that other men's DNA profiles were found on the women, many of whom were sex workers. But the defense attorneys also contended that the ballistics work linking the same gun to seven of the fatal shootings was not scientific.

Los Angeles County prosecutors presented several firearms experts who linked the bullets in seven victims to the same .25-caliber handgun.

However, defense expert witness David Lamagna, owner of the Massachusetts-based American Forensic Technologies firm, testified that the prosecutions' experts were incomplete. They only used 2-D microscope techniques to analyze the tool marks on the bullets, he said--and only 3-D-mapping electron microscopes or other advanced technologies can be trusted, he testified.

"I believe it's mostly subjective in nature at this point," said Lamagna about the state's firearms experts during his testimony.

Lamagna also noted that the Association of Firearm and Tool Mark Examiners, had "no mathematical standards" for visual comparison.

The defense's argument was ultimately dismissed by a jury, who convicted him on all counts. Franklin was later given the death penalty, and awaits execution.

Forensic practitioners say the U.S. criminal justice system has been appropriately using science like 2-D ballistics matches for decades.

The National District Attorney's Association (NDAA) slammed the PCAST report, saying that the panel had thrown out reams of scientific findings accumulated by analysts at the scene and in the lab.

"It is unfortunate that members of PCAST, none of whom are forensic practitioners who have been trained or tested for competence in the forensic disciplines, ignored vast bodies of research, validation studies and scientific literature authored by true subject matter experts," said Mike Ramos, the NDAA president.

The FBI itself released a statement in which they agreed more funding was needed by their lab, but went on to blast most of the PCAST allegations.

"The report makes broad, unsupported assertions regarding science and forensic science practice," the Bureau wrote. "The PCAST criteria define 'black box' studies as the benchmark to demonstrate foundational validity, but provide no clarification on how many studies are needed or why some studies that have been conducted do not meet their criteria.

"These criteria seem to be subjectively derived and are therefore inconsistent and unreliable," the FBI added.

For some, the PCAST report is simply the opening of the dialogue to continue to refine science to its best standards. The American Academy of Forensic Sciences (AAFS), which represents some 7,000 of the most experienced forensic experts in the country, cautiously welcomed improvements to the science.

"The PCAST report is an important start to the discussion of scientific validity and we look forward to continuing that discussion with the larger community of forensic science practitioners," reads the October AAFS statement.

by Seth Augenstein, Senior Science Writer
COPYRIGHT 2017 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2017 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Augenstein, Seth
Publication:Laboratory Equipment
Article Type:Cover story
Date:Jan 1, 2017
Previous Article:Elemental analyzer removes need for multiple units.
Next Article:High-performance hood maximizes protection, energy savings.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters