Featured Post

Welcome to the Forensic Multimedia Analysis blog (formerly the Forensic Photoshop blog). With the latest developments in the analysis of m...

Sunday, August 18, 2019

First, do no harm

In an interesting article over at The Guardian, Hannah Fry, an associate professor in the mathematics of cities at University College London, noted that mathematicians, computer engineers, and scientists in related fields should take a Hippocratic oath to protect the public from powerful new technologies under development in laboratories and tech firms. She went on to say that "the ethical pledge would commit scientists to think deeply about the possible applications of their work and compel them to pursue only those that, at the least, do no harm to society."

I couldn't agree more. I would add forensic analysts to the list of people who should take that oath.

I look at the state of the digital / multimedia analysis industry and see places where this "do no harm" pledge would re-orient the relationship that practitioners have with science.

Yes, as someone who swore an oath to protect and defend the Constitution of the United States (as well as the State of California), and as someone who had Bill Bratton's "Constitutional Policing" beaten into him (not literally, people), I understand fully the relationship between the State and the Citizen. In the justice system, it is for the prosecution to offer evidence (proof) of their assertions. This simple premise - innocent until proven guilty - separates the US from many 'first world" countries.

I've been watching several trials around the country and noticed an alarming trend - junk procedures. Yes, junk procedures and not junk science as there seems to be no science to their procedures - which serve as a pretty frame for their lofty rhetoric. This trend can be beaten back, if the sides agree to stick to the rules and do no harm.

Realizing that I've spent a majority of my career as an analyst in California, and that California is a Frye state, I'll start there in explaining how we, as an industry, can avoid junk status and reform ourselves. Let's take a look.

You might remember that prior to Daubert, Frye was the law of the land. The Frye standard is commonly referred to as the “general acceptance test” under which generally accepted scientific methods are admissible, and those that are not sufficiently established are inadmissible.

The Frye Standard comes from the case Frye v. United States, 293 F. 1013 (D.C. Cir. 1923) in which the defendant, who had been charged with second degree murder, sought to introduce testimony from the scientist who conducted a lie detector test.

The D.C. Court of Appeals weighed expert testimony regarding the reliability of lie detector test results. The court noted: Just when a scientific principle of discovery crosses the line between the experimental and demonstrable stages is difficult to define…. [W]hile courts will go a long way in admitting expert testimony deduced from a well-recognized scientific principle of discovery, the thing from which the deduction is made must be sufficiently established to have gained general acceptance in the field in which it belongs."

The last part of that sentence is where I want to go with Frye - "in the field in which it belongs."

There is an emerging trend, highlighted in the Netflix series Exhibit A, where [ fill in the type of unrelated technician ] is venturing into digital / multimedia analysis and working cases. They're not using the generally accepted methods within the  digital / multimedia analysis community. They're not following ASTM standards / guidelines. They're not following SWGDE's best practices. They're doing the work from their own point of view, using the tools and techniques common to their discipline. Often times, their discipline is not scientific at all, and thus there is no research or validation history on their methods. They're doing what they do, using the tools they know, but in a field where it doesn't belong. Their tools and techniques may be fine in their discipline - but there has been no research on their use in our discipline. Thus, before they engage in our discipline, they should validate them appropriately - in order to do no harm.

Let's look at this not from the standpoint of my opinion on the matter. Let's look at this from the five-part Daubert test.

1. Whether the theory or technique in question can be and has been tested. Has the use of [ pick the method ] been tested? Remember, a case study is not sufficient testing of a methodology according to Daubert.

2. Whether it has been subjected to peer review and publication. There are so few of us publishing papers, and so few places to publish, that this is a big problem in our industry. Combine that with the fact that most publications are behind paywalls, making research on a topic very expensive.

3. It's known or potential error rate. If there is no study, there really can't be a known error rate.

4. The existence and maintenance of standards controlling its operation. If it's a brand new trend, then there really hasn't been time for the standards bodies to catch up.

5. Whether it has attracted widespread acceptance within a relevant scientific community. The key word for me is not "community" but "scientific." There are many "communities" in this industry that aren't at all "scientific." Membership organizations in our discipline focus on rapidly sharing information amongst members, not advancing the cause of science.

So pick the emerging trend. Pick "Headlight Spread Pattern." Pick "Laser Scan Enabled Reverse Projection." Jump into any research portal - EBSCO, ProQuest, or even Google Scholar. Type in the method being offered. See the results ...
The problem expands when someone finds an article, like the one I critiqued here, that seemingly supports what they want to do, whilst ignoring the article's limitations section or the other articles that may refute the assertions. This speaks to the need for a "research methods" requirement in analysts' certification programs.

If you're venturing into novel space, did you validate your tool set? Do you know how? Would you like training? We can help. But, remember that people's lives, liberty, and property are at stake (and they're innocent until proven guilty), can we at least agree to begin our inquiries from the standpoint of "first do no harm?"

No comments: