First, the definition of forensic science again: "Forensic science is the systematic and coherent study of traces to address questions of authentication, identification, classification, reconstruction, and evaluation for a legal context."
Forensic science thus includes:
- authentication
- identification
- classification
- reconstruction
- evaluation
The type of work performed in the examples on ACE's websit clearly indicate that the Camera Match Overlay is a tool for reconstruction. This is how the product is being positioned in the market. Camera Match Overlay is an addition to ACE, and not part of it's basic functionality. If you're not involved in reconstruction, you can skip the Overlay tool and save a few bucks.
What ACE's basic functionality excels at is "evaluation." What's in the container? How should it be handled? Those File Triage type questions. Once answered, it's a short trip to repackaging the data in a format that is playable for the end user. But remember, evaluation has it's own set of rules.
ACE is also really good at reconstruction - the syncing and linking separate video streams. Reconstruction has it's own rules, workflow, and toolset. Reconstruction attempts to illustrate a theory of the sequence of events in question. Reconstruction is not authentication, identification, or classification - which have their own rules, workflows, and toolsets.
With that in mind, the second set of questions deals with training and tools.
A 16 hour training session on Camera Match Overlay's operation and use is likely sufficient for a technician to be able know which buttons do what functions across a variety of use cases. What it is not is a comprehensive education on photogrammetry. Because the focus of tool-specific training is the tool, we've split off the foundational education side as separate, non-tool-specific deep dives so that you get an unbiased exploration of the discipline from a neutral third party. If you're giving technician level testimony (no opinion offered), tool-specific training is likely enough. But, if you're offering an opinion (even passively), then you need a foundational education in the discipline in which you're engaged.
The third set of questions deals with the legal aspects of evidence hearings.
Keeping in mind that I'm not an attorney, consider the evidence hearing's rules (Frye / Daubert). Both types of hearings have as a foundational element what is commonly known as the “general acceptance test.” Generally accepted scientific methods are admissible, and those that are not sufficiently established are inadmissible.
Can a tool or technique without a history of publication or validation be "sufficiently established?"
Camera Match Overlay technology is new. It's the "shiny new object" for reconstruction exercises. The resulting videos become an amazing demonstrative aid to one's testimony, using the power of stunning visuals to illustrate one's theory of a case. But, bear in mind that it's only a demonstrative illustration of a single theory. There may be other theories worthy of exploration. If you're engaged in science, Daubert requires that you explore those other theories. If you're just engaged in trial support, and thus have no opinion, then go right ahead and create those stunning visuals.
All of this requires a bit of honesty. When I've simply retrieved files, I'm engaged in technician level work. When I've clarified and enlarged a frame, I've engaged in technician level work. These activities can support an analysis, and thus help to illustrate one's opinion, but they're not "analysis" in and of themselves. From the Frye ruling, "while courts will go a long way in admitting expert testimony deduced from a well-recognized scientific principle of discovery, the thing from which the deduction is made must be sufficiently established to have gained general acceptance in the field in which it belongs." When I want to offer an opinion, I must use tools and techniques that have been sufficiently established in my field. If I want to use "reconstruction" tools to reinforce my opinion in an "identification" exam, those tools must be sufficiently established within the realm of "identification." At this time, there are no studies validating the use of the Camera Match Overlay technology and methods for "identification" or "classification."
There are no studies involving the product at all. It's brand new. I'm certainly open to participating in validation studies, if anyone want to engage in our services. But for now, Camera Match Overlay seems to belong to the world of reconstruction until validated otherwise.
Thanks for reading. Have a great day my friends.
No comments:
Post a Comment