Featured Post

Welcome to the Forensic Multimedia Analysis blog (formerly the Forensic Photoshop blog). With the latest developments in the analysis of m...

Wednesday, June 11, 2014

Validation tests and comparing results

As a follow-up to yesterday's post, any time you're heading down a new path or plan to start using a new tool, it's important to validate it vs. a known data set. As an example, the folks at Digital Assembly publish a comparison of their tool's results vs. the results of other popular tools. They go the extra step and give you the links to the reference data sets so that you can conduct the tests yourself.

If you're an Encase 6 user and you're validating the tool vs. this disk image, you'll miss almost two thirds of the recoverable images. (This also speaks to the importance of keeping your programs up to date - and to validating the updates)


If you don't validate your tools before using them on casework, you're headed for trouble. Just because you haven't been asked the validation questions in court doesn't mean that you won't in the future. You've just been lucky. What would happen your your case and your reputation if you're just trying to get something done and you have Encase 6 - but the opposing expert is using APF? "How do you account for the fact that your tools / techniques couldn't recover the correct amount of files, or correctly recover the files in question in this case?" "Are your tools / techniques not reliable and repeatable?" What would happen if the frame / frames in question were dropped by your tool and you didn't know it? When there's blood in the water ...

Before you go down this road, reach out to folks with experience validating their tools. Get known datasets to use in testing. Test, test, re-test ... If the tool has issues, don't use it on casework.

No comments: