Featured Post

Welcome to the Forensic Multimedia Analysis blog (formerly the Forensic Photoshop blog). With the latest developments in the analysis of m...

Thursday, September 27, 2012

FourMatch - the marketing review


I realize that I'll probably upset some people with the comments that I'm about to make. But, I saw the marketing graphics for the new FourMatch Photoshop plug-in from Four and Six ... and I was a little shocked. Read the graphic; Assess reliability of social media images. In the same picture, Authenticate images instantly.

Now, what does that mean to you? The picture of the person with the mobile phone, the text ... with FourMatch, I should be able to authenticate pictures that I find on social media ... right?

WRONG.

From their own blog, "It currently can only analyze images from digital cameras, mobile devices, and tablets ..."

Remember Beckley? Beckley involved the police's use of images from Facebook. The police downloaded pictures from the suspect's Facebook account to use during sentencing - showing gang affiliation - for a sentence enhancement. Big problem. The defendent claimed that the image had been manipulated.

In California, authentication requires either the person in the scene to say ... yes, that's me and that's the scene as I remember it. Or, the photographer says ... yes, that's my picture and it shows the scene accurately, etc. Or, some independent person applies some scientific method to authenticate the image - like you and I.

Authenticate:
1. to establish as genuine.
2. to establish the authorship or origin of conclusively or unquestionably, chiefly by the techniques of scholarship: to authenticate a painting.
3. to make authoritative or valid.

With that in mind, I became interested in the FourMatch project/product. I need something to authenticate images from social media sites. Something that the court will recognize as accurate - and something that will work within the scientific method. Does FourMatch fit the bill for me? Unfortunately, no.

Of the five factors within the scientific method, FourMatch fails two rather significant ones.

First, I can refute the results presented in the window - in a manor of speaking. Here's my test. Let's say that I take a picture with my own phone. I upload the picture to my Facebook account. I download the picture to my computer. I load the downloaded image into Photoshop and use FourMatch - it flags it with the yellow ... probably not authenic, been processed flag. What?! It's my photo.

I'm the photographer, it's my phone, and it's my Facebook account. Who is right? Me, or the FourMatch determination that it's been manipulated and might not be authentic? I worry that, given the people involved in the project, that more weight would be given to the results of the plug-in's tests than my own testimony that it's my picture. But nevertheless, the content and context wasn't changed in any way. The only thing that happened was the upload - where Facebook recompresses the image. Facebook does that to every image. With this in mind, how can the marketing statement in the graphic (above, from their web site) be true?

In a blog post, the creators say, "It provides objective evidence that a file was not touched by any software application since the time it was first captured." If this is the case, that the plug-in works only on images that are direct from the camera and untouched by any application (as noted above, from their blog) - then you can't use the plug-in to authenticate or even assess images from social media. Social media sites recompress images on upload.

Then, there's the error rate issue. What's the instance of false negatives or false positives? FourMatch is a database driven product. It looks at the image vs. a database of camera info. For my assessment, using a popular US purchased mobile phone to take an image, the signature from the phone's imager will likely be found in the FourMatch database and it should give the image the green light. Yet, what about images from other phones, more obscure phones (not contained within the database) that are flagged yellow - further review. This is the same problem that I had with JPEG Snoop. Does the yellow, needs additional inquiry flag qualify as a false negative? Sure, for the Facebook tests FourMatch is right - it's been touched. But, it's possible to authenticate the image by other means. Again, does this qualify as a false negative? Would you be comfortable arguing this point in court?

If you hadn't read this, and you owned FourMatch ... it yellow-flagged your image, would you use the image in your case? Would you confidently proceed? Or, would you move on? In triage - it says the image has been processed, touched by software. What would you do?

So, does FourMatch provide objective evidence that a file wasn't touched by any software since capture? For the most part, yes - as long as the capture device's information is in the FourMatch database. In my tests, the database's problems were centered around mobile phones. Precisely the types of photos that we're concerned with. More people are ditching their point-and-shoot cameras in favour of the camera in their mobile. After all, who wants to carry around two pieces of gear when the cell phone pics can be directly uploaded to social media. People tend to flow to the easiest option. Remember, social media images largely come from we the people, not photojournalists with professional cameras.

Does FourMatch live up to it's own marketing slogan, assess the reliablility of social media images? No - it can't. Social media images are recompressed - touched by software.

If FourMatch really did work to authenicate social media images, it would be worth the price. I'd even pay a little more. But, most of my authenication requests come from folks who do not have the camera - or the allegations involve content/context issues (the other 2 of the 3 F's). As such, I'd have trouble justifying the expense for a database oriented triage tool.

For media outlets looking to verify the integrity of photos received from their field photographers and other sources ... I'm sure that this is a great tool and the editors will appreciate its ease of use. Generally, the cameras used by photojournalists will be present in the FourMatch database. But for law enforcement and criminal justice employees looking to authenticate social media images ... sorry. Wrong product.

2 comments:

Kevin Connor said...

We've actually tried very hard to be as precise as possible in our language about what FourMatch will and won't do. If you get the green light in FourMatch, then you've got very compelling evidence that the file has not been tampered with in any way since it was first captured by the camera. If you don't, then--as we say quite clearly on our site--the results are less conclusive, though we provide as many clues as possible to aid in further investigation. Thus, we specifically avoided catchy slogans like "Find fake photos fast," because FourMatch can't tell you whether a file was edited in a way that fundamentally alters the truth of the picture. A negative result in FourMatch does not mean a file is not authentic. But a positive result is indeed a form of authentication, because it tells you that you're seeing what the camera saw. Our approach is intentionally conservative, so as to avoid false positives.

I also must emphasize that the billboard on our home page has three rotating screens, each designed to speak primarily to one audience. The first screen is aimed at law enforcement (hence the photo of police tape), and it says "Validate photographic evidence." If, for example, you're dealing with a child pornography case, and you need to validate that the photos found on the suspect's computer are authentic photos of children rather than manipulated illustrations, then FourMatch will be invaluable, and the technique it relies upon has already been court-tested.

The second screen, which you feature above in your blog post, is aimed primarily at people in news organizations. We intentionally use the softer language of "Assess reliability" rather than "validate," because we acknowledge that often--though not always--files traveling through social media will be changed in ways that FourMatch can't validate. (That said, we've given public demonstrations of photos we found from last year's Vancouver riots online, some of which we were effectively able to validate in FourMatch.) Elsewhere on our site, we also emphasize that the best practice for media firms is to chase down the source of an image to see if they can obtain an original file, and then test that with FourMatch. Generally, responsible media organization should always be making every effort to track down the source anyway.

I'm certainly sorry if you find the marketing misleading, but I still believe that we've been fairly precise in our messaging, and anyone who takes the time to read the rest of the marketing material on our site--as they certainly should!--will get a full picture of what FourMatch does and how it works.

Jim Hoerricks, PhD said...

I think I understand your point ... but it's lost in the graphics and links. Each of the rotating banners has the same footer text about authentication. The find out more link for each goes to the same page. The info page does not say, Media - do this, LE - do that ... it may be inferred, others may not infer your message.

Here, I was only going into the marketing and the web site. I've yet to actually review and show the results of my tests.