Featured Post

Welcome to the Forensic Multimedia Analysis blog (formerly the Forensic Photoshop blog). With the latest developments in the analysis of m...

Saturday, March 9, 2019

Simple image and video processing

I'm back doing product reviews and comparisons. Yay!

In the blog post announcing my joining the Amped team a few years ago, its CEO noted "Jim is well known also for his direct, unfiltered and passionate style."


It's been a while, but the directness and passion are still here; and I have yet to find a filter. ;) ... and, in case you missed it, I'm no longer part of the Amped team. More on that in a future post.

Today's test - Input-Ace vs Amped FIVE for simple image / video processing. *

To facilitate the test, I've enlisted the help of an analyst from a private lab with Input-Ace and FIVE. I just needed a few stills and screen shots to work with. The test begins with a video extracted by me from a black box 2CIF DVR, the kind that are rather ubiquitous here in the US. It's one of my test/validate files, so I know the values that we are starting with.

The task: "I just need a still image of the vehicle for a BOLO." This should be a 2 minute process .. or less.

Now, the process and tools are a bit different when working in Input-Ace vs Amped FIVE. But, I devised a test of something I used to do several times per day at the LAPD - process 2 CIF video for a flyer. Resize / Aspect Ratio - problems of resolution can be fixed in either tool.

My experiment is two-fold.
  1. How fast / easy is it to load a video, fix the resolution issue (restore the missing information that happens when the DVR is set to ignore every other line of resolution - 2 CIF), and output a still for a BOLO.
  2. What's the difference in quality between the two processes? Is there a difference in the results? If so, what is it?

The "workflow engine" way of working is not natural to me. But, my friend is rather proficient with the tool and noted that fixing the resolution issue was a two-step process - first restore the aspect ratio and then restore the size. Each step in Input-Ace utilized  Nearest Neighbor interpolation. The time to configure the filters and output a still was less than 30 seconds.

Notes:
Nearest neighbor simply copies the value of the closest pixel in the position to be interpolated. (Anil. K. Jain, “Fundamentals of Digital Image Processing”, Prentice Hall, pp. 253–255, 320–322, 1989. ISBN: 0-13-336165-9.)

For FIVE, this solution is equally easy - the Line Doubling filter (Deinterlace > Line Doubling). Line Doubling utilized a Linear interpolation. As with Input-Ace, the time to configure the filters and output a still was less than 30 seconds.

Notes:
Linear interpolates two adjacent lines. (E. B. Bellers and G. de Haan, “Deinterlacing - An overview”, in Proceedings of the IEEE, Vol. 86, No. 9, pp. 1839–1857, Sep. 1998. http://dx.doi.org/10.1109/5.705528.)

In terms of speed and ease of use - it's a tie. 

Remember, this task isn't an "analysis" as such. This process is one of those common requests - we just need an image, quickly, for a BOLO. But, you want to fix the problems with the file before giving the investigator their image. Sending out a 2 CIF image without correcting / restoring the resolution could lead to problems with recognition as the images appear "squashed."

Next, I wanted to know if there was a qualitative difference between the two resulting files. This is where FIVE excels - analysis. FIVE's implementation of Similarity Metrics (Link > Video Mixer) was used.


The results:
  • SAD (Sum of Absolute Differences - mean) (0 .. 255): 2.0677.
  • PSNR (Peak Signal to Noise Ratio - dB): 28.7395.
  • MSSIM (Mean Structural Similarity) (0 .. 1): 0.9335.
  • Correlation (Correlation Coefficient.) (-1 .. 1): 0.9895.
A rather definitive result. As regards the correlation coefficient, a value of exactly 1.0 means there is a perfect positive relationship between the two variables. A value of -1.0 means there is a perfect negative relationship between the two variables. If the correlation is 0, there is no relationship between the two variables. A correlation coefficient with an absolute value of 0.9 or greater represents a very strong relationship. In this case, the value is 0.9895 ... or very nearly 1. The other results can confirm quantitatively what your eyes can see qualitatively - to the eye, the results are virtually identical. Same truck. No visual loss of details.

From a qualitative standpoint - it's a tie.

Thus, if the two pieces of software can deliver what is visually the same result, in the same amount of time, then what's the tie breaker? Features? Service? Price?

The tie breaker is for you to decide. What features are "must haves?" What are the terms of service? Are they acceptable to your agency? What's the better value for your agency, your caseload, and your workflow?

For features, does it matter if there are 130 filters / tools if you only use about a dozen of them on a regular basis? I'm in a different place in my casework now - more "analysis" than "processing." For me, the value proposition based on features still tilts towards Amped's tools. Besides, I've had my license for years now. I'm not coming at the tool for the first time. For many at the local police level, it's more processing than analysis - with analysis being done at the prosecutor's office. Input-Ace as a production tool is quite well positioned. As an analysis tool, you'll need something else for now. The testimony of Input-Ace's primary evangelist, Grant Fredericks, will confirm that it's major part of his tool-set, but not the only tool he uses.

For the terms of service, examine each product's End User License Agreement - otherwise known as "that thing you don't read as you install the software." The EULA is the company's promise to you - the terms of what you're getting. Are the terms acceptable? If you're in North America, can you get someone on the phone to help during business hours? Input-Ace vs Amped contact pages are linked here for your convenience. Are you OK with a web portal as your only option?


For price, it can only be published price vs. published price. I am told that the quoted price for an annual subscription of Input-Ace is US$3k. The generally quoted price of a perpetual license of FIVE is EUR 9000. The EUR / USD exchange has been quite volatile of late, so the dollar price has come down a bit under US$11k on the exchange. I believe that a perpetual license of Input-Ace is available, but I don't have information on that price ... nor do I know an agency that has gone that route.

Then there's the ordering component of price. Price doesn't matter if you can't order the product. Can your agency pre-pay for goods via an office shared in Brooklyn, NY, with 40+ other entities at any price? States like New Jersey and Illinois forbid such pre-payment explicitly. Counties and cities like Los Angeles forbid pre-payment practically. Did you check up on the business? Run a Bizapedia search on the business. Do you get a result? Do you recognize the names? Does anything look odd? FIVE's EULA indicates that support will be done via the web portal, not at their "NY office." Are those terms acceptable? Then check the provider of the competition. Do the same thing. Run a Bizapedia search. Recognize any names? It may not be your money that your agency is spending, but due diligence is necessary nonetheless.

I'm of the firm belief that any true comparison of products must include the total experience - comparing the features as well as the user / customer experience. Features are pretty straightforward. User / customer experiences vary from person to person.  I've shared mine with you. Feel free to share yours with me.

*An important note: this was a simple case study. It's results were valid for what specifically was studied. It's not meant to validate either tool for use on a particular case, or your particular case. The opinions expressed herein are those of the author alone.

No comments: