A random request via email prompts Round 2 of the product comparison - Input-Ace vs Amped FIVE. In this case, I received a .re4 file via Dropbox with the question - "what is license plate?" Well, not those words, but I distilled a few paragraphs down to the testable question.
Working the workflow, we conduct File Triage - can I even view this file? The person who emailed noted his failure at finding a way to play the file (he's likely the last person in the world who hasn't heard of Larry Compton's web site where you'll find three flavors of .re4 players). It's common to not want to install players / codecs, as I've said for years. I know that the .re4 format is just h264 video with a crazy proprietary container. Thus, for File Triage, I know that I can load it into Input-Ace or FIVE with no issue.
For File Triage of this random .re4 file - it's a tie(ish). Input-Ace takes about 30 seconds to decode and load the file. FIVE needs to "convert" the file, and takes about 45 seconds to create the "clean" file, the proxy, and the report.
The next step in the workflow is Content Triage, or "can I answer the question / satisfy the request with the given file?" I noted that the file's resolution was 2CIF. I also noted that the vehicle in question was about 50' from what was likely a 2mm-4mm lens. The pixel dimensions of the area likely containing a license plate were about 12x7px.
With every other line of resolution not recorded in the first place, and the target too far away from the lens, identification questions will not be answered with this file. The available video data will not support a conclusion. But, let's pretend that there is enough data. As this is a product comparison, let's run a Macroblock Analysis.
The file was about 500mb, containing about an hour of 2CIF footage. Again with the aid of a trusted analyst, fluent with Input-Ace, the set-up of the analysis as well as the generation of results took less than 30 seconds. The same file took FIVE 4.5 minutes to complete the task. We checked a select number of frames in each tool and found the results to be the same.
For Macroblock Analysis on this random .re4 file - score one for Input-Ace.
There were just a few lossless encoded blocks in the target area. With every other line of resolution missing and no usable data in the target area, there's no conclusion possible. All you're left with is "white car." Sure, you can attempt a Vehicle Determination Examination. You may get make / model / and range of model years. That might help narrow the search a bit.
Given that a majority of cases die at the Content Triage step, getting quickly and easily to and through that step is vital. With Input-Ace and this file, we were done with the case in less than 2 minutes. With FIVE and this file, the unexplained length of processing time for the Macroblock Analysis made it take more than double the amount of time to get to "no, sorry."
Of course, the rebuttal will be that FIVE contains so many more tools vs. Input-Ace - which is true. But, who cares about fixing fisheye or rectifying the image if the question, "what is license plate," is unanswerable? 90%-95% of the time, folks just need a quick answer ... that's actually quick.
Now, I know what you're thinking. Amped's DVRConv is their product for the quick creation of proxy files. Again, you're correct(ish). DVRConv does not support all of the formats that FIVE does. In this case, loading the .re4 file into DVRConv caused the program to crash. If all you had was DVRConv, you'd be out of luck with this file.
As a final aside, I do like that Input-Ace reports the non-standard frames/second tag as "Unknown" (see the above graphic) as opposed to the FFMPEG default of 25fps (as FIVE does). Sure, I can restore the frame rate in both programs, but the untrained person may assume that 25fps is correct and not take steps to restore the proper rate. Remember, the triage steps aren't always performed by trained analysts. They're often performed ahead of the decision to engage with the analyst. [shameless plug - seats are open for our upcoming training sessions - click here for more info or to sign up]
That's going to about do it for the Input-Ace vs FIVE comparisons. By now, I think you get the points. Besides, there's a cool new plug-in from Chris Russ that deserves a look.