Featured Post

Welcome to the Forensic Multimedia Analysis blog (formerly the Forensic Photoshop blog). With the latest developments in the analysis of m...

Monday, June 30, 2014

Apple to end support for Aperture

This just in from Yahoo News: "In a brief statement, the company said that it will stop updating and developing the affordable professional photo-editing software when the next version of OSX -- Apple's desktop operating system -- and its supporting apps are launched.

"With the introduction of the new Photos app and iCloud Photo Library, enabling you to safely store all of your photos in iCloud and access them from anywhere, there will be no new development of Aperture," Apple said. "When Photos for OS X ships next year, users will be able to migrate their existing Aperture libraries to Photos for OS X."

During its World Wide Developers Conference earlier this month, Apple focused heavily on photography and the need to offer consumers a new way of saving, sharing, sorting and editing images, and the statement suggests that what it's got up its sleeve will offer some of the functionality that was built into Aperture."

"Apple isn't axing Aperture completely. When the next version of OSX (Yosemite) becomes available to download, so will a compatibility patch so that existing Aperture users will be able to launch the app, but that will be the final update. And while the next version of OSX might have one or two more image-editing and classification tools that should fill an Aperture-shaped void for hobbyists, that might not be the case for pro users.

Indeed, following its initial statement, Apple also confirmed that it would help Aperture users simply migrate their files to Adobe Lightroom -- Aperture's closest direct rival. And, perhaps unsurprisingly, Adobe used the announcement to underline its own support to Mac-using photographers.

And while Lightroom is also a very good pro-level application, unlike Aperture, it's not accessible as a one-off payment. Instead, users must sign up to Adobe's creative cloud platform and its monthly subscription model. However for $9.99 (€12.29) a month, those that sign up also get access to Photoshop and the ability to use both applications as iPhone and iPad apps."

Wednesday, June 25, 2014

Supreme Court bans warrantless cell phone searches

This just in from the Washington Times: "The Supreme Court ruled Wednesday that police cannot go snooping through people’s cell phones without a warrant, in a unanimous decision that amounts to a major statement in favor of privacy rights.

Police agencies had argued that searching through the data on cell phones was no different than asking someone to turn out his pockets, but the justices rejected that, saying a cell phone is more fundamental.

The ruling amounts to a 21st century update to legal understanding of privacy rights.

“The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the Founders fought,” Chief Justice John G. Roberts Jr. wrote for the unanimous court. “Our answer to the question of what police must do before searching a cell phone seized incident to an arrest is accordingly simple— get a warrant.”

Justices even said police cannot check a cellphone’s call log, saying even those contain more information that just phone numbers, and so perusing them is a violation of privacy that can only be justified with a warrant.

The chief justice said cellphones are different not only because people can carry around so much more data — the equivalent of millions of pages of documents — that police would have access to, but that the data itself is qualitatively different than what someone might otherwise carry.

He said it could lay bare someone’s entire personal history, from their medical records to their “specific movements down to the minute.”

The chief justice cited court precedent that found a difference between asking someone to turn out his pockets versus “ransacking his house for everything which may incriminate him” — and the court found that a cellphone calls into that second category.

Complicating matters further is the question of where the data is actually stored. The Obama administration and the state of California, both of which sought to justify cell phone searches, acknowledged that remotely stored data couldn’t be searched — but Chief Justice Roberts said with cloud computing, it’s now sometimes impossible to know the difference.

The court did carve out exceptions for “exigencies” that arise, such as major security threats."

Tuesday, June 24, 2014

Pardon the interruption


Things tend to slow down during World Cup.

Thursday, June 19, 2014

Installing the 2014 Release of Creative Cloud


If you're a Creative Cloud subscriber, you may have noticed that your Apps list now looks like this after installing yesterday's "updates."

As Adobe's Julieanne Kost explained in her blog, "Most of you are probably noticing that when you install the 2014 release of Creative Cloud (Photoshop, InDesign, Premiere etc.), via the Creative Cloud desktop app, you’re actually installing NEW versions of the application. Yes, that’s correct, the new 2014 versions of CC apps will be installed in addition to (and can run along side of) the previous CC versions (they will not replace them). So, unlike the past few updates, the 2014 release will install a new, stand-alone version of most applications – such as Photoshop, InDesign etc.), and that’s also why it lists them separately in the CC desktop app). The fact that the 2014 release of Photoshop is a separate install might be why some of you aren’t seeing your custom plugins etc. that you might have installed with Photoshop CC."

Now, I like to keep the different versions installed. But if you don't, and want to just have the latest version installed, this post explains what to do.

RTFM solves Omnivore issue

Every once and a while, we have to be reminded to RTFM.

I had a piece of DME that had a weird codec from a company that has long since closed its doors. It has audio, which is more important than the video. Omnivore was going fine for a bit, then started dumping frames like crazy. I have a few Omnivores, so I tried each one. Same issue. So, I tried different computers. Same issue. Then, a helpful voice reminded me to RTFM. It turns out that Windows Aero was getting in the way of a proper capture - the fix to which is found in the Omnivore-Guide.pdf which is on the Omnivore.


Right click on the desktop and select Personalize.


Choose one of the basic themes - like Windows Classic.

This solved my problem.

So, if you're having Optimization or Capture problems on Windows 7, check your Personalization settings - or RTFM.

Wednesday, June 18, 2014

Adobe Announces Largest Software Release Since CS6

For those still using Adobe apps, here's the latest on the Creative Cloud update.

"Today Adobe announced all new versions of 14 CC desktop applications, 4 new mobile apps, the immediate availability of creative hardware, and new offerings for Enterprise, education and photography customers.

Of course this includes new features, enhancements and updates to both Photoshop and Lightroom for design and photography including the new Spin and Path Blurs in Blur Gallery, new typographic controls including Font Search and Typekit integration, enhancements to Smart Objects, Smart Guides, and Layer Comps, improved Content-Aware technologies, new selection capabilities using Focus Mask, as well as hidden gems and workflow timesavers."

Who gets DME better, SWGIT or SWGDE?

Yesterday's post featured an avalanche of new documents from SWGDE. One of note, Digital and Multimedia Evidence (Digital Forensics) as a Forensic Science Discipline, raised a few eyebrows around here. "Video Forensics" has largely been the domain of SWGIT. Now, out of the blue, comes SWGDE with their take on DME.

From the SWGDE document: "The purpose of this paper is to provide an abstract to assist the reader in understanding that digital forensics is a forensic science and to address confusion about the dual nature of the application of digital forensics techniques as both a forensic science and as an investigatory tool."

I love that they break down what they mean in clear terms: "As with other forensic science disciplines, the key attributes of digital forensics applied throughout the entire examination process, from collection through analysis and reporting, are:

  • Use of a quality management system containing standard operating procedures and an effective quality assurance program.
  • Proficient analysts with appropriate training, expertise, and experience.
  • Use of validated tools, processes, and methodologies.
  • Objectivity – the forensic analyst must be insulated from work-related undue pressures that could compromise the quality of work.
To help translate the document a bit, they try to differentiate between "forensic science" and "investigatory tool." I would argue that there should be no such difference. To me, when I hear "investigatory tool," I think "just trying to get something done." I think, untested, unvalidated, unreliable. 

By way of example, let's take mobile phones. An officer recovers a mobile phone from a suspect. He takes the phone, starts browsing through the messages and photos, and finds a photo in the gallery that seems to aid in the investigation. Not having training in mobile phone analysis, nor access to someone within that "search incident to arrest" time frame, the officer takes a picture of the phone's display with his own mobile phone.

For many investigations, it stops there. They have the picture they need. No further analysis is requested ... or maybe they don't have an analyst on staff or lack proper tools.

But, how can you answer questions about the photo on the suspect's phone? How did it get there? Did the phone generate it? Did an app generate it? Is it contextually authentic? You won't know without the phone and the original photo. You got something done, but you might have gotten it completely wrong.

Just something to consider.

Tuesday, June 17, 2014

Digital and Multimedia Evidence (Digital Forensics) as a Forensic Science Discipline

The Scientific Working Group on Digital Evidence (SWGDE) recently concluded its June 2014 meeting. The following eight draft documents were approved to be posted on the SWGDE website in order to solicit feedback from the Digital & Multimedia Evidence community:

Digital and Multimedia Evidence as a Forensic Science Discipline V2-0
The purpose of this paper is to provide an abstract to assist the reader in understanding that digital forensics is a forensic science and to address confusion about the dual nature of the application of digital forensics techniques as both a forensic science and as an investigatory tool.

SWGDE Best Practices for Handling Damaged Hard Drives
The purpose of this document is to describe the best practices for handling magnetic media hard drives when the data cannot be accessed normally.

SWGDE Recommended Guidelines for Validation Testing V2-0
This paper discusses the importance of validation testing and introduces a validation methodology.

SWGDE Best Practices for Computer Forensics V3-1
The purpose of this document is to describe the best practices for collecting, acquiring, analyzing and documenting the data found in computer forensic examinations.

SWGDE Capture of Live Systems V2-0
The purpose of this document is to provide guidance to the forensic community on acquiring data from live computer systems.

SWGDE Focused Collection and Examination of Digital Evidence
The purpose of this document is to provide the examiner with considerations to address when dealing with the review of large amounts of data and/or numerous devices.

SWGDE Mac OS X Tech Notes V2
The scope of this document is to describe the procedures for imaging and analyzing Macintosh computers. This document is restricted to the OS X operating system.

SWGDE Best Practices for Forensic Audio v2.15
The purpose of this document is to provide forensic audio practitioners recommendations for the handling and examination of forensic audio evidence in order to successfully introduce such evidence in a court of law.

Thursday, June 12, 2014

Amped FIVE Update: Reports in PDF and DOC, new Deblurring Modes, and more

Amped Software just launched a new version of Amped FIVE today, with a bunch of new filters and improvements. The main changes are:

  • Saving reports in PDF and DOC, as well as the current HTML format.
  • New modes for Motion Deblurring when there is a replica effect.
  • New Nonlinear Deblurring to use when motion is not linear.
  • A new CLAHE (contrast limited adaptive histogram equalization) filter.
Check out the details by clicking here.

Wednesday, June 11, 2014

Validation tests and comparing results

As a follow-up to yesterday's post, any time you're heading down a new path or plan to start using a new tool, it's important to validate it vs. a known data set. As an example, the folks at Digital Assembly publish a comparison of their tool's results vs. the results of other popular tools. They go the extra step and give you the links to the reference data sets so that you can conduct the tests yourself.

If you're an Encase 6 user and you're validating the tool vs. this disk image, you'll miss almost two thirds of the recoverable images. (This also speaks to the importance of keeping your programs up to date - and to validating the updates)


If you don't validate your tools before using them on casework, you're headed for trouble. Just because you haven't been asked the validation questions in court doesn't mean that you won't in the future. You've just been lucky. What would happen your your case and your reputation if you're just trying to get something done and you have Encase 6 - but the opposing expert is using APF? "How do you account for the fact that your tools / techniques couldn't recover the correct amount of files, or correctly recover the files in question in this case?" "Are your tools / techniques not reliable and repeatable?" What would happen if the frame / frames in question were dropped by your tool and you didn't know it? When there's blood in the water ...

Before you go down this road, reach out to folks with experience validating their tools. Get known datasets to use in testing. Test, test, re-test ... If the tool has issues, don't use it on casework.

Tuesday, June 10, 2014

Manual data carving - DVRs vs. Phones

A reader sent a note about how to explain the difference between computer forensics' ability to find deleted files on hard drives, but the relative inability of the team to recover files from a retrieved DVR hard drive. It seems that someone had retrieved a hard drive from one of those systems that will format the drive when you plug it back in, so they needed to retrieve the files without accessing the DVR's hardware (never mind the hardware decoding issues).

Popular mobile phone and computer forensic programs offer the ability to manually carve files of known types from the raw data. When folks delete text messages, images, and videos, forensic experts can often retrieve the files from the raw data dump of the device. This is largely due to the fact that common file types are coded in a certain way.


Because of the standards that are in place, we know that if we can find the JPEG's header (FF D8) and footer (FF D9) in the raw data, we can use our tools to extract / carve the image and save it out to a separate file. In this way, rarely is anything really deleted.


Also because of the standards, there are tools made specifically for carving multimedia files or for recovering multimedia files from hard drives or removable storage media - some are free, some are cheap, some are quite expensive.

The problem with applying this paradigm to DVRs is that coding, the header / footer for the proprietary file type is not generally published and is certainly not standard. If you're able to manually find and carve data from a Q-SEE DVR, the information gathered will not be of much use if you're trying to carve a raw dump from a Pelco DVR. Because of this high variability, the industry standard computer / mobile forensic tools aren't much help in automatic mode. It also means that it will likely take a considerable amount of time to decipher the encoding and begin the retrieval. In private practice, folks might not want to pay for that many hours of work. In public service, command staff might not have the patience required when waiting for results that might take a week or two to materialize.

But, if you have the time and the money to get into this type of work, there are a few training options out there. Your first stop should be with Jimmy and Jason at DME Forensics. They're offering classes on byte level analysis of DVRs. You also have the option to purchase their core product, DVR Examiner. I'd recommend that you do both if you're looking to get into this line of work.

Friday, June 6, 2014

Machine-created evidence is not hearsay

Whilst the article from Arstechnica.com is about red light cameras, one can see the argument coming for CCTV systems.

"The ATES-generated photographs and video introduced here as substantive evidence of defendant's infraction are not statements of a person as defined by the Evidence Code. (§§ 175, 225.) Therefore, they do not constitute hearsay as statutorily defined. (§ 1200, subd. (a).) Because the computer controlling the ATES digital camera automatically generates and imprints data information on the photographic image, there is similarly no statement being made by a person regarding the data information so recorded. Simply put, ―[t]he Evidence Code does not contemplate that a machine can make a statement."

"Goldsmith's attorneys also argued that, because the Redflex technician in charge of preparing evidence didn't show up at her trial, the images could not be admitted. What's more, Goldsmith's attorneys said that she had the constitutional right to face her accuser. In this case, her accuser is a machine.

She also challenged the character of Redflex, which has a prior record of falsifying speed camera documents (PDF) in Arizona.

The court didn't bite on that argument, either."

"It would be pure conjecture to conclude that all evidence generated by Redflex ATES technology and handled by Redflex employees for Inglewood is suspect because of the actions of a single errant notary public in a different state regarding a different type of technology and documentation. We have denied defendant’s request for judicial notice and reject her argument that the involvement of Redflex in this case requires a different constitutional conclusion."

Thursday, June 5, 2014

Working for the defense

I often get asked about my role as a scientist in light of primary employer. "Have you ever worked for the defense?" How does it feel working for law enforcement?" These are just a few of the questions that I've faced in trial.

As a scientist, I really don't have a dog in the fight. My answer to that line of questioning usually goes like this, "Regardless of who's signature is on my pay cheque, I work for the Trier of Fact - assisting the judge and jury in correctly interpreting these complex pieces of evidence. The results of my tests are grounded in science. They are reliable and repeatable. My tools and techniques are based on generally accepted, peer reviewed image science. The academic references for the algorithms used, for each of the steps performed, are noted in my report."

That being said, I have assisted the court in uncovering fraudulent evidence presented as impeachment evidence in People v. Abdullah (BA353334). It could be said, in that case, that I was working in the defense of the accused. But again, I was there to assist the Trier of Fact in correctly interpreting the evidence. In that case, the correct interpretation was that it was a forgery. In Hor. v. City of Seattle, I assisted the Trier of Fact in correctly answering the question about if/when a particular sound is heard in a recording (10-2-34403-9SEA) - seemingly in the defense of the City of Seattle - but more correctly in defense of the facts of the matter.

Trier of Fact n. the judge or jury responsible for deciding factual issues in a trial. If there is no jury the judge is the trier of fact as well as the trier of the law. In administrative hearings, an administrative law judge, a board, commission, or referee may be the trier of fact.

Taken a step further, there are certain trade groups that are geared towards law enforcement that will expel a member who is perceived or accused of having worked "for the defense." The perception is that law enforcement are the "good guys" and the criminal defendants are the "bad guys." Yet, to an image scientist, a 1 or a 0 is neither good nor bad. They're just numbers. I've worked a few cases where the government's "experts" got everything completely wrong, their work product was not repeatable nor grounded in science, and thus their conclusion was complete rubbish (scientifically speaking). In these cases, who's the "good guy" and who's the "bad guy?"

In the famous treason trial of Aaron Burr, he was defended by Edmund Randolph and Luther Martin, both delegates to the Constitutional Convention and among the most prominent men of the day. The Burr trial is one of the more famous examples of how politics and ego can enter into court proceedings.

But back to the point, if you're one of those scientists that think in terms of "good guys" and "bad guys," are you not biased towards a presupposed outcome - good will overcome evil and the bad guys will be punished? Is this form of presuppositional bias a good thing or a bad thing for scientists? I am certainly not one of those types of scientists. I work the case and the facts are the facts, regardless of who is signing my paycheck.

In the end, A either equals A or it doesn't.

Wednesday, June 4, 2014

LEEDIR Certified


Your humble host is now LEEDIR Certified.

Having sat through the training, I see LEEDIR as extending the concept of LEVA's IRIT beyond LEVA's group of Avid trained analysts. If someone has never seen the LEEDIR platform (or is not an FVA), they can be up and running with LEEDIR in less than two hours. If you've never worked on an Avid MC, you're not going to be very helpful to the IRIT.

For standard video/image formats, it's very easy to use and works great. It doesn't (yet) support proprietary video. But for those nasty proprietary files with known players, an Omnivore or VideoScanner 2 will do nicely for low cost screen captures. For the next instance, it would be much cheaper to ship a bunch of USB sticks out to the troops vs. shipping a bunch of FVAs to U Indy. Amped's user community could also be leveraged (via Citizen Global's cloud storage) for those files that have no player and need conversion.

With all the moving parts, the LEEDIR platform helps keep everything on one page - literally.

Tuesday, June 3, 2014

Mismatched parts problem

What happens to the video when you take this DVR mated to this camera at the wrong record setting? Do you remember how much aspect ratio was drilled into you during the LEVA Level 1?

The camera delivers what is effectively a 720x480 signal to the DVR. The DVR is, in this case, set to record WD1 ... which for this manufacturer (North America) means 960x480. OOPS! The DVR stretches the signal to fit the record dimensions. Not good.

Initially, the investigator thought that the video was just being stretched by the wide aspect monitor. But, further analysis revealed the stretch was happening to the recorded video. Remember, don't take things at face value.

I'm pointing this out as many DVR manufacturers are adding support for WD1. They're making YouTube videos showing just how cool their recorders are ... but really they're illustrating how their recorders distort the incoming signal.

This is an easy fix in FIVE, or any other software. But, if you don't know that it's broken, you won't know to fix it.

Enjoy.

Monday, June 2, 2014

Application security in the news

PCWorld has a story this morning noting that "Nice Systems of Israel said it patched remaining critical flaws in its call recording software used by law enforcement, but the consultancy that discovered the risky flaws hasn’t verified the fixes."

"The firm’s advisory describes nine vulnerabilities in Recording eXpress, six of which were ranked as serious. Some of the flaws could allow attackers to access call recordings and crack open a database showing the names of people whose calls are being monitored, which could potentially wreck a law enforcement investigation.

Over the course of three months earlier this year, Nice Systems patched a few of the problems, but some remained. Last week, SEC Consult went public with its findings, warning organizations to not use the software until at least five outstanding issues were fixed."

So, not only do you have to worry about validation of your tools, you should also be concerned about application security ... especially when your applications contain sensitive or personal information.