the news frontier

Detecting Fake Photos with Digital Forensics

A Q&A with Hany Farid on photo forensics
March 23, 2011

As photography has gone digital, it has become ever easier to manipulate images with Photoshop and other technology. Digital photographs used in the news industry are often adjusted for reasons of aesthetics—a contrast adjustment here, a color-alteration there. But they can also be altered with the aim to deceive editors or readers. Luckily, digital detection technology is quickly advancing, as well.

Hany Farid, a mathematician and digital forensics specialist who teaches computer science at Dartmouth University, has developed a host of tools to accurately identify images that have been altered. He will be speaking—along with Santiago Lyon, director of photography for the AP—at an MIT symposium on April 5 called “Ethics and Forensics in the Age of Photoshop Photojournalism.” Assistant editor Lauren Kirchner spoke with Prof. Farid to learn more about the science involved in photo forensics. This is an edited transcript of that conversation.

Can average news readers or viewers ever tell whether a photograph has been altered? Are there tell-tale signs to look for?

Yes and no. Your brain is actually fairly good at noticing certain inconsistencies in a photo. For example, we’ve all seen what I like to call the “floating head syndrome,” where someone’s head is pasted onto someone else’s body, and it looks disembodied, like it’s literally floating. Those things just pop out at us, and we don’t need anyone to tell us there’s something wrong there. But those examples are deceptive, because those are just examples of a bad fake. The problem is, there are other aspects of the visual system—where your brain processes images—where it’s just really bad at determining whether something is consistent or not. We have done a variety of studies and developed forensic software to determine how good people are at visually assessing authenticity. There are things we are very good at, and there are other things we are very bad at.

For example, we’re very bad at light and shadows. If I show you a photograph, and I ask you, “Are the shadows here consistent or inconsistent?”, you basically will have no idea. You just can’t tell. But here’s the really dangerous part: it’s not just that you can’t tell, it’s that you’re consistently wrong. You will look at something where the shadows are absolutely correct, and you will think, “Nope, something’s wrong here,” and you will say that consistently. So it’s worse than guessing, because you are wrong, you are sure that you’re right, and that’s the worst combination. I like to call that “the arrogance and ignorance effect.” And so the danger of relying on your brain to assess authenticity based on things like shadows and perspective and texture and lighting is that we’re just not actually that good at it.

So while bad fakes are very easy to detect, good fakes are very difficult to detect; and, worse, really good pictures are often said to be fake, because of this sort of failure to reason about things like lighting and reflections and shadows and perspective. You do see this effect now in photojournalism, where everybody now sees a remarkable photograph and says “No, that can’t be real.” Now there’s almost a knee-jerk reaction in the opposite direction.

Sign up for CJR's daily email

So if those are not reliable ways to detect a fake, then what are?

That’s the beauty of mathematics and physics and computer science; we can quantify and measure whether things are consistent or not. We know how to write down equations that quantify how shadows are cast, and we know how to write down equations that describe perspective projection, and we know how to write down equations that describe jpeg compression, and so on and so forth. So with all this we can actually determine whether these things we see are physically correct or incorrect. Now, the issue with these tools, of course, is that they are not at the stage yet where you just push a button and get an answer. It’s not like CSI on TV; it’s actually a fair amount of work.

Having said that, there are certain techniques that we’ve developed that are extremely automated that we can run in batch mode. I’ll just describe one of those. Digital cameras are great, for many reasons, but one reason is that they are all really different. Even within a Nikon family of cameras, they will all generate digital photographs in different ways. Jpeg is the dominant format that all images are produced in, and there’s probably a sense that a jpeg image is a jpeg image, but that’s actually not the case. In fact, there are huge differences in the underlying encoding of a jpeg that vary from camera manufacturer to camera manufacturer. That’s really cool, because when a camera creates a jpeg, it may look the same visually, but it actually will look very different to us—different from a jpeg that has been opened in Photoshop, altered, and then saved.

So we have a technology that has essentially learned, by way of millions and millions of examples of images created by cameras, what the statistics of the jpeg image look like. What we see is that, when the image is altered in any way—this could be as minor as cropping or contrast adjustment, or, more nefariously, removing or inserting someone into the photograph—we can determine that.

Now, this technique is good for certain things and not good for other things. What it is good at is, for example, if you are in a court of law, and you want to make sure that digital evidence has not been altered in any way from the time of recording, this is a very good technology. If you are the AP or Reuters or another news agency and you are getting photographs from citizen journalists, and you want to make sure they have not been altered in any way whatsoever, it is good for that.

But where it is not good, though, is if you are the AP or Reuters and you have your photojournalist who you’ve been working with for ten years, and of course they are going to do manipulations: they are going to crop it, contrast-enhance, remove dust, and so on, well, then, we can’t tell the difference between that and some other manipulation. So while this works in batch mode, and we can process tens of thousands of photos of images in five minutes, the more complex and nuanced forensic analysis, that looks at lighting and shadows and geometry and so on, that’s a much more nuanced process—that requires an analyst to actually be in front of a computer.

Is this jpeg-analyzing technology already in use now, or are you still developing it?

I use it in my day to day consulting work, but we are in the early stages of creating a company to commercialize it. There are two primary audiences, which are law enforcement and media outlets. We hope that by the end of the year we will have a version of this software available.

When I think of digital alteration, I tend to think of airbrushing in fashion magazines—techniques that make photo subjects look thinner or younger. But what are the most common types of forgery or alteration in photojournalism?

While I don’t work in news, the most high-profile examples that I’ve seen have been aesthetic. You take a picture, and you say to yourself, “If only that cloud wasn’t there,” or “Why is there a pole sticking out of that guy’s head?”, or “The guy in this photo looks really cool, and the guy in this third photo looks really cool, and they were taken just a few seconds apart, so maybe I can composite them together into one.” It’s usually about composition and aesthetics, to get that “bang” effect. Now, I’m sure there are examples of people fundamentally altering photographs, just complete fabrication. But many of the cases are more like, you could have taken that photo, it could have happened, but you just didn’t. I’m not defending that practice, though—I don’t actually think that’s okay.

There was a great example recently, there was a photograph in [Egyptian daily newspaper] Al-Ahram of Mubarak, President Obama, and a few other leaders walking through the White House. Mubarak was sort of trailing off to the right and President Obama was in the lead, but when Al-Ahram published the photograph, they spliced out Mubarak and put him in the front, and the headline read “Mubarak Leads Peace Talks” or something. So you could argue that, you know, they were all there, maybe it’s just a question of composition—but no, obviously that’s way off limits. But usually, a lot of it comes down to composition, and making those sexy and flashy photographs. Which probably says something about the state of photojournalism—that to get things published, there has to be a big “wow factor” in these photographs.

Do you happen to know whether photography awards take this kind of thing into consideration?

Yes, I have been contacted by a number of photography awards and competitions. In most that I’ve seen, it’s an absolute no-no. Most say that you can crop, and maybe contrast globally, but nothing else is allowed. I do think most prizes want to stay true to the nature of traditional photography. It’s not better or worse, but it’s a different skill set to create a composite photo. A digital artist’s or a graphic designer’s skill set is different than a photojournalist’s skill set.

You have a great visual history on your website, Photo Tampering Throughout History, showing that this kind of thing has been going on for hundreds of years, even though the methods were obviously different….

Yes, altering photographs dates back to the 1800s—essentially, as soon as photography started, people started altering photographs. Some of these old ones are amazing. Hitler did it, Mussolini did it, Mao—all the great dictators did it.

What did they do?

Stalin famously airbrushed people out of photographs who fell out of favor with him. That was true also of Hitler and Mao; Castro did it, too. Whenever people pissed them off—it was this really childish reaction. [Laughs] But I think that the reason they altered photographs was that they understood the power of photography. They understood that if you can change photographs, you can change history.

Lauren Kirchner is a freelance writer covering digital security for CJR. Find her on Twitter at @lkirchner