Why Narrow Depth of Field Makes Large Objects Look Tiny


37 Signal's review of 2006 posts, reminded me of their post about Olivio Barbieri's photography where he uses a tilt-and-shift lens to bring only part of an aerial photograph into focus.

The most obvious impact is the large object now looks like a tiny model. There are various tutorials about how to achieve a similar effect in Photoshop with gradient lens blur. See, for example, Fake Model Photography.

But why does this work? Why does a narrow depth of field make large objects look tiny?

For a fixed film (or sensor) size, depth of field depends on the focal length and aperture of the lens and how far the subject is away from the camera.

As photographers well know, a photo taken with a 200mm lens will have a much narrower depth of field than one taken at 16mm. Similarly, a photo taken at f/1.2 will have a much narrower depth of field than one taken at f/22.

But it's the role of subject distance in depth of field that makes photos like the one above look like models. If we assume 35mm film (or a full-size sensor) and a 50mm lens at f/4, then focusing at 1m will lead to a depth of field of around 10cm whereas focusing on a subject at 10m will lead to a depth of field of around 10m (from roughly 7m-17m). In fact, for our 50mm f/4 case, any subject over 25m away will have its background completely in focus.

So if you see an image of a subject that's 25m away and anything beyond that quickly goes out of focus, then either you've got a very open lens or a very long lens (or both).

Now imagine if the subject is 500m away and is starting to get out of focus by, say, 510m. What sort of focal length or aperture would this require? A 50mm lens would have to have an aperture on the order of f/0.004 - i.e. 12.5m!! If you wanted a more reasonable aperture like f/2.8, your lens would have to have a focal length of around 1350mm.

So you can see that a photo where a point 500m away is in focus but starting to get out of focus by 510m away is impractical to take. The eye knows this. So an alternative interpretation when looking at the photo is to assume it's not 500m away but is actually, say, 500mm away and is much smaller. Having a point 50cm away in focus and 51cm away starting to get out of focus is easily achievable with, say, a 50mm lens at f/4.

Note that I'm being generous with these calculations. When I say "starting to get out of focus", I'm talking about the earliest point at which you could tell on standard 35mm film. This may actually be closer to the subject than I've assumed. But that only makes the point stronger. If we were to require the focus at 500m to be going by 501m, for example, even an f/1.2 lens would have to be 2.7m long.

(btw, I did all these calculations using Dudak's Depth-of-Field Calculator)

The original post was in the category: photography but I'm still in the process of migrating categories over.