I did cringe slightly as I wrote that headline, as it would be more correctly stated as ‘How to get shallow depth of field’ – bokeh being the particular appearance of the out-of-focus areas. But, thanks to Apple, the world is using one term to refer to the other, so I gave in where the headline was concerned.
Only the iPhone 7 Plus, with its dual-camera system, has the ability to generate artificial shallow depth of field, an effect activated in yesterday’s beta. But all iPhones are capable of generating optical shallow depth of field in very limited circumstances, and it’s actually really easy to do so …
If you’re not interested in the ‘why’ but just want to get to the ‘how,’ then skip down to the sentence above the leaf photo. But if you’re interested in understanding how this whole shallow depth of field thing works, read on.
So, why does the 7 Plus need to create fake shallow depth of field anyway? Why does shallow depth of field tend to be found only in expensive cameras, not cheaper ones, and not smartphones?
The answer to that question is down to two factors:
- Lens aperture
- Sensor size
To correctly expose a photograph, a camera needs to let in the right amount of light. One way to do this is to adjust the shutter speed – have the shutter open for a shorter or longer amount of time. The other way is to change the size of the hole through which the light passes. That’s known as the aperture, and a traditional lens uses metal blades like this:
The bigger the hole, the wider the aperture – and the more light the lens lets in. Also, the wider the aperture the shallower the depth of field. Shoot on a conventional camera with a wide aperture and you’ll blur the background.
Smartphones don’t let you change the aperture. But the iPhone 7 has a wide-aperture lens, at f/1.8. On a DSLR, that would get you a beautifully-blurred background, like this photo I took in Cambodia with a whole bunch of people stood just 10 feet or so behind the child.
On an iPhone, however, normally almost everything is in focus despite that f/1.8 lens, so what gives? The answer is the second factor: sensor size.
For any given aperture, the larger the sensor, the shallower the depth of field. A full-format DSLR camera has a large sensor, so permits shallow depth of field. Compact cameras have smaller, but still sizeable sensors, so can still achieve shallow focus, just not to the same degree.
Smartphones, though, have tiny sensors. Yes, the iPhone one is bigger than many, but it’s still tiny compared to a full-size camera. So even at f/1.8, pretty much the whole image is focus.
There is, however, an exception: when you focus on something very close to the camera. That’s what I did in the photo at the top of this piece, and here’s another example.
In both cases, the thing I’ve focused on is literally about a foot away from the camera, while everything in the background is around 100 feet away or more.
But you can get significant background blurring even with much less space to play with. In this shot, taken in my office, the cat is literally 5-6 feet away from the iPhone, but it is still significantly blurred.
The bad news is that you can’t use this technique in the most useful situation: with people. Unless you’re going to get close enough to photograph their pores – not something most people appreciate – you’re only going to be able to achieve very slight blurring of the background.
But if you want to shoot flowers or similar, you can get great results with this technique. Just get the iPhone very close to the object you’re photographing, and angle things so that the background is as far away as possible.
If you’ve used this technique yourself, do post some links in the comments.
You can find more iPhone photography tips in a piece I wrote earlier in the year, with my colleague Jeff chipping in with his own.