The future is blurred: unmasking how smartphone snaps are becoming next-gen
Making your photos look arty since 2014
Call it a portrait mode, bokeh mode or background blur: almost every big-name smartphone has a camera feature that blurs out the background of your images. It makes portraits pop and nature images look arty, and can even make photos of everyday objects seem larger than life.
But how does it work?
These modes emulate the effects you can get with a DSLR or mirrorless camera. In these cameras, the two main factors that dictate the strength of the blur effect are the size of the sensor and the size of the aperture, or lens opening.
The latest phones, like the Huawei Mate 20 Pro and Pixel 3 XL, have incredible cameras that can produce great shots in almost all conditions.
Even these have far smaller sensors than full-sized cameras though, which limits their ability to blur backgrounds in images, so that’s why they need a little help.
The two lens effect
Most phone cameras with a blur mode use two rear cameras to create the effect, and the idea behind this is simple.
The two cameras see depth as our eyes do. The two lenses are slightly offset, which gives them a different view of the world, a different perspective. The closer an object, the greater the disparity in their position and appearance to the two 'eyeballs'.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
This difference is then analyzed by the phone’s processor, just as our brains process data from our eyes, to create a depth map of the scene.
You won’t see this as you shoot, but you can think of it as a rudimentary 3D model, or like a map where the terrain is represented as a series of contour rings. An algorithm then goes to work, blurring out the parts further away from your subject.
These algorithms have improved hugely since dual-lens cameras with blur modes started to appear on phones in 2014, with the HTC One M8 being the early front-runner.
You’ll now see convincing progressive blurring, with objects slightly behind or in front of your subject only a little out of focus, while those further away get the true 'bokeh' effect, taking on a lovely smooth blur – this is also known as a 'shallow depth of field'.
When used correctly, the term bokeh refers to the quality of the blur effect created by a camera lens. All sorts of adjectives can be attached, so you'll hear talk of beautiful bokeh, creamy bokeh and so on – and thankfully the best phones are starting to edge towards the bokeh effects of cameras.
Some phones, including the Huawei Mate 20 Pro and Samsung Galaxy Note 9, also let you choose the level of the blur – this is the equivalent of adjusting the aperture on a camera lens, while the new iPhone range are capable of letting you do this once the snap is taken.
As yet only a few phones have had cameras with genuinely variable apertures – where the hole through which light is captured by the sensor can be made wider or narrower, including the Galaxy Note 9 and the Nokia N86 from 2009.
However, the idea on those phones is to narrow the aperture, to let it handle ultra-bright conditions better by letting in less light, rather than widen it, which creates more blur.
Even the widest-aperture phones, like the f/1.5 Samsung Galaxy S9, only have the lens chops to capture natural-looking shallow depth of field blur close-up.
If you want to talk about this to a friend and make it sound like you know your stuff, be sure to use the term 'crop factor'.
This refers to the sensor size relative to the size of standard 35mm film, and is a solid indicator of both how well a camera can natively deal with low-light conditions without software aid, and how pronounced a blur effect you’ll see at a given f-stop rating, again without software.
These days, every type of smartphone shooting is enhanced by software, but you can appreciate just how clever the background blur effect of some phones is when you ask them to deal with small points of light.
These aren’t just blurred – they bloom into artsy-looking balls of light. Get a few of these into your bokeh images, as in the shot above, and you’re onto a winner.
This sort of light treatment demonstrates that phones like the iPhone XS don’t just emulate a lens, but the elements inside a lens.
A camera lens is not a single piece of glass, but a whole series of them that direct light from the wider opening onto the smaller sensor. The arrangement, and quality, of these elements affects the character of the out-of-focus parts of an image.
As you can see, software bokeh blurring is much more than just a simple Instagram-style filter.
Tracing outlines
All phones with a background blur mode tend to struggle when dealing with scenes in which the subject’s outline is very complicated, though. Most phones have depth sensors of a lower resolution than the main camera, meaning the depth map created is somewhat rudimentary.
And even with the best depth systems, you’ll often see a slightly rough outline where the in-focus subjects meet the blurred background. As the effect isn't optical, background blurring is always, to some extent, an informed guess which leads to the weird 'cut-out' edges you might sometimes see.
Other methods
There are other background blur methods that don’t rely on a dual-camera setup, which puts greater emphasis on this clever guesswork. Google's is the best implementation of a single-camera blur mode.
This doesn’t just use object and outline recognition, though. The Pixel 2 and Pixel 3 rear cameras use their dual-pixel autofocus to work out which areas of an image are part of the subject, and which are further away. This is based on the principle of phase detection.
Each pixel in the phone’s camera sensor is made of two photodiodes, the elements that detect light. This lets each pixel separate light received from the left and right sides of the lens.
When a pixel can tell whether the light it's capturing is in focus or not, it can tell whether that area of the image is part of the subject or not, and subsequently how far removed its is from the focal point.
Google also packs blurring into the Pixel 3’s front camera, though, and that doesn’t have dual-pixel AF.
Here we get the pure software effect, developed using a neural network designed to recognize people and pets.
This is a pure version of the 'informed guesswork' mentioned earlier, and that it works about as well as the dual-camera version for portraits shows how clever Google’s software is.
There’s another method too, one that has thankfully fallen out of favor. Some older single-camera phones with blur cycle through their lens’s focus range, capturing exposures throughout to analyze which parts of the image lose sharpness as the focal point retreats from your position. It's a lot slower than a good dual-camera setup, and the results are often not as good.
- Some of the best phones for background blur effects: the Huawei Mate 20 Pro, iPhone XS and Samsung Galaxy Note 9.
- Brought to you in association with Nokia and Android One, helping you make more of your smartphone. You can learn more about the new Nokia 7.1 here, and you'll find more great advice on getting the most from your phone here.
Andrew is a freelance journalist and has been writing and editing for some of the UK's top tech and lifestyle publications including TrustedReviews, Stuff, T3, TechRadar, Lifehacker and others.