Samsung A8 and the Depth of Field Scandal

I saw a BoingBong article lambasting Samsung for a marketing spot they did.

Evidently, the image Samsung used to showcase the A8’s simulated Depth of Field, was actually taken with a Digital SLR.  They evidently used a stock photo, then they changed the background and simulated depth blur via an image editor.  Some argue that this was done to demonstrate the capabilities, but it is very sneaky.

The whole scandal is covered in the post by the original photographer.  It’s an interesting read.  However, this opened a whole can of worms.  Namely, could the Samsung A8 take pictures like the one in the article?

[original article]

I personally don’t own a Samsung A8, but I do have a new generation of smartphone, the Google Pixel 3.  The Pixel 3 also simulates depth of field in a “portrait mode.”  Given this, I set out to correct some knee jerk reactions from various photographers.

Smartphone Doubts

Many believe that a smartphone can’t possibly create the quality required for a depth of field from a mechanical lens.  There’s also doubt on the quality of a smartphone image, compared to that of a dedicated camera.

On paper, they’re right.  A larger sensor will capture more detail and output higher quality images.

Yet I’m reminded of a similar argument: Back in the day, I used to be a film guy and I despised the idea that a DSLR could compete with film.  I used to complain that those tiny APSC sensors couldn’t match a true 35mm frame, let alone a medium format film frame.

In time, however, I agreed that digital camera sensors had come a long way.

Now the argument is, “there’s no way a smartphone could take an image of quality comparable to a DSLR/Rangefinder/special camera… just no way.”

While a small smartphone sensor can not equal a dedicated camera with a good sensor size….. they come awfully close in image quality!

Depth of Field

Below you’ll find several images with depth of field.  These images were taken by myself at varying focal lengths on various camera bodies, and some were taken with Google Pixel 3.

Look the images over and try and decide which ones were taken with a smartphone and which were taken with a dedicated camera body.  Be honest.  Also look past the content of the picture… don’t ask, “could a phone an image like that….”  Instead, focus on the depth of field and ask, “is this the result of a camera lens, or a simulation?”

If you’re honest with yourself, can you tell me which one was taken on a Canon 5D Full Frame DSLR?

What about from my Nikon P900 APSC camera?

How about my Sigma DP3M stacked sensor camera?

But which one is a smartphone faking depth of field?

Certainly there are some giveaways, like could I zoom in on that Lemur or bird with a smartphone… probably not?

What about the people shots?  Maybe you’re looking for tale tale signs that some of the shots are selfies… such as “is this shot from an arms length away.”  But if you are honest, and just looking at the bokeh or the depth of field, I think it’s very hard to spot which one is a smartphone picture and which one is a real depth of field shot.

Why Did they Use Stock Photos?

Honestly the image in the article, that I citied at the beginning, could have been done on a Google Pixel 3… and I assume, it probably could have been done on a Samsung phone as well (although I haven’t used the A8.)

So why did they use a stock photo and mock it up to simulate a demonstration of depth of field from the camera?

Simple!  It was a marketing team behind the footage.

Marketing teams are not technology teams.  They simply are trying to sell a product.  They probably didn’t even have an A8 on hand to play with.  The marketers are told that they need to sell the new phone.  They go out and find some images, give it to an artist, and the artist cuts out the background to simulate the change in depth.

This is all pulled into one marketing push and sent out.

“Oh but Apple doesn’t do that,” oh yeah?  Are you sure?  I would gather that most technology companies have marketing that unfairly positions their product.  Remember those old Apple commercials that showed people running windows and it crashes, and then someone in the audience yells, “get a mac?”  I grew up with those commercials.  During that time, I administrated several computer labs at my college… and the mac labs had the most downtime and issues of all.  The advertising however, sold the idea that only PC’s crash.

How about all those billboards that had shots supposedly taken with an iPhone?  Shots of a guy skiing down a mountain??  Consider that an iPhone 5s (which was the phone that supposedly took that shot) has a lens comparable to a 24mm-70mm lens.  The photographer would have to be feet away from the skier.  We all know a telephoto lens was used… so someone threw a telephoto onto the phone and shot through it.  Sure an iPhone was used, but is it fair marketing?  Can you or I take such a picture with our phones?  Not unless we add another $1000 to it.

This is how marketing works, it’s not fair, it’s not transparent – but that’s how marketing teams work.  It’s easier for them to license some footage from Fotolia, Getty, etc., than actually use the phone to take a picture.

As for the shots… here’s how my pictures correspond to the camera bodies:

Canon 5D shots


Nikon P900 Shots

Sigma DP3M Shots

Google Pixel 3 Smartphone Shots


One view some have, is that the image quality form a smartphone isn’t going to be close to that of a DSLR or other dedicated camera body.  Certainly the lens is small, the sensor is small… and if you’re blowing up an image to a poster print, it may not translate as well. But with most output of 8.5″x11″ and down, you’ll get very good results with a smartphone.  Many smartphones today also shoot in RAW mode.

RAW mode shots can be manipulated and pushed further than a normal jpg.  But is the quality there?

Raw Mode

Consider the image below… it was taken with the Google Pixel 3 as a raw image.  Then using some post processing, it was transformed into what’s below.  Only a raw image could have been pushed so hard, without creating imperfections – click in on the detail and see the 100% crop:


The most detailed shots I ever took were from my Sigma DP3M.  That camera is very special.  It’s also very unique.  It’s not a very user friendly camera, but it does produce the highest level of detail and sharpness.  It’s so damn sharp that I have to soften images in post processing or I’ll show people’s pores on their face!  Just look at the example below!

Smartphone Disadvantages

Click on the above image and check out the full resolution.  The level of detail is amazing… that’s what you get with a camera with 3 sensors each collecting a light frequency and combining the stacks together to form one highly detailed image.  To the left is a crop from that image, notice that you can see all the pores in the skin, as well as amazing details on the eye.  A smartphone today can’t produce this level of detail.  It’s just impossible considering the sensor size and capabilities.

Another limitation of a smartphone is an actual zoom. Zooms in smartphones are either constructed by software cropping or by secondary lens’.  These lenses never produce the quality you’ll get with a dedicated 200mm + lens.

With regards to depth of field, smartphones are simulating depth.  The simulation calculation will work in a perfect situation, but will fail if you introduce complex patterns/objects or glasses that refract a background.  It’s not perfect.  Having said that, a phone user can mitigate most of the problems by changing the shot to fit the scene.

Smartphone Advantages

So yes, there are quality differenceswhere the smartphone really competes is in the focal length range of 24mm-50mm.  In this range, a smartphone can produce an image that the average person can not discern the source.  Was it shot on a camera? Is it film, digital?  Is it full frame, apsc, or smartphone?  It can be very challenging.  Even in low light these days, smartphones are making advances with less noise.

Smartphones also allow us to immediately share or print our work.  If you want, you can download to Luminar, or Photoshop and work with it in post, but most of us are content with what the onboard filters and editing can do – and honestly those tools work very well on a Google Pixel.

Using AI, simulated depth of field, onboard filters and a variety of other technologies, smartphones are competing in their focal ranges.  Who knows what the future will hold, but smartphones are certainly getting better.



Leave a Reply

Your email address will not be published. Required fields are marked *