The Samsung Galaxy S23 Ultra just solved one of my biggest camera problems
I don’t know what Samsung did precisely with the Galaxy S23 Ultra’s camera hardware and software, but all I know is that it’s fixed something that’s been bothering me about portrait mode photos for a long time.
Samsung put a lot of emphasis on the Galaxy S23 Ultra’s new 200MP camera during its presentation at Galaxy Unpacked earlier this month, but one moment that really captured my attention was when (23 minutes in (opens in new tab) to be precise) it promised a clean separation of a subject’s glasses from the background in portrait mode shots.
As a spectacles wearer myself, I’ve often found using portrait mode for selfies when my face isn’t dead-on to the camera can be unreliable, so this sounded very interesting. And of course I had to give it a try when the the phone arrived for testing as part of our Galaxy S23 Ultra review.
To give you the bottom line early: Samsung’s telling the truth. While the Galaxy S23 Ultra already tops our best camera phones guide, the fact it can capture glasses better in portrait mode than an iPhone 14 Pro Max (our second-placed phone), lets you have more freedom when taking selfies or posing for the phone’s main cameras. That only reinforces the S23 Ultra’s position as the top camera phone you can buy.
Making a spectacle of portrait mode
I first noticed this when I went hands-on with the Galaxy S23 Ultra prior to the phone’s release and took this selfie in front of a leafy pot plant. While you can debate whether I look better in the iPhone or Galaxy’s image, what’s obvious is that my glasses are cut out near-perfectly in the Samsung photo, whereas there’s a strange blurry artifact on my specs’ right side in the iPhone photo.
This problem doesn’t come up if you take an image straight-on, as you can see in the comparison below, this time with a plain white background. But having to always take photos while staring straight down the lens isn’t fun let alone practical.
Turning my head to the side in this 3x lens shot, we see even with a nice simple background for the phones to apply the bokeh blurring effect to, the iPhone still can’t get it right. While my glasses frames are cut out properly, it doesn’t have my eye in focus, which looks very odd when the rest of my face is otherwise in shot.
Going back to the selfie cameras, we again see the iPhone struggling to pick out the edge of my glasses, although my eye is in focus this time (minus my eyelashes). The Galaxy S23 Ultra however gets it right, just as I’d hoped.
To be fully transparent, one of my Samsung portrait shots didn’t come out quite so well. So even if the Galaxy S23 Ultra can get the bokeh around a glasses wearing subject right, it’s still not flawless. Maybe we’re going to have to wait for future Samsung phones for this to be fully reliable.
In the current smartphone world, where flagship models across brands offer similarly effective cameras, tiny differences — such as being able to apply portrait mode to people accurately — can be make or break for a sale.
I’m not going to abandon my iPhone off the back of this test, but it’s more evidence that Apple’s photography lead has been eaten away at by the Galaxy S23 Ultra. And that Cupertino’s finest, despite being led by the glasses-wearing Tim Cook, have missed something crucial that needs to come to the iPhone 15.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.