We're starting off with a scene where our model (he doesn't often get called that) is leaning against a wall and the photo is being taken at an angle. The idea is to check out how the different algorithms handle the roll off of the blur as the distance increases, and if and how much any of the phones will try and blur the wall in front of the plane of focus.
Looking at the telephoto implementations and the Pixel 3's pseudo-tele images, we can see that the Note9 applies the least amount of background blur. The iPhone's background progressively melts away, but at the default setting the effect isn't as strong as that from the Pixel, which is also the most dependent on distance.
The Pixel is also the only phone that applies blur to the section of the wall that's in the foreground, but the wall should be sharp where the plane of focus intersects is, and it's not the case with the Pixel's image - so it is a bit overdone, in fact.
The wide angle portraits immediately strike with their very different perspective, and in the case of the S9, we couldn't quite replicate the framing because of the proximity requirement of its 'Selective focus' mode. And since we're on the S9, let's just say that it doesn't really bother much with depth detection, instead, it keeps the face sharp and blurs the rest.
The V40, Mate 20 Pro and OnePlus 6T apply progressively more blur the further the background goes, and the effect is the strongest on the Mate. It also applies some amount of vignetting by default, even if you've disabled its light effects, drawing the eyes towards the center of the frame.
It's worth mentioning that all phones did a great job isolating the subject's head from the background, but oddly enough, not all managed to properly separate the clothes - the hood proved particularly troublesome.
For this second scene we threw in some Christmas lights in the background to examine how the portrait modes will render those - that's one area where just blurring things doesn't get you close to the real deal. On top of that, Angie's hair is infinitely more challenging than the well-defined outline of our first test subject, and it shows.
Christmas lights first. We'd say the iPhone is the undisputed king in this respect producing the nicest bokeh balls of the bunch - even the reflections of the LEDs in the table top are rendered as big circles of defocused goodness (well, simulated, but still). And that's at the iPhone's default simulated aperture of f/4.5 - pick a lower f-number and they grow bigger.
Interestingly enough, it's the OnePlus 6T that's done the second most pleasing rendition of those. Meanwhile, the Galaxies draw them as a general mushiness of light, and the Pixel has completely obliterated them into an abyss of blur - you can't really tell there are lights in there at all.
None of the phones has done a perfect job of rendering the hair, and all have failed one way or the other. The most difficult bit is in the left of the frame where her hair is set against the wood panelling of the wall and the phones are at a loss when and where to start blurring.
The idea behind the third scene was to have more distracting objects in the background to try and confuse the algorithms into thinking they're part of the subject. We were shooting for hair extensions from the dry plant in the back to complement the already bush of a hairstyle of the subject, and possibly create issues with the shelves running behind the head.
Due to framing limitations imposed by the varying focal lengths and concepts we couldn't strictly replicate the same challenges for all phones, but it is what it is.
The iPhone's take on the shrub challenge is mostly successful, aided in no small part by Apple's way of handling portraits in the first place - render a sharp oval in the middle of the face and gradually blur away as you move out. That makes the hair-or-twig conundrum easier to handle as the hair is already pretty blurry in that area. It is a bit shaky, however, with the jaw line where skin and beard meet shelf.
The Note9 has done better with the shelf, but is running into problems with the hair where it fails to recognize the mostly flat color of the background (so it doesn't really-really need to blur it) and makes a bit of a mess with unruly strands cutting them into blur at arbitrary lengths. Even so, it was able to tell the plant in the background isn't part of the subject. The Pixel wasn't fooled by the plant either, plus it did a much better job with the hair, though it got oddly bested by a green leaf on the side. It had no issues with the shelves.
The wide ones all did fine with the hair, the S9 actually being slightly superior at that than its fellow Galaxy. Minor imperfections at the shelf-to-face border can be observed on all when examined up close, but there are no outright blunders.
We devised Scene 4 to test how the portrait modes would handle objects that are in the same plane of focus as the main subject. With actual optics, those should be in sharp focus as well, but if the software is tuned to find a face, keep it sharp, and then mostly blur the rest, the results would be different.
This ended up being the case with the iPhone XS Max, just as we suspected. We understand the drive behind that - you want nothing to distract from the person's face. Even so, provisions should be made in the software to account for uncommon cases like this one, no?
The only other phone that got fooled was the Galaxy S9, though it did so in a really weird way. The concrete column is half sharp, half blurred, what's that about?