The iPhone 7 Plus came out in 2016 with the company's first secondary cam (hehe) and at the time no one seemed to be doing telephoto cams. Apple wanted (you) to take portraits with blurred backgrounds and determined the normal+telephoto pairing to be the best for the task. It's the third generation of that setup that we're seeing on the iPhone XS, and the Max on our hands now.
Meanwhile, Samsung took its time and it wasn't until the Note 8 that a secondary cam popped up on a Galaxy flagship - conceptually the same design as that on the iPhones. The Note9 borrows that same configuration.
The Galaxy's telephoto cam may have been inspired by the iPhone's, but when it comes to the primary shooter, it's the other way around. Samsung made waves in 2016 with the Galaxy S7 and its dual pixel sensor and that's essentially what the company's high-end phones have used since, and now the iPhone too.
The Note9 uses a Type 1/2.55" sensor with 1.4µm pixels, each of them devoting a portion to phase detection. The imager is placed behind a 26mm equivalent lens that can change its aperture from f/2.4 in good light to f/1.5 in darker settings. The lens is stabilized.
Same on the iPhone. Well, not quite - there's no dual aperture action here, just a fixed f/1.8. The rest of it, however, appears identical - same sensor and pixel size, same effective focal length, dual pixel autofocus, image stabilization. This new sensor replaces the small-ish 1/3-inch one used in previous iPhones.
The phones share a seemingly identical telephoto camera, at least going by the specsheets. It starts with another 12MP sensor, a smaller Type 1/3.4" one with 1.0µm pixels. The lens offers a field of view equivalent to a 52mm lens in 35mm camera terms (so effectively a 2x 'zoom' compared to the main module), it has an f/2.4 aperture, and is also stabilized.
Oh, one semi-notable difference between the XS Max and the Note9 - the iPhone boasts a quad LED flash, while the Galaxy makes do with just the one LED.
Under the hood, both phones employ some form of image stacking thereby increasing dynamic range and cancelling out noise.
Apple says the iPhone is constantly keeping a 4 frame buffer, so that it can offer a zero (or minimal?) shutter lag. Its Smart HDR processing uses those frames, plus some extra ones in between at different exposure levels - shorter ones to preserve highlight detail, and a longer one to bring out the shadows - of which it then selects the best combination to blend into a final image with improved overall dynamic range.
Samsung's been doing something along these lines since the Galaxy S8 and is no longer really advertising it. Let's just say that the Note9 is no stranger to taking multiple frames, analyzing them and blending them together. That goes by 'HDR (rich tone)' in the camera settings, the smarts are implied.
Fire up the camera app on an iPhone, and it's all very familiar. Sometimes that's a good thing, but in this case we'd prefer to see some changes, like maybe, just maybe, have the camera settings in the camera app?
Anyway, swiping left or right switches between modes, but a flick up or down won't take you to the selfie cam like it would on the Note - here you need to tap on the toggle next to the shutter button for that. At least there's a dedicated mode for video recording so you can properly frame your clips.
Not so on the Galaxy - we've been whining about the shared viewfinder for as long as we can remember. Sure, having the record button right there in the photo viewfinder means one step fewer to start recording, and holding that will give you a preview of the frame, but we still don't find that ideal.
Changing modes works the same way here, and as an added bonus you get to pick what modes to have available and rearrange them to your liking in settings. There's a Pro mode with lots of manual control over shooting parameters - good luck getting that from Apple. The '2x' button for the telephoto cam is a bit too small and distant, though we could file this in the nitpicking folder.
Apple made a big deal in this year's keynote of a feature they've added to its portrait mode. Or rather, to the gallery, when editing shots taken in portrait mode - you can change the level of the background blur to simulate the effect you'd get from different aperture lenses all the way down to f/1.4. Huawei's Aperture mode is unimpressed. Mind you, Samsung offers a similar post-shot adjustment, only it's in units from 1 to 7. However, it's also available in the viewfinder as you're framing your portraits, unlike on the iPhone.
Alright, now that we're reasonably acquainted with the hardware and software, let's go out and take some shots while there's still sunshine.