You keep hearing about AI-powered cameras, but it often sounds like the Wizard of Oz is working behind the curtain. Now Samsung posted a fairly detailed explanation of how the Galaxy S21 series uses AI to create impressive portrait shots.
The selfie camera processing is pretty straightforward. First, the software identifies faces in the image and marks them for further processing (this is called segmentation). Second, detail in hair, eyes and facial features is enhanced. The camera also tweaks the white balance to achieve natural skin tones regardless of the ambient light.
Portrait mode is much more involved. It starts with segmentation so later stages know which parts of the image are humans, which are pets and which are just background. This way the correct processing can be applied where needed. This segmentation map is used to create a rough “seed map”.
Next is the “tri-map”, which is important as it highlights the border between the subject and background. Then the matte map traces fine detail within that border – it keeps hair and facial features from blending into the background. Finally, the depth estimation pass calculates the distance to objects, which will be used to create the shallow depth of field effect.
If you’ve encountered less than perfect portrait modes, you’ll know how often strands of hair and other fine detail is blurred into the background, giving the image a very artificial feel. It’s the accuracy of the matte map that sets Samsung’s portrait shots apart from the rest.
Below is an example tri-map (center image), white for the subject, black for the background and grey for uncertain areas. On the right is the resulting matte map, which refines the subject/background separation.
The phone takes the seed maps to apply image enhancements to the subject and blur the background. Then, it adds the matte map to create a sharp division between subject and background. All these layers are processed and combined into the final image in 3 seconds.
There is a problem with portrait mode selfies. Selfies taken in portrait mode just come out to be regular photos with no blur or any other effect.. It also says looking for face and it never says ready once it finds a face. A reboot fixes it but on...
Samsung introduced this 'live focus' mode since galaxy note 8 till now. It's really good to know how they improve it so far till s21 series. Especially using the front camera that doesnt have a depth sensor.. thats really tricky part.....
As an engineer, it would be a pleasure to make this possible for us all. I just need to get to know the right people willing to get on it.. so with that said, maybe we can have a call sometime then yeah? 🙂