Google has started rolling out the February update, which enables the Pixel Visual Core on the Pixel 2 and Pixel 2 XL. This will allow third party developers of applications that access the camera API to tap into the Visual Core hardware and access the HDR+ image processing feature that has been available in the main camera app since the beginning.Without HDR+ and with HDR+ on the Pixel Visual Core
The Pixel Visual Core is a custom image processor on the Pixel 2 devices that does hardware acceleration of the various image processing algorithms, primarily the HDR+ effect, which greatly increases the dynamic range of the image while reducing noise. It also includes RAISR, which uses machine learning to upscale digitally zoomed images and make them sharper.
All of these effects were available to the main camera app since the beginning but third party apps that accessed the camera could only take a standard picture without any of this additional processing, which greatly affected the image quality as the Pixel 2 camera without the HDR+ processing really isn't anything to write home about. Developers can now modify their apps to access this hardware if their app supports taking picture so their app can also take the same high quality images as the main camera app.
Also, in the coming days, Google will also be rolling out the new AR stickers with a winter sports theme. These will only be available in the main camera app.
Pixel 2's camera isn't as good as people made it up to be. I have Pixel 2 XL, iPhone 7+ and HTC U11+ and iPhone often have the best color followed by HTC while Pixel 2 go with drab yellish-green cast. Pixel 2 is also have the weakest AF and detail ar...
i see a ghost! did you?