Fearghast, 22 Aug 2020Better than Sony fans, that's for sure :DOh yes, I so can relate to that! xD
will it be a metal and glass construction?
Anonymous, 24 Aug 2020Facts? It is a fact that the Imx 689 has 4 subpixels per microlens. This was shown at the pres... moreAs brief as I wanted it to be, here's what you should know:
Pixel's Dual-Pixel sensors are standard Bayer arrangement color filter array and has Dual-Pixel AF (DPAF). This DOES NOT do pixel-binning to achieve 12MP.
IMX689 has Quad Bayer (Color Filter arrangement) pattern. The microlenses are meant to improve the Phase-detect AF (PDAF) and it has been implemented since IMX686 and they are called 2x2 OCL and very specific on Quad Bayers. Why microlens? Because they need to be small enough to fit the 0.8um pixel size (at first in IMX686, they're smaller as compared to the lens that span one color that has 4 photosites behind it), just don't make it as if "microlens" are a blanket statement for OCL, it isn't. There are 4 photosites per color and each color has 1 on chip lens (OCL) for it (as in IMX586). The 686 improves upon AF with 1 OCL for each photosites which means 4 OCL per color. The 689 improves upon sensor area and which means bigger photosites compared to previous. No matter how phone manufacturers overhype this tech, calling it whatever overhyped name they want, they can't hide the fact that it's still an improvement meant to address each photosite for PDAF on Quad Bayers. Addressing each pixel for PDAF has been the case for standard Bayer filters ever since that's why they can do Dual-Pixel AF (the least amount of pixels needed to triangulate the subject's distance) and that tech were already has been refined in hardware and in the backend algorithms which means that it is fast, accurate and reliable while the Quad Bayers are yet to to get there in terms of speed and accuracy on overall image processing.
Here's where you're wrong at:
1. You know nothing about the previous sensors used by Pixels that uses Dual-Pixel AF (DPAF). In fact, you know nothing about DPAF.
2. Due to #1, you keep on insisting that 12MP Dual-pixel AF sensors were 24MP and uses pixel-binning to do 12M. haha.
3. The microlens were not the ones responsible for pixel binning. Lol.
4. Maybe you should have just hit the search bar and search about Bayer Filter arrangement and DPAF. They're totally different.
Now go back and read your previous statements and see why they're laughable.
DroidBoye, 23 Aug 2020It's NOT a weak answer especially when you, yourself, doesn't know where you're... moreFacts? It is a fact that the Imx 689 has 4 subpixels per microlens. This was shown at the presentation of the Oppo Find X2 Pro. Furthermore there are no scientific articles about the Imx 689. But there are scientific articles about microlenses. Maybe you shouldn't believe every nonsense you read when you "start hitting the search bar in your web browser".
A very big downgrade from pixel 4 very diserpointed with google I think I will have to switch to iphone 12
Anonymous, 23 Aug 2020Weak answer. You even don't try to defend your arguments. I already explained that microl... moreIt's NOT a weak answer especially when you, yourself, doesn't know where you're wrong at. I would say that it was never meant to be "strong", my previous statements are meant to show facts, not strong arguments. Now you might consider yours to be "strong" but was it even factual? Maybe stop embarassing yourself and start hitting the search bar in your web browser.
DroidBoye, 23 Aug 2020You are explaining everything on a wrong context. Maybe it's time for you to hit the sear... moreWeak answer. You even don't try to defend your arguments. I already explained that microlenses completely change the behaviour. That's a fact! And it's not a different context at all. You can not take 24 megapixel photos with a 12 megapixel dual pixel sensor. When a microlens covers multiple subpixels, then each subpixel sees the scene from a different perspective, that's why a 12 megapixel dual pixel sensor, which is actually a 24 megapixel sensor, can not take 12 megapixel photos. If one would remove the microlenses from the Pixel 4 or iPhone 11 sensor, they could be able to take 24 megapixel photos. So a 48 megapixel sensor with a microlens above 4 subpixels behaves physically very different to a 48 megapixel sensor where each subpixel contains one microlens. It's nonsense to say: It is the same sensor, just with better autofocus. No, by changing how many subpixels are covered by a microlens, you totally change the physics when capturing a photo. You can not ignore the microlenses.
Anonymous, 23 Aug 2020"Just don't make it as if IMX686 rivals the Quad Bayers,"
As I said it is fund... moreNo one cares it's marketed as Quad Bayer in the market.
It's one of the best on the market giving balanced performance with very few shortcomings.
Anonymous, 23 Aug 2020"Just don't make it as if IMX686 rivals the Quad Bayers," As I said it is fund... moreYou are explaining everything on a wrong context. Maybe it's time for you to hit the search bar on your browser for you to know the differences.
DroidBoye, 23 Aug 2020IMX689 is a quad-bayer sensor with incremental improvement on AF to fix the slow AF problems t... more"Just don't make it as if IMX686 rivals the Quad Bayers,"
As I said it is fundamentally different. 4 subpixels under one microlens capture totally different signals than one microlens per subpixel. That's the same reason why a 12 megapixel dual pixel sensor can not take 24 megapixel photos. A 12 megapixel dual pixel sensor actually has 24 megapixels and uses pixel binning in order to produce 12 megapixel photos. Because one microlens covers two subpixels, you can not get 24 megapixel photos. Physically it makes a huge difference how many subpixels are covered by microlens.
Anonymous, 23 Aug 2020The Imx 689 has four subpixels per microlens. Quad Bayer sensors don't offer that. Quad B... moreIMX689 is a quad-bayer sensor with incremental improvement on AF to fix the slow AF problems that exist on quad (and above) Bayer setups compared to dual-pixels. Samsung's 108MP Nona Bayer implementation focusing problems were due to the fact that they prioritized more on upping the pixels and the sensor size instead fo fixing the in-sensor AF issue which only Sony does with the 689. The imrpovement microlens is just for that, mainly to improve AF and Sony's advertising that to rival the speed of dual-pixels. Just don't make it as if IMX686 rivals the Quad Bayers, no, it's a subset and an incremental but vital improvement within the Quad Bayer category of sensors.
Google is NOT obsessed with dual-pixel sensors. I would say that Standard Bayer sensors were chosen because they're the best sensor for the job that Google implements with their algorithms which requires speed (as fast as possible) and top of the line AF accuracy. Quad (and above) Bayer implementations are just slower when compared to Standard Bayers implementations.
Berserker, 22 Aug 2020What difference it makes if it's called Google pixel or Google nexus? Zero Selling it un... moreNexus phones were cheap budget devices. The move to Pixel was supposed to signify premium high end phones. So yes it does matter as Google is now offering mid range specs on a phone that is placed at the top end of the market. Therefore a name change would be appropriate.
Anonymous, 23 Aug 2020The Imx 689 has four subpixels per microlens. Quad Bayer sensors don't offer that. Quad B... moreFair point,mate...
Xiaomi is doing great lately,and without Sony.First they invested millions in an Isocell sensor(Samsung manufacture it),and now they invested in a custom Omnivision's sensor,and it seems they did a great job,judging by those DXO's sample pics,excellent exposure,very accurate and natural white balance,same with colours,unlike some competition..
Anonymous, 23 Aug 2020Exactly this is a downgrade compared to any Android flagships out there. Is not marketed as a flagship phone. Is just a bad coincidence of name from Google's part. They couldn't name it Pixel 5a, because then people would have question about the Pixel 4a name.
The best solution for Google was to never release this phone (Pixel 5).
If they would have put all the bells and whistles of a flagship phone, into the Pixel 5, then they would have price it at 1000$. Who would have bought it then? Would have been Pixel 4 case all over again.
I think what Google is trying to say, is that the Pixel 4a is an entry level mid-range phone and the Pixel 5 is an upper level mid-range phone. If something like that could exist. But, imo it is wrong how they thought of this.
Anonymous, 23 Aug 2020I can not explain that Google again misses the opportunity to use a periscope tele camera. It&... moreMaybe 3x tele at least would be enough for pixel 5 (due to size issue, periscope is still more expensive and bigger). It is quite sad tho, imagine if this pixel 5 sports triple cameras combo wide, tele, ultra wide
T M, 22 Aug 2020LOL! Google isn't going to release a newer phone with a faster SoC, 5g, 90 htz, bigger s... moreBeing that Samsung demands 1000$ for Note 20, I will not be surprised if Google 5 will cost 599$. Which at that price point will be another failed phone from Google, like the Pixel 4.
You got the One Plus Nord, with the same chip and much more affordable, you got the Pixel 4a if you want that Google camera experience. Imo, I don't know why Google releases this Pixel 5 model now.
If the Pixel 5 really has an ultra wide angle camera, I imagine this could be a reason why Marc Levoy left Google. I thought there were rumors that Osterloh wasn't happy? Marc Levoy was right that a tele camera is more useful than ultra wide, I guess that Osterloh is responsible for the stupid decision.
Anonymous, 23 Aug 2020The Imx 689 has four subpixels per microlens. Quad Bayer sensors don't offer that. Quad B... more"The Samsung Gn1 (72mm², 50 megapixel Quad Bayer Dual Pixel) or the sensor of the Huawei P40 Pro (75mm², 50 megapixel Quad Bayer Dual Pixel)."
I want to add that the Omnivision OV64C (41mm², 16 megapixel Quad Pixel) or OV64B (32mm², 16 megapixel Quad Pixel) could be better alternatives. These Omnivision sensors could maybe resolve more details and should be also better for Google's portrait mode depth due to the 2x2 microlenses.