AnonD-909757, 27 Jul 2020Those two comments were really appreciable answer, thanks a lot !
I didn't knew for th... moreYou don't want a disastrous wide aperture.
It will give you more problems than benefits.
Samsung's 1/1.33"(35mm) sensor with 1.4 aperture can give you great depth of field result but. It will make your photo soft. If you're close enough it will put your nose only in focus and everything else of your body out of focus, even your eyes. This lens will have terrible Dynamic range and will over-expose your image.
It is unusable for Landscape, closeup shot(macro), architecture(or things like it).
AnonD-909757, 26 Jul 2020But even with the same tech, wouldn't a cheaper sensor that have inferior tech (less ligh... moreFirst para was hilarious !
How can you use inferior tech in one sensor when the condition is both sensor must have same tech.
About Google and Sony. They really don't possess any better camera than Galaxy overall.
Google is ahead with main camera for some situation for it's extraordinary software. Sony is no good now. Sony was among the best in old days. But they have faded away.
Xperia 1 was bad. Xperia 1(II) is great improvement over Xperia 1. But not really competitive with current flagships like Huawei-Samsung.
About ultrawide high MP camera. It's not an actual camera. It's mobile camera. It's has limitations. When you put a big sensor it is hard to make it actually ultrawide. Huawei for example. They put a high resolution ultrawide but the camera is far less wide than others.
Regular ToF isn't for focusing. Specially purposed ToF can assist focus but they're not great.
Your AI and 3D theory is interesting. But I think the SoCs aren't powerful enough yet.
You're slightly wrong about S7 sensor vs S10 sensor. They are same resolution. But they're not same tech. So, they will cost different.
S7+S8 is with ISOCELL sensor and S9+S10+... is ISOCELL plus sensor.
About sensor size difference with the same pixel size.
I used to think the same way too.
But Sony offers only 1/2.0" sensor but Samsung both 1/2.0" and 1/2.25" .
Why ? If the 1/2.25 were better then why didn't Samsung use the 1/2.25 in S20 ultra instead of 1/2.0".
Anonymous, 26 Jul 2020No, it is not okay. You are wrong. If you are only interested in which camera captures more li... moreNow let's give some straight answer.
Was his comparison(not the equation) result wrong ?
Can you give an example of 2 sensor comparison where his equation give opposite result. I mean make the worse sensor better ?
Still I also would like to get the true equation which you think right.
Anonymous, 27 Jul 2020A (for example 48 megapixel) Quad Bayer sensor usually works by reading out 4 pixels (pixel bi... moreThose two comments were really appreciable answer, thanks a lot !
I didn't knew for the pixel size, I am user to the size of transistors in CPU in nm which aren't an accurate measurement but some sort of approximation where some measure the gate itself, other between center of one to center of another, etc...
I mixed up noise and interference, it make sense !
I guess a considerable part of the light is loss as if a "green frequency" photon hit the red filter, he won't go through, so I guess it may be that "a third" of the light is loss, leading to inconsistencies and making noise.
I am thinking about combining both light field and every color per pixel detection, I see some various idea, one using chromatic aberration would lead to large sensors but that would get almost all photons to each pixels and not a quarter for R and B and half for Y/G...
Might significantly reduce the second type of noise I guess.
And well, since larger pixels are often better, isn't there a diminishing return value at which binning is detrimental ?
Before I initially posted about quad bayer, I looked it up (the difference between it and Tetracell), and I did see an image illustrating how the pixels are rearranged, indeed it doesn't seem that great as some pixels are moved quite few positions away, on top of artifacts, it might make the details inaccurate and cause a big lack of sharpness I guess.
Waw, this is a serious sensor then, I guess it is the ISOCELL GW2, Wikipedia list it a 64MP and as the main camera of the S20 with an RGB tech rather than Tetracell.
Since you know quite a lot about sensors, I've got a question, if the aperture is quite large, does the sensor size really matter ?
I mean, if we have two 12MP sensors for the same aperture size, but one is big and the other tiny, as long as the whole light is focused on the sensor, isn't technically true that both get the same amount of light per pixels anyway ?
Or is there something else/more specific that make bigger pixels better ?
AnonD-909757, 26 Jul 2020I know, from what I've seen, the Bayer is the part over the pixels that separate light ra... moreA (for example 48 megapixel) Quad Bayer sensor usually works by reading out 4 pixels (pixel binning) together. This happens on a hardware level. This reduces the read noise compared to a traditional sensor with 48 megapixels, but only gives you a resolution that is comparable to a traditional 12 megapixel sensor. So the sensor can behave like a traditional 12 megapixel Bayer sensor. There's also a high resolution 48 megapixel mode, but these modes often don't work well due to the different color filter array: They rearrange the pixels artificially in order to get the traditional Bayer array ("remosaic") , so they can use the traditional Bayer algorithms. But this leads to artifacts. Therefore Quad Bayer sensors often capture much less detail in their high resolution mode than a traditional Bayer sensor with the same resolution.
Currently only the Galaxy S20 has a Bayer high resolution sensor (64 megapixel wide angle camera). It might be the first phone with a Bayer high resolution sensor since the Nokia Lumia 1020 (though the Xiaomi Note 10 can operate in a 27 megapixel Bayer mode).
AnonD-476622, 26 Jul 2020The 64mp zoom is very poor, i compared it to my P30 Pro, and image quality is a real let down.... moreI didn't read your thread, but the Ultra doesn't have a 64 megapixel camera. That's the S20. Furthermore the 64 megapixel camera is a 64.1 megapixel wide angle (not a tele) camera and has a field of view of approximately 27.6mm. This means that you get 6.9 effective megapixels at an 84mm field of view. So nobody should expect that this captures very distant objects as detailed as the 6.9 megapixel 135mm field of view of the P30 Pro (it's advertised as 8 megapixel 125mm, but in auto mode it is cropped for 135mm=27x5).
The 64mp zoom is very poor, i compared it to my P30 Pro, and image quality is a real let down. The main camera is good enough but the tele 3X module is disappointing in a big way. See comparisons here https://www.dpreview.com/forums/thread/4505497?page=4#forum-post-64175591
AnonD-909757, 26 Jul 2020This is a really informative and awesome answer, thanks !
I didn't wanted to go as far a... moreThe pixel size is actually called pixel pitch. So when you read about a sensor with 1.4 micrometer pixels, then 1.4 micrometer is the distance from the center of a pixel to the center of the next pixel in the row as far as I know. So it is usually correct to use (pixel size)² x resolution = sensor area.
There are basically two types of noise.
As far as I know, objects (or each mm² of an object) emit/reflect a slightly different amount of photons every millisecond, this leads to slightly different intensities every millisecond, this is perceived as noise. So this noise actually belongs to the reality, it is caused by the light (the number of emitted/reflected photons) itself.
But most cameras are not able to measure the intensities 100% correctly. This is a problem in low-light conditions or dark shadows and/or when the exposure is too low or when the pixel size is too low. In those cases a small number of photons reaches a pixel, so here a very small inaccuracy of the sensor/camera can have extreme consequences. So this type of noise is caused by the sensor.
AnonD-754814, 26 Jul 2020Actually to do comparison he doesn't need to be precise. There is a reason why I said Oka... moreNo, it is not okay. You are wrong. If you are only interested in which camera captures more light per pixel (and both sensors have the same aspect ratio), then instead of sensor area/(resolution x f-number²) it is also correct to compare
sensor diagonal / (f-number x sqrt(resolution) ). So sqrt(resolution) instead of (resolution) makes a difference! A camera with twice the resolution and sqrt(2) x diagonal would lead to the same amount of light per time per pixel whereas in his formula a camera with twice the resolution and sqrt(2) x diagonal would lead to a smaller(!) amount of light per pixel! That's not correct at all! You can't simply alter the exponent of a factor, this leads to different results.
-Furthermore as I said the relationship between sensor type names and sensor diagonals is not completely linear, so this can also lead to wrong results.
-Yes, it is correct to compare (pixel size)² / (f-number)² = sensor area/(resolution x f-number²)
-No, you need the real focal length for comparing the amount of light from an object as smartphone cameras use different sensor sizes. Instead of using (real focal length)² it is also correct to calculate the sensor area that an object uses. Comparing (equivalent focal length / f-number)² is wrong!
AnonD-754814, 26 Jul 2020First of all, There is no better small sensor(bayer) if it use the same type of technology. I ... moreBut even with the same tech, wouldn't a cheaper sensor that have inferior tech (less light sensitivity, poor bayer, bad analog or digital or converter) be susceptible to have a big quality performance despite being the same size and resolution ?
Well, the Exynos thing is a proof they try to make profit out of the tiniest little penny they can, as it cost them less to make while they sell it as the same cost as the Snapdragon variant.
Oh, ok, it make senses, also, I don't get why every company except Google and Sony which have excellent photography capabilities with their smartphones, insist on putting low resolution camera for (ultra)wide and telephoto, I mean literally the (ultra)wide mean less pixels per degrees, it should be actually the one where you put the big MP cameras, and the telephoto, since hybrid and numeric zoom are being often preferred over optical zoom by most brands, it should also get higher than 12MP to output 12MP to be able to get finer details.
But for some reasons they keep insisting on putting lower resolution sensor ?
It is like ToF camera, they put like 0.2MP camera, like, what ? Hell they should make a 12MP one to get a PER PIXEL focus correction software capability, no wonder the software brokeh is crappy, their is so much they could do with depth sensors by feeding it to an AI trained for special tasks, from gathering a pseudo 3D to make smart lightning correction, fixing focus, etc...
I was rather thinking they would cost more because of higher resolution or older model (and not technology), like a ISOCELL 2L1 who is of the S7 era should cost way less than a ISOCELL 2L4 who is S10 era, despite having the same tech, size and resolution, I guess ?
Wouldn't it be because of the size if the pixel separation ?
With that much pixels on such a tiny surface, even an almost insignificant size difference it might add up significantly, though I don't know the margins on sensors, so I am just guessing.
Like the difference between those two electron microscope shot of sensor :
The second picture look like it have a seriously lesser density because of each pixel separation.
Anonymous, 25 Jul 2020I just noticed that other users wrote that your calculation is correct. No, it's not.
Yo... moreActually to do comparison he doesn't need to be precise. There is a reason why I said Okay, not correct. As he's using same measurement for both case that makes the comparison result right. Also you don't need real focal length to compare because the focal length for all small sensors are converted the same way.
But if one need to measure actual amount then he need all the measurement real time.
One doesn't need to do that much math to calculate light per pixel.
The pixel size is already given. 0.8 and 1.0 micron. WTH does one need to take the entire size of sensor to calculate captured light per pixel ?
AnonD-909757, 26 Jul 2020Yup, I looked online anyway for the bayer, when I am not sure about something this is the bett... moreFirst of all, There is no better small sensor(bayer) if it use the same type of technology. I mean using the same technology bigger is always better as long as the lens can keep up.
And Trust me. It's for saving money. Saving money or in easy word being cheap is what they're doing since last year. There are a lot of proof. Want some ?
1. Exynos 9820/25 had only 2 Mongoose M4 core while they could have used 4 and make the performance better. Same goes with the GPU. Could have used 14-16 G76 cores in Exynos 9820/25. These two choice could make the SoC more efficient and powerful at the same time but it would cost a few dollars more.
2. They fired off the custom core division people too early to save money. That resulted in half baked Exynos 990 this year which is even worse than last years Exynos 9825 in many cases. There is a reason why the M5 custom core in 990 are so power hungry.
3. Using Samsung's Display other are giving 10 bit color display with 1440p@120Hz. It's probably because of the DSI they're using.
If I keep continue this it can go all day.
Now about 16MP sensor. by 1/2.55" 12MP sensor I was referring to specific sensor. This is already being used as the ultrawide sensor in S20s. Also making a new sensor with existing tech is no big deal. If they want 16MP they can get it.
About sensor's cost. You're slightly mistaken here. Sensors don't cost on how big they are or how long ago a particular sensor size was announced. It depends on what type of technology is it using. for example a 12MP 1/2.55" (ISOCELL plus) sensor will cost more than a 12 MP(ISOCELL) sensor.
Also the 13MP 1/3.4" sensor is there since S9+.
Also I have a confusion about one thing, if you know you can tell me.
A 48MP QB/Tetracell sensor with same 0.8 micron size sensor can be found in 2 different size.
* 1/2.0" and
Why the overall size difference if it has the same pixel size ?
Anonymous, 25 Jul 2020"6. At base Iso you would only need to compare the focal length (the real focal length) f... moreThis is a really informative and awesome answer, thanks !
I didn't wanted to go as far as really calculating the light per pixel by only giving a ratio, but I didn't knew the sensor size were not linear, I thought it was really fraction of the 1" sensor...
And as you pointed out I should have used only one dimension of the resolution since I didn't used the area.
Also, I am not sure if the inter-pixel size/separation is to be taken into account or not if I were to calculate accurately all that...
What is read noise ? Is it interferences in the analog part of the camera module ? Or does it come from the sensor itself ?
AnonD-754814, 25 Jul 2020By that he might have meant regular bayer.
Bayer and Quad Bayer are two different thing.Yup, I looked online anyway for the bayer, when I am not sure about something this is the better thing to do, though I haven't looked for the math part.
Paradoxally I love maths but I don't do much appart from in programming, so...
I see what you mean, there are so much variables to take into account, the pixel size, ratio of external lens to aperture, aperture to sensor size, bayer to quad bayer size difference and many ratio of all those, even the focal length change things relative to sensor size and all the rest...
For what Samsung did, I don't know, maybe they'll use a good sensor that will give good results despite being smaller and having smaller resolution, I don't think it is only to save money as it is a premium/high end device where peoples expect top performances for the high cost, it wouldn't make senses, except if they see the iPhone 12 as such big of a threat that they want to counter it already by trying to squeeze as much cost saving as possible to make it more affordable, as the iPhone 12 is expected to be way more affordable than the iPhone 11 or the Galaxy S10/20 and Note10/20, because Apple and Samsung are in way more direct competition than any other Android brands vs Apple are, so could be possible still.
Anyway, as long as smartphones sensors aren't seriously bigger than how they currently are, most of that is not that important as the software is literally what make or break sensors...
But a ~2.5" sensor with a 12MP resolution would be quite good, and a 16MP would be even better for a telephoto as the extra resolution allow for better hybrid zoom without loosing too much details on a 12MP picture.
Though there is a catch, I looked at the Samsung sensor list from Wikipedia, and there isn't any ~16MP sensor that are around the 1/2.5" mark which are recent and high end enough to replace the 48MP, they are lacking in this segment.
Those that could do are on the ~12MP range, and there is the 12.2MP ISOCELL 2L4 which is 1/2.55" and was the main of the S10s.
The 12.2MP ISOCELL 2L3 and ISOCELL 2L2 also 1/2.55" being older (certainly meaning cheaper) being the main of the S8, S9 and Note 8, Note 9 are the other, slightly older alternatives that could work well as a Telephoto.
Most are too old or too premium/expensive like the ISOCELL 2LD.
Also, the only 12MP 1/3.4 (and not 3.6 as Anonymous JT5 pointed out) leave only two sensors in the Wikipedia list, the ISOCELL 3M3 which is the Telephoto of the Xperia 1 and 5, and the ISOCELL 3M5 which is the Telephoto of the Xiaomi Mi 9 and the LG G9 ThinQ.
But since they are already a year old, either Samsung have a new one ready, or they really want to go cheap.
Anonymous, 25 Jul 2020Bayer and quad Bayer are different I know, from what I've seen, the Bayer is the part over the pixels that separate light rays of each colors between the pixels, basically made to only allow R,G and B going to their respective pixels, each independent pixels alternate between a pattern of a 2x2 group of RGGB or RYYB.
While Quad Bayer is the name of a technique to regroup 4 pixels of the same color together as a single one (which also make use of algorithms to take advantage of virtually larger pixels).
So basically quad bayer is 2x2x2 RRGGGGBB or RRYYYYBB.
I guess the idea is to get both world of larger per buyer pixels while also getting finer/more precise details by spreading the pixels.
The approach is interesting, though personally if I were to make camera sensors, I would rather try a different approach to combine both light field + color detection per pixels.
I see multiple ways to do that, one would take advantage of chromatic aberration but would result in a thick sensor.
Mobilemaster, 25 Jul 2020So the Note 20 will have a 3x hybrid zoom??? What a shitty move, Samsung! Other flagships have... moreI think you are overreacting.
First of all the note 20 will have a pure 64MP with 2x Optical Zoom 3x-6x Hybrid Zoom up to 30X digital zoom
The quality of the 3x zoom will not suffer because of the high resolution.
Other phone brands implement 3x Optical Zoom but with a lower resolution. (8/12MP)
And to note, there is not a single phone that can do a 30x hybrid zoom, so I don't know why you are telling is that there's a bunch of phone can do 30x hybrid.
(All of the are digital)
The only phone with the highest hybrid zoom count is 20x.(P40 Pro+)
Not getting specs better over half year is the same as getting them worse, but they are even not the same.
I like 12mp+64mp configuration as it gives best performance in low light and also daylight, non-quad but ordinary bayer is great for raw shooting and for 8K.
If I get jack instead of tele camera, I'm all in though.