Well 3D TOF sensor is 3D TOF Sensor have no idea what can be innovated there.
Hardware Hypervisor, 20 Sep 2020Apple ordered tof the sensor and will claim it as their technology.
And their fans will say A... moredoes this article have anything to do with apple?
Please, just go away with sad, sad, apple haters like you. Huawei is using the same LiDar sensor charging the same price yet you don't say anything about it.
instead of depth and Macro putting a ToF is a better thing
Hardware Hypervisor, 20 Sep 2020Apple ordered tof the sensor and will claim it as their technology.
And their fans will say A... moreJust because you order a part from someone it doesn't mean "you didn't do anything to it or invent it". Just because you don't have your own plants it doesn't mean you can't invent things and make them produced. Apple doesn't have their own manufacturing, but they can walk up to a company that's producing what they want and they can say "we're gonna order 800 million units of display glass if you make them to behave like X and have Y properties". And that will be their design and their technology essentially. Same is with display. Apple doesn't make them, but they can go to Samsung or LG and say, we want displays of X resolution with Y type with Z subpixel pattern and they'll make it for them. It's just a matter of cost and negotiations really.
It's nothing new, other industries operate this way for decades if not centuries already.
DroidBoye, 20 Sep 2020If the inclusion of ToF for S21 lineup were true, it simply means that Samsung has no plans of... moreI don't think they will use ToF for S21. ToF isn't good enough to do focusing.
As they're making the pixel size even smaller I think it will be harder to improve the PDAF unless they use the left area for more focusing pixels.
Hardware Hypervisor, 20 Sep 2020Apple ordered tof the sensor and will claim it as their technology.
And their fans will say A... moreIt actually happened with me in the past. Some fan saying Samsung made display for iPhone on Apple's design.
Though it's true that Apple has their own OLED design. LG's OLED is a waste. The OLED which is called LTPO OLED(in Apple watch) is actually Apple's design. But it's not good enough to use in Smartphone.
If the inclusion of ToF for S21 lineup were true, it simply means that Samsung has no plans of improving the AF of their Quad (or higher) Bayer filters in-sensor hardware for this series.
Apple ordered tof the sensor and will claim it as their technology.
And their fans will say Apple design the sensor and Sony manufacture them.
When in reality Apple just order them and did nothing else. Just like they did with OLED technology, RAM, "the toughest glass in any smartphone ever", etc etc. They did no R&D but just order, order, and order, and slap exorbitant prices
HuaW3I Suxs, 19 Sep 2020In few words, MARKETING, and too powerful, to brain wash to their consumers, telling with fanc... moreTBH retina is so fancy
Yeah so, in the past, ToF was that much of a big deal and now that they have solution, they consider putting it back ? I smell price increase for this ...
Shui8, 19 Sep 2020OMG people suddenly becomes confuse between LiDAR & ToF.
They are general terms for som... moreNot exactly, but I agree, peoples are soooooo confused about those terms.
ToF = Time of Flight, which mean anything that use a (preferably) constant speed emission of any kind to get distance through travel time from sensor (emitter) - to the object - back to the sensor (receiver).
https://upload.wikimedia.org/wikipedia/commons/thumb/f/f1/20200501_Time_of_flight.svg/1200px-20200501_Time_of_flight.svg.png
It can be sound, light, microwaves (radar), magnetic field, electrostatic field or multiple other things.
https://httpsak-a.akamaihd.net/3816841626001/3816841626001_6175688663001_5837149644001-vs.jpg
Even neutrons :
https://www.eng.hokudai.ac.jp/labo/QBMA/Bragg-edge/image/method_image.jpg
A laser autofocus or any laser rangefinder are almost all ToF as they measure time from emission to reception to get distance, it isn't necessarily pulsed as changing the frequency and getting the phase shift is also a used technique.
https://www.sfxrescue.com/wp-content/uploads/2017/03/tera-ranger-comparison.png
https://terabee.b-cdn.net/wp-content/uploads/2020/01/The-sensor-measures-4-distances.png
________________
LiDAR which can mean "Laser imaging, Detection, And Ranging" or "Light Detection And Ranging" refer to any method that use emitted light as a way to gather 3D depth data.
It can be a ToF technology using many points (yes, ToF can also be point cloud) where each points are measured by their return speed, giving distance information using time, it give a depth map where the resolution is equal to the number of lasers and the precision come from the resolution of the sensor.
Note that technically, this is for real time LiDAR 3D depth scanning, which in this case is a Depth Sensor, a "non real time" could be using a single line of laser that either, when the scanner move (like aerial mapping) or a single point, that by turning around, gather depth data, in such case, those aren't "3D depth sensor" as it use additional steps from the sensor itself to get the 3D depth data.
Here is an exemple of non Depth Sensor LiDAR using ToF :
https://www.youtube.com/watch?v=1lDO1UevAJI
And here an homemade one :
https://www.youtube.com/watch?v=GZvgMjkyJgI
There is the "Flash LiDAR" method who consist on emitting a large beam (either from an unfoccused laser or from a simple LED diode) and gather (probably using phase detection a little like PDAF) per pixel return time, allowing to have a resolution being equal to the sensor resolution (precision come from how fast your components work without too much interfereces), common point of those are that they use high speed memory in the camera sensor.
https://www.researchgate.net/figure/LiDAR-vs-Flash-LiDAR-technologies-Courtesy-wwwfosternavnet_fig3_261333968
It should be the one used for computational photography ToF/LiDAR as its resolution can easily be scaled up and it can allow depth data resolution of 12Mp or higher, matching with a 1:1 ratio the pictures for proper work through depth related things.
It can also be structured light (what we have in mind when talking about point cloud) which can be dots, lines or other patterns where the distance between the sensor and the objects is detected by how the projected light is distorted, it require the IR camera (though it could be done in visible or ultraviolet light too) to be working with a computing unit which will find depth informations by analyzing each dots.
And even there, it is a simplification, there are so many different implementation of LiDAR that this isn't surprising everyone is confused !
https://www.knowmade.com/wp-content/uploads/edd/2018/05/Main-IP-players-involved-in-promising-LIDAR-technologies.jpg
https://www.eetimes.com/wp-content/uploads/media-1304429-automotivelidarplayers-yole.jpg
________________
So there isn't any real difference between the terms ToF, LiDAR and Depth Sensor (do not mix it up with "Depth Camera") as it is like the difference between Vehicles, Car, Motorcycle and Machines, , those aren't even the only one, and the real difference of how we gather depth data is more as such :
https://www.cnx-software.com/wp-content/uploads/2016/09/3D-Imaging-Technology-Stereo-Vision-Structured-Light-Time-of-flight.png
*Stereo vision (stereoscopic) use two camera but require intensive computational power to make depth data from this and the depth precision isn't high as it is difficult to make out distances, distance precision will increase with the camera separation, note that more than 2 camera can be used.
It is usually perfect for when you directly need to use the image, like 3D video/movies.
*Structured light is about projected a special shape (structure) like lines, dots, circles or anything else, and compute the depth, it require a little bit of computing power, but the precision is extremely high, as it is quite easy to tell how far/how deformed each part of the structure is per pixels.
*ToF require almost no computing power, in fact most can easily be done in almost real time inside the sensor itself depending on the implementation, but its depth precision, though vastly superior to the Stereo one, is not as good as Structured light which is perfect for 3D face recognition (it is what the iPhone's Face ID use), ToF is not exactly middle ground as it can be used for totally different things, its precision limit come from the incredibly short amount of time distance variation, using light, take, for measuring centimetres, you need nanoseconds precision, meaning a memory that work as fast and a really precise synchronisation between emission and reception.
*Other methods to gather depth information also exist, even with a single camera, a good AI can make out depth, though the quality of the algorithm is the limit as it can easily wrongly analyse some objects and therefor mess up the depth, on pictures, the edge of haircut often get this issue, making the software assisted autofocus/background blur accidentally them.
https://www.youtube.com/watch?v=QSVrKK_uHoU
Active stereo is a combinaison of structured light and stereo vision is also a thing, it project a light pattern and use two camera to gather more efficiently the distance, it increase the precision and slightly reduce the computing power (since you only look for dots, it doesn't need the ultra complex, object/pixels matching algorithm that the standard stereoscopic one use, and by using two camera, it is easier to spot the changes on the dots/lines, requiring less computing power).
https://community.arm.com/resized-image/__size/1040x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-20-66/Depth-Sensors.png
It can even be used with two patterns :
pesches132, 19 Sep 2020Apple introduced it as "LiDAR" because they wanted to show that they "invented&... moreI think it is important to demonstrate any new feature that is getting introduced during keynotes. Not sure if LiDAR feature got demonstrated with a practical application instead of merely telling everyone that LiDAR capabilities alone. I didn’t watch iPad Pro keynotes on LiDAR. But so far no one I knew using iPad Pro or users in various forums claimed to have used this feature. Whereas when ToF was first launched in Honor 20 Pro (or 30 Pro) it was clearly demonstrated during the launch with a work out session like example
Nice
Samsung should learn how to make a proper design this year and listen their customers/fans. Developing a sensor that already exists it's meh...
pesches132, 19 Sep 2020Apple introduced it as "LiDAR" because they wanted to show that they "invented&... moreIn few words, MARKETING, and too powerful, to brain wash to their consumers, telling with fancy words tath they have "innovative" or "not equal to the competitor's" technology, but no.
Apple introduced it as "LiDAR" because they wanted to show that they "invented" something new compared to other smartphone maker's ToF sensors, just like how they say "Retina display" to differentiate and make people think its a "new" and "different" kind of display from others. Simple as that.
Anonymous, 19 Sep 2020The so-called "LiDAR" sensor is only available on the iPad Pro and I haven't co... moreBecause currently there are no app to use it's feature fully(at least that's what reviewers said that time, there may be some at present).
Here is more info for you.
https://www.cnet.com/news/the-ipad-pro-can-scan-your-house-and-future-iphones-might-too/
OMG people suddenly becomes confuse between LiDAR & ToF.
They are general terms for some kind of detection & scanning system technology, the names not invented from any other smartphone company.
- https://youtu.be/FOxxqVzDaaA
Tof detection scene is captured with each laser pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems, which makes LiDAR more advance on that regards.
This ISOCELL Vizion are indeed an ToF sensor from Samsung.
Anonymous, 19 Sep 2020Bats and whales have radar system. Both uses ultrassound. Ships use too. Researchers tha... moreYou are confussing radar with sonar, sonar, which is used by ships, submarines, torpedoes etc, relies in sound emitting/sensing, animals use a similar process, radar, uses microwave type radiation to accomplish the location, speed, acceleration, position, etc
AnonD-754814, 19 Sep 2020Retina display is a completely new word. So, none cares if they're using that name. But... moreThe so-called "LiDAR" sensor is only available on the iPad Pro and I haven't come across any reviews of it functioning like LiDAR. I don't think anyone even knows what it does or what it's for.
Tip us
1.7m 126k
RSS
EV
Merch
Log in I forgot my password Sign up