The future Apple iPhone may add a time-of-flight camera – this is what it could do

We're a few months away from Apple announcing its iPhones for 2019, but rumors have already begun for next year's models, with the ever-reliable Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones a reversing time (ToF) 3D depth sensor for better augmented reality functions and portrait photos, through MacRumors.


It is also not the first time that we have heard that Apple is considering a ToF camera for its 2020 phones. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have been in existence since 2017. Other companies have been reporting Apple here with a number of phones on the market that already have ToF cameras. But given the prevalence of Apple hardware and the impact it has on the industry, it is worth looking at what this camera technology is and how it works.

What is a ToF sensor and how does it work?

Flying time is a comprehensive term for a type of technology that measures how long it takes for something (whether it is a laser, light, liquid or gas particle) to travel a certain distance.

In case of camera sensorsin particular, an infrared laser array is used to emit a laser pulse that reflects the objects in front of it and returns to the sensor. By calculating how long it takes for that laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light is constant in a given medium). And by knowing how far all the different objects in a room are, you can calculate a detailed 3D map of the room and all the objects in it.

The technology is mostly used in cameras for things like drones and self-driving cars (to prevent them from bumping into things), but we have recently seen it appear on phones.

How is it different from Face ID?


Face ID (and other similar systems) use an IR projector to pulse a grid of thousands of dots, the phone then taking a 2D photo and using it to calculate the depth map.

Time-of-flight sensors work differently: by using the time-of-flight data to calculate how long the lasers take to reach the object, real-time 3D depth data is obtained instead of a 2D map it is calculated to three dimensions.

This has several advantages: thanks to the laser-based system it works for a greater range than the Apple-based grid system for Face ID, which only works about 10 to 20 centimeters from the phone. (If the subject is too far away, the dots in front of the grid are too far apart to provide usable resolution.) In theory, it also provides more accurate data than IR grid systems. A good example is the LG G8, which uses a ToF sensor for its motion-sensitive gestures. The ToF system makes it possible to track and distinguish each individual finger in 3D in real time to enable those gestures.

Why does Apple want it?

The rumors of both Kuo and Bloomberg say that Apple wants to add the ToF sensor to the reversing camera on 2020 iPhones, not to replace the existing IR system used for Face ID (which the new iPhones are said to have).

Apple is said to focus on enabling new augmented reality experiences: a ToF sensor can enable room tracking on a mobile scale, so that a future iPhone can scan the room, create an accurate 3D display, and can use for a much more immersive and accurate augmented reality implementation than current models allow.

As an added bonus, a ToF sensor would also allow better depth maps for portrait photos (which Huawei is already doing with the P30 Pro) by capturing full 3D maps to better separate the subject from the background, as well as better portrait mode videos.


Who else uses it?

Several telephone companies already have ToF scanners in their devices. As noted earlier, LG uses one in the camera on the front of the G8 to enable motion gestures and better portrait photos. (It also uses the same IR laser system for its wire assignment for the phone's unique "palm recognition" function.)

The P30 Pro from Huawei also has one as part of the series of cameras & # 39; s on the back, which is used for depth maps for portrait effects. That said, Huawei also claimed to have some AR ambitions for the sensor at the time of the launch and noted that the P30 Pro has the height, depth, volume and area of ​​real objects with an accuracy of more than 98.5 percent. can measure.

Sony – which supplies image sensors for a wide range of smartphones, including the iPhone – announced earlier this year that it was planning to increase the production of 3D laser-based ToF chips this summer, which would be a perfect timing for recording in a 2020 iPhone.

- Advertisement -