Apple reported to be working on a laser based 3D sensor in 2019 iPhone

  • Twitter
  • Facebook
  • Google+
  • Pinterest

With the new iPhone X in the 2017, Apple already has the public up and on its feet. The hype created with the face ID and the Animoji still has the remnants among the users and it will take its sweet time to wear off. But the tech giant despite, the already made impact, is rumored to have working for even greater stuff for the upcoming iPhone model of 2019.

According to the Bloomberg reports , Apple is developing a rear-facing 3D sensor system based on lasers. The new system would beam lasers onto surrounding objects and calculate how long it takes for them to bounce back.

In doing so, an iPhone would be able to map its surroundings much more quickly and accurately than solely by using its rear-facing cameras.

The system as stated would work on the principle of lasers. By firing these lasers out of the device and measuring the time it takes for the reflection to get back to it, a depth map is established.

And similarly this depth information could be obtained from dual cameras, as already done in iPhone X by the infrared based Face ID. The technique is also employed by Google in the new Pixel phones using dual pixels.

Then you might be thinking why Apple is working on the thing that has already been out there in the market. Well, simply because the new technique proposed would definitely give more precise and accurate results compared to what we have now.

The new sensor system will enable better depth detection for augmented reality apps, and a more accurate type of autofocus for photography, according to the source.

Bloomberg says that Apple would use different technology for the rear-camera 3D system than the technology used by the TrueDepth camera system.

The TrueDepth system shines about 30,000 dots outward, which is then captured and analyzed. The distortion of these dots creates the depth map. This is how Face ID gets the 3D data it needs to check for a facial identity match and unlock the device. The new sensor for the 2019 iPhone back camera relies on a time-of-flight algorithm, measuring how long it takes for a laser to hit an object and bounce back.

There is no reference as to whether Apple would abandon the dual-lens camera system if it adopted the Infrared laser 3D sensor, which would be able to create much more accurate depth maps than the disparity maps produced by the parallax optics system.

Bloomberg says that the tech could debut as soon as 2019, and the new iPhone models in question would also retain the iPhone X’s front-facing TrueDepth system.

It is unlikely it would do that, however, as the two-camera system on iPhones is used for more than just measuring depth. The second lens enables a 2x optical zoom and light data from both sensors are sometimes combined to make better quality single photos.

Leave a Reply

Your email address will not be published.
Required fields are marked *