Apple is working on bringing a 3D sensor system to the iPhone’s rear camera as early as 2019, Bloomberg reports. People familiar with the plan said the sensors would likely be different than those that power the iPhone X’s TrueDepth sensor system in the front-facing camera, which powers Face ID and other features.
Instead of using the structured light technique that projects thousands of laser dots onto the user’s face, the rear-facing sensor would use time-of-flight measurements that calculate the return time for laser light projected off objects and use that to create a 3D image of the environment. Sources said Apple has already reached out to prospective suppliers for the new sensors.
In other 3D sensor speculation, comments from the president of Apple HomePod maker Inventec has led to speculation that a future version of the home speaker will include some form of facial and image recognition.
Nikkei Asian Review reports that President David Ho told reporters, “We see trends that engineers are designing smart speakers that will not only come with voice recognition but also incorporate features such as facial and image recognition. Such AI-related features are set to make people’s lives more convenient and to make the product easier to use.” While Ho didn’t specify which products he was talking about, the fact that Inventec is making Apple’s HomePod has started the rumor mill, with analysts saying the feature could also make its way into the device as early as 2019.