I was reading a rumor site and they predicted that Time-of-Flight cameras are coming to the next generation of the Apple iPhone. Cool, but what’s a Time Of Flight camera?
I saw that same discussion from analyst Ming-Chi Kuo (here’s CNBC’s article) and was also intrigued by the terminology. Is Time-of-Flight something to do with dropping your phone? Could it be related to pictures you take on airplanes? Does it measure how quickly you can walk up a flight of stairs? Nope, nope and nope. Turns out that it’s actually a really interesting depth processing system that’s used on robots and self-driving vehicles, and is just starting to show up on mobile devices. Time-of-Flight (or “ToF” as it’s called in the biz) is currently available on very few smartphones — the LG G8 ThinQ, the Honor View 20, the Huawei P30 Pro, and the Oppo RX17 Pro — but by the end of 2019 both Apple and Samsung will support ToF cameras on their higher end phones.
Time-of-Flight uses a tech called Lidar to measure depth of various points in an image by illuminating it with an infrared light for a fraction of a second. Think of how a bat’s sonar works and you’ll have a rough idea. Lidar, btw, stands for Light Detection and Ranging, and it’s “a remote sensing method that uses light in the form of a pulsed laser to measure ranges”. A pretty modest laser, so don’t worry about blinding your photographic subjects!
The tech itself is astonishingly precise, because it’s working with light things are measured against the speed of light. Consider: If you have a subject that is 5 meters away, the time difference between the light leaving the camera and returning is roughly 33 nanoseconds. That’s 0.000000033 seconds. And this data will be captured for each and every pixel in the image. Impressive!
Photographers obsess over something called “depth of field” and Time-of-Flight is really the digital equivalent in a lot of ways. With a ToF camera system ,for example, you can have a lot of control over “bokeh”, or the ability of a camera to blur the background while keeping the foreground image in tight focus (see cat picture, below). This is a common portrait photography trick and is already available in the most modern Android and iPhone devices through other methods. ToF offers a better solution to these sort of processes, however, and will let you have even better portrait shots, even if you adjust them after the fact, not as you’re shooting.
Where ToF is really going to shine, however, is in other uses of depth in imagery. For example, imagine taking a photo of an object and having its dimensions accurately calculated, or facial recognition when people are further away from the camera. And oh, that augmented reality (think Pokemon Go and Harry Potter: Wizards Unite) is going to be dramatically improved as the Time of Flight camera tech makes it into their AR code bases. You’ll have virtual objects appear in logical places with correct dimensions – and even lighting – that will make these games hugely more immersive.
This will also allow gesture recognition so if you imagine that the front-facing camera on your phone is always on, you could wave your hand or wink at your phone to answer a call or delete an email from someone. In fact, most of the big smartphone companies are talking about gesture recognition, though their plans seem a little bit half-baked at this point (not to mention that you only want it to react to your gestures, not someone else!)
So that’s why there’s a big hullabaloo over Time-of-Flight camera technology making it onto smartphones. We have bits of it now, but in the next year or two we’re going to see smarter image processing, better portraits, more sophisticated facial recognition and definitely improved augmented reality (AR). Will it actually show up in the next gen iPhone from Apple too? We’ll have to see. Meanwhile, let’s get 5G properly deployed first, eh?