Apple last year launched the first iPhone with LiDAR. This technology detects light and the distance between it and objects, hence its name light detection and telemetry in English. It’s a sensor that the company has positioned in various ways, with a particular focus on improving photography and new augmented reality experiences.
This year, we continue to find LiDAR in the iPhone 13 Pro and iPhone 13 Pro Max, as expected. But its benefits were hardly mentioned, Well they’re already taken for granted. Even the silence around augmented reality in this year’s presentation may arouse the suspicion of many users. Although the reality is that iPhone LiDAR continues to offer great benefits.
An iPhone with LiDAR that understands the world around you
We have driven the industry for years with deep technology. Today, we are continuing this innovation with the LiDAR scanner. [El LiDAR] measures the time it takes for light to reach an object and return.
We have adopted this technology for the iPhone and with the machine learning and depth frameworks of iOS 14, the iPhone understands the world around you and creates an accurate depth map of the scene. It enables scanning objects and parts, photo and video effects, and precise placement of AR objects.
The LiDAR sensor included in the iPhone It is similar to the one that the iPad Pro 2020 already has and that our colleague Julio César analyzed at the time. This scanner is associated with the processor and its neural engine to offer more possibilities in augmented reality.
During its presentation, Apple highlighted the different applications that use this technology. There is one in particular, called JigSpace, which allows you to simulate what complex workspaces would look like. Instead of requiring a weeks study at a cost of tens of thousands of dollars, with this app and an iPhone with LiDAR, a business is able to do it “in hours” at a fraction of the cost.
In addition to these professional applications, LiDAR offers superior experience in educational applications. In education we see largely used to understand how models of rivers, mountains, animals and other uses work. But what really catches the eye of LiDAR and will appeal the most to the average user is the application of the scene scanner to photography.
LiDAR in cameras to improve everyday photography
LiDAR allows iPhones and iPads to “see” in the dark. Perfect use for low light situations. In recent years, photography in low light conditions has been one of the points that smartphone manufacturers have pushed the most.
An example of an image shared by Apple, where a photo is taken in portrait mode and in low light using the iPhone 12 Pro Max. Two years ago, Apple unveiled its Night Mode on the iPhone 11 Pro. And since last year, he aspires to go even further with inclusion of LiDAR scanner for photography. According to the company, LiDAR from iPhone cameras contributes to the following:
- Increased precision when focusing scenes.
- Reduced time required to take a photo.
- Autofocus in low light up to six times faster.
- Photos in portrait mode for night mode.
IPhones with this sensor enhance the experience of low light photography. One of the workhorses of smartphone photography, where noise and slowness when taking pictures are two major obstacles. With LiDAR on iPhone, Apple is tackling these problems directly. The photos are faster, with less “grain” and much sharper.
It is certainly an improvement where photography in low light conditions is the clear beneficiary. We have been using it for a year now and we can assure you that the change is very noticeable.