This is how the image model works on the iPhone SE according to the Hide analysis

oriXone

This is how the image model works on the iPhone SE according to the Hide analysis

analysis, hide, image, iPhone, model, works


iPhone iPhone

The new iPhone SE is a mix of a few devices, reworking older models and introducing more advanced development. Thanks to all these improvements, especially the A13 Bionic processor, the iPhone SE is capable have image mode and a large number of effects of the latest iPhone 11s with modules of the same iPhone 8 camera.

There has been a lot of speculation about this camera sensor and its functionality, and despite the fact that many of us thought it would work just as well as the iPhone XR sensor, so far only the iPhone has one camera capable of using image display mode. Towards a thorough research conducted by the developers of Halide working this way on the iPhone SE is different.

In case you didn't know, Halide is one of the best camera apps available in the Apple Store, able to bring advanced photography options to iPhones, both modern and old. So you know about this for a while.

IPhone SE image mode update

As they tell us in their in-depth analysis of the iPhone SE camera, how to create an image mode with a single lens is different for the iPhone XR than the iPhone SE, and it's all because of the A13 Bionic processor that it is capable of doing things we have never seen on an iPhone before.

"This iPhone goes to where no iPhone has ever been with "single Image Monocular Depth Estimation". That is, this is the first iPhone to produce an image effect using more than one 2d image."

IPhone is the colors

Most of us would think this is exactly what the iPhone XR did, but it is not. Despite being a single lens, the iPhone XR does have the hardware needed to gain depth, thanks to the so-called “Focus pixels"Anyway IPhone Camera Hardware does not support this technology, so Apple has had to use another strategy to emulate image display mode.

New The iPhone SE builds the image depth completely through machine learningIn other words, by machine learning from thousands of images, the system has the ability to simulate and learn the depth to be appreciated. This is something that is easily seen when taking a photo shoot, the new iPhone SE translates the depth and creates an existing image mode, something that would not happen if we took a photo with the iPhone 11 or XR, since they were analyzing the depth using Hardware and saw that it was completely flat.

the iPhone SE

It may interest you | First camera comparison: iPhone SE vs iPhone 11

However, this machine learning program is incomplete and there may be situations where it does not get things right, precisely for this reason. Apple is banning photo mode on the iPhone SE for people, or other apps like Hallide can also work on pets or things.

The A13 Bionic processor, as well especially your Neural EngineThey are the ones that make the image mode available for the iPhone SE and those that automatically enhance the photos taken by a 3-year-old sensor to a level of accomplishment that the iPhone had never done before.



Leave a Comment