We’ve spent a few years seeing how manufacturers are redesigning their photo apps, including new algorithms to improve the final quality of our photographs. In this regard, there is a somewhat controversial topic involving Samsung and its “moon mode”, which incorporates the best known phones in the mobile galaxy.
It all happened on Reddit, where a user posted a thread demonstrating why the images obtained from our satellite with phones from the Korean firm, they are totally fake. Nobody explained the reason to the user, but Samsung spoke and now we know how the processing trick works.
Samsung seems to explain the controversy
For its part, the Korean manufacturer wanted to give its version of the facts, demonstrating in an official publication how the treatment of its smartphones makes it possible to take clear images of the natural satellite. There technical explanation
Reddit user tried to replicate (without the same success) Samsung technology, which uses a blurred image of the moon as a base and combines it with different techniques from an advanced processing model. As confirmed by the brand, this mode which adds AI can be deactivated.
Specifically, when we use a Samsung mobile and point it at the night sky, the camera recognizes the moon and mix up to 10 photos with magnifications up to 25 times, allowing the “original photograph” with digital zoom to become sharper. This technology is called great resolution, and is added to another technique used by the mobiles of this brand.
The other technique in question refers to the artificial intelligence skills which are activated when we activate the “optimal shooting” mode. Finally, and thanks to these two modes, a good quality lunar photograph is obtained. Obviously, these implementations are necessary because the optics of the phones are not able to capture the moon naturally.
This is how Samsung’s artificial intelligence learned to recognize the moon
In the same publication, Samsung offers its Korean users all information of the learning process of your AI model. So we know how he learned to recognize the star that we see every night in the sky.
First, the model had to learn to distinguish the moon in its different phases, from the full moon to the growth. She was trained by providing her with various photographs that illustrate all the cases.
The next problem to solve was the bootleg alcohol. When we go to photograph this star, it takes a lot of digital zoom which makes us lose quality, but with that comes the intense brightness which also makes it difficult to take. Consequently, the brightness of the screen decreases, allowing the user to see the moon correctly.
Then Samsung technology changes the infinity focus to keep it in focus. It doesn’t stop there and that is that when we apply 100x zoom, the pulse of the user makes it extremely difficult to capture. Therefore, Samsung applies its mode Zoom Lockwhich avoids having to use a tripod so that the moon comes out centered and without movement.
Once we press the shutter, our Galaxy mobile will determine if necessary whether or not to implement this detail enhancement engine. That is, not all moon photos pass through the “AI filter”. And in turn, he denies the accusations coming from Reddit, because although the images are enhanced by different methods, they are real.
This is a clear example of how the cameras of our phones are improving step by step. They obviously have their limits, but the software improvements They alleviate some of these issues, allowing images to be captured that we never imagined from the palm of our hand.
In Xataka Android | Every Samsung Galaxy that will get Samsung’s most professional photography app
cover image | Pepu Ricca with Midjourney