You may have seen the headlines this week about the Samsung Galaxy S23 Ultra taking so-called “fake” photos of the moon. Since the S20 Ultra, Samsung has had a feature called Space Zoom that combines its 10X optical zoom with massive digital zoom to achieve a combined 100X zoom. In marketing shots, Samsung showed its phone taking nearly crystal-clear photos of the moon, and users did the same on a clear night.
But a Redditor proved that Samsung’s incredible Space Zoom uses a bit of trickery. It turns out that when taking pictures of the moon, Samsung’s AI-powered Scene Optimizer does a lot of work to make the moon look like it was photographed with a high-resolution telescope rather than with a smartphone. So when someone takes a picture of the moon – in the sky or on a computer screen like in the Reddit post – Samsung’s calculation engine takes over and erases the craters and contours that the camera missed. .
In a follow-up article, they undoubtedly prove that Samsung does indeed add “moon” images to photos to make the shot clearer. As they explain, “The computer vision/AI module recognizes the moon, you take the picture, and at that point a neural network trained on countless images of the moon fills in the details that weren’t available. optically.” It’s a little more “wrong” than Samsung lets on, but it’s still to be expected.
Even without the investigative work, it should be pretty obvious that the S23 can’t naturally take moonlit photos. While Samsung claims that Space Zoom shots using the S23 Ultra are “capable of capturing images at an incredible distance of 330 feet”, the moon is nearly 234,000 miles away or around 1,261,392,000 feet away. It is also a quarter the size of the earth. Smartphones have no problem taking clear photos of skyscrapers over 100 meters away, after all.
Of course, the moon’s distance doesn’t tell the whole story. The moon is basically a light source against a dark background, so the camera needs a little help capturing a clear image. Here’s how Samsung explains it: “When you take a picture of the moon, your Galaxy device’s camera system uses this AI technology based on deep learning, as well as multi-frame processing to further improve the details. Read on to learn more about the multiple steps, processes, and technologies that go into delivering high-quality images of the moon.
It’s not that different from features like Portrait Mode, Portrait Lighting, Night Mode, Magic Eraser, or Face Blur. All of this uses computational awareness to add, adjust, and change things that don’t exist. In the case of the moon, it’s easy for Samsung’s AI to make the phone look like it’s taking amazing photos, because Samsung’s AI knows what the moon looks like. It’s the same reason why the sky sometimes seems too blue or the grass too green. The photo engine applies what it knows to what it sees to mimic a high-end camera and compensate for normal smartphone flaws.
The difference here is that while it’s common for photo-taking algorithms to segment an image into parts and apply different adjustments and exposure controls to them, Samsung also uses a limited form of AI image generation on the moon blending into detail that was never in the camera data to begin with, but you wouldn’t know it, because the moon’s detail is always the same when viewed from Earth.
Samsung
What will Apple do?
Apple is rumored to be adding a periscope zoom lens to the iPhone 15 Ultra for the first time this year, and this controversy will no doubt weigh on how it trains its AI. But you can rest assured that the compute engine will do a lot of the work behind the scenes, as it does now.
That’s what makes smartphone cameras so great. Unlike point-and-shoot cameras, our smartphones have powerful brains that can help us take better photos and make bad photos look better. It can make night shots look like they were taken in good lighting and simulate the bokeh effect of a super fast aperture camera.
And that’s what will allow Apple to achieve incredible results with the 20X or 30X zoom of a 6X optical camera. Given that Apple has so far avoided astrophotography, I doubt it will go so far as to sample higher resolution moon photos to help the iPhone 15 take clearer photos, but you may be sure that its photonic engine will work hard to clean up the edges , preserving the details and increasing the telephoto capabilities. And based on what we’re getting in the iPhone 14 Pro, the results are sure to be spectacular.
Whether it’s Samsung or Apple, computational photography has enabled some of the biggest breakthroughs in recent years and we’ve only scratched the surface of what it can do. None of this is actually real. And if that were the case, we would all be much less impressed by the photos we take with our smartphones.