“The award-winning Pixel camera delivers the first AI-powered camera experience in history,” were the words of Shenaz Zack, executive director of product management at Google, to show off the photographic skills of the new Pixel 9. In addition to reviewing the elements that differentiate the models in the family, such as the Pixel 9 Pro XL telephoto lens, the Mountain View firm revealed the improvements made to both the famous Pixel Camera and its Google Photos service, which acts as both a cloud and a gallery.
Since a picture is worth a thousand words, I took the Pixel 9 that I recently reviewed and tried to see if the experience was as distinctive as Google would have us believe. These are the possibilities of the cameras of the new Google mobile, which, as we have already said, is equipped with AI and many functions that we would never have imagined. I’ll tell you about my experiencewhich was a lot of fun to say the least.
The Magic Eraser is not new but it is still very useful.
At the ‘Made by Google’ 2024 keynote where we met the members of the Pixel 9 series, the search engine company once again brought out an old acquaintance: the Magic Eraser, which you can now use on any Android mobile. This is not a new feature as such, although it remains one of the Featured Dishes from Google Photos.
This hasn’t changed even though we’re using it in one of the Pixel 9s: he still behaves so well. However, in case you haven’t used it yet, Google has made it known for the umpteenth time. You may notice better cropping and AI adaptation compared to other phones, it could also be a placebo.
Night vision to solve the main problem of cameras
“We have also improved night vision for low light photos,” Kenny Sulaimon announced on this occasion this improvement of an already present feature. “Night Vision” This is the Pixel’s “Night Mode,” which debuted with the third generation and has now received some improvements.
It is difficult to draw conclusions from this aspect. Yes, I will say that, as we can see in the image below, the photos seem more realistic than ever. Instead of the artificial lighting that once accompanied the results, the dark areas are now preserved as they are, with a rest of the scene properly balanced.
Comparing it to my little Pixel 6a and ignoring the hardware difference, the yellow tone continues to dominate night photos on the Pixel, something that is common in the rest of other manufacturers’ solutions. Of course, it seems that the AI improvement is no longer as aggressive, and the level of detail is older.
A panoramic mode with HDR+ and adapted for low light
This is one of the least used modes, and yet Google wanted to improve it. I’m talking about how I’m baptized as “Panorama” on the Google Pixel. Now, HDR+, which rules photography, prevailsPowered by Google‘ in recent generations. Additionally, it adds a redesigned interface to capture a larger image than normal.
It convinced me when using it: this mode It’s fast, very intuitive and the results are better than before. The wide dynamic range is noticeable, with plenty of detail in both dark and light areas, and well-calibrated colors. The adjustment of the different images I took is also commendable: the AI does a spectacular job, impossible to know where the cut is for each one.
Magic is the key to Google Photography: Google Photos Magic Editor
We now turn to Google Magic Corner:GooglePhotos. Among the new tools that are added to the service and the application, “Automatic Framing” stands out, a function that can be used a lot. With the introduction of AI, it seems that less and less skill is needed to take good shots.
This feature, as Google itself confirms, proves different compositions of our photos
Aren’t we convinced by the proposals of Google Photos? We have a button at the end of the carousel with which restart the framing process. In the following portrait you can see the generative filling which, all things considered, works wonders.
I haven’t forgotten another of the functions of the “Magic Editor” of Photos: the ability to change parts of a photograph like the sky or simply the ground on which one walks. By pressing the colored button at the bottom in the Gallery application, it’s easy replace the sky
The truth is that this automatic edition largely meets expectations: no matter how much we are moving away from the originalcan be used to improve photos that didn’t turn out the way we wanted. What is photography? The question makes more sense when we try these features in the first person.
Eye that also detects bodies of water, whether rivers, lagoons or ponds: in this sense I did not really like the results. Try to stylize the water, it is reflections and lighting but its execution usually fails (it depends a lot on the shot).
If you want to “reimagine” the bottom of a photograph, you should surround it with the “Magic Editor”, although its effectiveness will vary depending on how well the area is recognized. For example, I was less precise using my finger in the photo below, and the result speaks for itself.
It is also worth mentioning that generative AI It doesn’t quite match the real photographsyou can see at a glance which elements have been artificially inserted.
Google Photos’ latest magic trick isn’t about improving our photos or giving them a new look, but rather AI brings out its most creative side to completely change what we get with the Pixel 9 cameras. This is how the “Stylized” mode works.
From a beach, we go to Mars, then change the sand for water. These are the results of this creative function of the “Magic Editor”. Is this a photograph? Let me doubt it. In fact, it is the same as generating an image with one of those known as Midjourney or Image Creator, the only difference is that here we start from the photo taken by the Pixel 9 camera.
Can we say that Google has redefined mobile photography? Without a doubt, at least on the side that there are many changes and novelties focused on the photographic section of their new terminals, and that change the paradigm compared to what we had a few years ago. As we have seen, AI in photography can be a good assistant to improve our skillsalso to correct defects, but above all it had an impact on the processing.
This is just the beginning, I only have one question left: which of these features are really useful and will remain so in a few years. Users will judge, it is not in vain, these functions will cease to be a novelty at a time when they have been extended to other manufacturers and more popular phones. Google can say that it was the first, in fact, all this magic is nothing more than an evolution of what began almost a decade ago with the GCam Features and Smart Photos like the “Magic Eraser”.
In Xataka Android | This Pixel 9 feature went unnoticed but it puts an end to one of the most common mobile phone problems: we tested it
In Xataka Android | Understanding Google Pixel Updates: What Are Removed Features, What Android QPR Means, and How Often They Are Updated
Table of Contents