Lately, we can’t stop watching a thousand and one videos on apps that work wonders with our photos. That if they increase its resolution and detail, that if they take care of recovering the color. Others, on the other hand, allow us to literally become another person. However, these applications have a hidden danger and not only that of privacy. What is the dark side of AI in photos?
One of the clearest applications of artificial intelligence is to hallucinate data, and no, we don’t want that in a joking way. Rather, it’s because it works by learning a model to be able to make predictions. In the case where we are talking about sound, for example, we can make him listen to several complete songs and then find him to complete the missing ones. The result? The creation of new works, but very rarely, will be successful if not with massive training. Well, the same goes for photos and videos. All because what it transmits is based on a prediction and not real information.
Why is AI in photos and videos dangerous?
There is a subject that no one talks about and that is preservation of our memories, whether personal or collectiveand it’s the big problem posed by the AI manipulation of photos. For example, historical information may be changed for political reasons under such a premise. People, symbols or even historical records can be removed at source. Which makes the desired digitization of media becomes a problem. Crime scene video? Voila, the political dissident, will appear in the video so that he can be judged.
In a world where information flies daily and there is so much of it that it is impossible to oppose it, Fake news now has more power than ever thanks to what artificial intelligence can now do with the media. That is to say, to create irrefutable false evidence that forces journalists to be titanic to inform and deny them. At the same time, we find people destroying the past by upgrading old images and low resolution movies. It is possible that our children will no longer appreciate the works as they were made, but the Doppelgangers made using artificial intelligence.
Also, this writer knows of a family member who made the mistake of destroying old photos after giving them the AI enhancement treatment. The result? Have a number of photos of your family that look like, but aren’t, people who didn’t really exist. Now imagine this applying to historical memorabilia.
The importance of monitoring
The problem with all of these AI photo apps is the fact that there is no monitoring component. In every learning and inference process, there must be an evaluator, something that tells the algorithm that it has generated an erroneous result and rejects it as the basis for future conclusions. Many services do not allow us to assess whether the reconstruction has been correct and the general feeling we have is that although the AI is very powerful and impressive, it seems to be an enemy, loyal in many cases to reality and more than reconstructing images, they are partly made up and half true.
This capability should be imprinted in hardware, that is, there should be a mechanism that, when faced with two images, is able to guarantee whether one resembles the other. However, such capabilities are completely ignored in image reconstruction applications, especially for old photos. After all, we don’t have an original to compare them with. They got lost in the mists of time and therefore what we are going to achieve in AI-modified photos will never be a reconstruction of the real image.
In summary, be careful with AI and historical memories, their massive use can lead to the creation of false memories.