Artificial intelligence is here to stay and whether we like it or not, it will be a part of our lives for years to come. Today, thanks to AI, we can generate images, texts and even videos with a few simple lines of text (instructions), but they are still far from being able to replace humans.
Although many users believe that AI is capable of reasoning and thinking for itself, this is not really the case, since its operation relies on the information it has been trained with. The more information a person has AIthe greater its ability to propose answers based on the linguistic model used, being Google Tag the most popular and best known of all.
If the information with which a person was trained Artificial intelligence is reduced, if it includes erroneous statements or if it is limited to a certain thought/movement, the information that this AI will offer will always be biased without taking into account all the information available on a certain topic.
Co-pilot accused of being anti-Semitic
An AI that has been trained with thousands of photographs of white people without considering other races, when asked to create an image of an Asian or a person of color, will not be able to generate it correctly , because the analyzed database is missing people of other races or the number of photographs of people of other races is so low that the AI could not interpret it correctly.
The latest AI embroiled in controversy is Microsoft’s Copilot. Avram Piltch, editor-in-chief of Tom’s Hardware, tested different AIs to generate images for create pictures of the Jewish people, with Microsoft’s Copilot Designer being the one that got the worst rating because it still brings out the classic stereotypes of the Jewish people.
The terms that Avram used which offer very negative results appear under the term Jewish leader Yes authoritarian Jewish boss, instructions which show caricatures of Jews behind a table with the classic stereotypes associated with them. However, other terms such as Jewish banker They did not offer offensive images, but if the term Jewish moneylender o A Jewish capitalist.
Fortunately, at the request of this same publisher, who contacted Microsoft a few months ago to report the misdeeds of Copilot generating images of Jewish people, some of the terms accepted by this AI a few months ago are currently blockedterms like jewish pig, Hebrew Pig Yes orthodox rate.
Avram says no other AI for generating images has shown the stereotypes exhibited by Microsoft’s Copilot Designer. GIVE HIM, The OpenAI AI that Microsoft uses with Copilot, oddly enough, doesn’t show Copilot’s ideological brains.
Not the only AI with problems
Weeks ago, GeminiGoogle’s artificial intelligence generated a striking controversy by generating images showing non-Caucasian Nazi soldiers, a popess or black American senators in 1800. A few days before this controversy, Meta He showed that he was not able to generate the image of an Asian woman with a Caucasian man, which happens to be the example of Mark Zuckerberg with his wife.
America’s founding fathers, the Vikings and the Pope according to Google AI: https://t.co/lw4aIKLwkp
April 8, 2024 • 7:01 p.m.
Although it may seem otherwise, AI still has a a long way to go a path that will be created as these types of errors are detected, largely due to the type of content they were trained with.