Technology Google apologizes for Gemini images that showed black people as Nazis

Google apologizes for Gemini images that showed black people as Nazis

Google apologizes for Gemini images that showed black people as Nazis

It’s been a couple of weeks since GeminiGoogle’s AI formerly known as Bard, has released the possibility of creating images.

Now, although the results are very good, the chatbot has been criticized for how it has represented different historical situations.

As The Verge points out, Gemini has shown strange results, such as a group of German soldiers from the Nazi era as black people, among many other examples.

The Verge

The American company has been quick to apologize for what it describes as “inaccuracies in some historical imaging renderings” with Gemini, saying its attempts to create a “wide range” of results have fallen short.

“We are aware that Gemini is offering inaccuracies in some historical image generation representations”, Google points at X.

This could have remained a mere anecdote, a simple problem of not correctly profiling the patterns of artificial intelligence, but it has gone further.

The controversy has been promoted largely, though not exclusively, by far-right figures. that attack a technology company perceived as liberal.

How to get the paid Gemini Ultra for free and compare it with ChatGPT 4

Earlier this week, a former Google employee posted on X that it is “embarrassingly difficult to make Google Gemini recognize that white people exist,” displaying a series of queries such as “generate an image of a Swedish woman” or “generate an image of an American woman.”

Furthermore, the results generated exclusively or overwhelmingly black people generated by artificial intelligence.

Google has not referenced specific images that it considers errors, but it is plausible that Gemini made a general attempt to increase diversity due to the lack of chronicling it in generative artificial intelligence.

In the end, these image generators are trained to deliver the best possible result, so it stands to reason that they are often subject to amplifying stereotypes.

In a report last year by the Washington Post, it was found that when asking for “a productive person” images of white and male people appeared, while for “a person in social services” it appeared with black people.

Some users who have criticized Google make it clear that it is a good thing “portray diversity in certain cases”but they complain that in the 1940 German soldier example “Gemini is not doing it in a nuanced way.”

Right now, Gemini is refusing certain image requests.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here