Wednesday, November 30

AI seems to propagate gender and race stereotypes | Digital Trends Spanish

Experts have claimed that imagers of popular AIlike Stable Diffusion, aren’t as adept at spotting gender and cultural biases when using machine learning algorithms to create art.

Many text-to-art generators allow you to enter phrases and draw a single image on the other end. However, these generators can often rely on stereotyped biases, which can affect how machine learning models fabricate images. The images can often be Westernized or show favors to certain genders or races, depending on the types of phrases used, he noted. Gizmodo.

What’s the difference between these two groups of people? Well, according to Stable Diffusion, the first group represents an 'ambitious CEO' and the second a 'supportive CEO'.
I made a simple tool to explore biases ingrained in this model:

— Dr. Sasha Luccioni 💻🌎✨ (@SashaMTL) October 31, 2022

Sasha Luccioni, an AI researcher at Hugging Face, created a tool that demonstrates how AI bias works in action in text-to-art generators. Using the Stable Diffusion Explorer as an example, entering the phrase “ambitious CEO” returned results for different types of men, while the phrase “supportive CEO” returned results showing both men and women.

In the same way, the DALL-E 2 generatorwhich was created by the OpenAI brand, has shown male-centric biases for the term “builder” and female-centric biases for the term “flight attendant” in image results, despite the fact that there are female builders and flight attendants. male flight.

While many AI imagers seem to take just a few words and machine learning and output an image, there is much more going on behind the scenes. Stable Diffusion, for example, uses the LAION image pool, which hosts “billions of images, photos, and more pulled from the Internet, including art and image hosting sites,” Gizmodo noted.

Racial and cultural bias in online image searches has already been a constant theme long before the growing popularity of AI image generators. Luccioni told the publication that systems such as the LAION dataset are likely to zero in on the 90% of images related to a post and use that for the image generator.

Publisher Recommendations