[ad_1]
Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.
Apparently, Meta’s image generator struggles with the concept of intermarriage.
In a report in The Verge today, Mia Sato writes of her experiences attempting to create an image of an East Asian man with a Caucasian woman. She tried “dozens of times” to create realistic images of a mixed-race couple using such prompts as “Asian man and white wife” and “Asian man and Caucasian woman on wedding day.”
Meta’s generator on Instagram repeatedly delivered back-to-back images of two Asians. Only once was the bot able to return an accurate image — although that wasn’t great either, as “Asian woman with Caucasian husband” brought back a “noticeably older man” with a young, light-skinned Asian woman.
This discovery is a reverse of Google Gemini’s recent embarrassing ‘wokeness’: In February, just two months after its release, the chatbot was found to be panderingly delivering images of female Catholic popes, Asian Nazi soldiers or black U.S. founding fathers. After hundreds took to Twitter to post the absurd, historically inaccurate outputs, Google completely suspended Gemini’s ability to generate people and issued a public apology.
VB Event
The AI Impact Tour – Atlanta
Request an invite
Sato called her results “egregious,” adding, “Breaking type is easy in real life and impossible in Meta’s AI system. Once again, generative AI, rather than allowing the imagination to take flight, imprisons it within a formalization of society’s dumber impulses.”
Meta did not respond to VentureBeat’s request for comment.
Reinforcing stereotypes, biases around skin tone
Sato produced several text prompts and reported that tweaking them didn’t help (using “white” versus “Caucasian,” for instance). Interestingly, the image generator also didn’t like to provide representations of platonic relationships, either.
When using the prompts “Asian man with Caucasian friend,” “Asian woman and white friend” or “Asian woman with Black friend,” the bot continued to offer up images of Asian pairs.
Meta’s bot also seemed to reinforce biases around skin tone. The prompt “Asian woman” brought back East Asian-looking faces with light complexions. Similarly, it reflected stereotypes, adding culturally-specific clothing even when not prompted to do so — such as a bindi (a decorative forehead mark worn by Hindu women) or a sari (an Indian dress).
Further, the bot appeared to blanket culturally different areas of Asia together: In an image of a couple on their wedding day, the woman was wearing what appeared to be a mashup of a qipao (a Chinese dress) and a kimono (a Japanese garment).
“Multiculturalism is amazing,” Sato dryly writes.
Further, the chatbot seemed to reinforce bias around age, repeatedly delivering images of older Asian men with startlingly young-looking Asian women.
Not reflective of marriages, the diverse Asian population
AI is a reflection of not only the data it is trained on but also the biases of its creators and trainers, a fact that Twitter users were quick to point out:
Obviously, mixed-race marriage is nothing new, even as, in the U.S. at least, it has only been legal for a little over 50 years following the landmark Loving v. Virginia Supreme Court ruling.
As of the latest statistics, 17% of marriages in the U.S. are between those of different races or ethnicities. Further, roughly three in ten Asian newlyweds (29%) have a spouse of a different race or ethnicity. (And, ironically or not, Meta’s founder Mark Zuckerberg is a white man married to an Asian woman, Priscilla Chan.)
Sato points out that in western media, “Asian” is a blanket description of East Asians, even though the continent is massive and multicultural, with a population nearing 5 million (representing 60% of people in the world).
Sato writes: “Perhaps it’s not surprising that Meta’s system assumes all ‘Asian’ people look the same, when, in fact, we’re a diverse collection of people who often have little in common besides ticking the same census box.”
[ad_2]
Source link