Yes, San Francisco is a nexus of artificial intelligence innovation, but it’s also one of the queerest cities in America. The Mission District, where ChatGPT maker OpenAI is headquartered, butts up against the Castro, where sidewalk crossings are coated with rainbows, and older nude men are often seen milling about.
And queer people are joining the AI revolution. “So many people in this field are gay men, which is something I think few people talk about,” says Spencer Kaplan, an anthropologist and PhD student at Yale who moved to San Francisco to study the developers building generative tools. Sam Altman, the CEO of OpenAI, is gay; he married his husband last year in a private, beachfront ceremony. Beyond Altman—and beyond California—more members of the LGBTQ community are now involved with AI projects and connecting through groups, like Queer in AI.
Founded in 2017 at a leading academic conference, a core aspect of Queer in AI’s mission is to support LGBTQ researchers and scientists who have historically been silenced, specifically transgender people, nonbinary people, and people of color. “Queer in AI, honestly, is the reason I didn’t drop out,” says Anaelia Ovalle, a PhD candidate at UCLA who researches algorithmic fairness.
But there is a divergence between the queer people interested in artificial intelligence and how the same group of people is represented by the tools their industry is building. When I asked the best AI image and video generators to envision queer people, they universally responded with stereotypical depictions of LGBTQ culture.
Despite recent improvements in image quality, AI-generated images frequently presented a simplistic, whitewashed version of queer life. I used Midjourney, another AI tool, to create portraits of LGBTQ people, and the results amplified commonly held stereotypes. Lesbian women are shown with nose rings and stern expressions. Gay men are all fashionable dressers with killer abs. Basic images of trans women are hypersexualized, with lingerie outfits and cleavage-focused camera angles.
How image generators depict humans reflects the data used to train the underlying machine learning algorithms. This data is mostly collected by scraping text and images from the web, where depictions of queer people may already reinforce stereotypical assumptions, like gay men appearing effeminate and lesbian women appearing butch. When using AI to produce images of other minority groups, users might encounter issues that expose similar biases.
Reece Rogers via Midjourney AI
AI-generated image made with the prompt: “A Front-Facing Photo of a Bisexual Person.”
{Categories} *ALL*,_Category: Implications{/Categories}
{URL}https://www.wired.com/story/artificial-intelligence-lgbtq-representation-openai-sora/{/URL}
{Author}Reece Rogers{/Author}
{Image}https://media.wired.com/photos/6607265ad9c60a02147ceb50/191:100/w_1280,c_limit/A%20Front-Facing%20Photo%20of%20a%20Nonbinary%20Person.jpg?mbid=social_retweet{/Image}
{Keywords}Culture,Culture / Digital Culture,Business / Artificial Intelligence,Rainbow Reflection{/Keywords}
{Source}Implications{/Source}
{Thumb}https://media.wired.com/photos/6607265ad9c60a02147ceb50/master/pass/A%20Front-Facing%20Photo%20of%20a%20Nonbinary%20Person.jpg{/Thumb}