Chanel’s second-ever female global CEO, Leena Nair, recently discovered a stark contrast between her commitment to gender diversity and AI’s portrayal of leadership in the luxury industry. During a visit to Microsoft’s Seattle headquarters, Nair and her team tried out OpenAI’s ChatGPT, but the results were disappointing: when prompted to generate an image of Chanel’s senior leadership team, the AI produced an all-male group dressed in business suits—hardly representative of Chanel, a brand where 76 per cent of employees are women, including its CEO.
In a conversation at Stanford’s Graduate School of Business, Nair described the AI’s response as surprisingly outdated. “We’re like, ‘Show us a picture of a senior leadership team from Chanel visiting Microsoft’—it is all men in suits,” she shared. “It was a 100 per cent male team, not even in fashionable clothes,” she added with humour. “Like, come on. This is what you’ve got to offer?”
Nair’s trip to Silicon Valley also included visits to Google and other tech firms, all part of Chanel’s strategy to integrate AI across its operations. Since 2021, Chanel has even rolled out Lipscanner, an AI-powered app allowing users to try on lipstick shades virtually. Despite embracing AI, Nair acknowledges that challenges like bias and accuracy issues are part of the process. With Chanel’s 96 per cent female clientele in mind, she sees it as essential to work on removing biases that do not align with the company’s identity.
OpenAI responded to Fortune with a statement acknowledging that addressing AI bias is an ongoing task. “We are continuously iterating on our models to reduce bias and mitigate harmful outputs,” said a spokesperson. Meanwhile, when Fortune repeated Nair’s prompt to ChatGPT, the resulting image included both men and women, showing potential improvements but indicating that issues still exist.
AI biases against women in leadership aren’t new. A 2023 study by UCLA found that ChatGPT and Stanford’s Alpaca model tended to favour traditionally gendered language when describing men and women. For instance, the terms “expert” and “integrity” appeared frequently in descriptions of male candidates, while words like “beauty” and “delight” were more often used for female candidates. AI models also show an inclination to assume traditionally male-dominated roles are occupied by men, often using “he” and “him” pronouns automatically for positions like doctors or engineers.
Since joining Chanel in 2021 after a three-decade career at Unilever, Nair has increased female representation among Chanel managers from 38 per cent to nearly 60 per cent, aligning with her long-standing commitment to workplace diversity. Her appointment marks a break from Chanel’s typical leadership history dominated by men, with only one other woman, Maureen Chiquet, serving as global CEO from 2007 to 2016. Nair also became Chanel’s first Indian CEO, a notable achievement for an industry with traditional demographics.
For Nair, addressing AI biases is crucial not only for her own company but also for the tech industry at large. “It’s so important that we keep the ethics and integrity of what we’re doing,” she said. “I constantly talk to my friends in tech, all the CEOs, saying, ‘Come on, guys, you gotta make sure that you’re integrating a humanistic way of thinking in AI.’”
{Categories} _Category: Implications{/Categories}
{URL}https://www.businesstoday.in/technology/artificial-intelligence/story/chanel-ceo-asked-chatgpt-to-picture-her-team-but-was-disappointed-by-the-result-452194-2024-10-31{/URL}
{Author}unknown{/Author}
{Image}https://akm-img-a-in.tosshub.com/businesstoday/images/story/202410/6723754b3bbf3-chanel-ceo-leena-nair-311710250-16×9.jpeg{/Image}
{Keywords}{/Keywords}
{Source}Implications{/Source}
{Thumb}{/Thumb}