Shereen Wu, a Taiwanese-American model, didn’t ask for money to participate in an October fashion show. Then 21, the Californian was in it for the exposure.
So Wu was stunned a few days later when she saw a video of the show posted on the fashion designer’s Instagram account. There was Wu, walking down a runway in a black Michael Costello dress. Except, it wasn’t Wu: Her face had been replaced with the face of a White woman she didn’t recognize.
“Am I supposed to know this model? Who is it?” Wu said she asked her mother, who had first alerted her to the video.
Wu’s story reflects the rapidly increasing use of artificial intelligence in the $2.5 trillion modelling industry, a change regarded by some as transformative. Brands including Levi’s, Louis Vuitton and Nike have already teamed up with AI modeling companies and say one benefit is the ability to showcase their products on a diverse group of models.
But in a field that’s traditionally idolized physical perfection, AI technology is creating new, and more threatening, realities.
Nearly three quarters of the fashion executives in a survey published by McKinsey in November named generative AI a priority for their companies in 2024, and more than a quarter said they already use it in creative design and development. The cost benefits are unmistakable: Where human models might start at $35 per hour and, at the top end, can command thousands for a single day, one agency is offering use of its AI models for $29 a month.
Some in the industry already see the use of AI-generated “people” violating their name, image, and likeness rights. A September preliminary survey by the Model Alliance, a nonprofit advocacy group, found nearly 18% of the 106 responding models reported being asked to undergo a body scan for a 3D model of their body or face, without knowing how the scan would be used. Fewer human models also means fewer stylists, makeup artists, and other industry-related professionals.
The emerging technology has compounded a “lack of transparency and accountability” that models have been battling for decades, said Sara Ziff, a model who founded the Model Alliance in 2012, and discussed the issues at a Federal Trade Commission roundtable on generative AI last fall. The group has been pushing New York lawmakers to enact a bill that gives models and other creatives baseline labor protections, including against exploitation through AI.
Without momentum on the legislative or legal front, the speed at which AI is expanding might be too much to overcome.
Ken Girardin, who studies organized labour for the Empire Center, a nonprofit think tank, compared the modelling profession to whale oil dealers in the mid-1800s. Those suppliers believed they were selling an irreplaceable energy source—then saw the discovery of petroleum abruptly collapse their industry.
“Ultimately, modelling may end up having been a short-term phenomenon,” said Girardin.
The model Shudu has become the face of ad campaigns for brands such as Karl Lagerfeld, BMW and Paco Rabanne, and has more than 241,000 Instagram followers. She’s also not real, but rather digitally generated by former fashion photographer Cameron James Wilson in 2017 and inspired by real models including Grace Jones and Alek Wek.
Shudu’s unexpected popularity, and the need for more diversity in modelling, inspired Wilson to launch their AI and 3D modeling company, The Diigitals, six years ago. The company also created Kami, the world’s first virtual influencer designed to have physical features associated with Down Syndrome, in collaboration with Down Syndrome International and creative agency Forsman & Bodenfors.
Wilson says their intent isn’t to replace human models. The Diigitals often pays real models to stand in for Shudu as “muses,” with Shudu’s face dropped in for the final image.
“I really don’t want to be seen as taking away anything,” said Wilson. “I feel like AI and 3D modeling has the potential to have this negative impact and it’s down to us to basically have the moral standing to make sure that doesn’t happen.”
Similarly, Michael Musandu said he helped establish Lalaland.ai, a company that creates AI models for fashion eCommerce brands, in part to build greater representation in fashion. His company pays people from different communities for their body data to create its AI models.
“As a person of colour, I never got to see models that look like myself when I shopped online,” said Musandu, the company’s CEO.
He’s been creating AI models for four years. But the company—and the questions about how the modelling profession is changing—burst into the spotlight in March, when Levi’s announced its partnership with Lalaland.ai to “supplement human models” and “increase the number and diversity” of their models. Critics saw it as a cheap and insincere solution to the bigger challenge of diversifying the profession.
Amid the outrage, Levi’s pulled back, insisting in a statement the company didn’t “see this pilot as a means to advance diversity or as a substitute for the real action that must be taken to deliver on our diversity, equity and inclusion goals.”
Still, the incident set off alarm bells about AI’s potential impact, even beyond traditional diversity.
“If this process could be automated by technology, I’m afraid anyone who doesn’t fit standardized and antiquated height and measurement standards would be first to go in order to cut costs,” Jane Belfry, founder of BTWN, an agency that specializes in body diverse models, said in an email to Bloomberg Law.
Given that companies are under no legal obligation to disclose which of their images are created by AI, Belfry is also concerned about the impact on the consumer experience.
“Using computer-generated images to signal diversity and improve your optics is the exact opposite of any meaningful diversity and inclusion initiative,” she said. “Not only is it creating a bizarre user experience for consumers where you’re not seeing the actual garment on an actual person, but it is ridiculous to claim body diversity on an AI body.”
Musandu said that the use of AI models “accelerates the representation that we’ve all been missing within the fashion industry” but doesn’t replace it.
But the costs of AI and 3D models makes it tough to compete for real models. Musandu’s company charges clients anywhere between 600 to 5,000 euros per month for AI models, depending on their needs. Deep Agency, an AI modeling agency currently in closed beta testing, offers models for $29 a month.
Even cheaper is ZMO.ai, a company that launched in 2020 and offers an online AI art generator. It lets subscribers create three models a month for free.
Beyond being completely replaced, some models are concerned that AI companies could be using their image and likeness without their knowledge.
Ziff, from the Model Alliance, said models typically hand over power-of-attorney to their agencies when they sign a representation agreement and rarely see their contracts with the brands.
The alliance’s September survey found that models who had been body scanned hadn’t been given information about how the scan would be used and were concerned about unknowingly handing away rights to their image, particularly given the rise in pornographic deepfakes.
There aren’t any image or likeness cases currently being litigated against AI companies for use of body scans, but similar issues arise with the use of copyrighted material for training data for AI generators such as OpenAI.
“To create an AI model, presumably the tool that’s being used is using scraped data that that has been used to train a large language model,” said Vivek Jayaram, intellectual property attorney and founder of Jayaram Law. “If you sort of transpose the theories in the copyright cases, somebody might say, ‘Hey, using my face to train a model that’s creating fake people, that’s a violation.’”
AI versions of celebrities such as Scarlett Johansson, Ryan Reynolds, and Tom Hanks have been used in ads without their permission.
Johansson took legal action against an AI art generator company she said used her image without permission.
Such actions and a growing call for Congress or the US Copyright office to enact new protections, might help models by creating a baseline based on which they can argue for their own right to publicity, said Sarah Odenkirk, co-head of Cowan DeBaets Abrahams & Sheppard LLP’s art law practice group.
“We already are seeing some companies that are trying to figure out ways to package name, image, and likeness and help celebrities and other people who make money in that way to manage their images within this new landscape,” said Odenkirk.
Wu’s experience, after participating in the Art Hearts Fashion show in Los Angeles last fall, speaks to a different kind of appropriation. The contract she signed said she had agreed to “walk for exposure, images and any available sponsored goods,” but included no specific language about the images being altered through artificial intelligence. She sees it as the designer taking a step at the expense of someone trying to build her career.
“By not using my face, he’s taking advantage of models without the same influence,” she posted in a now-viral TikTok video.
Wu said she planned to talk to lawyers, but has not yet taken steps to file an action. But Costello, the designer, threatened her with legal action after her TikTok post.
The experience left Wu “terrified” about the prospect of modeling again, she said. “I don’t know if I want to go back,” she said.
A statement sent by Costello’s representatives said: “The allegation of digitally altering a model’s image to change her ethnicity and identity is a serious one, and we want to clarify unequivocally that neither Michael Costello nor our team was responsible for such alteration.”
The statement did not explain how the digitally altered photos might have ended up on Costello’s Instagram account. It also said the firm’s commitment to diversity, respect, and integrity “is at the heart of everything we do.”
Following a months-long strike that paralyzed Hollywood and sparked conversation about the impact of artificial intelligence, SAG-AFTRA, the union that represents actors, performers, broadcast journalists and thousands of other media professionals, won protections allowing its members control or compensation over the use of their likeness.
Despite being in an industry that’s often intertwined with such celebrities, models can’t unionize because they are independent contractors outside the National Labor Relations Act. This means that they aren’t granted “protections against discharge, termination of contracts, or other kinds of discipline for unionizing,” said Marion Crain, a labor and employment professor at Washington University in St. Louis’ School of Law.
One reason is that their work is typically short-lived.
“If you’re not looking at a long-term relationship between employer and employee, the current labour law doesn’t really do much for you,” said Girardin.
In that vein, Ziff said, their industry is “really like the wild west” for workers, which is why they have sought legislative support.
The Model Alliance’s signature bill, the Fashion Workers Act, would establish basic labor protections for models and content creators in New York’s fashion industry. It passed the state Senate last spring but stalled in the Assembly.
In response to the growing threat of AI in the modeling industry, new provisions were added into the legislation to require management companies and brands to obtain clear written consent to create or use a model’s digital replica, and would require them to detail the scope, purpose, rate of pay, and duration of use. They would also require written consent to alter or manipulate a model’s digital replica using AI.
Additionally, the Model Alliance plans to develop policy recommendations through a research study in partnership with the Worker Institute at Cornell.
“People think the fashion industry is glamorous, so they assume that people in our industry don’t have serious concerns,” said Ziff. “But the fact is, this is a multi-trillion-dollar global industry that’s mostly built on the backs of women and girls. Workers in our community deserve basic rights and protections just like anyone else who works for a living—and this is the next big challenge for us.”