
Through the past two years, artificial intelligence (AI) has threatened to replace every facet of humanity that it can. It has forced writers to change the way they write, teachers to change the way they write, and eliminated position after position in the real world. Yet AI has so far been unable to touch the performing arts sectors: we still need singers, instrumentalists, cast, crew, and the various things that make performances a form of art.
A new AI “actress,” Tilly Norwood, represents the first serious challenge.
You’d be forgiven for thinking that there’s no chance for AI to succeed in this area: the first film to be entirely written by generative AI—“Post Truth,” which released earlier this year—claimed to get a lot of press attention despite being an awful lot of nothing. Review aggregators Metacritic and Rotten Tomatoes don’t list any reviews for the film. An AI film “starring” Norwood, “AI Commissioner,” similarly fell flat. A review from the Guardian described Norwood’s performance as “someone whose perfect teeth keep blurring into a single white block in their mouth” being used to “deliver sloppily written, woodenly delivered dialogue.”
Other forays into AI generated audio or performative art have similarly fallen flat. AI “rapper” FN Meka began using anti-Black language within two weeks of signing with CMG and subsequently got dropped by the label. The human behind the AI’s voice—it wasn’t just an AI after all—wasn’t fairly compensated for his work in the endeavor.
Norwood’s “existence,” if one can even call it that, represents serious ethical concerns. AI talent studio Xiocia is considering signing the computer program, a process usually reserved for flesh-and-blood actors. It is indeed quite telling that The Industry’s first foray into “AI talent” is not a normal kinda-okay-looking-if-you-squint actor. Instead, the first AI making the rounds with talent agencies claims to be a teenage or early twenties girl designed to steal eyes.
It should not surprise anyone that the first AI created for acting purposes is designed to be sexualized. Artificial intelligence programs are not designed to push back against their sexualization, as a human performer might do. Fiona Sturges of the Independent sums it up nicely: “Here is an actor who will not set unreasonable terms for her employment. She won’t insist on a script that passes the Bechdel test, or on financial parity with her male co-stars. There will be no need for insurance, or stunt safety, or intimacy coordinators.” I question Sturges’s use of the pronoun “she” for a computer program, but concede that people have been using these pronouns for other computer programs (e.g., Siri, Alexa).
This is the next logical move for AI businesses: use their infinite-content-generation-machine to sell sex, or things that look like it. OpenAI is actively loosening ethical standards and enabling users to generate AI “erotica for verified adults” (read: pornography). Deepfake technology already exists, and has been used to generate sexually explicit images of children. Grok has “spicy mode,” whatever that means. Adults have already tried to propose to ChatGPT’s voice chatbots even before it loosened erotic restrictions.
If you’re an AI company looking to make a quick buck, exploiting human loneliness sounds like a great way to do that. You don’t even have to worry about regulation, because there isn’t any! (The Trump administration repealed Biden’s EO on safe AI.)
Artificial intelligence is a tool: when used properly, it can automate things that are tedious, or that humans don’t feel like doing. But it should not come at the cost of human interaction or involvement, and should be built with safety and informed consent in mind. I want an AI to fold my laundry so I can work on artistic pursuits, not the other way around.
I do not want Hollywood executives telling a computer program masquerading as a barely-adult “actress” how to behave. Humans make art; programs do not.
