LOS ANGELES — As artificial intelligence (AI) continues to make inroads into day-to-day applications, its use for creating avatars, especially those that could mimic specific individuals in a sexually explicit context, is getting a closer, more wary look.
Author Katie Bishop opines for The Guardian, in an article titled “AI in the adult industry: porn may soon feature people who don’t exist,” that while the adult industry often drives technological innovation, research into deepfakes raises questions about its potential for misuse.
“These people don’t exist,” Bishop explained. “They are the product of an algorithm, a network of images competing against each other to create convincing fakes — and experts believe that they could soon replace pictures of real people in everything from the profiles that we match on dating apps to the bodies that we watch in porn.”
Bishop points to the work of former Uber engineer Phil Wang at ThisPersonDoesNotExist.com, which makes use of StyleGAN (generative adversarial network) coding to create an endless array of faces.
“Our sensitivity to faces, when you really think about it, is a product of evolution for successful mating,” Wang notes. “What the site really demonstrates is that even for data distribution we are so well-crafted to understand as human beings, the machine can pick apart relevant features and recompose them in a way that’s coherent.”
While Wang’s application of the advanced tech seems benign, GANs are notorious for their use in creating deepfakes in which individuals, from celebrities to private citizens, have had their faces mapped, without their knowledge or concent, onto explicit, often pornographic videos.
While GANs can also be used to generate full bodies and not just faces, their use for porn is currently cost-prohibitive for widespread use, but many observers see this as a future hot spot for the industry, especially for niche material. Bishop notes this technology “makes it easier to create extreme content that consumers seek, but that some performers might not be prepared to participate in.”
However, according to adult industry insiders, this won’t be happening very soon for a variety of reasons, among them job preservation.
“Many of the characters in our experiences are computer-generated so [full-body generated deepfakes] are similar to what we do,” explains Ela Darling, an adult performer, technologist and VR evangelist.
"Some people are concerned that we’re going to reach a place where we don’t even need performers anymore," she observes, "because you can create AI humanoids, and I think that’s something that could be damaging to performers in the industry."
This caution over the fate of performers reflects a more nuanced discussion of the topic.
“As debate heightens on how adult content can warp our perception of consensual and enjoyable sex by showing scenes that objectify women and feature problematic sexual activity, and with concerns of exploitation in the industry growing, the idea of introducing lifelike images of people who can be bent to the viewer or producer’s will is somewhat worrying,” Bishop notes. “By allowing artificial intelligence into the equation we could be opening up our screens to ever more extreme content and perhaps [to] making real-life performers feel that they have to compete with their cyber counterparts.”
Darling, however, perceives a deeper message of empowerment with models' rights taken into account as this game-changing technology moves forward.
“The whole deepfake situation is a deeply unsettling concept because it’s mostly men using women to harm other women,” Darling declared. Her company follows strict guidelines to prevent nonconsensual deepfake technology from being included in any of their product.
The deepfakes debate has largely focused on the "impact of the person being victimized," Darling observes, "and we don’t consider the [broader] power structures being drawn on to create these experiences. As we move into the future and nascent technology becomes more widespread, we need to make sure that performers are stakeholders."
Read Bishop's story for the Guardian here.