The porn of the future (and its ethical dilemmas)

I didn’t know we could generate pornographic content based on our desires.




Like everyone else, I was nervous when I discovered that fake erotic images of Taylor Swift had been faked last January, but I thought it was a (sordid) insider affair. I didn’t know that anyone could now use artificial intelligence to bring their fantasies to life.

I learned it during the sixth edition of the conference Sexualities and technologies, orchestrated by the charity Les 3 sex*. More precisely thanks to the conference offered by Arnaud Anciaux, professor in the information and communication department of Laval University.

The researcher, who works among other things on the economic strategies surrounding the pornographic industries, taught me that there are hundreds of platforms that offer us the opportunity to create erotic images using artificial intelligence (and, in many rarer cases, videos of a few seconds). Two ways of doing things dominate:

1) We choose from several options. How many people do we want to see? How old are they ? What are their physical attributes? What would we like to see them do? And in what setting? Some sites offer more than forty such categories, including dozens of possibilities each.

2) We make a request. That is to say, we describe precisely what we want to see. (I could, for example, write: “Informed consent between three emotionally mature people who seek to give each other pleasure in the setting of the soap opera The little life. » Note that I did not test it for ethical reasons, perhaps the algorithms are not there yet…)

If videos are in their infancy, the production of pornographic images using artificial intelligence has been going well since 2017, Arnaud Anciaux explained to me. This is the year when we revealed codes facilitating the creation of hyperfakes (deep fakes). A new milestone was reached in the fall of 2022, when devices aimed at the general public began to multiply… No longer needed to know how to code, we could now visit a website, check boxes or write requests to give birth to the hyperfakes of our dreams.

PHOTO PROVIDED BY ARNAUD ANCIAUX

Arnaud Anciaux, professor in the information and communication department at Laval University

Platforms have not hesitated to deploy marketing campaigns on social networks to make their services known. Some featured images of celebrities, like the actress Emma Watson, without their consent… These stories caused a stir. Today, several sites therefore block queries that contain star names or words like “minor”. However, according to the researcher’s observations, users try to circumvent “anti-celebrity” filters in discussion forums…

(I give you permission to imagine me letting out a very long sigh.)

Using the image of a non-consenting person in a pornographic context is unspeakably violent and base. We understand that.

But Arnaud Anciaux made me discover another abuse linked to this practice: the hyperfaking which allows the features of a woman to be associated with a naked body which is not hers involves the use of the image of a woman. ‘a sex worker. This woman sees her body used to hurt another. Some actresses in the porn industry have expressed how violent the process was for them too.

Arnaud Anciaux speaks of both “control and ethics” issues. Without forgetting the question of remuneration. These platforms are paid, but who gets the money?

To train algorithms to generate images, artificial intelligence must rely on databases. She needs numerous pornographic references to embody the requested scenarios.

Some sites are more transparent than others about the origin of the content used and the people involved. These are often platforms created with the agreement of sex workers. However, when we move away from these sites, things become much more vague.

The researcher spoke with platform managers who say they use databases developed over 5, 10 or 20 years of work in the pornographic industry… At the time, the actors and actresses had signed contracts stipulating that they approved “current and future uses” of their image. OK, but can we honestly believe that they envisioned that one day it would be used to create content using artificial intelligence?

Other platforms instead use photos that are easily accessible online, for example on forums where Internet users reveal themselves. We can easily assume that these people did not agree to train any artificial intelligence!

The bodies created may be false, but they are based on very real flesh.

In January, Arnaud Anciaux attended conferences in Las Vegas and Los Angeles. The member of the Interuniversity Research Center on Communication, Information and Society was able to observe that actors in the pornographic industry work on different tools. Rather than inviting consumers to enjoy images based on the work of people who have not consented to them, they are developing digital clones.

Concretely ? You can exchange with the clone of an actor or actress. Artificial intelligence generates text messages, images and voice memos that allow us to maintain a relationship with this “being” that turns us on.

“It seems in theory less questionable from an ethical and legal point of view, as we have the agreement of the people presented,” considers Arnaud Anciaux.

Except it’s not perfect either.

The researcher uses a concrete example: let’s say the clone is asked for a photo of himself in a place that the actor behind said clone has never visited. Artificial intelligence will then have to draw its inspiration from images taken by people who have not necessarily consented. (Let us think of the lover of The little life happy to share photos of the decor on Reddit which sees its work diverted for erotic purposes because I wrote a weird request…)

It’s difficult to completely escape ethical reflection on databases.

The future of porn is complicated.


source site-52