STRYCHNINE LOOKS LIKE SUGAR

STRYCHNINE LOOKS LIKE SUGAR

THE POTENTIAL OF PORN

 

In 2017, I worked with Yavuz Kurtulmus, Saif Rangwala, and Gregor Schmidinger, as well as a team of artists to create the Porn Film Festival Vienna. This festival generated a significant amount of controversy not only in the culture and art scenes, but also within the political world of Vienna. The festival pushed a topic into the public discourse that everyone is familiar with but continues to be strictly taboo. Over 4,000 guests per year prove a vital interest in the topic exists in Austria. Significant is that this festival includes feminist productions, by and starring women, in which it’s of utmost importance that the production is pleasurable, honest, and consensual while respecting the agency of all participants. Our goal is to spotlight a positive image of female sexual desire, and to promote and support feminist film production. It’s also important for us to highlight diversity in terms of bodies, age, and sexual orientation. Another essential aspect of the festival is creating a space where an open discussion of the personal experience of sexuality, sexual identity, and pornography is encouraged. To come together to view, discuss, and reflect is our goal. Diversity also continues to be a central pillar of the festival, regardless of age, ethnic heritage, religion, body norms, sexual orientation, or gender.

A DOMINATING PERSPECTIVE

 

In our daily life, we experience a strange paradox. Nudity and sexualized images, especially of women, are everywhere. Still, it’s difficult for us to talk about sex and sexual desire. The nude human body remains provocative despite its significant exploitation in advertising, in film, and on the internet. The experience of mainstream pornography is often associated with a feeling of shame, and it’s seen as disreputable and dirty. Being embarrassed and ashamed is a no-go in our society. In porn, the boundaries of reality are blurred by the suggestion that mainstream productions are realistic, as if everything shown is natural and socially acceptable. However, the perspective is masculine, suggesting that women can be used, that their bodies and desire are available for use. It’s important to be very clear that mainstream porn is a reflection of our male-oriented society and all the problems this patriarchy entails, including sexism, racism, misogyny, inhuman behavior, and violence. This society is one that most women were socialized in, one silent about women’s desire, their bodies, and sex. More and more, men also want new depictions of sexuality and intimacy that portray a man as something other than a constantly horny, patriarchal, and possessive stud.

“With such downplaying, crimes against women and their sexual agency are seen as normal and without consequences.”

STRYCHNINE 01
“In an attempt to ruin my reputation in the society, if some extremist group makes a deepfake video of me forcefully trying to have sex with a woman and puts it up on the internet, you literally have no way of not believing that it’s me. And while there is nothing wrong with having sex, consent is the line between human behavior and bestiality. Suddenly, all my words and ideas would turn meaningless in your eyes. The only thing that may—just may—keep you from not believing your eyes, is your understanding of my work. However, that’s exactly the kind of world we are heading towards, where anyone can cook up any kind of video of someone to ruin their reputation. Keeping this in mind, we must proceed. We must raise our children with all the courage we can muster so that they can tackle the dark side of technology without committing suicide.” —Abhijit Naskar, The Gospel of Technology

THE DEEPEST OF FAKES

 

Revenge porn is a pornographic or suggestive video or picture of a person, usually a woman, that has been posted without that person’s permission as an act of revenge. These images or videos are spread via social networks, on porn sites, or on the dark web. Another tactic is intentionally sharing the content with people from the target’s work or private life. Either the victim is immediately aware of what has happened or the act remains a secret, and the victim learns of it only with difficulty. To obtain content, the perpetrator might resort to hacking, something that is more likely in the case of celebrities. However, another possibility is to create content using video-editing or deepfake technology. The first and most common use of deepfakes is face-swapping. In visual material, such as video or photos, the face of one person is replaced with the generated face of another, portraying the other person in a new situation. Media created in this way has enormous destructive potential, such as by creating fake pornographic content. Although deepfakes are new phenomena, they have sparked a wide-ranging debate regarding their use and dangers for politics, the economy, and society. In the case of politics and industry, a strong drive to recognize deepfakes exists with the intent to limit their use and to punish their illicit creation. In the fall of 2017, an anonymous Reddit released several porn videos under the pseudonym “deepfakes.” These videos included apparent sexual encounters between the Wonder Woman actress Gal Gadot and other actresses, such as Emma Watson, Taylor Swift, and Scarlett Johansson. The content was quickly shown to be fake, created using artificial intelligence. The user trained a neural net using footage of the actresses, then combined it with porn films. By December of 2017, the story was taken up by the mainstream media. At the start of 2018, another Reddit user presented a program called FakeApp. This software enabled users with limited technical knowledge to swap the faces of actors and actresses in porn clips. It also made it increasingly difficult to tell if content was real or fake. Using a subreddit created for the purpose, more than 50,000 users shared deepfake videos until the subreddit was taken down on February 7, 2018.

 

LAW FALLS SILENT ON THE FAKE

 

That the law has been of little use in the defense of potential victims has been clear for quite some time. For illustration, let’s consider the following story. Romeo and Juliet are alive in 2020 and have been in a long-term relationship. Romeo is a fan of digital photography, and wherever he and Juliet go out, through the streets of Venice or picnicking in the garden of his castle, he takes photos of Juliet with his smartphone. Juliet has consented to all these pictures, but she never allowed any photos that included nudity or were sexually explicit. After Romeo has convinced himself that Juliet is having an affair with Mercutio, the two break up. To take revenge on Juliet, Romeo uploads all the pictures he has taken of Juliet to an app that replaces the face of a porn star with Juliet’s. He then shares this deepfake video, along with all Juliet’s personal information, on several social media sites. When Juliet goes to the authorities in the US to have the pornographic deepfake removed and Romeo arrested, she learns that what he has done isn’t illegal. He can be prosecuted only for is sharing her data without her consent. Victims of deepfake revenge porn also face a difficult path through many legal systems, for example in Austria. The primary problem is the crime has yet to be defined in criminal law. It’s possible to prosecute based on libel or slander, but in this case, the victim has to appear in court as the plaintiff. Doing so requires seeking a civil injunction in court, which also requires retaining a lawyer and covering legal costs that might never be recouped from the perpetrator. Successful prosecution assumes the perpetrator can be identified, which is often difficult or impossible in these cases. Time is not on the victim’s side. While the perpetrator remains anonymous and uncharged, it’s impossible to force this person to remove the content. Even if an indictment occurs, the revenge porn can remain available on the internet and might continue to be shared. For this reason, it often makes more sense to register fake content with the platform or provider facilitating the sharing. Unfortunately for the victim, this minor step can be difficult, such as when the provider is located in a country such as Ukraine.

 

An exception is the US State of Virginia. There, lawmakers made it illegal to distribute pornography without the mutual consent of all parties, and they expanded the definition of content to encompass realistic fake videos and photos, including computer-generated deepfakes. Since 2014, it has been illegal in Virginia to share or distribute nude pictures or videos to coerce, harass, or intimidate. Lawmakers made clear—through a change in the law on July 1, 2019—that the prohibition included faked videos or pictures, including deepfake videos, altered images, and numerous other kinds of falsified media. Breaking this law is a class 1 misdemeanor, and conviction can lead to up to a year in prison and a fine of $2,500.

Kayne West
In case you somehow missed this drama, or need an update, the TL;DR version goes something like this: Kanye West released a song called “Famous” containing the lyrics, “I feel like me and Taylor might still have sex / Why? / I made that bitch famous.” He claimed to have called Taylor asking for permission to use the lyrics and said she approved it. Taylor released a statement saying that she had warned Kanye of the song’s “misogynistic message” and that he’d never asked permission to use the line, “I made that bitch famous.” Kim then released a video of parts of their phone call in which Taylor seemingly gave her permission for Kanye to use the “We might still have sex” lyric, but not the reference to her as “that bitch.” Taylor then disappeared from public view for almost a year before returning with an album, Reputation, which appeared to address the feud in a couple tracks. Around a month before Kim shared the audio clips from Kanye and Taylor’s phone call, he released the “Famous” video. It featured nude waxworks of prominent celebrities in a bed, and on Kanye’s right was a waxwork depicting Taylor. Taylor never publicly commented on the video, but many of her fans expressed outrage that Kanye had used a naked mock-up of her without consent, labeling it “sexual harassment” and “revenge porn.”

“Nudity and sexualized images, especially of women, are everywhere. Still, it’s difficult for us to talk about sex and sexual desire.”

AN APP FOR THAT

Deepnude is an app that takes harmless photos of women and turns them into realistic nude photos, leading to a storm of outrage. This outcry led the developer of the app to recall the program, and a few days after it was announced, Linux and Windows took the software offline. The creator of this app, who works under the pseudonym “Alberto,” claimed that he underestimated the interest in his program and that he now realizes the potential for abuse was too high. The servers dedicated to running the app crashed due to overuse. “Alberto” claimed the world was not yet mature enough for his app. Hopefully, it never will be. This app is a good example of rape culture, a term referring to a social environment that downplays the sexual violence of men against women through jokes and trivialized portrayals in film, music, and apps. With such downplaying, crimes against women and their sexual agency are seen as normal and without consequences.

OPEN QUESTIONS, OPEN WOUNDS

Should we create stricter laws to regulate fabricated data? Should we regulate the creation of this kind of data or just its use? Who owns the intellectual property this fabricated data represents? Does it belong to the algorithm or the developer of the algorithm? These are challenging questions that still fail to encompass the social effects of AI and phenomena, such as deepfakes, this technology facilitates. Technological progress can’t be stopped, and innovations in digital editing inevitably race ahead of laws that regulate them. It’s impossible to have a dialog about sexuality and pornography without examining patriarchy and capitalism. We are sexual beings. The focus of politics will always be the body—see Foucalt’s term biopolitics—and especially the bodies of women and those of fringe groups. Pornography, that remarkable and everyday art, reminds us that we are part of a larger power structure. Pornography is not to be blamed for this societal malaise, but it illustrates that inequality is deeply rooted in our civilization and continues to dominate our view of sexuality. To paraphrase Annie Sprinkle, the answer to bad porn isn’t no porn but better porn. Stay alert and reflective. Doing so is key to moving our society toward true equality. In this vein, don’t trust everything you see. Even strychnine looks like sugar.

“Media created in this way has enormous destructive potential, such as by creating fake pornographic content.”

STRYCHNINE 03

JASMIN HAGENDORFER is a contemporary artist, curator, and festival organizer who lives in Vienna. She’s the founder and creative director of the Porn Film Festival Vienna. Starting in 2019, she became the artistic director of the Transition International Queer & Minorities Film Festival. Her work is a meeting place of themes originating in politics and art. She’s an expert in the field, and so we sought her analysis of the dark side of deepfakes—computer-generated revenge porn.

Follow Jasmin on Twitter: @jhagendorfer or @pffvienna

MEMO 01 - JULY 2020
Copyright 2020 TFLC
Ideas for change