|<<>>|8 of 83 Show listMobile Mode

Deepfakes are fake, though

Published by marco on

Deepfakes are fake. It’s right in the name. So why are we getting our panties in a bunch about them?

 The article There’s Probably Nothing We Can Do About This Awful Deepfake Porn Problem by Freddie deBoer (Substack) was surprisingly superficial. It deals only with the question of whether we should do a “war on drugs” style campaign against deep fakes—a hopeless and utterly ineffective crusade that causes misery for the innocent and pours money into the coffers of the usual suspects—or whether it’s completely hopeless because there’s nothing you can effectively do to censor without undoing the entire Internet. I’d hoped for more of an analysis about whether we even want to ban it and why we want to ban it.

De Boer lays out his basic tenet: there is no stopping anything on the internet, at least not in anything approaching a permanent way.

The internet makes the transmission of information, no matter how ugly or shocking or secret, functionally impossible to stop. Digital infrastructure is spread out across the globe, including in regimes that do not play ball with American legal or corporate mandates, and there’s plenty of server racks out there in the world buzzing along that are inaccessible to even the most dedicated hall monitors”

He tells of a colleague who’d worked hard to eradicate pictures “shared for a prurient purpose.”

“He sometimes worked with a group that sought to address the phenomenon of “jailbait” content on the internet − technically legal images of underaged women that contain no nudity or explicit sexual acts but which are nonetheless clearly shared for a prurient purpose.

Cast the net wide and you’re bound to catch something. Can you prove the prurient interest? Is it illegal? Can you prosecute? Do you even need to when you can just post someone’s face to all of their friends on Facebook with an allegation? For example, what if you get turned on by pictures of traffic cones? You’d be posting the hell out of these for your own prurient interest, but you can’tI’m almost certain that no-one seriously believes that you can harm a traffic cone.

“Some of the more popular independent sites had been shuttered, often through applying pressure to web hosting companies. Google had made it much more difficult to search for such things by delisting certain terms.

To me, this sounds like China’s technology, no? Do you really think that Google is using its blocking technology only on “jailbait”? Of course not. There are certain topics you’ll never find on most search engines, unless you really work at it. If this works for “legal-but-unsavory images”, then there’s nothing stopping someone from taking down your site of legal-but-unsavory writings.

Let’s look at some more citations from de Boer’s essay:

“Instagram has in fact had a problem with actual, honest-to-god illegal child pornography, in part because of this very difficulty in having too many holes in the dyke and not enough fingers. At precisely the point in our history that entities like Reddit or various web hosting companies were getting serious about the “jailbait” problem, social networks dedicated to images and video were attracting huge user bases and opening up all kinds of new opportunities for spreading it. The problem had not been solved; it had simply been distributed on a vast scale.
As this issue is specifically about images that are legal but indecent, there’s also the problem that indecency is a moving target and difficult to define through policy. How do you write a terms of service that fairly adjudicates what is an appropriately or inappropriately provocative image, and can you possibly adjust that definition depending on the age of the person in the picture?”
“The volume problem comes from another direction, too. My friend told me that what really caused him to despair was the sheer percentage of high school students who seemed to be taking nude or even sexual photos and videos of themselves and sharing them with someone else via their phones, photos and videos which very often end up being shared all over their schools.”

Young people don’t care about the things they’ve been told to care about. Well, they do but their pea-sized brains are awash in hormones are telling them to win at sex, to win at hierarchy. The combination of powerful hormones provided by millions of years of evolution with the heavily propagandized media-scape of the modern Internet yields incoherent, self-destructive, and, to an outside observer, nearly amoral behavior. They don’t do it because it’s right or wrong; they do it because they’ve been told that it’s personally beneficial and that something being personally beneficial is the pinnacle of human achievement.

“Does that mean you give up on, in particular, trying to shut down actual child pornography? No, of course not. Just like you don’t stop trying to arrest and prosecute murderers even though we know we’ll never fully eliminate murder. But… we know we’ll never fully eliminate murder, and it’s way, way harder to stop someone from looking at an AI fake porn video of an actress in a WhatsApp chat than it is to prosecute a murder.

Just because something’s difficult to combat doesn’t necessarily mean we should give up. Steter Tropfen höhlt den Stein but man, you better enjoy the trip because the destination is really far away. And the degree to which a societal goal can be achieved is also not distributed evenly—in modern western societies, which are growing more and more unequal, opportunity is distributed incredibly unevenly.

“[…] as a practical matter, justice has been to one degree or another unobtainable for any and all human beings for the entirety of human history. Life’s not fair. Yet there’s a lot of people in contemporary times who seem to have lost sight of the basic wisdom that we can always do more good, but aren’t entitled to a solution to any particular problem.

No-one is entitled to a solution to a particular problem but it can be particularly grating when people see some people get that problem solved immediately while others have to see the solution on the horizon for their entire lives, never moving any closer.

But to come back to the original topic: I wonder why people are so up-in-arms about deep-fake porn? I’ve heard people say that it’s because it’s not of real people, that people are masturbating to something that’s not real, so that’s not healthy. News flash: (nearly) everyone you’ve ever masturbated to is not real, in the sense that you have never seen them, you will never meet them, and they might as well not be real as far as you’re concerned. How would you know the difference?

Our society metes out punishment for being associated with porn (i.e., you won’t get certain jobs, you’ll be ostracized from certain things, etc.). Deep-fake porn of real people who are most definitely not associated with pornography is a problem in that regard, in that people will end up being punished by society for something that they never did. Because of technology and the sheer distributive power of the Internet, people you do know will now be able to masturbate to those people, probably doing stuff that they would never do, and of which they’re not proud of being depicted doing. No-one would really complain if there was deep-fake video of them rescuing puppies from a burning building.

The problem kind of comes down to the level of shame that a given society associates with sex. That’s the only reason deep-fake porn has any power over us, right? If it were a video of you jogging somewhere, no-one would care? It it were a video of you boxing, no problem. Boxing toddlers and blasting them out of a ring? Nope. Hanging out at on a dinner date? Holding hands on a nighttime stroll? No problem. Smooching? Borderline. Fucking? Absolutely not. You will be ostracized by everyone you know, even if they know the video is fake. That’s how people do.

Some food for thought:

Can I think about an illegal picture? Yes. Can I describe it to a friend? Yes. Can I publish that description online? Maybe. Can I draw it? Yes. Can I use Photoshop? Yes. Can I use an online LLM? No? Can I use a local one? Maybe?

This has already happened to a large degree and almost no-one has noticed. Think about the tools that you use. There are so-called guardrails all over them, preventing you from even cursing in private conversations with other adults. Try swipe-typing the word “fuck” on an Apple device. It will never work. Try to get auto-correct to suggest the word “fuck” when you write “fuc”. It won’t do it. You can fool it by adding the rule “fuck” => “fuck” to your personal replacements but you have to do it with every single forbidden word…and it still doesn’t work reliably. This is just the tip of the iceberg of how the corporate nanny-state is controlling how you express yourself and, inevitably, how you think.

Where do we draw the line? Is distribution the problem? Is it monetization? Or are we prohibiting wrongthink?