I cannot find the exact wording of the law. Governor Murphy said on the local news that other states had passed such laws. But that they had been struck down as violations of the first ammendment.
That is one reason I oppose such laws. Besides that, I do not see the need for them. If somebody uses a deep fake in commiting a crime, they have already commited a crime. This means that they can already be arrested, charged and convicted under existing laws.
I also see an up side to deep fakes. Yes, deep fakes can be used to claim that somebody posed nude or engaged in various sex acts when they did no such thing. I acknowledge that. However, that statement is far from the whole issue. First, now that basically anybody can make a very believable deep fake, the proper response is “That is not me. I never did that. It is a deep fake!” Second, now that anybody can make a very believable deep fake, anybody can very believably claim that any images, audio, or video of them is a deep fake- even when it is not. So, I definitely see an up side. If whatever security precautions you took with those files were not enough, just say it was a deep fake. If the person or persons you shared them with should leak them, just say it’s a deep fake.
I expect that many people will disagree with me on this.
Here is a link to Governor Murphy’s official site, and his speech about the law-
ETA
I found the text of the law. I now oppose it more than ever.
Until the audience has been educated to no longer believe their eyes, deepfakes are pure bad and nothing good.
Making the creation and / or dissemination of them a separate crime is good. Because unlike your silly strawman, the primary use of deepfakes is not in furtherance of another crime, but in furtherance of destroying someone’s reputation. Which might be a tort, but is not a crime.
Of course in a post-truth post-law autocratic kleptocracy, the very idea that laws exist or are passed is simply quaint, not useful. Every law becomes a feel-good measure and nothing more.
First, people tend to believe what they see especially if it looks plausible. A group did a survey of teens about deepfakes and two-thirds responded that they would assume a nude photo of a fellow student was legitimate, especially if they thought the victim was “the type” to send nude photos. Sure, you can claim it’s a fake but you’re putting the burden on yourself to prove your innocence (as the phrase goes, not to cast a moral light on consensually sending nudes).
Second, people have been using “It’s photoshopped!” as an excuse since the 90s and it rarely gets more than an amused “Sure, Jan” from the public.
Third, creating a fake nude that’s going to fool your mom, doctor, and a forensic team isn’t really the goal. The teen girl mentioned in the governor’s press release was a victim of AI nudes in her school along with multiple other girls. The photos used where pictures from school dances and other events. Presumably the boys sending them to one another didn’t actually think these girls stripped under the balloon arch at the Spring Fling but that didn’t stop them from making and distributing the pictures.
Fourth, which ties into the third, the harm is done regardless of whether you can convince people it’s a fake image or not. The goal is usually to objectify, embarrass and/or exert control over the victim. “I can make realistic looking photos of you doing whatever I want”.
This isn’t to debate the text of this specific law, it just sounds misguided to present the easy creation of AI generated nude images of nonconsenting people as a silver lining.
What you describe is not an upside, it’s another downside.
And all the legalistic debate ignores what’s really wrong with deepfakes - the utter violation inherent in parading around pornographic images purporting to be oneself.
That violation - usually of women - is why a law like this is a good thing. Fuck the Patriarchy!
Again, I strongly disagree. Obviously, I think Nazis are bad people. However, I strongly support their right to speak and assemble. Even Governor Murphy noted that other states had passed laws against deep fakes, but they had been ruled as unconstitutional violations of the first amendment.
I am partly here to debate the exact wording of this specific law. That is why I linked to it in the OP.
I am also started this thread to debate whether laws like this are good in principle and if they can be passed without violating the right to free speech. Obviously so far everybody but me thinks it is a good idea.
I am against the Patriarchy just as I am against racism and Nazis. There are plenty of ways to attack the Patriarchy without violating the constitutional right to free speech.
The wording of the law makes it seem like generating AI fakes is only against NJ law if you’re doing so in order to commit a crime.
A natural person commits a crime of the third degree if, without license or privilege to do so, the person generates or creates, or causes to be generated or created, a work of deceptive audio or visual media with the intent that it be used as part of a plan or course of conduct to commit any crime or offense, including but not limited to:
So, if I want to make a thousand fake images/videos of my coworker and let them sit on my PC then there’s not a violation. If someone else is on my PC and sees them, the police aren’t going to do anything about it. If I start sending them to my other coworkers saying “Check out what a whore she is” then I may be committing a crime of harassment and generating/including those images is a knock-on violation to that offense carrying additional penalties.
This doesn’t seem like an egregious violation of the First Amendment. It also, of course, isn’t just about sexual content. I could get hit with additional penalties if I use AI deepfake stuff to commit fraud, in the course of committing theft, to create public hysteria, etc.
I agree with LSLGuy that most fakes aren’t made to conduct a crime. However, I think that “destroying someone’s reputation” could be a crime (harassment) depending on the state law.
I assume most pornographic AI fakes are made for private use of the creator or perhaps sharing in a way that may not rise to the level of a crime by itself. This law doesn’t seem like it would do much to combat that and others may feel that stricter laws prohibiting AI fake creation are warranted.
Maybe there could be a law that if you make a deepfake based on a real person, there has to be an obvious watermark with your real name and a statement that it’s a deepfake. If the pic had a watermark which said “Deepfake created by Joe Doe”, it would make it obvious that it was a deepfake and would not interfere with first amendment rights. It wouldn’t be a crime to create a deepfake. It would be a crime to not include a watermark on the deepfake.
A couple of issues with that are (a) stuff like the “nudify” apps and services are hosted overseas so they don’t care about a US law saying you need a watermark and (b) it’s trivially easy to generate this content from home using even a half-decent computer. If I make an image and send it off into the wild, you might be able to trace it back to me but you probably wouldn’t be able to prove I generated it.
It’s not being violated worse than the actual human beings involved are. People trump paper, every time.
Plus the “right to free speech” has always had exceptions, including against obscenity, defamation and false statements of fact. And deepfakes are obscene, defamatory, lying speech. These are literally acts of sexual violence you’re defending here.
Obscenity is on a case by case basis. OTTOMH. I don’t know if somebody has been convicted of obscenity for the past few decades. Defamation as pointed out above is a civil matter not a criminal one. “False statements of fact” are against the law in certain circumstances. This is not universal.
Obscenity is a legal term. There are plenty of things I am violently against that are not obscene under the law. Are you saying a deep fake depicting somebody nude is obscene but a genuine photograph of a nude person is not? Certainly there are differences. But the legal definition of obscenity is based on the image and what it depicts- not how it was made. You can say deep fakes are immoral, evil, and wrong. But, obscenity has a specific legal meaning.
As for defamatory, we already have a civil charge for that. Why do deep fakes merit a new criminal charge? I can now travel to Jersey and defame somebody all I want, as badly as I want and only face civil consequences. Yet, if I make a deep fake I face criminal charges.
Are deep fakes lies? Of course they are. Is all lying against the law? Of course it is not.
I could not disagree more strongly. I have been in the mental health system many decades. It is rather difficult to spend all that time around other patients and not meet many people whose mental problems were caused by sexual abuse. IMHO, comparing being raped by your father, pimped out by your uncle, or raped by one of the popular boys at school to having somebody start a rumor about you and back it up with a deep fake is both ludicrous and an insult to all the survivors of rape and other sexual abuse.
Perhaps they should legislate that software used to create these things should incorporate things like those hidden dots that printers add to each printout. It’s well known that you can hide codes in digital images that are easy to find with the computer, but which don’t show up to normal human observers, so it wouldn’t even affect the quality of your fake porn.
Why do you need to compare them? It’s not a zero sum game where being trafficked for sex becomes less violence if you also say that grabbing someone’s ass or distributing fake nudes of them is also an example of sexual violence. People who have had fake nudes distributed of them experience shame, mistrust, withdrawal and suicidal thoughts. Children have had to switch schools as a result. Saying “It’s not like you were raped” seems like a terrible response.
RAINN, which is made up of numerous survivors of horrific sexual crimes, is strongly against deepfake nudes and has worked to try to get legislation against them so it seems like they’re not on the “That just cheapens real sexual violence” bandwagon.
Maybe but, again, it gets to be a global thing and there’s already a lot of open source software that can be used for this. Probably far too late to stick the genie back in the bottle when anyone can just download a copy of A1111, Flux, Comfy or other image generators.
But imagine they used your name while doing it. Imagine they gave reporters your name to use in newspaper articles, imagine them wearing a “Hi, my name is DocCathode” on the robe. Imagine if your friends, relatives, coworkers and anyone else you interact with assumes that’s you.
It’s not about someone expressing themselves, it’s about someone pretending to be you expressing yourself.
If I got a billboard that said “I’m DocCathode and I rob banks, cheat on my taxes and drown puppies”, that would be slander (I hope), but should my freedom of expression protect me? It’s art!
Pornographic deepfakes of nonconsenting women meet my definition of an obscene case.
What does that matter? Defamation is nevertheless an existing exception to absolute free speech.
And this should be one of them.
It’s also a plain English term.
If the person is nonconsenting in both cases, they are both obscene.
I have exactly zero problem with nudity. I appreciate and celebrate willing nudity. That’s not the issue here.
Yes - in the case of deepfakes, it depicts violation of consent. That’s obscene to me.
So, violation of consent is not obscene to you?
So? It doesn’t go far enough in the case of deepfakes, clearly. So criminalize it too.
Did I say all lying was? No. But some lying is. Deepfakes should be in that set.
Nice try at playing a completely non sequitur mental health card, here’s my rape victim trump.:
I’m an actual survivor of multiple rapes and I’m telling you that no, it is not an insult and you do not speak for all survivors. Or any, for that matter, if you’re not one.
I see this analogy as so flawed as to be useless. Unless, you can find cites for deep fakes of ordinary people being used for purposes other than porn? The general use of deep fakes is to make false images or videos of people nude and/or engaged in sex acts. That would be the rumor going around.
Again, I haven’t seen or heard of a case where anybody was ‘pretending to be’ any body else. AIUI In general, the misuse of a deep fake is- boy makes deep fake of girl. He passes it around and says the image is real, that she either gave it to him as part of a sexual relationship or that he got it from another boy who was in a sexual relationship with her. At no point does he actually claim to be her.
Sigh, again nobody is saying ‘I am DocCathode! And I do these things!’. They produce a deep fake of an act and say ‘That is real! That is DocCathode! And look at what he is doing!’
OTTOMH That would be libel not slander. Libel is written like in a library. Slander is spoken. No, if you put up a billboard making claims you know to be false you would be civilly liable. I could and would argue that strangers would believe those claims. Those who know me would already know that regardless of my position on cheating on my taxes, robbing banks and drowning puppies- I just don’t have the kind of money it takes to rent a billboard.
I don’t know why this statement is here.
On Preview
I see that **MrDibble has responded. Besides the usual problem of people interrupting my web surfing just because I work in a call center and it is business hours (The nerve of them! I may complain to my supervisor), I am having (I started a thread about it) problems with the SDMB freezing. I now need to type all my posts in notepad and then copy and paste them. I am not ignoring you. I will post this. Compose and post a response when I am able.
More commonly, he makes no attempt to say it’s a real photo and just passes it around (or keeps it) so everyone can say “Hehehehe… that’s what Cindy looks like naked” and get their jollies seeing a depiction of Cindy naked regardless of its validity.
Again, most of these images are made from photos taken from social media accounts and the like. Then you run it through some app (or Telegram service, etc) that uses AI to “remove” the clothing. It’s still obvious that it’s a photo taken at a dance, on a boat, at a party, out with friends at a Starbucks, etc. The goal isn’t to trick people into thinking Melissa got naked at a Chili’s, it’s just to say “Here’s what Melissa looks like if she was naked”.
Shame those X-ray glasses sold in the back of kids mags back in the 50s and 60s didn’t work. We’d have dealt with this problem back then and it would not be novel now.