This is true but a good part of that is technological. A decent video deepfake used to take hundred+ of high quality images to produce. It was a lot easier to get a thousand high def images of a celebrity by taking frames from movies, videos, interviews etc than to get enough quality photos of the girl on the bus. Even still images were more difficult and, until fairly recently, it made more sense to just use Photoshop than to collect enough good images to make a model for image generation. And not many people have the time/patience/skill for Photoshop like that. Until recently, nearly 100% of deepfakes were of celebrities because you could usually only get enough materials of celebrities to make a deepfake.
Technology is now where you can make single images with a single face pic or just take a photo and use AI to “remove” the clothing, replacing it with a nude body. It’s even getting better and better at video replacement off a single image. With the popularity of “Nudify” apps that remove clothing and the growing ability to create video with little source material needed, I’d expect those numbers to shift. I wouldn’t be surprised if they haven’t already shifted significantly for still images.
The shift in technology is also why you see more press for laws now. “Common citizens” are more likely to be affected and lack the money and tools a celebrity might have to get media removed and pursue legal action against the creators.
As I have said in this thread and elsewhere, I was a bullied outcast throught high school. I noticed who disliked or hated me (it was nearly every other student), I noticed the intensity with which they hated or disliked me. The various methods they usd to express those feelings all ran together into a blur.
I can understand the existence of an image being hurtful. That is what the person who created and ditributed it intended it to do. But the quality of the image making a difference?
How often it is committed isn’t relevant to the effect that punishing it would have on it. It just means more perpetrators punished. A good thing.
How does it prevent anything if it’s not targeted at the perpetrators?
Do you think reducing the “market” is going to curb the hateful fucks who do this?
That’s not what your very last post indicated, obsessed as it still was with the degree of realism and the difference between drawing and photomanipulation.
What’s not workable about taking a victim statement in each case?
And..? The law is not a robot.
Sure. It can be “carefully worded” to include victim statements as evidence.
If the image obviously looks like a fake image, the victim doesn’t have to worry about convincing people that it’s a fake image. But with deepfakes, the image looks like a real photograph. The victim has to convince people that even though the picture looks like a real photograph, it’s actually computer generated. Perhaps computer savvy people will understand the difference, but not everyone of the victim’s family, friends, coworkers, etc. will understand the difference. Granny is going to think that her granddaughter is doing nudes because she doesn’t understand how a computer can make a fake photo.
If you don’t view nudity as problematic, you may not view deepfake nudes as a problem. So instead of nudity, think of something which you would have a problem with. For instance, imagine someone posts pictures of you deliberately harming an animal on the SDMB. One picture is a crudely drawn image and the other picture is a deepfake that looks exactly like a real photograph. With the crudely drawn image, you don’t have to prove that it’s not real. Anyone looking at it knows it’s not real. But with the deepfake, you have to convince everyone that it’s a fake image. Not everyone will believe you when you say it’s a deepfake. Some people will think you’re lying. They’ll believe the deepfake more than your words and they’ll think it’s a real photo of you harming animals.
It was (almost) literally impossible to make deepfake videos without a large number of high quality images from 2017 - c.2022 or 2023. That’s a six year headstart. There’s also the basic notion that fakes of celebrities are more likely to be posted/shared since more people care about Famous Woman than “A woman I know”.
I’m not saying that celebrities don’t make up the lion’s share, I’m saying that the 94% comes with a number of caveats.
I’d be curious to know what the percentages look like on the nudify apps but, of course, we’ll never have the information.
I’m happy to agree that the percentage can change.
I doubt it’d ever drop below an overwhelming majority, though.
And my main point - it’s never going to be mostly teenage boys making it. This buys into some myth that teenage boys are the only uncontrollable sex monsters, which goes against absolutely everything I know about Rape Culture.
That doesn’t matter to the victim. The person in the deepfake is worried about people thinking it’s a real photo. The victim doesn’t care about what the creator’s intentions are. The victim is worried about the photo being seen by people they care about and worrying what those people will think.
The cite I used delved into creator forums and chatrooms as well as the front-facing sites.
They also, according to their own testimony, are just hit with shock and shame on the sight of their own image used that way, at the violation of their own person. That’s separate from what others might think, it seems.
There’s validity to this. Someone takes a photo of a girl at Homecoming and uses AI to make her appear nude and then sends it to his friends. Said friends are probably smart enough to guess that the victim was not posing nude at the homecoming dance and the photo is not valid. But the victim still has the fear that someone somewhere will see the photo and probably doesn’t want to rely on “Well, they’ll figure out it was fake”. It’s not that the AI image was generated with the intent of fooling people, but the victim is still fearful of the image out in public.
I saw. I don’t think those are going to create an accurate portrayal either since celeb faking is more “acceptable” in a pubic forum. It’s also from 2023 which is an age ago in AI tech terms. In any event, it’s probably not worth litigating further.
The same way teaching kids about drunk driving before they of legal age to drink or drive reduces drunk driving. That is what I meant by prevention.
No, nor did I say anything abour reducing the “marlet”. Again, the idea was aimed at edcuating students before they started.
As Jophiel proved (with a cite) that the issue is NOT deep fakes being passed off as real images anyway, the point is moot.
Why the word “obssessed”? A deep fake is very realistic. It is often realistic enough that people might think it a photograph. However, again Jophiel showed me that most deep fakes are presented as fake and are not intended to fool the viewer, only to harm the victim. So, the question “How is a deep fake which is criminalize in the New Jersey law legally different from any other image?”
I don’t see that as helping. Under the law you propose an image is or is not illegal based on how the victim feels about the image. So, there is no question that the image exists or who created it. The only question is how the victim felt? You then have a judge or jury deliberating not on who did what, but ‘how did the victim feel?’ and whether they should or not we would inevitably get a judge or jury asking ‘was the victim right to feel this way?’
It kind of is. Judges are not robots. Juries are not robots. The law kind of is. I remember when some celebrity was killed by a man who had been stalking her for some time. She had gone to the police many times. At that time stalking was not a crime and he wasn’t breaking any existing laws. After she was killed, stalking laws were passed.
I am talking about the wording of the actual law. It would not need or include evidence of any kind. It would need to be carefully worded with clear and defined terms. I linked to the recently passed New Jersey law. It includes definitions of all terms- deep fake, distribute, etc.
Yes, I believe women. I also believe a law based entirely on somebody’s subjective experience would never pass. Remember how, before recreatiional use of marijuana became legal in many places, head shops would sell pipes, hookah’s, bongs, etc and have big signs everywhere saying “For Tobacco Use Only”? Everybody, including the police, knew those shops existed to sell drug paraphernalia. As no marijuana was being sold there, the police could not prove anything.
And you’re only targeting a small percentage of perpetrators that way.
Because you keep bringing it up.
Was this meant to be a full sentence?
You don’t see how assessing the impact on the victim of a crime helps in judging the crime? Are you serious?
No. I’m not saying that’s the be-all and end-all of the matter. But it should be a significant component.
The law has absolutely no effective existence outside of its interpretation by judges and juries. You admit neither of those are robots. Therefore the law can not be a robot.
I’m sure that was a great comfort to her cold corpse.
You make the case for enacting such laws now for me. Enact them before more people are violated.
All laws are subjective. Because humans make them.
You’re just discounting the subjectivity of the actual victims.
Not enough to enact laws to combat their violation, it would seem .
What does this complete non sequitur have to do with deepfake porn?
Your link does not appear to have posted properly.
Again, the plan (which I abandoned after Jophiel posted a cite) was prevention.
I am not obssessed. The law I linked to criminalizes deep fakes in certain circumstances. Asking why deep fakes and not other images is an entirely valid question.
Yes. It should be
So, the question is -“How is a deep fake which is criminalize in the New Jersey law legally different from any other image?”
My objection to your proposed legal standard was, and remains, that whether or not an image is a illegal is based entirely on how the victim feels about. I dpn’t see that as remotely workable. A statement from the victin would indeed confirm and clarify how they feel. I’m not disputing that.
I asked-
You replied-
That is detail not given in your previous reply. So, I ask again at what point should an image be criminalized?
A law open entirely to subjective verification will either never pass or be immediately struck down. Judges and juries are not robots. however they are bound by the law as written. Thet written law is a robot.
That was exactly my point.
The woman, quite understandably, found the stalker’s behavior threatening. She went to the police several times. They could not do anything because the stalker had not broken any existing laws. Once again, the law is a robot.
If you want a law to pass, stay on the books and be enforced it is not enough for the intent to be a good one. The wording and definitions must be precise.
Again, look at the New Jersey deep fake law I linked to at the end of the OP. Like the text of most laws, it attempts to define every term as clearly and strictly as possible and to leave as little room as possible for subjective interpretation.
No, I am not. As I already quoted earlier in this post, I asked at what point an image should be criminalized. You answered. That answer was based entirely on a subjective measurement. You said in the post I am replying to, that you meant details not present in your answer.
Again, a law based upon
would not pass.
It is not a non sequitur. It is another example of the law being a robot. Everybody, including the police, knew that the store was selling bongs, hookahs, etc that buyers would use with marijuana. The police did not and could not charge the store owners with anything because so long as they insisted everything was for tobacco use only, they were not technically breaking the law.
Yes it is. I have given examples of it being a robot above. One of those examples includes a woman being murdered because the law is a robot.
If you’re only targeting a small percentage of perpetrators, while ignoring the most prolific, you’re not preventing jack.
You didn’t just ask that question. You posted quite a bit more about it.
In any case, the answer to your question is because deepfakes are sexual violation and other images are not.
Nope. Violation is one component, obscenity another.
As soon as it’s either violative, or of a class of pictures that would be violative. I don’t require that potential victims actually be violated in order to criminalize.
I realize that’s the nitpick you’re going to jump on now, that I said “make” and not “would make” or “could make”. Go wild, but do acknowledge that I’ve just posted this clarification.
Obscenity laws are on the books right now and have passed Congressional muster. You’re flat-out wrong.
The Miller Test is hardly a robot. Like I said, flat-out wrong.
If you think that’s why the police did nothing, I have to question whether you know what the words The Patriarchy actually mean.
If the police wanted to stop the stalker, an appropriate law would have been found.
The law is not a robot. It’s a tool. One that Power is are happy to wield, or not, as best suits its ends.
You’ve literally been arguing about the correct interpretation of “psychological injury” in this very thread. Not subjective, my ass.
Yes, you are. You’ve indicated that their pain doesn’t matter as much to you as Freeze Peach does.
Then say that, rather than just dropping it in in response to a different point on my part.
And head shops got raided all the time.
No, it is not. You’ve given examples of people using the law as a shield for their lack of actions in specific instances, but that’s entirely a choice of the people - plenty of head shops have been raided anyway, plenty of stalkers have been arrested in other cases. Cops choose which laws to use all the time - just ask any Black person who’s ever been the victim of selective policing while a White person skated on the same offense how robotic the law is..
The study you linked doesn’t say education programs don’t help. It says badly designed programs don’t help.
DARE was a failure. I don’t think we need a study to tell us that. Looking into exactly how and why it failed was certainly worth studying.
I disagree. Preventing people from repeating an action is separate from preventing the next generation from never doing it in the first place. It was a long term plan.
Once again, I have to disagree. Other images that are sexual violation remain legal. How are deep fakes different? That is why I was posting details and asking further questions. This morning, the answer hit me. (I am sure somebody will correct me if any of this is wrong)
Under the law, AI generated images cannot be copyrighted. They are considered to be produced by the AI. AI’s are not legally people. They have no rights of any kind.
Deep fakes are produced by AI. The right to free speech guaranteed by the first amendment does not apply to them. So, declaring deep fakes generated by a machine illegal
accomplishes a ban that easily withstands any objections on the grounds that it infringes on free speech. A badly done cut and pase in Photoshop, or a photorealistic deep fake drawn freehand are different under this law because they are the work of human beings with Constitutional rights.
Again, this is information not present in your original answer.
I asked
You answered
I was responding to what you had posted previously.
If you want to change your answer to
That’s fine. But you need to define “violative”
I was aware of both obscenity laws and the Miller test. I knew that they were still on the books. I didn’t know they had been used recently.
quote=“MrDibble, post:78, topic:1016544”]
If you think that’s why the police did nothing, I have to question whether you know what the words The Patriarchy actually mean.
[/quote]
I have to wonder just what you think The Patriarchy is. Is it bad in many way? Absolutely. Is it the cause of a long list of bad things? Yes, it is. Is it the cause of every bad thing that happens? I would have to go with a no on that one.
I can certainly see cops hassling him, despite his not actually having broken any laws. But, they would arrest him and charge him with what exactly?
[
I have been arguing about the interpretation of a term neither I, nor any other poster in this thread ever used?
It’s a long strange trip you’ve taken.
You said
I replied
Where in the that post do you get “their pain doesn’t matter as much to you as Freeze Peach does.”? I am not seeing any such thing in the reply you quoted from. Also “Freeze Peach”? Really?
Apparently, the lawmakers in New Jersey feel that way. Again, the laws applies only to machine generated images. Any image made by a human being is outside the scope of this law.
You have a cite for that? A quick Google showed this-
As the shops in question were actually selling marijuana (recreational use is illegal in this state.)
So they were breaking the law and neighbors were complaining about it. I don’t know whether that qualifies as a raid.
Without specific laws against stalking? Do you have a cite for that?
That one I will have to give you. The reason Philly’s ‘stop and frisk’ policy was struck down was that police were found to be targeting black men. Sadly, this is actual progress. I did not live in Philly during Rizzo’s term as mayor. The fact that people love a racist thug like him is baffling.