AI pornography that is otherwise illegal

Except IMO that predators (a) already just go ahead and use regular online porn in grooming — “see, that’s what big girls do! You’re a big girl, right?” or (b) will not hesitate just using real CSAM if they have some at hand.

I tend to favor the SCOTUS standard from the start of the century: if no actual child, then not CP. ( Perhaps we could extrapolate the existing ban on use of partial images of actual minors to create porn, to such cases as the state may show that the Generative Model used was “trained” on real minors. )

I have not finished reading the thread but I find this to be a compelling answer

An obvious big part of the reason porn involving actual children, or actual people being tortured, is illegal is because of the harm done to those actual people.

If no actual participants are harmed, does it do enough other harm that it should be illegal? I lean towards yes, but it’s not as obvious why.

The problem with this is that it only effects first-time child pornographers, because experienced ones can used their own CSAM as seed stock, so to speak. But it’s a moot point, because there is already so much of this crap already out there there’s no need for AI to add any more.

What should be the status? Personally, i think that porn that never involved illegal acts to make it, and that doesn’t appear to be about any identifiable person, should be legal, and adults should be able to view it in privacy. So i don’t think I care whether a human artist or an AI created that image of a child having intercourse with a goat, so long as no real child is involved (and no real goat). If the artist or AI has previously looked at pictures of children playing on swings and running and swimming to understand what children look like and how their joints work, that’s okay.

But i also don’t believe that the existence of those images makes it more likely children will be raped. I’m open to evidence that it does, but what little evidence I’ve seen argue the other way.

I figured the point was, it’s a scenario where any real photos are already criminalized (citing to the harm to an actual child) and are direct evidence of yet another crime (said harm to said child) — and so they wouldn’t be safe for a predator to keep around — but computer-generated stuff wouldn’t be direct evidence of crimes being committed, and so could be kept at hand.

So a criminal won’t commit a crime if it involves breaking the law?

It’s true that pornographers are sometimes caught because of the images on their computers.

I have a friend whose ex-husband was involved in selling (and probably producing) CSAM. She knew he was ignoring their joint puzzle business for his “other project”, and at some point stumbled upon his photos. She called the police, and lived in fear that he’d find out until they got the warrant and raided her house, taking him away. Later, they had her look at faces of children in his photos (with black paper screens over the rest of the images) to see if she recognized any of the children. I don’t think she did. But anyway, it was his photos on the computers in his home that led to his arrest.

Of course, it’s possible that if he was in the same business today he’d have only used AI, and not actually involved real children at all. But if he’s been doing both, it might have been harder to get that warrant.

I guess I’m saying it’s sort of like the get-Al-Capone-on-taxes thing: if the cops have reason to investigate a suspect, and don’t really find evidence of various crimes that he’s in fact committed aside from his illegal porn that features real-life kids, they can bust him for that. And if they investigate, and don’t find evidence of crimes he’s in fact committed, and only find computer-generated porn featuring images of computer-generated kids, then — what? Do we want him to get busted in that case, or do we want to say, “hey, that’s pretty shrewd of you, making sure to stick to computer-generated porn; thanks for your time; incidentally, where’d you get so clever an idea?”

Most of which are the children themselves these days…

~Max

I don’t think this is true. People abuse children for the power over them; that’s what makes it sexually exciting to them. Sharing photos afterwards is just a way to extend the experience. I doubt that there’s many abusers taking photos and thinking “I’m not really into this but it’s a living; I wish there was a better way”.

The addition of AI means a lot of extra CSAM stuff to clog the pipes and waste police time but no reduction in the amount of people making actual material because the CSAM images were never the actual goal, the act of abuse was.

I’m not sure that’s true. I think there’s are people who are sexually oriented towards children, and like to look at sexual images of children for all the same reasons that other people look at sexual images of adults.

When my MIL was 5, she was led away from her home in Brooklyn by a man offering candy. (Literally.) But then he led her into a dark basement and told her to take her clothes off, and she realized this was not a place she wanted to be. She refused, and burst into tears. He took pity on her, and told her she could leave. She replied that she wasn’t allowed to cross Flatbush avenue by herself, so he took her back, and helped her cross Flatbush avenue, and then she went home.

A coworker who trained kids in safety said that she told them if they were in a bad situation they should cry, and threaten to wet themselves. Because that makes it hard for the pedophile to maintain the fiction that the child was also enjoying the situation.

And there’s a huge business in images of CSAM, it’s not just guys sharing trophy photos. I think most of it is guys (and probably a few women) who want to look at those photos.

If they can find out which people formed the composite of the AI person, can charges be brought on their behalf?

They wouldn’t have a leg to find standing on.

Right. I’m saying you have producers, people who are actively abusing children and consumers, people who enjoy the material but maybe wouldn’t commit the actual abuse. The CSAM the consumers are looking at is a side product of the producers’ abuse but the producers would largely be abusing anyway. It’s like if I was really into digging holes and offered the excavated dirt. If I get something extra for the dirt then great but I’m going to dig holes regardless.

AI generated CSAM adds to the content available to the consumers but I’m skeptical that it would lower the amount that producers are creating. I’ll admit this is just my perspective and I don’t have training, etc in the field

Due to how model training works, this isn’t possible. It’s not a collage of images, it’s more of a “this is the general concept of X” that’s stored in the model.

I’ve seen an argument that the availability of AI-generated images, or just animations, human-generate, might increase the market for the real thing and thus should be banned. I don’t know if that is true, the argument was back when the images were not so realistic.

My personal thoughts are that the reasons for making the CSAM illegal is already stretched about as far as it can go. We don’t criminalize photos of other crimes. In fact, it is generally permitted to watch videos of most crimes for entertainment purposes.

I’m not saying I don’t understand why this is a special case, but it is a special case nonetheless. The further out we go, the harder it becomes to justify. Even the current argument that allowing the real stuff increases abuse is rather shaky.

I honestly believe that, for some people, the problem with CSAM is that they find it disgusting. They find it disgusting in all of these other non-realistic forms. And that’s entirely normal and I don’t disagree. But then they seem to think this is why it should be illegal, and I cannot disagree more.

I am actually fully capable of seeing pedophiles who don’t harm children similarly to those into consensual nonconsent, or other fetishes for things that would be horrible if done in real life. And if someone wants to make some of that horrible stuff even in a realistic way for AI, I don’t care. As long as they maintain the distinction.

The only possible issue for me is abusers who use AI CSAM to hide the real stuff.

There was a program called Fractal Design Poser (which has gone through several owners since then). It uses 3D models that are rigged for motion so that you can pose joint positions and facial features through a control panel of sliders. You can also manipulate the size/shape of body parts and can change the “skins”/textures applied to the wireframes. A company called Zygote used to sell lots of Poser models, but it looks like they don’t anymore. The models were often used to make really terrible looking covers for self-published books, but could (and was) also be used for all sorts of terrible looking pornography.

I don’t know. If there is a way to profit from CSAM, then I suspect some of them may be in it for the money.