AI pornography that is otherwise illegal

I guess. It’s a big world so I’m sure someone is. Not sure if it’s a significant number purely (or primarily) in it for capitalism.

Actually, I revise that. I’m thinking of someone actively engaged in the act with a minor. I could easily see a shitty exploitive parent photographing their kid for money and having them perform for the camera or a similar circumstance.

Well, that’s more than I wanted to think about that today

And I’m saying that i think you are wrong about the motives of the producers, at least some of the time. That guy who took my MIL back home? He wanted to photograph her naked, and probably to look at the names photos. He didn’t want to hurt her or hold power over her. Because if that’s what he’d wanted, he would have done it. And he chose not to.

And my friend who found her husband was selling photos of kids? She thought he was primarily in it for the money.

I think that if you could produce the same materials without hurting children, a lot of the current buyers of the material would prefer that and there would be less “real” CSAM material produced.

Mind you, I assume this would have been decades before the internet existed. Options were a lot different then.

I have no interest in seeing CSAM material but, AI not withstanding, if I was I’m sure it would be far easier and safer to find it online than to lure children into my basement. I’d never be in the production side.

Oh yes, of course. That would have been in the 1940s, or maybe the 1930s. Copies of photos cost real money.

But my point is that i think a lot of the motive to create CSAM materials is actually sex, not power. (And profit, of course.)

Interesting article.