Vulgar pix of celebs? Why?

As I understand it, Emma Watson is also a frequent target due to her outspoken feminism. Yes, she is also an attractive woman, but the sheer volume of material well exceeds that of other attractive Hollywood women. I believe 4chan had regular events to fake images of her. No, I don’t hang out there; don’t ask me why my brain absorbs the internet detritus it does.

I’m a little surprised there hasn’t been a thread about AI fakes yet just because I’ve seen several news stories about high schools grappling with the issue as students pass around fake nudes of other students (made with AI, of course). Some states have passed laws explicitly about it and Senators Murray & Cruz have a bipartisan bill to outlaw it on a federal level but many schools were caught flatfooted and uncertain if a fake image represented an actual crime and what the crime would be and how to report it. Even going to police would sometimes result in a shrug.

I realize, of course, that I could just as easily started a thread but it was more of a “huh” thing side-adjacent to the whole AI art topic than anything I was jazzed up enough to start a thread over.

I saw this story recently (involving using a nude filter on a photo of a 17-year-old):

Yeah, if the OP hadn’t mentioned “vulgar poses,” then I would have thought it was that. It’s the poses that make me think it’s likely a different face.

The software you describe will use a vast database of nude photos to extrapolate what the person would look like if they were nude. And, of course, it’s been trained to make it more and more realistic.

But it keeps them in the same position. That’s the whole point.

BTW on that, altering a real minor’s picture to create porn with it has been illegal for a long time and “simulation indistinguishable from real photograph” was also added as part of the PROTECT Act since 2003.

And the courts will have to rule definitively, but I would imagine that “using a real minor’s picture to train an AI model to create porn” would count as “altering a real minor’s picture to create porn”, so far as the law is concerned.

You could go about it in a variety of ways. There’s apps and bots that just “remove” the clothing from a photo while leaving the pose/background the same. This is the stuff schools are dealing with right now: take someone’s homecoming dance photo off her Instagram, run it through and pas sit around.

There’s faceswap apps and bots that will put Face A onto Body B in a photo/video. This would be used in “Vulgar poses”, deepfake video porn, etc.

Then you could split the difference and use AI to generate a generic nude person matching the body type of the victim and swap the face into it as part of the generation. This would probably require you to be running your own software versus a phone app or Discord/Telegram bot but maybe I’m underestimating.

There are apps/websites where you can start with a fully-trained AI model, and then add a few extra pics (called a “lora”) to the training, so it knows how to include the subject of the pics that way. In the AI images thread, at least one poster has used a lora of himself to create pictures of himself posing with various celebrities. That’s harmless, of course, but it’d be the same process to take a few pictures of a classmate and use them for the lora.

I think I disagree, but with no firm foundation or real vehemence. Call it a hunch, not a conclusion.

So far the courts have been pretty resistant to the idea that the output of an AI bears any direct causal connection derived from its training data. Them accepting that logic would basically require the AI folks to be paying everyone to absorb all the stuff they now scrape for free off the internet. That would be bad for commerce.

If training data for task [whatever] is not an “input” that leads to an “output”, then no minor’s picture was altered on the way to creating a tool that can alter minor’s pictures at will. Despite the fact perhaps tens of thousands of minors pictures were fed in.

Now taking this trained AI and feeding it a picture of Jane Doe age 12 of Smallsville USA, with the intent of making a porn of her? Totally illegal by the plain language.

Although federal law prohibits realistic depictions of underage nudity/acts, that’s not necessarily the case on the state level. For example, from one article about eighth graders getting expelled:

The Beverly Hills Police Department and the Los Angeles County district attorney’s office are still investigating the incident, but no arrests have been made or charges brought. California’s laws against possessing child pornography and sharing nonconsensual nude pictures do not specifically apply to AI-generated images, which legal experts say would pose a problem for prosecutors.

And, of course, none of that if helpful if you are 18+ and your classmates, coworker, spurned ex or dude who found you on Instagram starts making and/or distributing images. Or, to take it back to the top, a celebrity.

Yeah, that’s what I meant. If you just ask the AI for a picture of a 12-year-old, then the resulting picture is going to be a composite of many, many 12-year-olds, in a way that you can’t even pin down an individual source for any given feature. But if you specifically feed it pictures of Jane Doe, and then ask it for a picture of Jane Doe, and as a result get something that looks like it’s actually a photograph of Jane Doe, then that’s something else.

I’m actually a bit surprised about that, because California has pretty strong laws protecting a person’s right to their likeness, and those laws apply regardless of the source of the depiction. I’d have thought that there’d be something the courts could use there.

That’s a really good article from WIRED that aceplace57 posted above and the most persuasive part is that it’s over 20 years old, describing a phenomenon that had obviously been going on for a long time by then. It’s unimaginable that tremendous advances in photo manipulation haven’t been made almost daily since 2003 that are available to the casual hobbyist. From a technical viewpoint, without any interest in producing porn or soft porn, I could see how someone interested in testing the limits of what’s possible might be intrigued into trying his hand at matching Celeb Alister’s face with Pornoc Lip’s body, if only as a way to test his own abilities.