Vulgar pix of celebs? Why?

As I understand it, Emma Watson is also a frequent target due to her outspoken feminism. Yes, she is also an attractive woman, but the sheer volume of material well exceeds that of other attractive Hollywood women. I believe 4chan had regular events to fake images of her. No, I don’t hang out there; don’t ask me why my brain absorbs the internet detritus it does.

I’m a little surprised there hasn’t been a thread about AI fakes yet just because I’ve seen several news stories about high schools grappling with the issue as students pass around fake nudes of other students (made with AI, of course). Some states have passed laws explicitly about it and Senators Murray & Cruz have a bipartisan bill to outlaw it on a federal level but many schools were caught flatfooted and uncertain if a fake image represented an actual crime and what the crime would be and how to report it. Even going to police would sometimes result in a shrug.

I realize, of course, that I could just as easily started a thread but it was more of a “huh” thing side-adjacent to the whole AI art topic than anything I was jazzed up enough to start a thread over.

I saw this story recently (involving using a nude filter on a photo of a 17-year-old):

Yeah, if the OP hadn’t mentioned “vulgar poses,” then I would have thought it was that. It’s the poses that make me think it’s likely a different face.

The software you describe will use a vast database of nude photos to extrapolate what the person would look like if they were nude. And, of course, it’s been trained to make it more and more realistic.

But it keeps them in the same position. That’s the whole point.

BTW on that, altering a real minor’s picture to create porn with it has been illegal for a long time and “simulation indistinguishable from real photograph” was also added as part of the PROTECT Act since 2003.

And the courts will have to rule definitively, but I would imagine that “using a real minor’s picture to train an AI model to create porn” would count as “altering a real minor’s picture to create porn”, so far as the law is concerned.

You could go about it in a variety of ways. There’s apps and bots that just “remove” the clothing from a photo while leaving the pose/background the same. This is the stuff schools are dealing with right now: take someone’s homecoming dance photo off her Instagram, run it through and pas sit around.

There’s faceswap apps and bots that will put Face A onto Body B in a photo/video. This would be used in “Vulgar poses”, deepfake video porn, etc.

Then you could split the difference and use AI to generate a generic nude person matching the body type of the victim and swap the face into it as part of the generation. This would probably require you to be running your own software versus a phone app or Discord/Telegram bot but maybe I’m underestimating.

There are apps/websites where you can start with a fully-trained AI model, and then add a few extra pics (called a “lora”) to the training, so it knows how to include the subject of the pics that way. In the AI images thread, at least one poster has used a lora of himself to create pictures of himself posing with various celebrities. That’s harmless, of course, but it’d be the same process to take a few pictures of a classmate and use them for the lora.

I think I disagree, but with no firm foundation or real vehemence. Call it a hunch, not a conclusion.

So far the courts have been pretty resistant to the idea that the output of an AI bears any direct causal connection derived from its training data. Them accepting that logic would basically require the AI folks to be paying everyone to absorb all the stuff they now scrape for free off the internet. That would be bad for commerce.

If training data for task [whatever] is not an “input” that leads to an “output”, then no minor’s picture was altered on the way to creating a tool that can alter minor’s pictures at will. Despite the fact perhaps tens of thousands of minors pictures were fed in.

Now taking this trained AI and feeding it a picture of Jane Doe age 12 of Smallsville USA, with the intent of making a porn of her? Totally illegal by the plain language.

Although federal law prohibits realistic depictions of underage nudity/acts, that’s not necessarily the case on the state level. For example, from one article about eighth graders getting expelled:

The Beverly Hills Police Department and the Los Angeles County district attorney’s office are still investigating the incident, but no arrests have been made or charges brought. California’s laws against possessing child pornography and sharing nonconsensual nude pictures do not specifically apply to AI-generated images, which legal experts say would pose a problem for prosecutors.

And, of course, none of that if helpful if you are 18+ and your classmates, coworker, spurned ex or dude who found you on Instagram starts making and/or distributing images. Or, to take it back to the top, a celebrity.

Yeah, that’s what I meant. If you just ask the AI for a picture of a 12-year-old, then the resulting picture is going to be a composite of many, many 12-year-olds, in a way that you can’t even pin down an individual source for any given feature. But if you specifically feed it pictures of Jane Doe, and then ask it for a picture of Jane Doe, and as a result get something that looks like it’s actually a photograph of Jane Doe, then that’s something else.

I’m actually a bit surprised about that, because California has pretty strong laws protecting a person’s right to their likeness, and those laws apply regardless of the source of the depiction. I’d have thought that there’d be something the courts could use there.

That’s a really good article from WIRED that aceplace57 posted above and the most persuasive part is that it’s over 20 years old, describing a phenomenon that had obviously been going on for a long time by then. It’s unimaginable that tremendous advances in photo manipulation haven’t been made almost daily since 2003 that are available to the casual hobbyist. From a technical viewpoint, without any interest in producing porn or soft porn, I could see how someone interested in testing the limits of what’s possible might be intrigued into trying his hand at matching Celeb Alister’s face with Pornoc Lip’s body, if only as a way to test his own abilities.

It’s only gotten worse, with a twist.

Now the fake bodies have gotten ridiculous, just gigantic breasts and bathing suits that are little more than strings, but the really silly thing I’ve noticed on this newer batch of vulgar pix is that Jennifer Lopez’s head or Jennifer Aniston’s or Jennifer Whoever’s head is disproportionately large for “her” body. It’s almost like they’re advertising “FAKE” with red letters superimposed on the pix. I’m not quite sure if this is just sloppy photo work or for some reason deliberate. If it was sloppiness, you’d think the head would sometimes be too small or sometimes just right but no. It’s always just a little bit too big for the body, though they’ve always got the skin tones right and the shadows, lighting source. etc.

The same fake comments below-- “I love you, Jenny” or “Your gawjus!!–” which are too stupid (as if she’s really the one posting these pix) to be plausible. Seems like an awful lot of trouble to go through just for a few clicks.

And no, ignoring the pix doesn’t seem to reduce the frequency with which they appear on my internet feeds. It seems to be an entire industry that I can’t figure out the purpose behind.

Maybe I’ll find out if the trends continue–the heads continue to grow larger, the bikini strings smaller, the breasts more and more pumped up.

An oversized head makes it easier to completely cover up the head of the real body model. Though I’d have expected that that would have gotten much less common lately, with more sophisticated image manipulation tools.

I remember friends telling me if it looks unreasonable it’s almost certainly fake.

An A-list personality might have topless photos in their past. Anything explicit is probably fake.

I agree. The level of AI fakes today is scary. They are making videos of well known people saying the most outrageous stuff and it’s all AI generated.

The huge boobs makes it easier to spot fakes. It’s pretty obvious whether an actress has that body type.

Maybe they are. Maybe that’s the joke.

Rule 34 answers the OP question

Rule #34 There is porn of it. No exceptions

The web site is very old. I think it goes back to 2005. It’s very, very HC. Too much for me.