Vulgar pix of celebs? Why?

Lately my Twitter feed (and other feeds) have been besieged by semi-nude photos of celebrities, which is sort of the norm for 2024, but certain of these questionably tasteful pictures push the definition of “questionably tasteful” over the cliff. Specifically, the poses taken by these celebs call into question whether the pix are real, or photoshopped, or altered in some other major way (superimposing the celeb’s head onto another woman’s body, for example?) Mature actresses, for example, women of considerable dignity in mid- or late-career, in their 30s and 40s, are shown doing more than displaying a bit of cleavage or upper thigh. And I have to ask why would a movie star agree to pose in a skimpy bikini with her legs splayed wide open. I’m not talking about what used to go by the name of “wardrobe malfunctions,” or shots by paparazzi catching a star awkwardly getting out of a car, or crossing her legs on a movie set and inadvertently revealing something she shouldn’t, but rather a deliberately posed pic on a planned shoot, with the celeb starting into the camera, smiling, with hair and make-up just-so but positioning her body in a rather vulgar way. Have I simply gotten prudish in my old age? Or are these pix the result of current undetectable manipulation of photographs? I look at them and ask myself “What in the world is the advantage to this celebrity’s career in agreeing to pose in so vulgar and lewd a manner? What possible motivation would she have to tell the photographer anything other than ‘No, I’m not posing like that’?”

I can’t say why they are showing up for you now, but what you describe sounds like a long common type of manipulated photo. You use a real photo of someone in that position, but change out the face or head. Possibly even a bit more.

Automatic face swap tech has been around for a while, and it’s getting better. There are also deepfakes, which use AI to do face replacements on videos.

I can’t think of any new tech that would make these easier to create. Sure, there does exist AI that creates images from scratch, but I think they’re still usually detectable, and the public ones at least tend to try to stop you from using real people’s likenesses.

But I’m not up on the latest in that area.

I don’t use Twitter/X, but have occasionally seen such images elsewhere as apparent clickbait. My assumption was that they are all (or nearly all) fakes. But I’ve never clicked on them or tried to analyze them.

On edit - ninjaed.

I assume those are fakes. Occasionally I’ve seen old photos of actresses when they were younger in various states of undress, but mostly these were posed or shots taken during filming of movies. It’s not something all that recent, it’s been part of internet culture even since Al Gore invented it.

I don’t think there’s been much advance in the capabilities of top-end technology for faceswaps, in the past decade or two. If a movie studio wanted to face-swap a star’s head onto a body double for a publicity shot, for instance, they’d be able to do it well enough that nobody short of a forensic expert could tell the difference. Where there has been significant advancement of late is that the software that can do that, and do that good a job of it, has become more accessible. What would have taken the resources of a movie studio, twenty years ago, can now be done in a few minutes by an amateur with a decent home PC and a copy of Photoshop. So now, if there’s anyone, anywhere, who wants a picture of Bea Arthur posing spread-eagle in half of a bikini, it’s going to happen.

Faking video is harder than that, still, and while it can be done well by very dedicated amateurs, the amateurs who do it well tend to quickly get offered jobs to do it professionally for the studios. But that’s going to get easier with time, too.

The why is that many people are fascinated by photos like this. So stuff can be sold to them using these images. There’s no stopping it.

AI makes face swapping trivial. Not only with excellent quality but ease of use versus someone making a good job out of Photoshop cut & pasting.

(It’s also possible, of course, that said photos are real. I haven’t seen the pics you’re asking about, haven’t noticed them in my internet travels and am not asking for examples)

I don’t get “vulgar celebrity pix” on Twitter or Facebook feeds for whatever reason.

There’s still a fair amount of clickbait along the lines of “Everyone wanted to date this celebrity in the '80s, but boy, she sure looks like shit now”. Apparently there are oodles of people aching to cluck over what Victoria Principal has turned into at age 75.

Right, but that’s not new. That’s what I was referring to with “automatic face swap.” It’s been around for quite a while now. Though I’m sure new AI generation stuff makes them better.

I’m just not aware of anything new enough that the OP would suddenly be seeing them.

Along with “Amazing! Beautiful people and sex actually existed 100 years ago, as these vintage photographs prove!”.

I’d guess that the sheer number has increased due to expansion and ease of the tech. I was reading somewhere that the number of Deepfake videos on the internet (not the same thing but same ballpark) has been doubling every six months.

Or the OP has just triggered something in the algorithm that is serving these up regularly. And I don’t mean “hur hur, he actually likes them and looks for that stuff” just that the social media algorithms get hung up on stuff sometimes, maybe because you stopped scrolling and spent a moment looking at it (if only to wonder wtf) or something else. Facebook is currently convinced that I’m super interested in vintage Beetle Bailey comic strips right now and that it not a thing I’ve been late-night web searching for.

The thread I cite below is ostensibly about something totally different from this OP, but quickly veers into discussing all the ways you can inadvertently trigger a self-reinforcing spiral of stuff you don’t want to see and some ways to push back at that, slowly whittling away at the flood of unwanted stuff.

@GailForce may find it informative, if only for background:

Actually there has been incredible advances in the last two decades, mostly in the last decade, and in fact even more in just the last couple of years. Matching lighting, skin tone, and changing facial expressions are now built in options inside Photoshop.

A lot of the “vulgar pix” on Facebook AI groups that allow stuff like that are of Greta Thunberg for some reason.

Right, that was my point. Photoshop isn’t top-end. Twenty years ago, someone with the resources of a movie studio could do all of that, too. What’s different now is that now all that is available to hobbyists.

There used to be a poster on Usenet called the Fake Detective. He spent several years tracking celebrity Fake. He often found the original suggestive photo used to create the Fakes. Members of Usenet groups helped find the source photo.

He had a Web site. I don’t know if it’s still up. He burned out and stopped researching over 20 years ago. I remember he posted frustration at the sheer volume of manipulation.

Magazines began professionally altering photos for their covers. Lower or raise a neckline, add or remove jewelry, almost anything in a Magazine cover could be changed. There was a big controversy when a major magazine altered OJ’s Cover to make him appear darker and more sinister.

An article From 2003

Either
(A) There is a large overlap between the set of people who feel Ms Thunberg needs to be put in her place and that of people who feel this is how you put a female in her place; or
(B) There is an unexpectedly high number of people turned on by being shamed about their climate footprint.

Knowing the internet it’s probably both. :roll_eyes:

With a heavier, much heavier, weight on option A.

Ever wish you could “Unthink” something?

EDIT: Ah! Beer! I’m on it!

I’m guessing that a large fraction of Thunberg pics of this sort are from Andrew Tate or his followers, as revenge for the fight that Tate moronically picked with her and epicly lost.