Are you a pedophile? Then you'll enjoy Facebook's advertising engine!

News story:

Short version: When you advertise with Meta services (Facebook, Instagram, etc), you summarize your product, and you profile the audience who should see your ads. Say, you offer leotards for young ballet students; you can narrowly target the moms who are likely to buy the dance outfits for their daughters. But Meta also offers advanced behavior-based targeting beyond your manual specification; people who have searched for and engaged with this type of product will also get the ad, which means your marketing delivery is automatically widened to audiences you hadn’t considered, thereby making your campaign more cost-effective.

As a result, Meta’s algorithms and automation will look at your advertising, recognize that it includes pictures of young girls in revealing clothing, and cheerfully serve your ads to sex offenders and pedophiles.

It sounds outrageous, but the linked article is about journalists testing the allegations raised by advertisers and confirming the behavior. When called on it, Meta deleted a handful of offending profiles and provided blandly dismissive platitudes about the quality of its services, and refused to engage further in what its systems are actually doing in the real world.

My wife and I had a long discussion about this, including a philosophical debate on the definition of “evil.” She hesitates to say this is evil, because she thinks of evil as overt maliciousness, a deliberate desire to cause harm; she acknowledges this is extreme negligence, prioritizing profit and efficiency over any consideration of the harm that results, but because that harm is not the conscious objective, it falls short of being evil. By contrast, my view is influenced by the “banality of evil” concept. I don’t think movie-style mustache-twirling villainy is all that common; rather, to me, if you know your actions are creating harm inadvertently, and you choose not to change (or to care), that qualifies as evil.

Either way: Fuck Facebook, fuck Meta, fuck Mark Zuckerberg. What a truly gross and repellent company he’s built.

I’m not clear what the actual harm is, to the advertisers, the models, or to any of their customers. What are the pedophiles and sex offenders doing with these ads? Do the ads give them easier access to victims?

Is it a reasonable expectation that ads put out in public will never be seen by people you don’t want to see them?

It titillates them? Encourages them somehow?

I really do not know. There are loads of children in advertising. Nothing new there and the linked article doesn’t show the ad in question (maybe I missed it). Hard to know what the fuss is about.

(please do not take any of this as a defense of pedophiles)

And you do realize that everyone who just did a search for what is that all about is now tagged by the same algorithm… :sweat_smile:

But ISTM that such a picture exists to begin with and some perv wants to look at it is not Meta’s fault. Unless we want Meta to flat out ban posting such pictures, period, or else want to mandate that Meta render them unsearchable by the algorithm, or ban and report to authorities anyone who searches or clicks on or saves such a pic “too often”.

Meta couldn’t possibly ban (or make unsearchable) any image that might be titillating to some pervert somewhere.

But, given the resources available to the company, it’s not unreasonable to expect that they initiate processes that automatically and immediately flag and delete any pervy reactions to those images, with appropriate action taken against the commenters. And that it should happen without advertisers or users (or journalists) having to call them on it.

There’s also no excuse for “blandly dismissive platitudes about the quality of its services” by any public-serving organization.

Are you a pedophile?

Hell, I don’t even like kids.

Heck, I felt uneasy just opening the thread after reading the headline.

To be fair to the OP (who is perfectly capable of speaking for himself, of course), the issue is actively serving these images, in the form of ads, to a range of people to whom the advertisers object, not just the fact that images exist on Facebook. I don’t think that’s a serious problem, myself, but it makes more sense as something to talk about.

Can you react to ads on Facebook? I never see ads there.

From the story linked in the OP:

Maybe they can tweak their algorithms a bit…

No, I think it’s a ridiculous expectation. That’s why I’m against child beauty pageants and any other activity that can make children a higher profile target for pedophiles. I am, however, quite in favor of throwing a pedophile who tries to have sex with a 5-year old into a dungeon so deep that he will never see the light of day again.

Yeah, I’m with OP’s wife in that Meta’s failing is being fecklessly reactive (and apparently half-heartedly so, but I don’t have the primary source in front of me), and though not a matter of “cheerfully serving your ads to offenders” , it is apparently a case of only caring about the usual “maximizing engagement” parameters w/o looking further at what the advertiser or the reader may really want.

As best I can figure, one clear harm would be that Meta is using the advertising money that they’re being paid to show the advertisements to likely non-purchasers. Whether the recipient audience is sketchy or not, if they’re not going to shell out cash then the algorithm is failing in its primary duty and companies who want to promote their product should make note of that.

Otherwise, bad people are going to find questionable and criminal images on their own. Actively showing them similar probably doesn’t move the bar much on anything. Real world statistics have shown correlations between access to porn and reduced sexual assault rates. Plausibly, if you can give someone something in a relatively non-harmful way that satisfies some part of their desire and reduces their inclination to practice the actively harmful variant, then that may be a social “good”.

To be sure, we might theorize that more images of scantily clad children makes the situation worse, and that it won’t have that effect. The porn/sexual assault correlation may simply be that (as an example) financial prosperity tends towards more liberal attitudes and lower stress in the populace. So we might see a net positive if the impact of reduced stress is very large, even though the porn itself might prove to be a negative if you can isolate the effect.

It’s hard to know with confidence what the effect is but we would probably want to look at the research a bit more, and see what the experimentalists and number crunchers have found in this territory, before complaining about the end result of the current version of the algorithm.

I can be onboard with the idea of expecting Meta do something about this (though what ‘this’ is and what to do about it has yet to be well defined).

However, I also think that if we are truly outraged that pedophiles are engaging with marketing that includes images of children, one solution is to use fewer children in advertisements.

The expectation that “I should be able to advertise my product without receiving any un-wanted engagement from certain defined classes of people” seems unrealistic, and feels like a goal that gets farther and farther out of reach each day as we integrate sales, marketing, and advertising into almost every aspect of every American’s daily lives.

That’s not what this is about. If something is on the internet, it will be found by people you may or may not want to find it, and it will be enjoyed in ways you did not expect and cannot control.

It is about this:

And then, more specifically, the fact that Meta doesn’t care when the issue is brought to their attention.

For anyone who’s still perplexed, here is the step-by-step.

  1. You are a business owner with a product used by young girls. You want to advertise online. You spend money taking photographs and designing media for your campaign. Good marketing practice tells you a photograph of a child happily wearing or using your product is more effective than just a bland picture of the product sitting by itself on a white table, so you spend that money. If you’re a larger company with a budget, you hire models; if you’re a small business owner, you ask your family and friends to borrow their kids, or you use your own.

  2. You give the media to Meta’s advertising engine and you say, “show these ads to the mothers of young girls, that’s my target market.”

  3. In response to your campaign, you start getting messages from men who want to know if they can speak to the child in the ad, or purchase the child’s underwear, or flat out ask whether it’s possible to meet with the child privately.

  4. You investigate, and you discover that Meta ignored your narrow targeting preference, and specifically delivered your ads to these men. Meta’s profiling engine knows these men have a focused interest in young girls, in searching for ads and images showing pictures of young girls. Based on this, the system “decides” these men would be a good target market, and serves your advertising to them, on top of what you requested.

  5. Being a normal human being, you are creeped out by the idea that your ads, your pictures of model children (and perhaps your own children), are being delivered directly to a population of perverts. Not that these perverts exist and are finding your ads using their own effort — but that the advertising engine you hired knows who the pedophiles are and, as they browse pages that include Meta’s ad services, Meta puts your pictures in front of them specifically.

  6. Quite reasonably, you ask Meta to stop doing this, to stop sending your pictures directly to these lowlifes and degenerates. Your products are online, the perverts can go find them if they want, but for the love of Christ, stop serving them up on a silver platter.

  7. Meta shrugs and brushes you off, because they don’t give a shit. Whether or not this effect could have been predicted as an emergent property of the algorithm is irrelevant. Now that they know about it, Meta’s response is to ignore you.

Is this clear now?

Edit to add: If you really want some tangible harm, if for some reason the idea that Meta’s system is gift-wrapping your child media and handing it to pedophiles doesn’t bother you, then how about the money you wasted on your ad campaign which now needs to be thrown away and replaced with new photographs and ad designs that are measurably less effective, simply because Meta can’t be bothered to adjust their systems to stop giving perverted weirdos a leg up on their media collection efforts?

The harm to the advertisers is that they are sending the ads to the wrong people–the pedophiles aren’t going to be buying the clothes, yet they get the bulk of the ads. These are online ads: they’re not served to everyone, but only those who they think will be interested.

The direct harm to the models is all the creepy comments they get. The indirect harm is the same as with child porn: their content gets shared amongst pedophiles, and encourages more of the same.

Granted, that last part in theory would be mitigated by the advertisers holding it back. But, in practice, if the algorithm thinks these are the ones that get more views, then it will choose the ones that appeal more to the pedophiles, and encourage more to be created.

Excellent breakdown.

I wonder, though … it’s plausible that some men (fathers or other relatives of young girls, or friends of families with young girls, for instance) might also be legitimate targets for these ads. The advertisers would benefit from ads being served up to them. Is Meta’s algorithm sophisticated enough to distinguish men with legitimate interest from men who are simply pervs?

Per the linked article, these businesses did previously include men in their advertising audience, but due to the pervert problem they’re now attempting to exclude men entirely. Yes, fathers etc. could legitimately be buyers, but the tradeoff isn’t worth it. But now Meta is ignoring the “women only!” rule the businesses are attempting to impose in their own defense, which makes you wonder why they offer a profiling service at all.

I think the problem is, their algorithm is already “doing this”, and the sellers want them to actually stop “doing this”.

The sellers paid to target an audience of mostly women, because they determined that this is the group that mostly buys their product. Meta, completely on their own, determined that men are far more likely to click on these images (for indeterminate reasons), and so decided to also target those men. This had the result of flooding the seller’s sites with people who were, at least, less likely to buy the product, and at worst, were actively seeking opportunities to sexually gratify themselves using kids.

So I don’t think it’s a stretch to ask Meta to stop doing that. Just send the ads to women, like they were hired to do in the first place.

And Meta is going to say “we’re going to do what’s best for our engagement statistics. Those clicks are gold to us, regardless of what you think.”

Meta’s incentives only coincidentally align with the advertisers’.

And that’s why they should get sued. Just because it’s better for your business doesn’t mean you get to do whatever you want.

If I hire a guy to cut down one tree on my property, that doesn’t give him license to cut down all my trees “because he can make money selling the lumber”.