Face detection and recognition algorithms are quite specific to facial features. The ones I’m familiar with would not work with breasts; perhaps there are others that would, but generally speaking face detection is a very specific task, while naked image detection is a very broad and vague one.
There is direct evidence linked upthread that Facebook uses humans to detect objectionable images, and there are a number of commercial services that provide it as a service with APIs for integrating it into online services. There is zero evidence that Facebook is using software detection, and I don’t believe there are any commercial services or software that would do the job.
(There are some companies selling porn detection software, but the ones I’m aware of work by looking for copies of known images - they’re designed to catch people distributing images commonly in circulation, not detect images containing boobs).
The only system of which I am aware that flags risque pics is in use by Google, and actually uses their text searching expertise–the image is classified by the words surrounding it.
Facebook’s site search engine is still pretty poor, so I doubt they use that. But, if they did, it would definitely be moderated by a human. Unlike Google, you can’t just flip an option to be able to see the false positives.
And who takes a bath with an overly-made up face and poses for a picture and it’s “just a G-rated personal photo?” Were they purposely trying to trick a filter?
There’s the woman and then there’s the group that re-posted it to Facebook. As I said above, the woman has a “personal photos” page with far more risqué pictures on it. There was nothing particularly staged about it.
But someone noticed it, it got passed around, and then it was uploaded to Facebook by people who apparently collect such odd photos.
There are sites like Awkward Family Photos that are filled with collections of pics that the people who made them didn’t think were weird at the time but others decided are worth bringing to the attention of the Internet.
Another interesting point this case and similar ones reveals:
We know porn when we see it (let’s say): if I want to jack off to it, or it has naked wimmen or twinks, or any other definition. (Note that these definitions are all over the place: I might want to jack off to pictures of Interstate highways, but this kind of thinking I’ll ignore.)
As the poster of the mascara trio–the poster in the photo, not the GQ poster–reveals, or, as far as I can see, Miss Bathtub Elbowtits does to more selective minds–“incorrect,” illusory porn images are equally pornographic, given purely visual stimulation. They are porn images.
Can you imagine being the person whose job this is? I hope it comes with brain bleach, 'cause I’m sure many of the images they have to screen are *not *gotchas.