AI technology could be used to determine people's sexual orientation (Economist article)

I’m highly skeptical that this machine could actually do it, but what do we know.

Do they know it is basing it on facial structure? Or are they just feeding it pictures of faces and orientation, and having it learn to predict based on whatever pattern it detects? And how accurate is it? Just better than random, or highly accurate? Is it more accurate than a human who is good at it?

Machine learning AI could be used to probabilistically determine an awful lot of stuff about people. If analysing someone’s purchases, posts or website visits can be used to assess what kind of ads should be directed to them, it can be used to tell pretty much anything if someone’s interested enough, you have enough data and smart enough AI.

It’s not clear exactly what they’re talking about, but there are video analysis techniques that can see very subtle changes in people’s faces like flushing and pulse rate.

See this video

It wouldn’t surprise me at all if that could be used to very accurately judge sexual orientation in response to stimuli. The people whose pulse quickens when someone attractive is nearby are sexually attracted to that person’s gender. That’s a subconscious effect.

Eye-tracking is another technique that could likely very easily be used.

So, an *actual *gaydar?

Incidentally, due to the ambiguity of the word “determine”, I thought this thread would be about something else entirely.

Maybe they can cross that AI with this one and come up with…well, something.

The argument that AI can spot patterns we would never ordinarily see because of its ability to examine huge datasets is certainly true, but the reliability of the correlation between such patterns in people’s faces and specific attributes is extremely questionable, IMHO. Some things, sure – how many of us have not said “I could tell just by looking at him that he was an asshole” and been pretty consistently accurate? But I think we’re picking up on a lot of other cues besides just facial expression. Which of course an AI could do, too. But sexual orientation? Sounds like woo to me, and I’ll go out on a limb here and suggest that the researchers’ statement is self-promotional sensationalism more than science.

IBM’s experience with AI backs you up. An April IEEE Spectrum article explores the many failures and few successes of applying Watson to real world problems as opposed to games. It turns out human beings still can’t be duplicated. From the article:

Outside of corporate headquarters, however, IBM has discovered that its powerful technology is no match for the messy reality of today’s health care system. And in trying to apply Watson to cancer treatment, one of medicine’s biggest challenges, IBM encountered a fundamental mismatch between the way machines learn and the way doctors work…Looking beyond images, however, even today’s best AI struggles to make sense of complex medical information. And encoding a human doctor’s expertise in software turns out to be a very tricky proposition…

I could easily believe it if it generated stimuli and gauged human responses to them (like showing a picture of an attractive man and an attractive woman side-by-side, and tracking eye movements to see which one the subject looked at more). But just from candid photographs is harder to believe. And you’ve also got problems in training the databases: How do you label the training samples, and how confident can you be in the labeling of them? People’s self-identifications of their orientation aren’t always accurate.