Heard about Carruthers? Caught shagging a sheep!
My god … a female sheep, I trust?
Of course! There’s nothing queer about Carruthers!

###

From my childhood. Where does this silly stuff reside over the years, waiting for an opportunity to make itself known again? In this case, the opportunity was a recent report in The Economist that AI can recognize gayness or straightness from photographs much better than actual people can.

Briefly, Stanford researchers Michal Kosinski and Yilun Wang narrowed down some 300,000 photos of about 75,000 men and women found on an open-access dating website (they don’t say which one) to 35,000 photos of nearly 15,000 individuals with male, female, gay and straight evenly represented. Using facial recognition software (“VGG-Face”) to analyze this mess of data, they came up with an algorithm that correctly identified who was gay or straight 81 percent of the time for men, 71 percent for women, from a single photo. (Shown five photos of an individual, the computer could do even better: 91 percent/83 percent for men/women.)

Which is pretty impressive. My gay friends may argue the point, but on average people correctly identify sexual orientation only about 60 percent of the time. The software relies on subtle differences in, for instance, forehead and jaw size, to make its predictions — leading to the conclusion that we’re gay or straight in the womb.

(Incidentally, this particular study was limited to just gay or straight, skipping the rest of the LGBTQIA etc. variants, and didn’t include African Americans.)

Taking the Stanford report at face-value (it’s peer-reviewed for publication in the Journal of Personality and Social Psychology), this is scary stuff. Plenty of regimes the world over still criminalize homosexuality, while plenty of gay folks in less repressive countries opt to stay in the closet. (“Is my husband gay?” outranks “Is my husband cheating?” in Google searches.)

And, of course, this can only be the start of a slippery slope. Once individuals with less sense of ethical responsibility than academic researchers get hold of this, who knows where it’ll end up? Lead researcher Kosinski recognizes the possibility of future abuse of predictive face-recognition systems, telling the Economist, “I hope that someone will go and fail to replicate this study … I would be the happiest person in the world if I was wrong.”

Me, I’m with Carruthers — nothing queer about me (I can’t say sheep turn me on, either.) But — but — suppose I learned that actually, this 75-year-old red-blooded hetero male has been living in self-deception, and that — according to the best and brightest computer algorithms — I’m a poofter? Oh, one gets over it, right? Meanwhile, I worry that computers will go way beyond straight/gay. Once they’ve figured out which way we’ll vote by checking our eyeballs, we won’t even have to bother to vote, it’ll be done automatically. Will my intended spouse be faithful? Ask his or her computer. And someone has already linked this research to the Philip K. Dick-inspired Minority Report 2002 movie. In the future, you prevent violent crime simply by predicting in advance who’s going to be murdering whom, and have the “Precrime” police unit curtail the deadly event. (So much for free will!)

As I say, this story portends a slippery slope, not one that I plan on sliding down any time soon. Not even with lube.