Professors create 'gaydar' machine with AI
Stanford University professors use AI to identify sexual orientation. Source: Shutterstock.

Just as society is moving towards viewing sexual orientation as a fluid spectrum, two Stanford University professors have found face recognition technology can categorize people as ‘gay’ or ‘straight’, based on a single picture.

Psychologist Michal Kosinski and his colleague Yilun Wang, at the Stanford Graduate School of Business, found that Artificial Intelligence can supposedly differentiate sexual orientation with 81 percent accuracy.

The study found that when the pictures of two white males from dating sites are compared, an existing algorithm can identify which men identify as straight or gay with high rates of success.

Accuracy increases to 91 percent for white men and 83 percent for white women when five pictures of the individuals are compared.

An existing facial recognition program, called VGG Face, reads and codes the photos. It then enters the data into a logistic regression model and looks for correlations between features in the photos and the person’s stated sexual orientation.

It is uncertain what specific factors the AI program uses – individual attributes such as facial features, for example, or whether it looks at image quality.

However, The New York Times reported that Kosinski and Wang claimed the algorithm was interpreting set facial features and grooming choices.

Although the research has successfully confirmed the hypothesis that AI can identify sexual orientation, critics have questioned the ethical soundness and scope of the study.

LGBT+ rights campaign group, GLAAD, released a statement about the study: ““This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”

Researchers purposefully chose not analyse pictures of people whose gender did not align with their sex, non-white people, or natural photos not uploaded onto dating sites.  

The study also only considered only two sexual orientations – straight or gay – thus nullifying a myriad of people who consider themselves between the two labels.

Additionally, being able to obtain personal or sensitive information by simply running a picture through a program may have dangerous implications for society.

With 72 countries still criminalizing homosexual relations, the program may be used to prosecute those who are not straight

https://twitter.com/joshraclaw/status/917891601770450944

But, as LGBTQ Nation noted: “[Kosinski and Wang] used publicly available information, common software, and techniques that people working in the field are already familiar with.

“They invented nothing new, they just set out to show what things that are already out there can do.

“So instead of giving brutal regimes tools to attack gay and lesbian people, they’re giving us a heads up that these tools are out there.”

Liked this? Then you’ll love…

Why teachers will never be replaced by robots

Students cheat – but can they be stopped in the age of technology?