Pers Soc Psychol Bull.
Row over AI that 'identifies gay faces'
Study participants use gendered facial cues and stereotypes of gay people to make their judgments, but reliably misjudge sexual orientation for people countering stereotypes. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications. Later studies found that gaydar was also accurate at rates greater than chance for judgments just from the face. The viewers rated their sexual orientations on the same scale and the researchers found a significant correlation between where the people said they were on the scale and where they were perceived to be on the scale. Contrary to hype surrounding the study, the results suggest that walking styles and body shapes do not give away sexual orientation. Journal of Nonverbal Behavior. Rule argued it was still important to develop and test this technology:
For other uses, see Gaydar disambiguation. While the findings have clear limits when it comes to gender and sexuality — people of color were not included in the study, and there was no consideration of transgender or bisexual people — the implications for artificial intelligence AI are vast and alarming. Journal of Nonverbal Behavior. Detailed acoustic analyses have highlighted a number of factors in a person's voice  that are used, one of which is the way that gay and straight men pronounce "s" sounds. This page was last edited on 25 Juneat