Advances in AI are used to spot signs of sexuality | The Economist

Advances in AI are used to spot signs of sexuality; staff; In The Economist; 2017-09-09.
Teaser: Machines that read faces are coming

tl;dr → The machines have gaydar now; people don’t (c.f.  Studies. That. Show).
Filed: under: Not. Juvenile. At. All. <giggle>Interesting, if true.</giggle>

Original Sources
Michal Kosinski, Yilun Wang (Stanford University); Deep neural networks are more accurate than humans at detecting sexual orientation from facial images; self-published; Center for Open Science; 2017-09? ←zn79k; forthcoming (maybe), Journal of Personality and Social Psychology; Author’s Notes, last updated 2017-09-10 (when viewed on 2017-09-10) ← notes.

Abstract:

We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain. We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 71% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males with 57% accuracy and gay females with 58% accuracy. Those findings advance our understanding of the origins of sexual orientation and the limits of human perception. Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.

Mentions

  • Physiognomy
  • Deep Neural Network (DNN)
  • Face++
  • on 130,741 images of 36,630 men and 170,360 images of 38,593 women downloaded from <where?>a popular American dating website</where?>
  • VGG-Face, a categorizer
    VGG-Face (Parkhi, Vedaldi, & Zisserman, 2015).
  • <quote>[They] used a simple prediction model, logistic regression, combined
    267 with a standard dimensionality-reduction approach: singular value decomposition (SVD). SVD is similar to principal component analysis (PCA), a dimensionality-reduction approach widely used 269 by social scientists. The models were trained separately for each gender</quote>.

Previously

In The Economist

 

Comments are closed.