Hero Image

AI can predict your political beliefs using facial recognition



AI can predict your political beliefs using facial recognition
24 Apr 2024


Recent studies have underscored the potential privacy risks associated with facial recognition technologies.

A report in the American Psychologist journal reveals that artificial intelligence (AI), can accurately determine an individual's political beliefs using images of their neutral faces.

The precision of this prediction is likened to the success rate of job interviews in predicting job performance, or the influence of alcohol on aggressive behavior.


How was the research conducted?
Steps


The research was led by Michal Kosinski, a faculty member at Stanford University's Graduate School of Business, specializing in organizational behavior.

Kosinski revealed that 591 participants, who completed a political orientation survey before the AI system, generated a numerical representation or "fingerprint" of their faces.

These fingerprints were then cross-referenced with a database containing their responses to predict their political ideologies.


Kosinski emphasizes importance of protecting personal data
Concerns


Kosinski expressed concerns about the extent of personal information people inadvertently reveal by posting their photos online.

He stated, "I think that people don't realize how much they expose by simply putting a picture out there."

He emphasized the importance of protecting personal data such as sexual orientation, political beliefs, and religious views.


Requirements for the study
Process


The study required participants to prepare in a specific way for clear facial images. They wore black T-shirts, removed all jewelry, facial hair, and cosmetics, and tied back their hair.

The researchers used the facial recognition algorithm VGGFace2 to scrutinize these images, and identify unique face descriptors consistent across different pictures.

These descriptors were compared with those in a database to find matches.


Study warns that biometric surveillance technologies are more threatening


Predictions


The team used linear regression to plot facial descriptors on a political orientation scale, predicting political orientation for previously unseen faces.

The results indicated that individuals with conservative views often had larger lower faces.

This highlighted the need for academics, the public, as well as policymakers, to address potential privacy risks posed by facial recognition technology.

The study warned that "widespread biometric surveillance technologies are more threatening than previously thought."

READ ON APP