We support our Publishers and Content Creators. You can view this story on their website by CLICKING HERE.

Recent advances in facial recognition technology have some researchers worried that privacy, among other things, can be seriously threatened.

Alarm bells went off after a recent study indicated that images of a person’s expressionless face could lead artificial intelligence to predict their political orientation.

“We demonstrate that political orientation can be predicted from neutral facial images by both humans and algorithms, even when factors like age, gender, and ethnicity are accounted for,” the study, published in the journal American Psychologist, stated.

“This indicates a connection between political leanings and inherent facial characteristics, which are largely beyond an individual’s control. Our findings underscore the urgency for scholars, the public, and policymakers to recognize and address the potential risks of facial recognition technology to personal privacy,” the authors added, warning that the “widespread use of facial recognition technology poses serious challenges to privacy and civil liberties.”

“I think that people don’t realize how much they expose by simply putting a picture out there,” the study’s lead author Michal Kosinski told Fox News Digital.

The facial images of 591 participants were taken in the study after they had filled out a political orientation questionnaire. This numerical “fingerprint” of their faces, according to Kosinski, was then compared by AI to a database of responses in order to predict the subjects’ views.

“We know that people’s sexual orientation, political orientation, religious views should be protected. It used to be different. In the past, you could enter anybody’s Facebook account and see, for example, their political views, the likes, the pages they follow. But many years ago, Facebook closed this because it was clear for policymakers and Facebook and journalists that it is just not acceptable. It’s too dangerous,” he said.

Kosinski, an associate professor of organizational behavior at Stanford University’s Graduate School of Business, noted that a Facebook user still can have their photo image visible.

“But you can still go to Facebook and see anybody’s picture. This person never met you, they never allowed you to look at a picture, they would never share their political orientation … and yet, Facebook shows you their picture, and what our study shows is that this is essentially to some extent the equivalent to just telling you what their political orientation is,” he said.

The study followed a controlled procedure to photograph the participants:

Participants wore a black T-shirt adjusted using binder clips to cover their clothes. They removed all jewelry and—if necessary—shaved facial hair. Face wipes were used to remove cosmetics until no residues were detected on a fresh wipe. Their hair was pulled back using hair ties, hair pins, and a headband while taking care to avoid flyaway hairs. Participants sat up straight with their lower back pressed against the chair’s back, their upper back off the chair, feet flat on the floor, and hands on the lap. We used a neutral background.

The final images were analyzed by the facial recognition algorithm VGGFace2 to ascertain “face descriptors, or a numerical vector that is both unique to that individual and consistent across their different images.”

The study authors warned that “widespread biometric surveillance technologies are more threatening than previously thought,” adding that “Our results, suggesting that stable facial features convey a substantial amount of the signal, imply that individuals have less control over their privacy.”

According to Kosinski, algorithms “can be very easily applied to millions of people very quickly and cheaply,” and he told Fox News Digital that the study is “more of a warning tale” about the current technology “that is in your phone and is very widely used everywhere.”

The study’s authors noted that even though the algorithm won’t allow “conclusively determining one’s political views,” even “moderately accurate algorithms can have a tremendous impact when applied to large populations in high-stakes contexts.”

“Scholars, the public, and policymakers should take notice and consider tightening policies regulating the recording and processing of facial images,” they concluded.

Frieda Powers
Latest posts by Frieda Powers (see all)

We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.