LGBTQ Groups Condemn “Dangerous And Flawed” Facial Recognition Intended To Predict Your Sexuality

By: James Felton/IFL Science  LGBTQ groups have condemned as “dangerous” an algorithm developed by Stanford University to predict whether you are gay or straight based on your face.

Stanford claims the tech, which uses facial recognition, can distinguish between gay and straight men 81% of the time, and 74% of the time for women. Several prominent LGBTQ groups have issued a joint statement calling the research “dangerous and flawed” as well as “junk science”.

The main concern is that the technology could be used be used to cause “harm to LGBTQ people around the world”, as well as problems with the quality of the research itself.

The Stanford researchers gathered 35,000 photos that had been publicly posted on a US dating website and analyzed them using a “deep neural network”, where the AI analyzes visual features. The algorithm was fed information on the self-reported orientation of the people in the photographs and was asked to predict sexuality based on the photos alone.

The research, published in the Journal of Personality and Social Psychology, found that the AI was better than humans at recognizing whether someone was heterosexual or not. Humans, the researchers found, could identify orientation around 54 percent of the time for women and 61 percent of the time for men.

The authors say that when the algorithm is given five photos of people to review, the accuracy goes up to 83% for women and 91% for men.

However, LGBTQ groups and other commentators have said that this research is both flawed and has the potential to be horribly misused.

“At a time where minority groups are being targeted, these reckless findings could serve as a weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous,” Jim Halloran, GLAAD’s Chief Digital Officer, said in a statement.

The groups are concerned that such technology, whether it’s accurate or not, could be used by brutal regimes to persecute gay people or people they suspect of being gay.

The researchers responded with a statement of their own: “GLAAD and HRC representatives’ knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organizations strive to advocate.”

Ashland Johnson, HRC director of public education and research, said. “Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world – and this case, millions of people’s lives – worse and less safe than before.”

However, the researchers said they “put much effort into ascertaining that our data was as valid as possible, and there are no reasons to believe that there are gross inaccuracies.”

GLAAD released a joint statement with the Human Rights Campaign condemning the research, which, amongst other problems, only looked at white people.

“Technology cannot identify someone’s sexual orientation. What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated.”

The groups urged news organizations to point out the flaws in the study when reporting about it, after the research was spread around social media over the weekend.

“This research isn’t science or news,” Jim Halloran said. “It’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”

Other problems listed by GLAAD and Human Rights Campaign were that the study assumed there were only two orientations and ignored bisexual individuals, and it only looked at white people of a certain age. It also only looked at photographs that were from dating sites and reviewed superficial characteristics.

“It is not surprising that gay people (out, white, similar age) who choose to go on dating sites post photos of themselves with similar expressions and hairstyles.”

The researchers involved in the study acknowledged that such technology could be misused.

“Given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.”

Then then add: “Let’s be clear: Our findings could be wrong. In fact, despite evidence to the contrary, we hope that we are wrong. However, scientific findings can only be debunked by scientific data and replication, not by well-meaning lawyers and communication officers lacking scientific training.”

GLAAD and HRC say they had a call with Stanford several months ago where they raised their concerns, and none of their concerns or the flaws raised were addressed.

Cosmic Scientist/Report a typo

The Cosmic Scientist inspires people to open their minds up to a broader view of reality. Examination of information and news both on and off planet Earth is the focus of study here, and this is done by creating awareness and shedding light on a number of different topics. The Cosmic Scientist encourages and inspires all beings to follow their heart, and make positive changes in their own life and on their home planet.