29 Nov 2017

Creepy new technology could put rainbow communities at risk

3:56 pm on 29 November 2017

Not like for like.

 

No caption

Photo: Flickr/Carrie Gong

Two recent international studies reveal frightening possibilities that could put LGBTQIA+ people at risk, rainbow groups say.

The studies themselves may also be perpetuating harmful stereotypes.

In September, Stanford University researchers announced they had come up with an algorithm for facial recognition that could correctly distinguish between gay and straight men 81 percent of the time, and 74 percent for women.

Earlier this month, the iPhone X was released featuring new facial recognition technology.

“We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain,” the researchers said.

A few days ago, researchers at Columbia Business School in New York published a study that said someone’s sexual orientation could be identified from a handful of Facebook likes.

For instance, the researchers said liking pages related to Lady Gaga, True Blood and Harry Potter could imply someone is gay. Information about what people click is increasingly being used by advertisers.

This is an invasion of privacy and could put people at risk, said founder of national charity InsideOUT, Tabby Besley.

“Let’s say a homophobic parent comes up behind their teenager, who is scrolling through Facebook, and sees an LGBTQIA+ ad on their newsfeed,” she said.

“The next thing is that child is kicked out of home because their parents assumed or found out they're gay - it's not an unlikely story.”

Besley said technology can be an incredibly powerful tool for LGBTQIA+ communities.

“We're more likely to access support online than offline and with so many people in our community experiencing mental health challenges as a result of discrimination, a lot could be done with technology to improve our mental health,” she said.

“However, as demonstrated in these studies, technology can also put people at risk of further discrimination and harm. That risk can't be taken lightly.”

The researchers suggest a tool be added to Facebook that allows people to “cloak” their behaviour - in other words, hide what they like from advertisers.

The study on facial recognition, for which researchers analysed more than 35,000 facial images posted on a dating website, also deeply worries Besley.

Apple introduces Face ID.

Apple introduces Face ID. Photo: Screenshot: Apple/CNET

“Technology like this, regardless of how accurate or scientific it is, has the potential to be used in really harmful ways against LGBTQIA+ people - particularly in countries where being gay is still illegal or heavily discriminated against - software like this could be used in an incredibly dangerous and life-threatening way,” she said.

“Even in New Zealand, you can imagine how, if people had access to something like this, they could out others against their will, or use it to discriminate when employing staff or even target hate crimes.”

The Stanford University researchers said the purpose of devising the facial recognition algorithm was to “expose a threat to the privacy and safety of gay men and women”.

They said their findings showed that gay men and women often have “gender-atypical” features. They said gay men appear more feminine and often have narrower jaws, longer noses and larger foreheads, while gay women often have larger jaws and smaller foreheads

Because the algorithm had a lower success rate for women, the study said female sexual orientation is more fluid.

After being criticised by the Human Rights Campaign, which expressed similar concern as Besley, co-author Michal Kosinski told The Guardian technology similar to his already exists.

“One of my obligations as a scientist is that if I know something that can potentially protect people from falling prey to such risks, I should publish it,” he said.

He has not released his algorithm or corresponding program to the public.

Nevertheless, a spokesperson for the organisation Rainbow Youth said such technology is very concerning.

“It follows stereotypes about our community that I think are unhelpful - such as that gay men are more feminine or that fluidity is more common in women,” she said.

“The reality of our community is much more complex than these stereotypes reflect … relating biological determinants of sexuality to outcomes of studies like this is irresponsible and could have harmful impacts on queer and gender diverse people.”

Besley said the study feeds into “assumptions about gay people always looking a particular way - which we know are not true”.

Both Rainbow Youth and Besley were also concerned about other aspects of both studies.

Tabby Besley, 3rd from left.

Tabby Besley, 3rd from left, founded InsideOut five years ago. Photo: Supplied

“They completely erase huge groups of people from the rainbow community, including people of colour, bisexual people, transgender people and anyone over 40, making them inherently inaccurate,” said Besley.

Cyber security organisation Netsafe said the studies raise questions about the ethics of experimentation.

“It’s not only around why are we doing it and what the value for society may be, but also what the unintended consequences are,” said Netsafe’s director of technology and partnerships, Sean Lyons.

He said there is potential for people to misuse this type of information to potentially harm others, intentionally or not.

In regards to information that websites such as Facebook collects, he said people should have the right to remain private.

“Before we share information about ourselves ... we need to consider that we are giving away insights about ourselves and our lives. Single pieces of information on their own may not say much about us, but when you start to look at all of the information we share as a whole, you start to see the bigger picture.”