Eticas

Is AI for or against LGBTQ+ community?

In 2011, an Android app had a test of twenty stereotyped questions through which parents could reportedly find out the sexual orientation of their children.

There were questions such as “Does he like to dress well, pay close attention to his outfits and brands?”, “He loves football?”, “Is a fan of singers divas?” or “Does it take him a long time to do his hair?”.

If your child was classified as straight, it would said “You do not have to worry, your son is not gay. So there are chances for you to be a grandmother with all the joys it brings.” but if the app determined that your son is gay, then the message was “No need to look the other way! He is gay! ACCEPT IT!”.

Besides the simplistic, sexist nature of this idea, and the intromision to privicy that it is, what if they are bisexual, transgender, non binary or gender fluid? We don’t know, as these people are not taken into account (once again).

Later, in 2017, Stanford University developed an AI tool which claimed to be able to identify a person’s sexuality with very high levels of accuracy based on just a single photo. Yes, that means that some people could be forced to come out by a machine.

Stanford researchers Michal Kosinski and Yilun Wang wrote a study called “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images”. They used deep neural networks to extract features from thousands of facial images. Their facial recognition system “correctly” distinguished between gay and heterosexual men in 91% of cases, and in 83% of cases for women, while human judges achieved much lower accuracy: 61% for men and 54% for women. That study findings exposed phrenology as a threat to the privacy and safety the LGBTQ+ community around the world, with potential risk of persecution, especially in authoritarian regimes like that in  Uzbekistan, where authorities subjected at least six men to forced anal examinations between 2017 and 2021 in order to prosecute them for consensual same-sex relations.

In fact, according to Human Dignity Trust71 countries criminalise private, consensual, same-sex sexual activity; 43 countries criminalise private, consensual sexual activity between women using laws against them; 15 countries criminalise the gender identity and/or expression of transgender people; and there are 11 countries in which the death penalty is imposed or at least a possibility for private, consensual same-sex sexual activity.

Have you ever thought about the consequences that these kinds of systems may have for all these collectives? Yes, we are talking about being murdered just because of their sexual orientation or gender identity. And it can also happen if those authorities decide to buy data from the apps that this community may use.

And there where having any non straight sexual orientation is not criminalised, what happens if homophobic companies start to discard candidates through these systems? This is just one of the examples of how the private sector may use these sorts of tools.

Beyond that, what if we use this great potential of technology to help vulnerable groups instead of perpetuating discrimination and being a potential weapon for their safety?

We truly believe that there is a strong need for having a moral mind and ethics when the ideas of creating these kinds of systems appear. “Is it useful?”, “May it be discriminative with some collective?”, “Would it be a help or a threat for the vulnerable ones?” or “Can someone use it against them?” are some of the questions that developers should keep in mind when proposing the idea of creating and developing it, so the negative impact of their outputs would be minimized.