AI facial recognition technology could have devastating consequences for trans folks

AI facial recognition technology could have devastating consequences for trans folks
LGBTQ

A sense of security on the internet as a queer person has always been hard to find. And with a rapidly changing digital world, things seem to only be getting more complex. 

To me, the rise of AI (artificial intelligence) came out of thin air – I knew nothing about the subject, and then my feed was flooded with chatbot screencaps, deepfake drama, and generated music videos. 

Of course, all this begs the question – what exactly is AI? It’s made leaps and bounds in recent times, but we’ve also been using simplified versions for many years.

To put it simply, artificial intelligence is ‘intelligence’ exhibited by machines that may mirror human ability – the ability to learn, read, write, create, and analyze.

Of course, this very definition comes into question when taking into account that AI is trained on pre-existing datasets – anything the machine ‘creates’ is regurgitated from past knowledge. For example, if you use the popular tool ChatGPT and ask it to write you an original story, it will do its best – but it would be an unwise move to publish these works since it’s likely they mirror previously published copyrighted material. 

In some cases, AI can be helpful. It’s already disrupting the workforce as employers choose AI to make content quicker (although often with decreased quality) than a person could. AI is used in medical research and other STEM fields; using AI can drastically reduce the time it would take an individual to process large amounts of data. 

Sadly, the traits that make AI so helpful in some circumstances also make it hugely detrimental in others. When I first discovered AI image processing & image transcribers, I put in a few images of myself out of curiosity. I mostly got accurate results – descriptors like ‘young man with pink hair,’ ‘male in mid-twenties smiling slightly,’ and so on.

However, when asking AI to generate an image of a trans person, the results get frighteningly offensive.

Sourojit Ghosh, a researcher of the design of human-centered systems, finds worrying issues in the way Stable Diffusion, an image processor, conceptualizes personhood. “It considers nonbinary people the least person-like, or farthest away from its definition of ‘person.’”

Ghosh noted that when Diffusion was asked to produce images of people with prompts including ‘nonbinary’, the program grew confused and often made a monstrous amalgamation of human faces that resembled no real person. Similarly, putting in prompts for ‘trans man’ or ‘trans woman’ may confuse the AI, result in hypersexualized images despite a neutral prompt, or result in a cringe-worthy, embarrassingly stereotypical version of what a trans person might look like.

What’s more concerning to many transgender and/or nonbinary individuals is the growing push for AI facial recognition software. Facial analysis software is growing in usage for marketing & security purposes; as if the idea of a camera scanning your face to use for marketing statistics isn’t concerning enough, what this means for transgender individuals is unclear.

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,” said lead author Morgan Klaus Scheuerman, a PhD student in the Information Science department at the University of Colorado. “While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.” This could mean that the software incorrectly labels transgender individuals with a frequency that would cause issues in end results. 

Researchers collected 2,450 images of faces from Instagram. The pictures were then divided into seven groups of 350 images (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary) and subsequently analyzed by four of the largest providers of facial analysis services (IBM, Amazon, Microsoft, and Clarifai).

Cisgender men and women both had high accuracy rates – 98.3% of cis women and 97.6% of cis men were identified in alignment with their gender.

But outside of cis people, the system began hitting more snags. Trans men were misidentified as women up to 38% of the time;  those who identified as agender, genderqueer, or nonbinary – indicating that they identify as neither male nor female — were mischaracterized 100% of the time. 

“These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman. And that impacts everyone.” said Scheuerman.

As facial analysis software becomes increasingly common, it’s unclear how transgender individuals will be affected. I’m a worrier by nature, but my mind can’t help but race with worst-case scenarios. Could I fail security clearance for not matching the gender on my ID card?

It’s already well-documented that trans people face more struggles at security clearances and may be subject to pat-downs if something in their physique seems ‘off’. One notable instance of this is the assault of Megan Beezley, a trans woman who was inappropriately patted down & asked to show additional identification by TSA at Raleigh-Durham International Airport (this story hit close to home to me, as I grew up in Durham, and part of my push to move out of state was the rising incidences of queer hate crimes). 

If this software becomes public, will anyone and everyone be able to find past pictures of me, pre-transition, by entering an image of me now? It seems that the future of going ‘stealth’ –passing in your day-to-day life as your chosen gender identity, without clarification that you transitioned – is at risk, as you might have to bicker over your facial features with anyone and everyone who uses this software.

It’s clear that these programs need further work before being used on the public – after all, a system that can’t correctly identify an entire group of people is, to say the least, flawed.

Originally published here.

Products You May Like

Articles You May Like

Jenny McCarthy & Donnie Wahlberg Post Holiday Card Thirst Trap
Ted Danson Is ‘a Man on the Inside,’ ‘Based on a True Story’ Returns, a ‘Grey’s Heat Wave, Humans Take on Hamsters
Tributes paid after death of J Saul Kane, AKA Depth Charge: “A true trailblazer” 
Peter Scanavino on Rollins and Carisi’s Relationship, ‘Organized Crime’ Guest Spot (Exclusive)
Garrison Brown Is Finally Acknowledged On ‘Sister Wives’ New Season