advertisement
advertisement

I’m a trans woman. Google Photos doesn’t know how to categorize me

“Is this you?” Platforms like Google Photos, Apple Photos, and Facebook are forcing trans people to interact with—and classify—photos of our past selves.

I’m a trans woman. Google Photos doesn’t know how to categorize me
[Photo: Afif Kusuma/Unsplash]

There are two people on my phone screen. At least, I think there are two people; Google isn’t quite so sure. And neither is any other system in the world. Both of them have the same social security number, the same home address, the same parents. But not a single person would say they look the same. At most, they would see siblings or perhaps cousins, related certainly but clearly of different genders.

advertisement
advertisement

This is the problem: I’m a transgender woman and I took both of these photos of myself, one before I transitioned and one after. The world is full of traps like this for me, whether it’s the bouncer who looks at my driver’s license and demands a second ID before letting me into the bar, or the unchangeable email address that uses an old name. Trans people are constantly having to reckon with the fact that the world has no clear idea of who we are; either we’re the same as we used to be, and thus are called the wrong name or gender at every turn, or we’re different, a stranger to our friends and a threat to airport security. There’s no way to win.

Digital systems have made this much worse. For computers and databases, the world exists in a binary. Either two things are the same, or they’re different—without caveats, without a middle ground. But as a trans person, my sense of being is often conditional. How I answer the question, “Is this you?” depends on who’s asking.

Being trans online

Some trans people erase whole sections of their lives from the servers of Facebook and Google just so they won’t have to be presented with another machine-driven attempt at nostalgia in the form of a Facebook memory or a stray photo turning up in a Google Photos search, inducing a fit of negative memories and discomfort with their past selves. Jennifer Moore, a trans woman I spoke to, has untagged herself from her old photos on Facebook.

“There’s plenty of times that I’ll [look up a photo of] myself, but I don’t want it to be automatic,” Moore says. “I certainly don’t want somebody who has my name to search for me and find old pictures of me.”

We both had the same reaction: confused paralysis.

Moore was presented with the same prompt from Google Photos as I was, and we both had the same reaction: confused paralysis. “I was getting the question for like a year before I ever decided what to do about it,” Moore says.

advertisement

Eventually, social media convinced her to make a decision: She taught the machine-learning algorithm in Google Photos that her past self was a different person, keeping it accessible, but forcing the software to draw a line between pre- and post-transition. She even named the new person “deadname,” the term trans people use for a name we no longer use after transition.

“There was some before and after transition meme that was going around on Twitter and I wanted to participate,” Moore says. “Having Google know who I am made that a lot easier, but I didn’t want to put my name on those old photos.”

These groupings are only visible to individual users, and if you turn the feature off, all of the groups and labels will be deleted. Google also recently introduced manual face tagging, which will let users tag photos of themselves at different stages of transition as different people, and let the algorithm manage the rest. “The face grouping feature is intended to make it easy to manage, label, and find photos of people and pets in ways that are relevant to you. When this feature is turned on, you may occasionally see prompts asking for feedback to help further customize and improve your groups,” said a Google spokesperson in a statement.

Not an immutable object

Some software doesn’t even give you the chance to have an existential crisis; it simply makes the decision for you. And that decision is often the wrong one. Cayce Fisher, a trans woman who uses an iPhone, recalls how her phone decided to group all of her selfies together, choosing the oldest one, years before transition, as an icon to represent the album as a whole.

“It contains wedding photos right next to workplace photos, next to really sad depressing selfies I took before I transitioned, next to photos I took this morning,” Fisher says. “I think and feel differently about all those photos.”

In the end, Fisher was able to change the cover photo for the album, but the Photos app continued grouping all of her past images together. And that just didn’t make sense with how she views herself.

advertisement

There’s no understanding that people grow and change.”

Cayce Fisher

“It implicitly says this person from 2002 is the same as this person from 2019. There’s no understanding that people grow and change,” Fisher says. “The person is an immutable object.”

The only way to get rid of that grouping would be to remove the grouping entirely and spend the time to re-categorize pre- and post-transition photos—or delete old photos entirely, an option Fisher was reluctant to choose.

In my case, just like in hers, I don’t want to pretend my past doesn’t exist. I’m a radically different person than the me of 2013, but those memories are mine too, and it’s not always fair to ask me to give them up for the sake of poorly designed software. I had nearly three decades of life before I transitioned, so even the act of telling an app that I’m a different person now for purely utilitarian reasons feels like a betrayal of those memories, like I’m trying to act as if they never happened.

An anomaly in the machine

The heart of Moore’s paralysis and Fisher’s frustration with decisions like these can be found in the simple fact that any answer we give a computer is one-dimensional and any response the computer gives is inscrutable. That’s simply part of their design, according to Penelope Phippen, a trans woman who’s been published in software industry journals for her work on machine learning algorithms.

“The design of modern machine learning systems is such that it is very hard for the people building them to say why they are providing the answers they are providing,” Phippen says. “They’re almost impossible to reason about unless you have a degree in higher math, so we just don’t.”

Without understanding how a machine makes its decisions, we’re left with contradictions. Where we see a slow progression from one self to another, the machine forces us to classify into two categories: either same, or different. And while that works for the majority of cisgender people who grow up, get older, and make piecemeal changes to their appearance, transgender people remain an anomaly.

advertisement

The systems are very much designed with this cisnormative view.”

Penelope Phippen

“The systems are very much designed with this cisnormative view,” Phippen says. “You move through the world with this one experience, without these huge, significant shifts.”

User-facing machine learning systems like this rely on two models, Phippen says. First, a global model that resides on Google’s servers is trained on hundreds of thousands, if not millions, of images. That system does a first pass of categorizing each face before it reaches your phone. On your phone, another machine learning system is trained by the user, via those prompts asking you if two photos are of the same person. These results are usually only stored locally and rarely incorporated into the broader set of training data.

In short, it’s not likely to confound the global algorithm much if you tell Google that your pre- and post-transition selves are different people; they may look different locally, but if the first model sees you as the same person, it’s likely to continue doing so. In the end, the agency we have over these algorithms exists on our phones and nowhere else. Tagging myself as a different person before and after my transition isn’t likely to make that choice automatic for the next trans person who comes along and doesn’t want to see their past surfaced by a machine.

Agency over the algorithm

In the end, the only way to train a system like this how to reliably treat trans people with respect is to teach it how to identify a trans person—a prospect fraught with hard moral choices.

“The same data set that could be used to build a system to prevent showing trans folks photos from before they started transition could be trivially used and weaponized by an authoritarian state to identify trans people from street cameras,” Phippen says.

In the end, the agency we have over these algorithms exists on our phones and nowhere else.

With this dystopian future in mind, coupled with the fact that federal agencies like ICE already use facial recognition technology for immigration enforcement, do we even want machine learning to piece together a coherent identity from both pre- and post-transition images? At what point in transition does a photo become acceptable to show to trans users? Even Phippen finds this a difficult question to answer.

advertisement

“I haven’t resolved all of this for myself, how I want to deal with my past identity and how it relates to my current one,” Phippen says. “For the most part, I’m not forced to do that in my day-to-day life.”

But regardless of any personal decisions, that world may be on its way faster than anyone anticipates. Machine learning systems are rapidly developing and may soon be capable of identifying trans users. Researchers have already compiled data sets that include images of trans people over time, and an algorithm trained on this data could be used to identify whether or not the photos on a particular person’s Google account belong to someone who has transitioned.

The reality of life for any trans person usually does not involve regularly reckoning with our past selves—most of us go out of our way to avoid confrontations like that—nor with the implications of cisnormative software. And yet, on that little phone screen, there are two thumbnails, asking for me to make a choice.

In the end, I click the button that indicates there are two different people on my screen.

Do I wish Google would do this for me without asking? Probably not. With trans people facing daily harassment simply for existing as ourselves, the stakes seem too high to risk teaching these systems how to recognize us, even if the intended effect is well-meaning.

Personally, I’m happy to tell Google some little white lies so that I don’t have to be bombarded with old photos of myself, or even disable its facial recognition completely. This feels like a small price to pay for a world where I remain in charge of the algorithm, without handing control over who I am to corporations and their faceless machines.

advertisement

Cara Esten Hustle is a writer and software developer in Oakland, California.

advertisement
advertisement