Tenants in the Atlantic Plaza Towers apartment complex in New York’s Brownsville neighborhood were fighting to prevent their landlord, Nelson Management Group, from installing facial recognition technology to open the front door to their buildings, calling it an intrusion of their privacy. This week, they succeeded—the group reversed the decision.
More than 300 residents filed a complaint with the state to block the application in January, voicing fears that the proposed installation is a result of the pressures of gentrification in Brooklyn, with landlords hoping to attract higher-income, white tenants to the majority-black building that contains affordable housing units.
The Atlantic Plaza Towers tenants’ success could have wider implications for landlords across the city experimenting with facial recognition technology in their buildings. San Francisco became the first U.S. city to ban government agencies and police from using facial recognition technology in May, but much less is known about the use of the technology in the private sector, as companies are not required to comply with freedom of information laws.
Tranae Moran, 27, a third-generation resident of the complex, and Fabian Rogers, age 24, who has lived there since the age of 10, shared their personal decisions to fight against facial recognition in their apartment buildings during a series of interviews, which have been edited for space and clarity.
Tranae Moran: We live in two different 24-story buildings. We are a pretty tightly knit community. There are families who live in the buildings that have been there for generations. My great-grandmother was the first one in the building, and then my grandmother, my mom, and me.
We have security guards there 24 hours a day. We have superintendents, some of whom live on the premises. We have a key-fob entry system to get through every gate and every door inside of the building. There are cameras everywhere—you cannot escape them. There is not a piece of the property that is not under surveillance. Cameras [face] both ways in our hallways. It pretty much feels like a juvenile detention center.
There is not a piece of the property that is not under surveillance. . . . It pretty much feels like a juvenile detention center.”
Fabian Rogers: Last October, we were notified about a facial recognition technology being installed in our housing complex by the Division of Housing and Community Renewal [a New York State government agency]. We never received any notification from building management at all—they had to submit an application through DHCR, and if DHCR hadn’t notified us, they would have gone ahead and installed it. We get notices about other things from management every day. If there’s construction happening or the water line is going to turn off for some reason, we get a letter from them, but they didn’t send one this time.
TM: When I saw the DHCR form about surveillance technology, I was confused. Like, why do we need this? I already feel like I live in Fort Knox. [Management already has] my personal data. [They] know everything about us. We’ve been here for years. [They] are watching our every single move. And on top of that, [they] want to get my biometric data. I don’t know what you’re gonna do with it. Management didn’t even ask us if we wanted [the technology].
FR: When I heard the news, all I thought about was articles I read about data collection. How people’s personal information was collected on apps like FaceApp.
TM: When we were notified, we had like three weeks’ notice to say that we didn’t want it. And we all found out about it at different times—our mailboxes were screwed up during this period because our lobby was being redone. There was a high chance that some mail, including that survey, didn’t even make it to tenants. It felt like the information was deployed in a way that they didn’t want us to be able to talk to our neighbors about it. There was not enough time for us to inform ourselves of what this technology means for us. We all had to fill out this form and send it in. It wasn’t straightforward at all: The last page had a yes-or-no question that was written in a way where you would check yes thinking that you were saying no to the technology. A lot of people filled it out wrong.
I already feel like I live in Fort Knox.”
FR: We ended up knocking on every door, and we organized with each other. We had a general-attendance tenant meeting that was probably one of our biggest, because everyone was confused. That’s when we were like, okay, we have to organize, and do it fast because we only have about three weeks. Brooklyn Legal Services attended the meeting and came on board to help us in November. We appointed floor captains who would just get the information to everyone. We started knocking on everybody’s door and making sure that they filled it out.
TM: There are 718 units in the building, and in the end over 400 people voiced their opposition.
FR: Afterwards, we went to DHCR as a group—BLS came with us—to physically hand in our opposition to the facial recognition system, along with a petition and all the surveys we had collected. There was some media coverage. [DHCR employees] were confused. They didn’t even want to let us into the building. Honestly, they were not very pleased with our progress.
TM: A lot of DHCR members weren’t even aware of what was going on or of the technology they were trying to install. Our landlord was taking advantage of gray areas.
FR: BLS collaborated with us to figure out next steps. They helped us with our media strategy, which was great because landlords had been taking advantage of the fact that people did not have any knowledge of this. Through the media [attention] we basically forced Nelson [Management Group] to respond in public. That’s when they finally took us seriously and felt we should have a sit-down together. It was all on their terms. They didn’t want to sit with the whole tenant body—Nelson Management asked a few of us to meet with him and his legal representatives and his security representative.
TM: They didn’t ask us about our thoughts or field questions—that wasn’t the point of the meeting. It was basically a pitch from Nelson: They were telling us to feel grateful for the fact that this technology was being installed because according to them it was 100% foolproof, even though they had no evidence. It was like dirt thrown in our face. We had talked with tech experts over the past year, and they can’t even give a guarantee of 100% efficacy for facial recognition technology. They didn’t give us specifics on how the technology works, how the information is saved, how it’s accessed. Nelson [Management] couldn’t even explain to us what the installation process would look like.
They basically told us that it wasn’t a security-based solution, it was a profit-based solution. He couldn’t promise better security. He wanted to reel in new tenants that could afford the market-rate apartments after they were renovated and flipped.
According to them it was 100% foolproof, even though they had no evidence. It was like dirt thrown in our face.”
FR: Nelson has been known to start renovating vacant apartments and boost up their rent price. It feels like there is some stigma because we are in affordable housing and we’re paying rent-stabilized rates—like they are giving us a hard time because we can’t pay the full price.
TM: On my floor there were empty apartments that were renovated that nobody lived in. We felt like [Nelson Management] was warehousing these apartments until new security measures were put in place that [they] could use as a marketing tool. The technology can also be used to evict current tenants faster. It can catch tenants doing things they aren’t supposed to be doing according to their lease—small things, like if you’re letting someone stay in your home while you’re away. We’re constantly being watched by our security for little offenses. If my son rides his scooter in the hallway, I will get a letter from management under my door two days later.
FR: People don’t understand how intrusive this type of technology is. They think it’s just like Snapchat. But it’s like . . . it’s my home.
TM: Facial recognition technology has a higher error rate when it is trying to scan black or brown people. The software is biased, and I don’t see it working in our community at all. I also don’t want to be a lab rat for [Nelson Management]—I don’t want to be in one of the first buildings where they are testing their systems so that they can deploy it into more buildings.
People don’t understand how intrusive this type of technology is. They think it’s just like Snapchat. But it’s like . . . it’s my home. ”
FR: We have had meetings with local government officials including [New York] assemblywoman Latrice Walker. Congresswoman Yvette Clarke introduced an act about it, the “No Barriers to Housing Act.” It hasn’t passed yet. The bill prohibits the use of biometric recognition technology in certain federally assisted dwelling units, and for other purposes. It’s a good start but nowhere near what is needed to protect the city’s citizens. The bill has a lifespan of just one year if it passes. That’s not good enough. The fact that this only covers federally funded dwelling units is not good enough either. We want everyone in the city protected from the use of biometric collecting systems. This is happening in a lot of places—we heard of some buildings in the Bronx and Queens, mostly minority, low-income communities. No government officials, no city agencies, no one knew about it but the landlords.
TM: If local governments take hasty action without talking to people affected, it actually hurts us, because the media will cover the bill, and then the issue will get less coverage after it’s passed, and we still won’t be protected.
Landlords, building owners, development companies, and investors are just doing what they want. But I am a third-generation Brooklynite, I am from here, I can’t leave. I don’t want this technology in my home.
As far as I’m concerned, we still have work to do.