The Digital Panopticon

[music]
Brigid Bergin: It's the Brian Lehrer Show on WNYC. Good morning again, everyone. I'm Brigid Bergin, politics reporter for WNYC and Gothamist, filling in for Brian today. Now we go to Brian's weekly series with the Greene Space called Punishment and Profit. It's all about the business side of the prison industry. The question guiding these segments is who profits when people get put away? Every Tuesday evening through May 4th, the Greene Space in partnership with the advocacy group, Worth Rises, is holding a virtual panel discussion about one aspect of the business side of the prison industry.
Brian previews those weekly discussions here on the show. Last week, Brian talked about companies who provide the physical stuff to presents like furniture and restraint equipment. Today, we're going to talk about something less tangible but equally as essential to the lifeblood of mass incarceration. That's data and surveillance technology. Things like facial recognition, fingerprinting, and software that law enforcement uses to set bail or identify supposed gang affiliations. This kind of technological surveillance is a billion-dollar industry, and it's only growing.
With me now are Bianca Tylek, Executive Director of the advocacy organization Worth Rises, and Albert Fox Cahn, the Founder and Executive Director of the Surveillance Technology Oversight Project at the Urban Justice Center, a New York-based Civil Rights and Privacy Group. He's also a fellow at the Engelberg Center for Innovation Law and Policy at NYU School of Law. Bianca, Albert, welcome back to WNYC.
Albert Fox Cahn: Thank you so much for having us.
Bianca Tylek: Hi, how are you? Very nice to be with you this morning.
Brigid: There are two broad types of surveillance technologies we'll be talking about in this segment, those used by the police to track and arrest people, and the other which are used by jails and prisons to keep track of people once they're inside the system. How would you say these two categories are used together, Bianca?
Bianca: Sure, absolutely. I think one in many ways feeds up the other, and often they're actually the same technology. If you take, for example, risk assessment tools, risk assessment tools are used-- They’re often provided by corporations, sometimes foundations, they have these algorithms built into them. They tell courts whether or not someone should be released, for example, on bail while they wait pretrial, and they claim to measure the risk of flight, all they really do is actually measure the risk that somebody has had contact with policing, which is highly racialized.
Those same exact tools that are used to determine whether somebody should be eligible for bail are also used inside of prisons and jails to determine whether somebody is a high risk of violence and should be in a maximum-security facility, it determines where somebody might do their time, which also determines what programs they might have available to them. It's actually the very same tools that are used across these different aspects of the system.
Brigid: Albert, your advocacy group has been working to make the general public more familiar with the kinds of surveillance police departments use. Why has that been such a challenge, especially in New York City?
Albert: Well, in addition to being one of the most expansive and invasive surveillance states in the country, New York is also one of the most opaque, the NYPD has systematically thwarted public oversight for decades. They've done this in a number of ways. They've fought to create a number of exceptions to freedom of information law, trying to push back against the normal tools that journalists, advocates, and everyday New Yorkers use to understand how our government operates.
Then, we've also seen the NYPD using millions of dollars in private funding through its supposedly charitable foundation, and through federal funding, circumventing the normal appropriations process at the City Council. We worked pass a law here in New York called the POST Act last year, there is absolutely no accountability for the tools the NYPD purchased with non-city funds, even if they were being used to track and or arrest New Yorkers.
Brigid: We're going to talk a little bit more about some of that City Council legislation later in this segment, but Bianca police departments across the country would say these tools are objective and don't have the same biases as say people. I know your group has written a lot about the algorithmic biases when it comes to these tools. Can you explain to us a little bit more about how it works and maybe use facial recognition as an example?
Bianca: Sure. The example that I was using before risk assessment tools. Risk assessment tools came up almost a century ago. When they first came in, and they were meant to measure, as I said, risk of reoffending or anything like that, they actually used race as one of the inputs. Over the years, that obviously became something that was frowned upon. Now we use proxies for race, things like zip codes that people live in, or the number of times somebody has been arrested, not convicted, which is arrested by police for anything.
We use many of these aspects of criminal history and location as essentially proxies for race until we have those same exact outcomes in policing. You see in gang databases, where in New York, with a 95% of those gang databases are Black and brown people, which is-- It's completely based on things again, like your neighborhood, who you hang out with, or some very, very obscure data points.
In terms of facial recognition, which is a technology that is now being introduced by some of our major tech companies, you're seeing some really, really stark differences in how these technologies interpret Black and brown folks, as opposed to white folks. In particular, when it comes to white men, you have rates of accurate facial recognition that are north of 90%, 95%. When it comes to women of color, that figure is just 65%. What we have seen time and time again, are actually facial recognition systems incorrectly identify Black and brown people that have actually led to arrest.
Brigid: I want to talk about a high-profile case of the NYPD’s use of that technology over the summer when the police showed up at the apartment of Black Lives Matter protester seeking his arrest. They used helicopters, riot gear, police dogs, and his alleged offense, we later found out was using a bullhorn too close to a police officer’s ear. Albert, can you talk about how facial recognition software was used in this case and more broadly by the department in general? I think it was one of my Gothamist colleagues who actually figured out this technology was used in this case.
Albert: That was actually George Joseph at Gothamist WNYC who noticed that at the bystander photos, there was a printout from the NYPD facial identification section, that one of the officers who was storming this entire block was holding. The activists here, Dwreck, also known as Derek Ingram, is an amazing political leader activist here in New York, works with warriors in the garden. He was accused of pointing a bullhorn at a police officer. That was the only accusation.
Officers then after using allegedly facial recognition to track him down, also, using social media as they increasingly do, they then deployed SWAT teams, helicopters, all of these different officers. I think it's really crucial here to remember that when we talk about the risk of error with facial recognition, these other surveillance systems, we're not just talking about the risk of a wrongful arrest or even a false conviction, we're talking about the risk of all the forms of police violence that come when communities of color interact with the NYPD, we're talking about the risk of SWAT team at your door or even a knee to the neck.
Surveillance is often the entry point to interactions with a criminal justice system that is built on systemic violence against BIPOC New Yorkers. I think that risk that really visceral harm has to be very central to this conversation that, yes, this technology promotes bias. Yes, this technology is invasive and erodes our privacy, but it's really putting people in harm's way.
Brigid: Listeners, do you have a question about how police departments or law enforcement agencies use surveillance technology like facial recognition or risk assessment software? Maybe you want to know how and when police are collecting your data, call in at 646-435-7280. Or maybe you're a police officer or you work in corrections, how do you use technological surveillance or risk assessment? Does it make your job easier, safer, or do you have your own concerns?
The number 646-435-7280. As those calls come in, Bianca, facial recognition may be something new, but fingerprinting is certainly decades old. Last year, the New York City Council urged the NYPD to expunge many New Yorkers from the city's fingerprinting database, they agreed, but the database has only grown since then. Why should anyone want officers to give a fingerprint data when the police and much of the public see it as an important tool in their work?
Bianca: These are many of the same reasons I just mentioned, which is keeping people in these data systems makes them more likely to be approached by cops even when they have nothing to do with, nothing that is related to incidence of crime. All of a sudden, there's a crime that happens, let's say in a CVS and you've been in that CVS and your fingerprints are in the system, it makes you susceptible to that violence that he was just talking about.
There's actually more systems I would say even being created now there's voice printing, which many people may not know about. People inside prisons and jails, especially in New York, are being required to give a voiceprint in order to use the phone to call their loved ones. Well, those voice prints are being turned into databases sold to other law enforcement, and when they leave prison or jail, they're not removed or scrubbed from that voiceprint data.
Not to mention that the technology on the accuracy of voices is questionable at best, and to take it one step further into the conversation about protesters, in particular, being targets. These voiceprint systems inside prisons and jails don't just track the voices of people who are incarcerated, they actually also track the voices of people who they are calling.
If you are a person on the outside that is speaking to somebody inside a prison in jail, and more than one person at the same facility, you will be tracked and flagged for additional surveillance, which means as an advocate, as a social provider, as a family member with more than one loved one inside, you are more likely to be surveilled than other folks. That is like tormenting folks who are providers, who are protesters, who are organizers. That's been happening around facial recognition, the labeling of people as like Black extremists, and all these things time and time and time again, to quell the protests.
Albert: I could just quickly add Brigid, in New York, we claim to be a sanctuary city, we claim to protect undocumented New Yorkers from ICE, but the truth is that New York City's fingerprint database is shared in real-time with the New York State Police, which in turn is shared with the FBI database, which is completely accessible to ICE. We have this lip service to protecting undocumented communities, but then the reality is that fingerprint data can be used by ICE to fuel their deportation efforts here in New York.
So much of the data that we're talking about, it's not just abused by the NYPD and other city agencies, but all of the federal agencies that they then share that data with. Agencies that even under the Biden administration are still deporting New Yorkers every single day.
Brigid Bergin: Albert, you brought up the POST Act earlier, can you talk a little bit more about what the POST Act requires from the NYPD?
Albert: Sure. The POST Act is part of a growing national movement to no longer allow the police to control these questions over surveillance. The POST Act was just a very minimal first step. It said that the NYPD had to tell us what systems it was using because some of these systems very invasive, biased systems, they'd be used for a decade before anyone actually knew they were being placed in our communities. The POST Act said, "You have to actually tell us, what systems you use, how you're using them, how that information is being stored. Like with the fingerprint reader, who are you sharing it with?"
There were a few systems that we learned about for the first time because of the POST Act, like the fact that the NYPD has tools to track cryptocurrency and identify the owner, that it has systems are used to set up fake social media profiles so they can friend people and see posts that would otherwise not be public, but the NYPD also systematically tried to fight this law. They gave us a lot of policies that were riddled with errors, that were filled with boilerplate language. We saw more than 8,000 people submit comments in response to this saying that these policies were deeply broken, that these were not actually complying with the law.
Brigid: The social media example feels so potent because I think we can all wrap our heads around what a fake social media account, whether it's on Facebook or Twitter, would look like. Can you talk through a little bit of an example of how they've used that?
Albert: Well, with the social media account, we've seen this effort since the successful rollback of much of stop and frisk to we've seen the NYPD trying to replace the abuse of analog stop and frisk with these digital dragnets. One of the biggest ones, it's the gang database, which grew to be over 42,000 New Yorkers as Bianca mentioned, over 98.5% of whom were non-white. So this was basically a racialized dragnet.
One of the ways that they've been filling that gang database is by using these posts to create these fake profiles, friend people, then instead of needing to get a warrant, or get a subpoena to track what people are posting, they can just look at the friends feed. Then because of the software they're getting, one officer can conceivably create dozens, hundreds, maybe even thousands of fake profiles and use that to track countless New Yorkers overwhelmingly BIPOC New Yorkers. We've seen the power and the danger of these fake accounts in national elections to spread disinformation, but they're also a powerful tool to take away even the minimal protections that are private social media post app.
Brigid: If you're just joining us, you're listening to the Brian Lehrer Show. I'm Bridget Bergen, a reporter in the WNYC and Gothamist newsroom. I'm talking to Bianca Tylek, Worth Rises's Executive Director, and Albert Fox Cahn, Executive Director of the Surveillance Technology Oversight Project, about the data and information systems that track individuals in the criminal legal system. We're going to go to the phones. Let's talk to Alan in Queens. Alan, welcome to WNYC.
Alan: Thank you so much for taking my call. Thanks for the great program. Just a question. What's being done to pressure the tech companies themselves, even though they go through the nonprofit organizations connected to the NYPD? Is there anything being done to pressure the tech companies directly? Thank you.
Brigid Bergin: Thanks, Alan. Albert, do you want to start with that?
Albert: Yes, we've seen amazing campaigns to push back against the companies that are feeling mass surveillance. The Athena coalition has done a tremendous amount of work to push back against Amazon's facial recognition infrastructure, the Ring camera system. We've seen activist groups have done shareholder resolutions, we've seen a lot of campaigns to boycott these companies to say that you shouldn't be giving them your dollars every day if they're empowering this technology that's transforming our city and putting our neighbors at risk.
Part of the issue though is that it's hard to put pressure on a lot of these firms because a lot of the key players in mass surveillance like Amazon are monopolies. There isn't a lot of competition, and so while that consumer-facing work is important, while a lot of this effort to put pressure on companies directly is important, we also need much stronger laws, including a categorical ban on surveillance tools like facial recognition.
Brigid Bergin: Albert, as I hear you describe that, you've said Amazon several times, I know it sounds like Microsoft is a big player with their domain awareness system. We're not talking about obscure companies here, we're talking about major Silicon Valley players that are building this technology.
Albert: It's both the biggest of the big tech players, but also startups. One of the most invasive companies that we've come across in the last couple years is Clearview AI, a relatively small firm that scraped billions of photos on nearly every single American, sold that as a facial recognition platform to the police departments. NYPD officers could just sign up for this if they had an NYPD email account, any NYPD officer.
They ran more than 11,000 searches on that and we don't know if that was part of their day job or if they were doing this to track down someone they spotted at the local bar. It really is the Wild West out there. Yes, a lot of the giant tech companies are profiting off of it, but so many other firms are seeing this as a way to make a lot of money.
Brigid: Bianca--
Bianca: If I could add.
Brigid: Go ahead.
Bianca: If I could add to that, in fact, is the company that Albert just mentioned, Clearview, there was just a lawsuit filed against them last week by the executive director of rather the firm Just Futures Law and actually the executive director will be on our program tomorrow evening discussing both that lawsuit as well as the broader use of facial recognition and biometric data in part of the mass deportation system.
Then, there's others, I think, in particular, in the immigration system, there is quite a incredible amount of work that has been done to target the corporations specifically, Amazon, Microsoft, as you mentioned, Google, so others, by Mijente and other organizations that came together to create the no Tech in ICE platform, that we definitely encourage people take a look at.
There is quite a bit of work being done on the corporate side of things and as Albert mentioned, everything from small or newer stage companies to very large, major conglomerate corporations that are getting into this data business.
Brigid: Bianca, I wanted to ask you more broadly about the private tech companies that provide the technology surveillance systems for use inside prisons. You described that the use of the voice-print, as one of the things that people who are living in those facilities are dealing with or living with, but can you tell us a little bit more also what it's like to live in one of these high tech prisons?
Bianca: Yes, absolutely. I want to say only to a certain degree, I have never lived inside of a high tech facility-
Brigid: Absolutely.
Bianca: -but from what we have heard from people, I think the voiceprint that's happening now is quite invasive. As we said, you cannot use the phone if you do not give a voiceprint to one of these corporations. There's now new corporations that are popping up just to surveil calls and so they use trigger words to essentially help them move through mass amounts of calls faster, these live surveillance. One of these things that a company called LEO Technologies does that is around this type of voice recognition is that they will actually say that their "system speaks inmate."
Brigid: Say that one more time. Their system speaks?
Bianca: Inmate.
Brigid: Okay.
Bianca: That how they describe it on their platform, on their website. Which just creates a lot of questions exactly. Like, "What does that even mean?" That's already starting to sound like a lot of things. That sounds pretty terrible. There's all of that, but there's even other things inside of prisons and jails. Now people are given in some facilities some wristband that they have to wear, that has a actual marker in it. When they walk past certain places in the prison is constantly tracking their movements and every last little thing. You really start feeling like we just have trackers of every single type on people in every moment.
Brigid: I appreciate your description, Bianca, but I also just want to acknowledge your comment. You are obviously describing this as an advocate, and someone who works on behalf of understanding what is happening in these prisons and so I appreciate you noting that with your answer. Albert, I want to talk about another piece of relevant news this month, police Commissioner Shea announced that the NYPD would be expanding its ShotSpotter Program to monitor large swaths of the city. That's the technology used to monitor gunshots. Why is it controversial?
Albert: Well, I think we have to start off with that description. It claims to monitor gunshots, but the truth is it monitors neighborhoods. These are highly sensitive directional microphones that are placed in low-income communities and communities of color to monitor for what they think sounds like gunshots. The problem is there's no real evidence that the technology works. This is part of the pattern we see with surveillance salesmanship that you get these claims that, "Oh, the artificial intelligence does X or does Y," but when you look for the actual third party verified evidence that these claims are reality, there isn't a lot there.
What we have seen with ShotSpotter is plenty of cases where they got it wrong, where they couldn't tell the difference between a firework or a car backfire and a gunshot. What that translates into is a lot of people being stopped. It's one of the leading justifications for Black and brown New Yorkers being stopped by the NYPD under the new stop and frisk guidelines. Also, at the same time, these microphones are so sensitive that they can listen to what people are saying in their own apartments if the window is open. They can hear what people are saying on the street.
The company has also been accused of fabricating evidence. In one case up in Rochester, where a man was charged with shooting police officers, he had been shot by police officers, that they knew. He was in the hospital and they claimed, "Well, he shot first." There wasn't any evidence that he had actually shot at them. What this man claims in his lawsuit is that ShotSpotter manufactured a gunshot report to give the police evidence that he had actually shot before they had fired three bullets at him, that they had heard four gunshots instead of three. He's now suing them after he was acquitted of these charges in his criminal case.
I have to ask, if this technology isn't proven to work, if there are allegations that it gets it wrong all the time, if there are claims that they're manufacturing evidence, then why is Commissioner Shea investing so much money to blanket New York neighborhoods with these microphones. This to me is just a lose-lose in every possible way.
Brigid: Albert, thank you for that. Bianca, as advocates call on governments to defund police departments, some worry that less money for police departments means more layoffs and reliance on technology rather than human beings. With all we've laid out here is is that a concern to you?
Bianca: Sure, I think it's always a concern when we put advocacy demands in the hands of law enforcement and allow them to define how that's going to happen. The reality is that when we defund the police, yes, we mean less police on the streets, we also mean less surveillance. That's not necessarily the solution that we would like to see, but that's what makes it so important for legislators, elected officials to not simply take high-level comments and demands made by advocates, but to actually involve advocates and the community in the process of developing the solution so that the police that are walking around the streets are not replaced by some form of AI surveillance.
It is just remarkably important that that process of finding solutions to de-police our communities are done in concert with communities and advocates and that electives as I said legislators, police don't find a new way to police us.
Brigid: We're going to try--
Albert: If I could--
Brigid: Albert I'm going to cut you off because I want to try to get one more caller in here before we have to wrap up. I want to go to Derek on the Upper West Side. Derek, welcome to WNYC. What did you want to add to this?
Derek: Thanks for taking my call. One of your guests, lady mentioned that this LEO Systems claim on their website that they speak in me, I'm guessing that that was referring to something called the IMEI Ivan, Mary, Echo, Ivan, which is a unique identifying address of all cellular devices so that any cell phone or smartwatch anything that communicates in cellular land has a unique identifying number, that's how people and devices are tracked.
I think that's what LEO System is trying to say that they can break down the codes and track devices that's what the police often use. I just want to mention, I used to be a teacher in South Bronx, the NYPD back in the '90s, they used to go around and collect all the yearbooks from the Black and Puerto Rican high schools and use the photos in the yearbooks for their photo race for criminal identification without knowledge to the students, of course. I think the city council stopped right around the time that Rudy Crew became Chancellor.
Brigid: That is so disturbing. Derek, thank you. Let me turn to Bianca because I think Derek was making a point to something that you had explained. Was Derek correct in his understanding of how this company was marketing the technology?
Bianca: No, actually, LEO Technologies had a product called Verus. Verus, specifically, is real-time intelligence that listens into just wall phones in a prison or jail. Has nothing to do with separate devices, but wall phones. It uses keywords and phrases to listen [inaudible 00:30:44] conversation.
Brigid: You said it spoke "inmate", and you--
Bianca: Yes. That's their language.
Brigid: I-N-M-A-T-E. "Inmate," that's what you were saying that they are saying their technology speaks. Which, as you raised, has many implications.
Bianca: Correct. [crosstalk] That's not okay. Right, exactly.
Brigid: Right, [chuckles]. Okay. We still appreciate, Derek, your contribution and your perspective, particularly having been a New York City teacher. We're going to have to leave it there for today with Bianca Tylek, Executive Director of Worth Rises, and Albert Fox Cahn, Executive Director of the Surveillance Technology Oversight Project, about the data and information systems that track individuals. Thank you so much for being here.
Albert: Thank you so much for having us.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.