Melissa Harris-Perry: Hi. I'm Melissa Harris-Perry and this is The Takeaway. Today started like most workdays.
Siri: 5:30 AM.
Melissa Harris-Perry: The alarm.
Siri: 6:00 AM.
Melissa Harris-Perry: A calendar reminder, it's spirit week, second grade, so I got to make sure Anna wears her orange t-shirt.
Siri: 7:00 AM.
Melissa Harris-Perry: Posted a little red heart like on Twitter for the daily sunrise photo tweeted by our Takeaway director Jay Cowit.
Siri: 8:00 AM.
Melissa Harris-Perry: The Apple Watch has an attitude and tells me to check my rings because normally I'm further along by now leave me alone, Siri.
Siri: Okay, Melissa, whatever you say. 9:00 AM.
Melissa Harris-Perry: I present my lunch order for delivery on DoorDash. It's not even mid-day. I've already told my phone tons of information about my schedule, my work, my location, my family, my food preferences, and my health practices. Harmless enough if these events were hastily scribbled notes in a personal journal, but these data are far more public. Each of our daily smartphone interactions is a cyber breadcrumb in a personal digital trail. A trail hungrily pursued by advertisers pushing their products into our carefully crafted algorithms.
It's not just about products. It's also potentially about punishment. Law enforcement already uses digital footprints in criminal cases. For example, when the Department of Justice was looking to identify people involved in the January 6 Capitol attack, it subpoenaed cellphone data. What if your cell phone data shows you search for a reproductive health care clinic or that you looked up how to order abortion medication online? In March, Missouri considered a law to make it illegal to "aid or abet" out-of-state abortion.
It's not difficult to imagine using location services on a smartphone to track and prove the whereabouts of a woman seeking illegal abortion across state lines. In the wake of the recently leaked draft opinion, suggesting that the Supreme Court is ready to overturn Roe v. Wade. Privacy experts are expressing concerns that if abortion or access to reproductive care becomes widely criminalized, law enforcement officials and prosecutors could use smartphone data as a surveillance tool.
I'm joined now by Cynthia Conti-Cook, Civil Rights Attorney and current technology fellow on the Ford Foundation's gender, racial and ethnic justice team. Welcome to The Takeaway, Cynthia.
Cynthia Conti-Cook: Thank you. Happy to be speaking with you.
Melissa Harris-Perry: I'm also joined by Yveka Pierre, Senior Litigation Counsel at If/When/How a national network of legal professionals focused on reproductive justice. Welcome to The Takeaway, Yveka.
Yveka Pierre: Hey, y'all? How's everybody doing?
Melissa Harris-Perry: It's a tough one. Yveka actually let me start with you on this. I'm wondering, as we're looking at this potential future, is this something that's only in the future or have we seen reproductive choices, or reproductive experiences already being criminalized?
Yveka Pierre: Yes. Yes is the quick short answer. Yes, and that's the longer answer. As long as folks have been able to get pregnant people have been able to sought ways to not be pregnant. In the United States, what we have traditionally seen though, abortion is not something that has always been criminalized, even after Roe. We have seen folks be criminalized for seeking to end a pregnancy, whether that's seeking to end the pregnancy on their own a process called self-managed abortion.
Seeking to end of pregnancy through assistance by others in an indef project that If/When/How has been doing to analyze and collect information on the rest of the self-managed abortion. We can find from the year 2000 to present day, more than 60 cases where someone has been prosecuted for their self-managed abortion or for assisting someone and in doing so.
Melissa Harris-Perry: All right. Cynthia, let me come to you on this. We know that there's already some level of criminalization around choices and experiences of termination but has technology been part of any of these cases? Are we just being alarmist about how our smartphone data could be used?
Cynthia Conti-Cook: Not at all. It has absolutely already been used in prosecution and when we say that data will increase and escalate the criminalization of abortion. Let's take that apart, it means two things. There's the increase from surveillance and there's an increase from prosecution. What we're speaking about at the moment is an increase in prosecution because that digital evidence is being included in the prosecution, to, for example, illustrate, as the prosecutor see it, the intent of someone.
Already there has been examples of digital evidence. It's not what we are thinking, perhaps speculating around our menstrual apps and that data. It's much more common to see the text messages to your friends, to see the websites that you visited, or the email receipts that you get when you buy the abortion pills online. It's the very commonplace type of information that everyone is creating, and everyone is having extracted from their criminal case if they're already in the midst of an investigation and prosecution.
Melissa Harris-Perry: Why wouldn't you be protected under a fifth amendment right not to incriminate yourself by your own text messages here?
Cynthia Conti-Cook: The Fifth Amendment, unfortunately, does not extend to your digital devices to your digital information.
Melissa Harris-Perry: Just as I was like,-- and but it's as simple as that, that we simply aren't covered with that constitutionally protected right, against self-incrimination if we self incriminate digitally?
Cynthia Conti-Cook: The Fourth Amendment should be the protection that we have over the digital devices. The Fifth Amendment, in my opinion, should also include digital devices but so far, courts have not exactly gone there. The Fourth Amendment protection from search and seizure has not been extended very generously by courts to digital information. In fact, the Supreme Court only included cell phone site location data, as something that could be protected in 2017, in a case called Carpenter.
Because the reason that a lot of this protection hasn't come into play is because of a doctrine called the third party doctrine. As long as we're sharing that data with the companies, the courts have always assumed that we should not have a privacy interest in that data.
Melissa Harris-Perry: All right. Yveka, I'm wondering in our conversations at this point about the potential end of Roe v. Wade protections, we've talked a lot about how that is likely to have a disparate impact on people capable of pregnancy, depending on poverty, status, race, obviously, geography and which state you live in. Is that also true here in the context of digital surveillance? Are we expecting that there would be some categories of people who would be closer to surveillance and criminalization than others?
Yveka Pierre: I think that's absolutely right. When we're looking at the criminal legal system as a whole, we can see that the strong arm of justice, the hammer of justice does not fall upon everyone equally, what we are seeing and what we will likely see is that folks who are blocked from access of being able to access a clinical abortion if they would like to do so. Folks that aren't able to make the full-on decisions that they want to about their reproductive lives are going to be the folks that are more likely to turn to self-managed abortion, be pushed towards self-managed abortion, I should say.
Those are the folks that are also more likely to be criminalized. They are folks that live in communities that are already policed. They are folks that are already heavily surveyed, the same groups of folks that when they come to a healthcare professional are less likely to be believed about things like pain. Less likely to be believed about their own bodies. Those are the same people that are more likely to be targeted by police, they're more likely to be targeted by prosecutions when it's time to make a decision about whether this is a tragedy, or whether this is a crime.
Prosecutors are more likely to want to go after those folks, and we should name things in the room, right? These are people who are poor, people who are Black and brown folks that live in communities that don't have a ton of access to healthcare. Those are the folks that are likely to be prosecuted. Also, when we're looking at the full span of how visual evidence gets used in the criminal legal system. Oftentimes, it's people who don't know their rights and what they can say yes or no to. I'm sure Cynthia is going to be able to talk about this in more depth but oftentimes law enforcement because they're able to not tell the truth to people that they're investigating. Will just say, "I need to have your phone. Let me see what's in your phone, what's your password?" Folks just hand that information over because they don't know they're allowed to say no.
Also, quite frankly, even if some people say no, that no is not always respected. Then they can have access to your phone, and they can go through your phone and pull your text messages, and pull your searches through whatever extraction program that that police office is currently using.
Melissa Harris-Perry: Cynthia, as Yveka, was talking about knowing your rights and even being structurally or an identity sense position to assert those rights, help us to understand, what are some of the legal rights? Also, what are some of the maybe technological pathways that folks can start taking to try to protect themselves?
Cynthia Conti-Cook: Sure. It's really important for everyone to know that when you are in an interaction, not only with police officers, but with social workers, with caseworkers, with probation officers, with parole officers, that the request for a digital device, unless you're on probation and parole, and they already have told you that your digital devices are a part of what they will routinely surveil and you've had to agree to that in writing, which sometimes happens.
In any situation where law enforcement, also including immigration, is demanding access to your digital device, if you can say no, you should say no, and hand your device over to an attorney or someone else who is involved in your representation in that case. Then the attorney can take it to the court to determine the scope of the allowed access to your digital devices. Don't hand your phone over is step one.
Step two is get really comfortable using encrypted communication. Get all of your contacts imported to your encrypted communication apps now so that you can very casually use it just to say, "Hi," just to say, "How are you?" so that when you are in crisis moment, or if you're helping someone who's in crisis moment, you don't have to worry about downloading something new and figuring it out. Same thing for anti fingerprinting browsers, get really comfortable using them so that they are not tracking what website you go to every time you switch your websites or what you search for.
Do that now, again, use it to look up what shampoo you want to buy or whatever so that you are well versed in how that website works. When you're in a moment of panic, you're not just defaulting to these apps that are vacuuming up a huge amount of our information and then treating it.
Melissa Harris-Perry: Cynthia Conti-Cook is an award-winning Civil Rights Attorney and current technology Fellow at the Ford Foundation's gender, racial and ethnic justice team. Yveka Pierre is Senior Litigation Counsel at If/When/How, a national network of legal professionals focused on reproductive justice. Thank you both for joining us today.
Cynthia Conti-Cook: Thanks for having me.
Yveka Pierre: Thank you.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.