Roe v. Wade is gone, but this isn’t 1973. In some ways, it’s worse.
When the Supreme Court ruled last week that banning abortion is not unconstitutional, abortion immediately became illegal in several states with “trigger laws” that would go into effect with such a ruling. It’s about to become illegal in even more states where previously passed laws restricting abortion had been blocked by federal courts.
Many people are on the brink of losing access to safe, legal abortions, and those who access or support abortion will face consequences in some states ranging from civil suits to arrest. These are grim times for abortion access.
And the forecast is even more grim as we now live in an era of unprecedented digital surveillance. I’ve spent most of my career helping protect activists and journalists in authoritarian countries, where it’s often wise to think several steps ahead about your digital privacy and security practices. Now we need to bring this mindset back within our own borders for those who provide abortion services and those who seek abortion.
The first step is operational security. Abortion providers, employees and volunteers of abortion support networks and abortion seekers must take immediate steps to thoroughly separate their work and health from the rest of their digital lives. That means using aliases, using separate phones and emails, downloading a privacy-protecting browser, and being very careful installing applications on personal phones.
For people who are pregnant, it is important to start by understanding the existing threats. People who have already been prosecuted for their pregnancy outcomes were monitored and turned in by trusted people, including doctors. The corroborative evidence included Google search histories, texts, and emails. It’s time to consider using Tor Browser for searches related to pregnancy or abortion, using end-to-end encrypted messaging services with disappearing messages enabled for communication, and being very selective about who entrusts information about their pregnancy.
It’s also important to look to the future and rethink the wealth of data we create about ourselves every day – which can now be used against us. People of childbearing potential should reconsider their use of menstrual tracking apps, which can collect data that can be sued if suspected of aborting a pregnancy. They can use an encrypted period-tracking app like Euki, which stores all user information locally on the device, but beware that if that phone is seized by the courts, they can still read the information on it. People who could become pregnant should also carefully review the privacy settings of services they continue to use and disable location services for apps they don’t absolutely need.
But the biggest responsibility now rests with the tech industry. Governments and private actors know that intermediaries and apps often collect piles of data about their users. If you build it, they will come – so don’t build it, don’t keep it, dismantle what you can and keep it safe.
Businesses need to think about ways to enable anonymous access to their services. They need to stop tracking behavior, or at least get users to confirm first. They should strengthen data erasure policies so that data is deleted regularly; avoid logging IP addresses, or if they need to log them for anti-abuse or statistics, do so in separate files that they can collect and delete on a regular basis. They should reject user-hostile measures such as browser fingerprinting. Data must be encrypted in transit and end-to-end message encryption must be enabled by default. They must be willing to stand up for your users when someone comes to request the data, and at the very least make sure that users are notified when their data is requested.
There is no time to lose. If there’s one thing I’ve learned from working with vulnerable populations in authoritarian countries for a decade and a half, it’s that when things start to go wrong, they get worse very quickly. If tech companies don’t want their data turned into a dragnet against those seeking abortions and those providing abortion services, they need to take these concrete steps now.
Leaving frightened people to their own devices to determine their digital security in a world where it is difficult to understand what data they create and who has access to it is not an option. Technology companies are in a unique position to understand those data flows and change the default settings to protect the privacy rights of this new vulnerable group of users.
The Supreme Court on Friday rolled back rights by half a century, but now is not the time to shrug and say it is too late and nothing can be done about it. Now is the time to ask tough questions at work. You have the data of the world in your hand and you are about to be asked to use it to be the little helper of the repression. Do not do it.
While others are working to restore rights so heartlessly taken away, good data practices can help tech companies avoid being on the wrong side of history.
Eva Galperin is the director of cybersecurity at the Electronic Frontier Foundation.