CN: Fascism, US Politics
I just got home from the Election Tech Debrief, meeting with 300 other folks who worked on the intersection of Democratic politics and tech this last election cycle. I will have several newsletters coming out of things I learned and thoughts spurred by fruitful conversations. This is the first of these.
At the conference, I remembered what it was like working in tech under the first Trump administration. Somehow, I had forgotten what it was like. How much energy it took. How much time it took. How much brainspace I used on it. How could I have forgotten? And yet, I had.
I want to share a few lessons (and memories), both for folks who've come to tech since the Trump years, and those of us who were there but have forgotten.
I have no hesitation saying that it's going to be different this time around. Last time, Trump's administration was a directionless chaos machine, pushed and prodded by various ultra right wing forces here and there. This time, they have a clear, targeted and dangerous agenda and a plan to implement it.
Also, data is in a whole different sphere than it was before. While “big data” was certainly a thing in 2017, the levels and uses of tracking and detailed data are far beyond what was in use in 2017.
However, many of the lessons are still applicable.
1. Don't write the listLast time, we had a huge focus on the “Muslim ban”. Given that the travel ban against folks from many Muslim-majority countries was one of the first major actions after Trump took office, it was huge on our radar. There was a real fear of Trump using the major tech companies to write a list (or database) of Muslims or of immigrants for deportation. As tech employees, we called on our leadership to sign pledges not to make that list. We signed pledges not to do it ourselves.
In reality, most likely, the data to make that list already exists. However, it's still relevant. If you are asked to make a list of people, users or accounts that meet certain demographic criteria, push back. Why? How will this be used? What are the possible malicious repercussions for this?
2. Examine the dangers of every piece of data you storeI don't see folks having this conversation anymore, but in the early Trump years, we talked a lot about the repercussions of the data about people you store. Adding a column for “religion” or “country of origin” may seem innocuous, but it can cause great harm. Even a column for “pronouns” or “gender identity” could be used for harm. At the same time, there are times when there really is a use to store this kind of data.
The key is to reflect on whether this data is actually relevant and important, and who might be hurt by the data. Is there a way to reduce or mitigate the harm? Do you really need to store this data? Or can you adjust your software so it doesn't need this information, if you don't really need it to serve valid user needs.
I've been doing this to an extreme, with some of my work in the reproductive justice movement. When building software for patients who may be seeking an abortion from states where abortion is illegal, we are extremely careful in what data, if any, we store. We don't store anything we don't have to, we delete data as soon as possible, and we think about what data might be incidentally stored that we aren't thinking about.
What data about people are you collecting? Where are you storing it? How are you storing it? How long are you storing it? How is it vulnerable not only to hackers, but also to lawyers and subpeonas? How would data be traced back to people?
We're all going to need to move back to this type of reflection.
3. Think about your subprocessorsWhen thinking about protecting the data of people seeking abortion from states where abortion is illegal, we spend a significant portion of our time thinking about our subprocessors or vendors. If you only store data encrypted, but you send that data for processing in a service that doesn't store it encrypted, then you aren't safeguarding the data.
If you delete your data every 15 days, but your subprocessors don't, then you aren't deleting the data. Don't forget about your backups. Backups are often a weak link in security.
4. Push the folks in power to hold the lineOne of the things we spent a lot of energy on (which I had entirely forgotten, or perhaps blocked out of memory) was pressuring our c-suite. We were building opensource infrastructure tools, and we had the federal government as our clients. We pressured our executives to take a stance on issues like not aiding the government with making a Muslim ban. We pushed them, and asked hard questions in All Hands Meetings about which federal agencies we were contracting with. How exactly was it that we were contracting with the Department of Homeland Security, but not Customs and Border Patrol. Were we sure they weren't using our software? How was the Army using our software? How could we make sure that our open source software wasn't used for evil?
At the end of the day, there were not always answers. I'm not sure there is a way to make sure that OSS isn't used for evil. I don't feel fully confident that CBP never used our platform and the software we built. However, it's important to push folks in power to have the conversations. I'm proud that we kept pushing the conversation.
At the same time, it was exhausting. It was morally exhausting to feel like we had to be the finger in the dike, holding back our software from aiding fascism. Those were dark, hard days, but we need to find ways to fight these fights sustainably. While there is a place for opting out fully from software with these ambiguities, I do think there's also a place for moral techies pushing back against executives who might be happy to capitulate and aid fascists.
5. Bureaucratic delays and refusals to comply make a differenceSomeday, you may be asked to do things you think are not right, not moral. Maybe your boss comes to you sheepishly, asking if you can build a feature. Or maybe it's blatent, with no discussion of the potential harm a feature could cause.
Stand up. Say no. I would like to say I clearly remember every detail of the time I said no, but to be honest, I don't. I think my adrenalin was high enough that I didn't fully record the memories. I couldn't tell you, today, what the specific issue was. But I won't forget the day and a half a coworker and I spent, trying to figure out how to implement this feature request ethically. Maybe there was a way around that nagging worry we felt. Ultimately, we concluded that there was no way to do this that we felt morally ok with. It was scary to push back and say “No. It's the wrong thing to do, and we won't build it”. But I slept much better at night, knowing I had said no.
Fascism runs on the compliance of folks who disagree but don't resist. Delay. Gum up the works. Say no.
This is not going to be a fun ride. People will die because of this administration (people already have). It's important not to be so close to the news all the time that it overwhelms you into paralyzation. But also, it's important to act. Fascism wins if we cooperate and do nothing.
P.S.If you enjoyed this newsletter, please pass it on to a friend! And if you have thoughts, or your own lessons on resisting fascism from the first Trump term, I'd love if you shared them.
If you enjoyed this newsletter, please pass it on to a friend!
If a friend forwarded you this email, you can subscribe here
If this post sparked some thoughts for you, I'd love if you reply and start a conversation with me about them.
b'vracha, Caroline Taymor