The rights of the abuser vs. the rights of the abused within the tech we build
When we don't explicitly call out differences among our users, whose goals are we prioritizing? Are we inadvertently reproducing existing forms of oppression?
A concept I’m going to explore in more detail in my book is the idea that when we design a piece of tech, we are always prioritizing certain types of users, even though most of us rarely fully realize what we’re doing. I’m not talking about doing design research so that we can design the best product for the people who will use it, or prioritizing personas or real users when we make design decisions, but something at a larger and more nebulous scale. This prioritization is easy to see in social media, where the rights of those harassing others through the platform are prioritized as “free speech” over the rights of those on the receiving end of the harassment to not be harassed. Put another way, social media platforms prioritize the rights of abusers over the rights of the abused.
Elizabeth Yardley is a criminologist who researches gendered violence and homicide and has a lot of smart things to say about this. Last year she wrote an article that appeared in the Violence Against Women Journal detailing the rise of technology-facilitated domestic abuse, and describing how people researching and writing about this problem need to view it through the lens of patriarchal legacies of misogyny and sexism. If we don’t do this, we will continue to “prioritize abusers’ freedom to do harm over rights of survivors to be protected from harm.”
In a blog post expanding on this idea, Yardley writes:
The playing field is anything but level, the odds are stacked against female victims of men’s violence. Neoliberalism and the so-called ‘freedoms’ it advocates are not equally accessible to all. It is inherently patriarchal, built on the same misogynistic ideologies and values that have been with us for centuries. It masquerades as a force for ‘freedoms’ but it is a wolf in sheep’s clothing. In neoliberal political-economy, abuser’s freedoms to do harm outweigh women’s rights to be protected from harm.
Nowhere is this more relevant than when we look at the big tech firms whose products are re-purposed for abuse. There is the assumption that the owners and users of online accounts are one and the same, that mutuality and trust exist between people sharing a residence, that end users don’t intend to use their tech to do harm and that the main ‘risks’ around technology come from outside of the home from hackers, fraudsters and identity thieves. The fact that women are in more danger from those inside than outside of their homes has not registered with big tech.
In addition, social media platforms refuse to remove vile and defamatory content posted by abusers because to do so would infringe their ‘freedom of speech’. Companies selling spyware products used in domestic abuse market them as ‘legitimate’ through claiming they are for child safety or employee monitoring. Some are so blatant as to explicitly target abusive men, using lines like ‘Catch your cheating wife’ in their publicity materials.
The assumptions about users who are sharing a residence having trusting relationships, and that users don’t want to use our tech for harm, are key factors to the outcome of designers inadvertently prioritizing the abuser’s goals over the victim’s. We make these sorts of assumptions, but they’re so deeply baked into our psyches that we usually have no idea we’re making them. This sort of thing goes beyond the push to stop assuming our users are white or male or cisgender - it gets into thinking about what types of relationships our users have and if, when presented with an opportunity for exerting power and control over someone else through a piece of technology, a user will seize that opportunity and do harm (the answer to that is always yes). There are so many assumptions that need to be be disbanded in order to truly design for safety that they’re getting their own chapter in my book. But the assumption that we’re not prioritizing the goals of abusers is a big one. Because very often we are. And we’re doing that because we’re not taking the time and energy to really think through this side of the tech we build.
We have these assumptions in tech and design because they’re deeply embedded in our society and culture, and reflected in so many laws and agencies and realities - just look at the tiny percentage of rapists who see jail time, the enormous numbers of women abused by male partners who never seek legal help, and survivors from both groups who are badly mistreated by police, lawyers, judges, and wider society. The idea that domestic abuse isn’t that big of a problem, and when it does happen is a “private family matter,” or that women who are raped somehow brought about their own rape because they were drinking, or out at night, or trusted the wrong person, remain pervasive in our media, news, and laws, though feminists have made serious strides combatting these narratives. But the embedded nature of these ideas means that when we don’t think about them explicitly in our work, we end up inadvertently reproducing them. There truly is no such thing as “neutral” when it comes to our work - we are always prioritizing some group, whether we know it and acknowledge it or not.
Sasha Costanza-Chock is a designer, activist, and the author of Design Justice, which is simply an incredible book, and if you’re interested in themes of design, privilege, and power, you should order it today. In Design Justice, Costanza-Chock explores the concept of universal design (the idea that the things we design must be accessible to the widest set of people):
UD [universal design] discourse emphasizes that we should try to design for everybody and that by including those who are often excluded from design considerations, we can make objects, places, and systems that ultimately function better for all people. Disability justices shares that goal, but also acknowledges both that some people are always advantaged and others disadvantaged by any given design, and that this distribution is influenced by intersecting structures of race, class, gender, and disability. Instead of masking this reality, design justice practitioners seek to make it explicit: we prioritize design work that shifts advantages to those who are currently disadvantaged within the matrix of domination.
The idea of ending the practice of masking the realities of our users and pretending we can design something that sort of works well for everyone, and instead explicitly stating the differences among our users and then prioritizing the most vulnerable, is something that exerts a powerful pull in my work designing for safety. It’s something I’ve been trying to articulate for over two years but didn’t have the proper language for until I read this book.
Designing for safety means recognizing that the status quo is stacked in favor of abusers and against survivors/victims, and then explicitly prioritizing the rights of our users to not be abused (or equipping to quickly understand the abuse and put an end to it). This is the opposite of how tech is designed now. For example, the affordances of a Nest thermostat prioritize the goals of those who would use it for abuse (messing with the temperature, turning off the heat in the middle of a freezing night, etc.) over the goals of survivors to not be abused and to understand and end abuse quickly. In this case, a history log of activity would achieve the goal of the survivor being able to understand the abuse is happening, and a UI that prioritized removing another user would help her end the abuse.
A second example is the “share my location” feature in Google Maps, which abusers can easily enable for stalking purposes by accessing their victim’s phone (the idea that passcodes keep unwanted people out of phones is another assumption addressed in my book). A monthly summary of who you’re sharing your location with is sent to the user’s email, but that gives an abuser a long while to secretly monitor someone’s location, again prioritizing their goals and rights over the goals and rights of the person being stalked. In this case, daily reminders or even a persistent design, such as the blue bar that appears at the top of iPhones when Google Maps is in use, would prioritize the goals of the survivor to not be abused by helping her quickly understand that the share my location feature is being used for stalking.
In order to shift our tech towards both justice and safety, we need to make the differences among our users explicit and then choose to prioritize the groups that are the most disadvantaged, oppressed, and vulnerable. We can’t pretend that anything we design and build can ever be truly neutral; if we claim something to be neutral, we’re really just saying that we don’t understand how different groups are advantaged and disadvantaged and how we’re reproducing those differences in our work. Instead, we need to call out these differences and use our power as the creators of tech to explicitly prioritize the rights, needs, and goals of our most disadvantaged and vulnerable users.
A few other updates - I know it’s been a while since my last newsletter, and I’m assuming that “we’re still in the midst of a deadly pandemic” is reason enough for the delay and no one will judge me too harshly for it. I’ve also been in the thick of editing my book, which is shaping up to be something I’m becoming incredibly proud of. But wow is it a lot of work on top of my regular job. Throw in my third round of IVF being the longest and roughest yet in terms of side effects and recovery time, and things like this newsletter have fallen to the wayside. But - I’m really excited about the state of the book, which I feel confident is going to be a very useful guide to understanding and combatting tech-facilitated abuse.
If you know someone who’d like this sort of thing in their inbox, forward it their way. You can subscribe here. You can follow me on Twitter here. If you want to support my work, you can become a Patron or hire 8th Light to build your custom software (and pre-order my book when the time comes, but that won’t be for a while yet!)