| Payton Croskey Kenia Hale | MR Online

How a new generation is combatting digital surveillance

Originally published: Boston Review on June 2, 2022 by Nate File (more by Boston Review)  | (Posted Jun 07, 2022)

We’ve generally come to accept that our devices are listening to our conversations, our personal data is being tracked and sold, and that law enforcement tracks and stores images of our own faces. But is there anything to be done about our dwindling digital privacy? While there is a growing community of people committed to protecting our privacy online, a new lab at Princeton University pays particular attention to the challenges and threats that Black and other marginalized people face under our digital surveillance state.

Kenia Hale, a predoctoral fellow, and Payton Croskey, a rising senior, are members of Princeton’s Ida B. Wells Just Data Lab, where they research ways people of color can resist this surge of digital surveillance. The lab, created and led by sociologist Ruha Benjamin, “brings together students, educators, activists and artists to develop a critical and creative approach to data conception, production and circulation,” and aims to “rethink and retool the relationship between stories and statistics, power and technology, justice and data.”

Hale and Croskey led a project this spring called “Liberatory Technology and Digital Marronage.” Their team researched various technologies that reject and subvert the growing prevalence of digital surveillance, as well as technologies creating refuges from mainstream digital society. They compiled their findings into a zine and online repository, and are developing an app, Our Space, built off of their work.

Boston Review Black Voices in the Public Sphere Fellow Nate File spoke with Hale and Croskey about their research, the possibilities of technologies that respond to the needs of marginalized people, and the red flag of any digital technology claiming to be neutral.


Nate File: How did you define “technology” for your research? It seems like that word can mean anything from a computer to something like a gardening tool.

Payton Croskey: We wanted to challenge our own understanding of what technology is, so we broadly defined it as any tool used to accomplish a task. And then by putting the word “liberatory” ahead of it, the technology also had to support the increased freedom and well-being of marginalized people, especially Black people. And so, we created subcategories for liberatory technology that was directed toward the surveillance of facial recognition, voice assistance, biometric tags, and social media. We also looked at digital healing spaces that work to escape from it all. And we really wanted to make sure we focused on technology created by Black, brown, indigenous people.

Kenia Hale: Collectively, our society has shifted our definitions of technology away from some of the more “low-tech” forms, like gardening tools. But we really wanted to embrace those, too, because those are typically the forms of technology that most marginalized groups tend to use. For example, braiding hair and quilting are forms of technology that were used by enslaved people. The things in our Western world that we might otherwise brush off, we wanted to make sure that we were counting them, because that also opens up our perspective beyond generalized standards of what is “correct” and what isn’t.

NF: Were you concerned that promoting strategies to evade surveillance technologies might ultimately help those developing them?

PC: We definitely considered that, and you certainly don’t want to help surveillance systems improve by publicizing the ways to get around them. But, interestingly enough, many of the tools that we ended up studying tended to be forms of protest, or something that the creators wanted to be publicized so they could put a spotlight onto the corporations gathering data on users.

And a lot of these technologies were tools that the creators wanted the public to use. We found some examples of makeup that people would put on their face to hide from facial recognition technology, for example. Even though these systems will likely update their software and no longer be susceptible to such an attack, these creators are trying to increase awareness of how dangerous these systems are. They want to get conversations started so that long-term solutions can be found. Some weeks, we just could not find very many examples of liberatory texts against certain kinds of surveillance, even though we’re sure that they exist, too. They just might not have been publicized as much. But then, that just led us to brainstorm, how might we subvert these systems? Or do we think that technology is the answer in this case? And sometimes the answer was, No, we really just need to abolish these systems.

NF: What drew each of you to this concept of digital marronage and liberatory technology?

KH: I’m very much interested in the way that communities organize themselves, both online and in person, especially from an environmental justice lens. Last semester, I read “A Totally Different Form of Living: On the Legacies of Displacement and Marronage as Black Ecologies” by Justin Hosbey and JT Roane. The paper describes marronage as the practice in which people escaped slavery to form autonomous societies, where they used nature as an unruly space to organize and “freedom dream” together. And the article investigates how many of those geographies where marronage occurred are under ecological threat. Reading that made me want to understand marronage better and think more broadly about how communities of resistance organize themselves in person and online.

PC: I read Stefano Harney and Fred Moten’s The Undercommons: Fugitive Planning & Black Study, which is a series of essays exploring the Black radical tradition and how one can be in, but not of, the university. Then, through my own research, I developed my own term based off of Moten and Harney’s work, the “Augmented Undercommons,” which I’ve defined as a digital location for people to practice radical study and the creation of liberatory technology, parallel to the destructive one that we currently exist in. And I use that term to help us think through how we can be in, but not of, the technological and surveillance state. I was interested in finding ways that the average person can fight back against digital surveillance systems. And I wanted to find out how we can hide from them. How can we create new, less dangerous technologies?

And then Professor Benjamin introduced us to each other. She encouraged us to explore our interests within the lab, and has really shaped how we think about technology and the liberatory systems we seek to create. We’re extremely grateful to her.

KH: Absolutely. And so we came upon this idea of liberatory technology and digital marronage together. But I also wanted to uplift another text that Payton and I both were referencing, which is Robin D. G. Kelley’s Black Study, Black Struggle.” One of the things he describes there, similar to in The Undercommons, is how to exist within the university but also not become a part of it; to amass the resources you can from that power structure, and then redistribute them to your communities. For us, that was basically a model of what marronage means in the twenty-first century. The whole point of this project was to research and collect information on these different liberatory technologies and then distribute that knowledge to activists and communities outside of Princeton University.

NF: Which technologies that you came across over the course of the project were particularly interesting to you?

PC: I’m excited by Breonna’s Garden, which is a new augmented-reality app that serves as a space for communal healing. It was created in partnership with Breonna Taylor’s family, as a way to continue her legacy of caring for people—she was an EMT and was becoming a nurse. So, to use the app, you find a safe space in your home. And looking through the camera, you see flowers growing from the floors. And as you walk around the flowers, you hear messages left by other people from around the world who are grieving the loss of Taylor, or another loved one. Taylor acts as a sort of guiding angel through the app, where she is helping people grieve and recover, just as she was in life. We often think of digital spaces as social media, where there’s a lot of toxicity, trolls, and other harmful things. But this is a digital safe space.

KH: I’ll mention two. One of them is this app called Kinfolk, which troubles the typical idea of monuments. While there is a wider debate around taking down monuments right now, this app basically asserts that there are so many missing monuments of the Black radical activists from history. As you look through the camera, and the app will erect a monument of a figure like Toussaint Louverture, or Frederick Douglass, and explain the things they did. The app is specifically meant for youth as a way to give them access to that education that they otherwise may not get, in a way that’s also accessible, because kids are always on their phones.

The other is this new social media platform called Somewhere Good. The creators wanted to make “digital gardens,” similar to Breonna’s Garden, that are alternatives to social media, where people are collectively tending to each other, like we would a garden. Gardens are also places where communities can meet each other and work through problems together. The creators considered what it would mean to create a social media with those values literally encoded into it. While other social media platforms are highly individualized, Somewhere Good is all about connecting through group settings and collectivity.

PC: This theme of gardens is catching on, and we seem to be going back to our roots; going back to nature, and getting away from all of this—batteries, high-tech solutions—and seeing how we can use the systems of care that nurtured us throughout history. That include things like nature, garden, soil, and growing our own things. It makes sense also because, like gardens, these solutions are messy. They take time. And you can’t tell a plant to grow faster; it won’t do it. It will go at its own pace.

NF: I assume that people who develop systems of digital surveillance would say that the issues we experience with them—like facial recognition technology disproportionately misidentifying Black and dark-skinned people—happen because the technology is new. And to eliminate problems like racial disparities, we need to invest in it more, not less. How would you respond to that?

PC: Laws and technology will never get better unless people do. We encode our values, beliefs, and biases into our technology. That is almost inevitable. We have yet to see a way of creating technology that is actually neutral and unbiased. The problems with technology exist because they’re built within a history of enslavement and exploitation of marginalized communities. The technology is new, but the problems with the technology are not new. We’re just reforming them and reshaping them in new packaging. New software might make the technology run a bit faster, but it’s also making the discrimination run a bit faster. To fix the technology, we have to fix the people who are creating it.

KH: I think about people like Elon Musk and how he has had interest in buying and controlling Twitter, one of the biggest social media companies in the world. Meanwhile, his company Tesla is currently dealing with a lawsuit on behalf of thousands of Black employees who allege that the company has fostered a racist, white supremacist culture. It’s hard to disconnect those things. And that’s why we call technology a tool: it depends who is using it. Technology is not going to be the solution in and of itself, but it can be something that we then use toward marronage and liberation.

PC: As we thought through creating our own interventions, we considered that if it’s inevitable that we’re going to encode values into these programs, we need to be intentional about which values.

KH: Any technology that claims to be neutral, that’s a red flag. An article we read noted how facial recognition technology is being used to identify January 6 rioters; the article questioned whether that was a good idea. You would think you would want to identify these people, but at the same time you have to consider whether you want facial recognition technology to be used at all. The more that harmful technologies are used, the more they become available for people to abuse them. You know, the police might say, because of an emergency like January 6, we need to use more facial recognition technology on everyone. And once that happens, it’s not like the police would necessarily say, OK, we’ve now identified all these people, we’re not going to use that information anymore.

NF: Right, that data doesn’t just get deleted. And the rules on how they can use that data aren’t very clear.

PC: These small things can turn into something much larger and more concerning. You help train these systems to identify your face better—you use face ID to open your phone for example, or phones identify and sort your photos of friends and family, and make it really easy to post about them. So those seem pretty harmless. But in reality, the software is being trained and getting better at identifying you. That means the software is also getting better at spotting someone who is protesting the police, for example. And now because of these everyday uses with our phones, you are getting knocks at your door.

KH: I remember when I found out that when you’re doing those captcha prompts, where you input information to identify all the pictures of a car to prove that you’re not a robot, that’s also being used to train AI to recognize things like a car or a bike or a bridge. And it’s like, what? I had no idea that my identifying a bike is now helping AI identify a bike. So many of these companies just don’t tell us what is happening with our data.

NF: Let’s talk about the app you’re developing, Our Space. It’s similar to some of the other apps you’ve mentioned, such as Breonna’s Garden, in that it creates a sense of digital marronage, of escape and freedom. Why did you choose to go in that direction?

KH: Often, you’ll see mental health apps designed to help an individual. They will prompt, “How was your day?” or track how your mood has changed throughout the week. But we wanted to resist this idea that your healing has to only be on you, an individual living under the weight of capitalism. Our Space is meant to be a digital healing space that allows you to be community with people. And we wanted to lean into the need for interconnectedness to heal. On Our Space, you can check in with people in your network to see how they’re doing while simultaneously letting them know how you are feeling.

As we were generating ideas, we talked about how when you’re not doing well, it can be hard to figure out what you want to do to help yourself. So Our Space also lets you plan those helpful things, so that if you do feel anxious or overwhelmed, you and your network know what helps. Like, I like to watch this show, eat this kind of food, do this activity when I am stressed. And you can alert your friends that you’re feeling anxious, and then they can know to bring this snack to you so you can cool down, for example.

PC: We also wanted to make sure that we honored the idea that low-tech solutions are valuable, and we tried to infuse both high- and low-tech ideas into this. So yes, the app gives you a digital space where all of these different things are lined up and you have a way to engage with others. But it also makes space for more low-tech tools like snacks and alerting your friends that you just need company right now.

NF: Has your research, and working on this app, changed your day-to-day? For example, are you off social media now?

PC: It’s frustrating, because after doing all this research, I want nothing more than to get rid of all of it. Just throw my computer and TV out the window, get rid of everything. But it’s also very difficult to go completely dark. I’m still a college student and I have assignments to turn in. I want to see what my friends are posting on social media. I do want to use the Internet to engage with some of the people who aren’t physically close. But I don’t think it’s all or nothing. That’s why we focused so much on the simple ways that we can change some of our habits so that I can still be in control of some of my data and the way my information is being used. I don’t use my fingerprint or facial recognition to open my phone. I don’t have Alexa.

I think we’re still trying to figure out what is the right balance of having control over my information, but also not feeling like the weight is entirely on me. There’s so much pressure on individuals to protect themselves, or to protect their communities from being surveilled. But at the end of the day, large corporations and police departments are responsible for these issues, and yet I am the one tasked with doing the work to make things better.

NF: What you’re describing sounds very similar to the challenge of an individual trying to combat climate change. It’s great if I recycle and use paper straws—but your small habits are up against massive industries that are driving the problem on an enormous scale.

KH: You can feel a sort of despair, because these companies and these technologies are all-encompassing, and it can be very hard to resist them. But in the face of that, it’s comforting to think about how our communities have always resisted and found ways through similar struggles. Voice assistants, for example, are ripe for surveillance, but they also make life easier for people with disabilities and that might be a reason to advocate for their use. At the same time, people with disabilities have existed and thrived way before voice assistants. Thinking about how our lives have been before these technologies existed can be eye-opening.

No app or technology is going to heal the systems that create so much anguish, particularly for communities of color. But until those systems are eliminated or change, we can work together as a collective, digital maroon society and figure out how to take care of each other.


Payton Croskey is a member of Princeton University’s class of 2023. She is a visual artist and tech justice scholar writing, coding and designing a liberatory future for all who refuse to submit to technology’s watchful eye. Pursuing a degree in African American Studies and a minor in Computer Science, Payton’s research seeks to uncover and develop ways of designing digital systems that protect and empower targeted communities. Payton is the creative content director for the Ida B Wells Just Data Lab, where she extends her studies by collaborating with community organizations to address the creation, reproduction and lack of data in underrepresented communities.

Kenia Hale graduated from Yale University with a bachelor’s degree in Computing and the Arts with an Architecture concentration. Kenia is interested in environmental justice, racial justice, and the implications of big tech and surveillance on communities of color, especially across the midwest. As a CITP Emerging Scholar, Kenia is interested in either continuing her senior thesis research, or researching Black Digital Ecologies.

Nate File is a Black Voices in the Public Sphere fellow at Boston Review, where he writes about Black liberation and modern forms of marronage. He holds an MFA in journalism from NYU, where he studied long form reporting and writing. Previously, he has written for Philadelphia Magazine and Bedford & Bowery.