Ruha Benjamin

Podcast Description

“When I think about the role of technical systems…the inequity becomes even more encoded, subtle and it may seem like it’s operating where race is not explicit…The designers are not taking explicit note of race and yet racial disparities are being reproduced.”

Ruha Benjamin is Associate Professor of African American Studies at Princeton University, founder of the JUST DATA Lab, and author of Race After Technology: Abolitionist Tools for the New Jim Code (2019) among many other publications. Her work investigates the social dimensions of science, medicine, and technology with a focus on the relationship between innovation and inequity, health and justice, knowledge and power. Professor Benjamin is the recipient of numerous awards and fellowships including from the American Council of Learned Societies, National Science Foundation, Institute for Advanced Study, and the President’s Award for Distinguished Teaching at Princeton. For more info visit  



Kim Crayon: Hello everyone, and welcome to today’s episode of the #CauseAScene podcast. I have been very patient in waiting on this guest to find some space in her schedule. I have looked at my DMs, and I actually reached out to her on May 29th. So I’m very happy that her schedule has opened up and allowed us to have this very important conversation. So I want to introduce everyone to Ruha Benjamin. Ruha, would you please introduce yourself to the audience?

Dr. Ruha Benjamin: Absolutely, and it’s really a thrill to be on the show; to be in conversation with you on this fantastic show. So, my name is Ruha Benjamin, and I’m trained as a sociologist, I am a professor of African American studies at Princeton University and I study the social dimensions of science, technology, and medicine with a focus on the relationship between innovation and inequity.

Kim: Whew! OK. So, as I always start, why is it important to cause a scene, and how are you causing a scene?


Dr. Benjamin: We inherit a bunch of scripts, a bunch of roles, structures that shape our day to day lives, that shapes our opportunities, and I think it’s important to disrupt those patterns that we’ve inherited. And so, when I first started out college, I was a drama major, and quickly turned to sociology because for me sociology was about the drama of life. And in part, it’s about really questioning the various positions that we take for granted—the natural order of things—and so disrupting all that’s taken for granted about the social patterns and hierarchies and categories that we’ve inherited is really important for me as a social scientist.

And, I encourage through my teaching, through my public engagement, just through living my life, to just really encourage the people around me to do the same, because it’s not something that can change through just individual effort, but through organizing and collective movement building in terms of changing the status quo.

Kim: OK, so I… [sighs] people, people, people. I have highlighted the hell outta just the introduction of the book. So the book, Ruha wrote is “Race After Technology” and boy oh boy—well first of all, there’s a beautiful Black person on this cover, and I was like, “Ahhhh yesss. [Dr. Benjamin laughs] Yesss.”


And I’m just gonna read you some things that I—I mean, again, this is just the beginning of the introduction, not even the entire introduction, this is just the beginning—I’m gonna read you some things that I highlighted, and it goes back to what you just said: the natural order. Because we’ve started collectively questioning what the natural order is—or what we’ve been told the natural order is—and we’re finding out that it’s not so natural after all. [Laughs] So I’m gonna do this like I do the book club everybody, if you wanna follow along and find it, it’s on page four, the first thing I’m—the first quote is:

[Reads from “Race After Technology”]

But the presumed blandness of white American culture is the critical part of our national narrative, scholars describe the power of this plainness as the “invisible center”…

And that’s in quotes.

…against which everything else is compared, and as “the norm” against which everything else is measured.

God damn.

Dr. Benjamin: Mhm. [Laughs]


Kim: Whewwwwww! Man, this was just coming to Jesus moments for me.

[Continues reading]

To be unmarked by race allows you to reap the benefits but escape the responsibility of your role in an unjust system.

[Exhaling] OK, on page 5:

They calculated that the racial gap was equivalent to eight years of relevant work experience with which white applicants did not actually have and the gap persisted across occupations, industries, employer size, even when employers included the equal opportunity clause in their ads.

And then I want you to… this—mmm—the “New Jim Code.” Did you—is this a term that you created?

Dr. Benjamin: Yeah, exactly.

Kim: OK.

Dr. Benjamin: It’s a riff off of Michelle Alexander’s well-known concept.


Kim: Exactly. So lemme read you the definition, everyone, of what the “New Jim Code” is and we’ll get into this discussion.

[Continues reading]

The New Jim Code: the employment of new technologies, that reflect…

I need you to hear this, people, because I talk about constantly that tech is not neutral, nor is it apolitical. So:

The New Jim Code: the employment of new technologies that reflect and reproduce existing inequalities but that are promoted and perceived as more objective and progressive than the discriminatory systems of previous eras.

Dr. Benjamin: Mhm.

Kim:  OK, so… mmm. That just… when I was reading that it was so much confirmation, and I’m loving all the learning—I tell people, I do this podcast because I’m learning. It’s helping me understand what it really means to be antiracist and what it really means—and why we’re having so many problems in tech. And the work that’s really gonna have to happen to stop causing harm.

Dr. Benjamin: Absolutely.


Kim: So, if you can give me a background—I know you’re a sociologist—how did you get to this? How did you… this thing here? [Laughs]

Dr. Benjamin: Yeah. So, I’ve really—again, we’ve gotta go back to Spelman College, where I went to undergrad, historically Black Womens’ school. Where—this is a context in which you have people who share similar demographics that draws us there, but one of the things you realize right off the bat is all the fault lines under that category “Black Women”; all of the regional differences, differences of class, nationality, sex and gender, et cetera.

And so, part of it is thinking of how difference is produced, and for my senior thesis I looked at the role of medicine in naturalizing differences and making us—failing to question these differences because of the authority of science and medicine. So I did a thesis there on looking at obstetrics, gynecology, reproductive justice. I followed Black midwives in the South who continue to practice.

And so that got me interested in the authority of science and medicine, and how Black people have always questioned it, both epistemologically but through our practices. The fact that you still have Black midwives practicing in context in which their expertise is not valued, or in some places criminalized. So I wanted to juxtapose the great authority investment of power in this conventional site of medicine versus the more subversive forms of Black midwifery that I had the pleasure of learning from.


And so that got me started along these lines, and I take the same questions that I started with at Spelman to my work as a graduate student, then looking at the life sciences, stem cell research. My first book [“People’s Science: Bodies and Rights on the Stem Cell Frontier,” 2013] was on the forms of exclusion and inclusion there in California with the big stem cell grant that was implemented.

And so, again, it’s thinking about who’s included and excluded; whose world views, visions of a good life are embedded in a particular scientific discipline or policy. And so, those questions are relevant to every field. And so now, we come to the data sciences, you come to computation, machine learning, AI [artificial intelligence]; those underlying questions, which are questions about the social inputs into technology—not simply the outputs; not simply the impact of technology—are relevant. A few years ago, I kept seeing these headlines and hot takes about so-called racist robots.

And what was interesting to me was the novelty in which it was being framed, the kind of feigned shock and awe that, “Wow, all these technical systems could be biased,” but anybody who studied the history of racism—as many scholars and activists have done—would tell you that a computer system, like other kinds of systems—whether legal system, educational system, healthcare system—reflects the values, assumptions, biases, desires, interests of the people who produced it. And so, the fact that we put science and technology in a bubble, as if it’s asocial, apolitical; as if there’re not human beings behind the screens actually designing these things, to me indicates a kind of, a real—it’s not even a ignorance, it’s a kind of a feigned shock that human beings are really behind the screen, you know.  And, and a unwillingness to look squarely at the social process that produces these fields.


And so, I wanted to put my own training in conversation with these developments, rather than take the framing that I saw in these headlines about racist robots—I wanted to put it in conversation with Critical Race Studies that’s been looking at how racism is productive. It produces things, it constructs worlds, it helps to create value for some even as many people suffer from it.

So that’s what this book is trying to do, it’s trying to bring together Critical Race thinking and put it in conversation with science and technology studies in order to not necessarily—and like you mentioned in your intro that it confirmed for you many things,  I think many people are already talking about many of the insights in the book, so for me it was about pulling together the conversations that I saw out there, rather than pretending like I was coming up with something wholly new. It was providing a language and pulling together these strands that I saw dispersed throughout public discourse so we can have it all in one place, and use it—the book itself—as a tool to advance a social justice agenda vis-a-vis science and technology.


Kim: Whew, OK. Man, alright. So, two things that popped out for me: one is about your graduate study work, the reproductive health piece. And then, but I wanted to make sure I hit on… [sighs] The reason the feigned, “Oh, human?” thing, this is the issue that we keep bumpin’ up against, because we are quick in technology to try to extrapolate out the human in everything. It’s all automated. Or automate things as quickly as we can, you know? So, get the humans out of it, get the humans out of it, because they’re the bias. And I just… [sighs, then laughs]

I’m just like, “Oh lord, we have to really stop with the quant stuff here.” Qualitative research, qualitative perspective has just as much value, and helps to explain; they both work hand in hand. I see that all the time; I have some clients and I had them doin’ some research on creatin’ a job description for a COO—a Chief Operating Officer—because that to me, outside of the CEO, is the most important job in the company, but that’s not what tech believes.

Because most of these companies don’t have a business; what they have is a scaled product or service. They don’t have any processes, nothing in place that can be replicated, that they can even actually measure quantitatively or qualitatively; is it causing harm? What’s the… none of this stuff. They just keep running ’round—and it was interestin’ that when he was doing the research he kept comin’ across these articles, we were talking about it, “Yeah, this is a role that you start later once you makin’ X amount of dollars, and you want to automate it out as quickly as possible.”

I’m like, “The only human role that’s in this, the one role that is about touching people and impacting people, you want to automate out.” It’s quite interesting for me.


Dr. Benjamin: I wanna… that reminds me of a phrase that one of my colleagues uses to describe technology and she calls it—now this is Donna Haraway—she calls technology a “frozen moment.” And so, rather than actually excising the human out of it, what we do is solidify a set of human decisions, assumptions, values at the point of creating a technology that then continues to persist. It freezes in time those decisions and codes it and then it has an allure of objectivity and neutrality, but really what it does is it makes the human decision at the point of this time more rigid, right? And it encodes it in a particular system, and we see how this completely backfires on us.

One example that comes to mind recently is in the state of Michigan when the governor introduced—Governor Snyder at the time—introduced an automated decision system to flag people who were being suspected of employment fraud, and it flagged over 40,000 residents in Michigan. Only years later do we find out that it was wrong in over 90% of the cases. Meanwhile, people went bankrupt, people lost their homes, committed suicide, divorced, were payin’ back the state money that they didn’t owe. And the state coffers grew and grew and grew, and meanwhile peoples’ lives—this automated system wreaked havoc on people’s lives.

But again, the human oversight—there was no real human oversight. And whatever the human decisions were at the beginning, faulty as they were, were encoded and presumed to be efficient and objective and fair, and they weren’t.  And so, again, it underscores the point that you’re making about the fallacy of presuming that we ever really get the human out of it. We just solidify and make our bad decisions more rigid.


Kim: And you speak to that, ’cause you can look at social media platforms right now. When the beginning idea was “move fast, break things.” People keep saying, “Oh well, that philosophy has changed.” No. That philosophy is ingrained in this community.  Period. I don’t care if Facebook changes—they can write it on the wall that says it’s been changed—that is the psyche of this community. As well as this “free speech” stuff.

And we see that when privileged individuals—because they don’t have the impact of being on the negative end of what’s considered free speech—to them all speech is equal.  And so once you ingrain, once you code that into a system—and I’m so happy you said that—so now that’s coded into the system so no matter how many marginalized people, vulnerable people are telling you, “I’m being hurt,” because it doesn’t match what you’ve encoded, then you argue with them. Their lived experiences are not as important as what you’ve encoded in this thing. And it’s quite interesting.

So you just hit the—again, I love these conversations—because you just connected a dot for me of what that is called, ’cause I didn’t know that had a—you had a friend who has a name for it.

Dr. Benjamin: Yeah!

Kim: Because that’s exactly what it is: it’s decisions that were made ten, fifteen years ago…

Dr. Benjamin: Frozen. Frozen in the system.


Kim: …when we weren’t even at the table to say, “Hey, this shit’s not gonna work.” And now that we’re sayin’, “It’s not gonna work,” you’re sayin’, “Well that’s what the system says,” but the system is wrong. [Laughs]

Dr. Benjamin: Mhm. Yes.

Kim: And I wanted to talk to you about the reproductive thing, because my audience is largely white people. And I try at every moment to to have a—when I bring someone on and I realize has a specific expertise—to have a conversation. Because this reproductive health thing is quite interesting to me. Because I’ve been tryin’ to make folx understand that white supremacy is the parasite that’s now eatin’ on its host.

So it is all these anti-abortion laws that are being passed, and you have it on both—now that I think about it—you have it on both ends with your undergraduate, with the stuff you were studying, and you can speak to this specifically. These laws, young ladies, young white ladies specifically, are for you. They don’t care about our babies, because if they did, our babies and mothers wouldn’t be dying and wouldn’t have the deplorable health care and mortality rates that we have. What they’re doing is tryin’ to force you white women not to have babies. Can you speak to that?


Dr. Benjamin: Yeah. I think part of it, again, is even defining what we have in mind when we say “reproductive rights,” or “reproductive justice.” Because when you just look at it at the point in time of choosing to take a baby to term or not, that narrow conscription of what the problem is leaves out so many other processes and practices that actually should be enfolded in the idea of reproductive justice. And this is the contribution of Black Feminists in particular, and it’s to say that what our education policy is, is a part of reproductive justice. Health care access well before you get pregnant is part of reproductive justice. Our zoning laws, our investment in job opportunities, everything along the life spectrum is part of reproductive justice. So don’t talk to us and pretend like you care about the health of our babies when every other indicator, every other element of our lives has blocked opportunities.



Dr. Benjamin: …has blocked opportunities; lack of investment. And so, it really reveals the lie embedded around the discourse of “pro-life,” when no other policies in society in terms that would actually engender a good life for people actually support that vision,  and so, for Black feminists, it’s always been a matter of saying that reproductive rights has to encompass the whole life-span. And increasingly what we find is that people who work in maternal and child health are adopting this lifespan model, because they understand that how a woman is treated, what opportunities she has before she ever gets pregnant actually impacts the health of a baby.

So we know it even from a biological model that that is true, but from a sociological model, it has to be that when we’re giving lip service to anything related to reproductive health, we have to talk about all of the policies in the society: our tax policy, our water policy, our education policy; and unless you’re consistent, and you’re actually investing in life-affirming investments throughout our social structure, just focusing on the point of pregnancy actually reveals the lie embedded in the entire discourse around pro-life.

Kim: Mhm. Thank you so much. I love brilliant Black women. Thank you. And I’m gonna keep bringin’ brilliant Black woman on here because I also recognize that many—because of white supremacy, they don’t see us as anything but this… they don’t see us as women; they don’t see us as smart; they don’t see us as anything—so I’m gonna keep highlighting these stories, these women, this knowledge, because I continue to say Black women are the moral compass of this country. And without us, we would be in such… woo! Y’all think y’all got it bad now? If we—and we’re doin’ it because we know that the shit rolls downhill; if we don’t do this work, we will be the first ones impacted. So we do this work and you benefit by product of our work. And I’m gonna continue to bring these beautiful Black women on this show.


So I wanted to talk to you—two things, because I… one of the things is, and you—again, just in your introduction, people—I’m not even askin’ you—you can read the whole book; yo gonna learn a lot in just in this freakin’ introduction.

And I might make this actually one of the books—so I started a “How to be an Antiracist” podcast that I do on Sundays. And so, we’re going through Dr. Kendi’s book. The next book we’re gonna be goin’ through is Nell Painter’s “The History of White People.

Dr. Benjamin: That’s a tome. [Both laugh]

Kim: So I’m thinkin’ about adding this, because I will have no white authors in this, ’cause we’ve done enough of that. We need some perspective from some Black people, and this really speaks to—because my area is tech—so this really speaks to that.

So I really want you to talk about two things. You’re sayin’ that your idea or theory of Jim Code came from someone else, so I want you to talk about how that evolved. And also, something very important—and you talk about this, again, in the introduction—the difference between equality and equity.


Dr. Benjamin: Mhm. Sure. So, to the first question about how the concept evolved: I was definitely influenced by the work of Michelle Alexander [“The New Jim Crow: Mass Incarceration in the Age of Colorblindness,” 2010] and the body of work around critical race studies, because one of the core issues that we wrestle with in this body of scholarship is how our society on one level seems to be changing, but there are underlying things that stay the same; underlying dynamics of domination, oppression that continue to persist. So how do we theorize the nature of change and the consistency of various hierarchies and oppressive structures?

And so of course, Michelle Alexander theorizes that, through the idea of the New Jim Crow, where she’s looking at the relationship between slavery, Jim Crow, ghettoization, mass incarceration, to show how racial domination persist in a new system, and how it builds on these prior legacies. And so for me the added value of the New Jim Crow is to code is to think about the role of technology in maintaining systems of power and domination. Which is not necessarily her focus in that work.

And so, when I think about the role of technical systems, the role is two-fold. One, is that the inequity becomes even more encoded—subtle—and it may seem like it’s operating where race is not explicit. And I can give you some examples of that in very specific algorithms in which the designers are not taking explicit note of race and yet, racial disparities are being reproduced. So it’s that encoding of inequality that doesn’t rely on explicit or intentional racism.


And then the other aspect of the New Jim Code that I see as distinct is that the presumed objectivity of technology allows that domination to hide in plain sight. It allows it to penetrate every area of our life, but because people are unwilling or unable to question it because it falls within the realm of technology, it makes it actually even more dangerous. So rather than being able to point to a racist judge, or a racist cop, or a racist doctor who is perpetuating racism, when it’s mediated through a technical system, and where the output is giving more opportunity, or more life chances to one group over another, and you’re unable to question it. One, you don’t know how it was designed, you can’t penetrate it at a very basic technical level. But also it’s the kind of epistemic cloak, in which people don’t feel like you can question it because it’s technology. What I describe as an “allure of objectivity” that surrounds it.

And so, to me it makes it even more important for us to have a language to name it, and to begin to question it. And so, that’s what the New Jim Code aims to do, to give us a language to be able to call it out when we see it.  And in that way, it’s a concept that’s in conversation with a number of other colleagues who are also writing in the last few years in order to name this new problem space. So, we think about my colleague Safiya Noble, “Algorithms of Oppression: How Search Engines Reinforce Racism” [2018], we have “Automating Inequality: How High-Tech Tools, Profile, Police, and Punish the Poor” [2018] by Virginia Eubanks.

Kim: She’s been on this show.


Dr. Benjamin: Oh, wonderful. We have “Weapons of Math Destruction” [2016] by Cathy O’Neil; a number of other people.

Kim: She’s been on the show. [Laughs]

Dr. Benjamin: Wonderful! So, like all of us are in conversation and solidarity, we’re trying to build up a grammar to be able to name the various manifestations of this problem space because the fact is, if you don’t have a word for it, if you don’t have a language to name your reality, that means you’re still affected by it, but you can’t call it out.

Kim: Mhm.

Dr. Benjamin: And thereby, you can’t intervene in it. So, language and the power of words is the first step to be able to then take the subsequent steps to actually build a movement in order to counter this thing that you haven’t been able to name, and so, this concept in this book is one more grammar, one more element to this larger vocabulary that’s trying to build up a movement around tech justice.

Kim: Oh, my word. OK, we gonna get to the equity and equality, but you just hit on it, because as an educator, I start every conference, everything with defining terms. Words have meaning; you just can’t change the meaning of things. And then we have to understand that words do evolve. And so it’s a thing that—people act like it’s a black and white issue; “Well you said racism is this definition based on that, so you can’t change that.” Or, “They is used for this; it’s a plural, you can’t use it for a singular person.” I’m like, “OK, language evolves.” That’s what that is. And yet, even in that, words mean somethin’. You just can take the word “cat” and just point to a stop sign and say, “That’s what we’re gonna call that now.” That’s not how that works.


Dr. Benjamin: Yeah, exactly. So, I think that fluidity of language, and the meaning of language is important.

Kim: And I love—and so, this popped in my head—I’m loving that I’ve had three of the four of you on my show. [Laughs]

Dr. Benjamin: Yes.

Kim: I’ve been tryna get Noble on… [mock frustration] oh, I’ve been tryin’.

Dr. Benjamin: Yeah.

Kim: Yeah, so I might need a little help with tryin’ to get her on here.

Dr. Benjamin: Sure. Sure.

Kim: But it’s interesting that it’s women. Are there any men doing this work? [Laughs]

Dr. Benjamin: Yeah, there are a few that I can point to off the top of my head. There’s a book that just is coming out this month, so you should definitely have NYU professor Charlton McIlwain here. His book is called “Black Software” [“Black Software: The Internet & Racial Justice: From the Afronet to Black Lives Matter,” 2020].

Kim: Oh yes, that’s on my Amazon wish list, mhm.


Dr. Benjamin: Perfect. So definitely invite him on. And then a brother named Professor Andre Brock, down at Georgia Tech, and his book will be out in a couple of months, and it’s called “Distributed Blackness.” [“Distributed Blackness: African American Cyber Cultures,” 2020]

Kim: OK.

Dr. Benjamin: He’s one of the OGs in terms of this field of race critical code studies.  He’s been writing in this area for a long time, mostly in article form, and so his book is gonna bring together a lot of those strands. And I know that people are going to really gain a lot of insight from that work. And so, yeah, there’s some brothas in the mix. [Laughs]

Kim: Cool, cool, cool. Alright, so I have those three books tagged—or three people tagged. I have Michelle, I have Andre, and what did I just do…

Dr. Benjamin: And Charlton.

Kim: And Charlton, yes. OK.  Alright, so thank you for these. Yeaj, someone I just interviewed mentioned “Black Software.” Oh “How Not to Be Stupid about Race,” [Correction: “How to Be Less Stupid About Race,” 2018] she mentioned “Black Software” to me.

Dr. Benjamin: Yep, Crystal Fleming.


Kim: Yes, mhm.  So… whoo! OK, now can we talk about the difference—again, now, words matter, because I tell people that I’m not interested in equality. I’m not.

Dr. Benjamin: Yeah.

Kim:  There’s no way we can be equal. [Laughs]

Dr. Benjamin: Yeah.

Kim: I say this—I’ve been sayin’ this for over a year in my talks. It would take a white man to get narcolepsy, sleep for years, and I get every advantage, and even when he wakes up that he simply crawls; but even with his crawling, because he gets benefit of the doubt and all this other stuff, he’s still gonna end up ahead of me. So I’m not looking for equality. I’m looking for equity. [Chuckles] So can you talk about what those things are, ’cause you mention those in the book? ‘Cause I—that’s a big thing that people, I find, screw up and make huge mistakes in when they’re trying to create inclusive environments or inclusive events, and that kind of thing.

Dr. Benjamin: Yeah. So a really straight-forward way of thinking about it is that equality is very literally making two things the same, so it relies on a foundation of sameness. And it’s looking at those two things as if, again, that they’re on a even playing field, that they share the same histories; where they’re talkin’ about two groups, they experienced life in the same way. And so it’s just a matter of making sure that we even out the opportunities, the resources, et cetera.


Equity, by contrast, really grows out of an analysis of this history and the differences that surround people’s experiences. And so rather than trying to make things the same, it’s trying to address the actual needs that grow out of this very unjust history. And so, the goal is not to make everyone the same, but to actually address the needs of any given time.

And so for me, equity is important, but I would also say anytime we’re talking about equity, we also need to add into that an analysis of justice and really thinking about how power operates; and power is not simply a top-down phenomenon. It’s just not something that happens through one or two gatekeepers or overseers or people in power, but power is also diffused; it’s horizontal. We exercise power in the way that we move through the world. And so even though you might be on the one hand the target of an unjust system, you might also exercise forms of injustices against other people in your group or without. So it’s also owning our own complicity in how we help to maintain forms of privilege and power.

And so really this is a conversation that doesn’t leave anyone out. None of us are on the side of the innocent or on the side of a pure victim in these systems. We have to think about how we continue to—even if it’s at the level of just internalizing the logics of domination. Like if you think about colorism in the Black community, it’s not as if there’s a white man installed in every house telling us to value lighter skin over darker skin; we have internalized these logics and then we perpetuate it. It takes the form of, say a grandmother telling her grandkids not to spend so much time in the sun in case they start lookin’ too dark, or the way that we police one another’s hair, or all of the ways in which we internalize the logics of white supremacy, and then we perpetuate it.


And so that means that we all have a role in trying to combat both the system of meaning and the practices that go along with it, and so rather than spend too much time trying to only direct our energies at a boogeyman out there, I think it’s important for all of us to take into account our own role in maintaining these systems of meaning and power and try to intervene where we can in our own mind and in our own relationships, in our own workplaces, in our own families; we have a lot of work to do. And for me, that’s where I like to direct my own focus, rather than only looking to white people to change, because to the extent that I have internalized the logics of white supremacy, it is inside of me and I have to actually take that into account.

Kim: [Sighs] OK. So again, I’m getting so much confirmation. So the four guiding principles of #CauseAScene are, one: “Tech is not neutral”; two: “Strategy without intention is chaos” [correction: “Intention without strategy is chaos”]; three: “Lack of inclusion is a risk management issue”; and four: we must “Prioritize the most vulnerable.” And that to me is how I get to the equity question, because when I define privilege, it’s simply—people get all [grumbles]—simply who has access, and who gets to leverage that access? So depending on what situations we’re in, our access to privilege or our levels of privilege change. So if we come into a space and we are—and this is this internal thing you were speaking of—if we come into a space and we evaluate who in this space is the most vulnerable by making sure that they feel safe and protected, everybody else is gonna feel safe and protected.

And it also—and then, that’s the first part; part A—the part B is where I say all whiteness is racist by design. That is just what it is. People of color have to deal with the model minority myth, which is anti-Blackness. And then Black people specifically have to deal with our own internalized white supremacy and anti-Blackness. So it leaves everybody in here with the responsibility to do something when we cause harm—intentionally or not—to recognize it, to own it, to figure out how we won’t do that again, to make amends, to rebuild trust. It’s incumbent upon all of us. So just hearin’ you say that, I’m like, “OK, yeah.”


Dr. Benjamin: Absolutely. Resonates. Exactly.

Kim: Yes. It totally resonates. This is the work I’m tryna get people to understand. People when they realize, when my followers, when the listeners realize that, start really to understand what it means when I say that whiteness is racist by design, they—I wrote an article on the five stages of grief. And it’s about white supremacy or antiracist—I can’t even think of the name right then [“Dismantling White Supremacy And The 5 Stages of Grief”]—but it’s the five stages, you know? And it’s like so many white people or so many people who realize they’re complicit, they get stuck in anger or guilt. You gotta move past that. You gotta figure out how you move past it, ’cause this is a ongoing cycle; I tell people all the time that I’m doin’ this work because I have the skills to do this work as an educator, so I am educating the oppressor while I’m also processing my own oppression.

That is something people have done. So I’m gon’ make mistakes because I’m dealing with my own [chuckles] stuff here. I’m learning things, and I might have a… oh, and it goes back to—bringing ‘er back around, right?—it’s not solidified, as you said, it’s in that technology. I may have an understanding of something in January, but by October, it’s developed, and that needs to come with me. That needs to come forward. And that’s how our technologies should be behaving. They should, as we learn, our technology should be learning with us. It should be shifting and changing with us.


Dr. Benjamin: Yeah, and so many of the principles that you just outlined, I find when we apply it to technology, like even the point about prioritizing the most vulnerable; there are ways to apply that in the process of tech design. I’m thinking about my colleague at MIT, Sasha Costanza Chock—that’s someone else who you should have on the show, by the way—in which she outlines the design justice principles on the website—, I believe—and goes through how tech designers should be thinking about their relationship with communities. For example, how do you actually operationalize and partner with communities and find technical solutions that support the vision that people have for their own communities, rather than coming in to fix something with some application or software system.

And so, for example, one of the principles has to do with the needs of the communities always going before the needs of the designer, or if something is working in the community already, you don’t necessarily need a technical fix for it. Not everything needs a technical solution; just because you can create something doesn’t mean you should create something.

Kim: [Claps] Oh, my lord, have mercy. [Laughs]

Dr. Benjamin: And so when you really are putting first and foremost a humanistic vision of design, then in some cases you might have to put the brakes on whatever you’re thinking to create technologically. And so I would encourage listeners to check out design justice, check out the principles, and actually look at the projects that are growing out of that.



Dr. Benjamin: …and actually look at the projects that are growing out of that.

Kim:  It’s

Dr. Benjamin: There we go.

Kim: And I actually believe I have her coming on the show, now that I look at—I need to see her Twitter because I think I’ve already reached out to her.

Dr. Benjamin: Awesome. Awesome. Yeah.

Kim: Yeah. So, thank you for that one. OK. So you see this—well, first of all, I wanted to tell you—one thing that when you were talking about the sameness, what it brings to mind is why I’m not—OK, I’m not gonna speak for Black feminism, because I don’t see myself as a feminist because of I don’t see myself included in feminism—is the fact that sameness—and I talk about that all the time—feminism as it is commonly used or ennacted is, “Let’s only focus on the things we have the same,” [chuckles] which is gender.

Well, gender in most situations is not my biggest concern. It is my race. And to say that this is something I cannot talk about or including this is against the thing that we’re doing is problematic for me. Because if we all—women of color, whatever—come in, and transgender women or non-binary individuals come in and all we focus on is gender, then only the people with that thing will be elevated. And this is why I tell people all the time in tech: white women are not diversity. Can we stop that? Cause that’s this is this thing that we’re seeing. It’s a all-white panel and they’ll have one or two white women. It’s like, “That’s not a diverse panel.” You know? And they think they’ve done something.


But I just wanted to bring that up, ’cause that’s that—again, I wasn’t really talkin’ to you; talkin’ to the audience so they understand the examples of this. Every time you ask an individual to silence or cloak those things that make them different and only focus on what makes them the same, then you’re using your power structure for you and not for the collective.

Dr. Benjamin: Absolutely. And that’s why Black women, Black feminists, have been able to actually theorize in a much more complex fashion and build movements in a much more complex fashion than both white women as feminists, and Black men, as Black nationalists, because Black nationalism in many ways asks Black women to ignore the gender dynamics of our experience in the name of sameness. And so Black women have sat at the crossroads of this and have said, “We’re not gonna give up our racial experience for white feminists, and we’re not going to give up our gendered experience for Black nationalists.”

And we have not allowed either group to own the idea about racism or feminism, and so that’s what Black feminism gives voice to this complexity by saying, “You don’t get to have the monopoly over feminism and you don’t get to have the monopoly over addressing white supremacy.” It’s through this intersectional lens.

And one very concrete example of how this has taken shape through recent research is—coming out of MIT, again—Joy Buolamwini and Temnit Gebru’s work looking at facial recognition systems and how not only have they been—most major facial recognition systems—have been very poor at detecting darkness, but Black women in particular, the combination of race and gender actually produces the poorest results in terms of detection through facial recognition. So this is an intersectional analysis that actually brings to light a complex reality that Black women have in the larger social system, but now we see it reflecting through computer systems, and so it places a mirror onto this particular positionality that we inhabit to draw a light onto how these systems are only gendered, but they’re raced. And when you bring the two together, our experience offers a really unique light onto what has to change to make this more equitable.


Kim: And the thing that is scary to me is the fact how we scale that. It’s not only that you try to extrapolate the human out of that, but the levels at which this harmful, biased, racist algorithms or whatever they are, are scaled. And, and then it’s… I can’t go on tyin’ back to when you don’t have the language, because it’s… I can be honest, until I really started studyin’ these things, I have these, “Aha!”—this is why I’m sayin’ “process,” ’cause I have to keep continuing to have these “Aha!” moments like, “Oh, that’s what that was. That wasn’t me!” Because we internalize it or the systems say it’s our failing. And it’s like, “Wait a minute. That was not me. That was not my fault.”

So you have to go back—if people are going back and evaluating all of the things that they thought that they were both smart, and they had… and you know, “I pulled myself up by the bootstraps.” And you understand that if you’re white, no, you really didn’t. You’re probably ’bout as average as anybody else. And it’s gonna take you to shift your psyche. I understand. That’s hard work. Most people don’t wanna do that. But that’s the work it’s gonna take for us to get through this, and we—as you said—we all need to be doin’ this work, because I can’t continue to move forward with this movement if I’m not doin’ this work, because this work helps me to shift my understanding, not only for the work, but for myself, because it’s healing. There’s so much trauma that we’re all carrying around.

Dr. Benjamin: Absolutely. And I just love your vision of continuous work and learning; that there’s no point in which any of us have arrived at final enlightenment where we have it all figured out, where we have all the answers. In fact, that is the lie embedded in so much technology, is that it can predict the future infallibly. And so to counter that, it’s not simply that we want to make those predictions more accurate; we actually want to question the very process of prediction, this very notion of omniscience and objectivity, and to actually say rather than simply trying to tweak it, to make it better at prediction, let’s question that actual process. And why are we trying to colonize the future before we get there? Who does that benefit? Who does that displace?


And so, rather than try to actually come at it with a sense that we have it all figured out, how about we actually try to operationalize something like technological humility, where our posture towards technology and towards conversations about tech justice comes from this learning mode that you’re describing, where we come at it as students, where we’re trying to learn something together and figure it out?

And of course, many of the tech giants, many of those who monopolize power, are certainly the opposite of humility. They exercise a kind of hubris that then materializes in the digital and technological and material infrastructure. But to counter that, I don’t think that we want to mirror, we don’t want to come at tech justice with a hubris that knows it all from the point of view of our own values; we actually want to embody and enact a different ethic, a different set of values that is not simply trying to be the new people in power, but to question the very structures of power and rather than try to mimic those that currently monopolize and wield this power, we wanna do something different.

Kim: It reminds me of… OK, so there’s a—I’m tryna find my exact thing. Hmm. It just reminds me of why we in this space find ourselves in trouble so often, [laughs] is because we think that the smartest person in the room decides on where that line is. And I’m now—and this is why I get a lot of pushback—because I’m saying, “Who said that was the smartest person in the room?” How did they—who determined that they were the smartest person in the room? Well, like these terms of “nice” and “fair”; people in power get to determine what those terms actually mean. They’re very subjective. They’re not—what I consider fair, it might not be what somebody else considers fair.


And so it’s until we are able to have various voices in the room, various lived experiences in the room, having those conversations, challenging each other in ways that creates something that we couldn’t have created by ourselves, but also—put a period on that. And also understanding that that will change; the more we information, we get, it changes.

And we see that with just history. People thought the world was flat. It is not flat. If we had solidified that we would be in trouble. You know, it’s like in every other place in the world, learning and growing is the norm or the accepted, except in tech. It is this… and people get, “Oh, you’re just,”—I got called a cancer on the community the other day.

Dr. Benjamin: Wow. Wow.

Kim: Yeah, yeah, yeah. Because I’m just done. I’m done. These people puttin’ these folx on these high pedestals, you can’t challenge—they say summin stupid on Twitter—you can’t challenge them. Then they get these folx to come, and it’s like, “Your argument is not sound.” There is no sound argument in that. So you can do whatever, but you’ve again, underestimated me because I’m a Black woman, so you didn’t expect me to come with knowledge and resources. I’m a Black woman. We document every damn thing. We’ve had to; we have a level of [laughs] workin’ around this world so that we can even be in the room with you, that you don’t even understand what we put behind the scenes—had to do behind the scenes because of those two, those intersectional things.


And, and I just, and people are just getting really—I’ve been doin’ this for a year and a half, and people are really now seeing that I’m not going anywhere. And that I’m turnin’ it up. And so when I bring individuals like you on you, can’t say—because unfortunately, my lived experience is not enough; I have to bring on proof—so bringin’ on people like yourself, who are experts in these fields to say, “Hmm, didn’t I tell y’all that? OK. All right. So you hear it, right? All right. So…”

Because we have to keep doing it because when we challenge these individuals, they fall back on their feelings. Your feelings are not equal to actual harm. Your feelings are no longer our responsibility. That’s what therapy is for. And if you can’t handle your feelings and being around people, then you don’t need to be around people, and you need to not be speakin’ at events, you need not to be on Twitter, you need ta—all these things—because now I’m putting this responsibility back on you. Because we’ve always had to manage our feelin’s. And I bring this up every time: how difficult is it for you to write an email?

Dr. Benjamin: Ha! [Both laugh] Oh my goodness. Well, yeah, it used to be much harder. Now I’m in that mode of short and sweet, but definitely in terms of managing tone, if that’s what you’re implying, let’s say it’s too much energy.


Kim: Yes. Too much. And I bring this up all the time, because again, it’s not just me; it’s the same thing as of when—and I can’t think of her name right now; she wrote the article about white women’s tears, and she’s in Australia—and she said that she did not expect so many Black women in the United States to vibe with that. But it was for the first time she saw that it wasn’t just an Australia issue. It was women of color all over the world who have been experiencing white women’s tears. And so when I bring this conversation, when I bring this example up as the email, there is not one Black woman yet who said, “Yeah, no, I ain’t got this issue,” especially in a professional environment.

Dr. Benjamin: Absolutely. Now, if you take to the point where we’re actually trying to create smart technologies that can detect emotion, knowing that the way that emotion is racialized and gendered, how one person’s loud voice or hands moving is coded as aggressive and angry, imagine now we’re building technologies to detect emotion. Now whose sight, whose vision, whose understanding of emotion is being encoded into these technologies? And then you have a—yesterday, I was in conversation with Latino brother—and he was talking about robots that are supposed to be assigned to follow parolees around after they are let off jail, and using AI technologies to detect if they’re doing something they’re not supposed to. And he was talkin’ about how, if he walked up to someone and started talking loudly and moving his arms in whatever way that he’s comfortable, how that could easily be identified as aggressive or violent and how that would have serious repercussions.

So in part, what you’re describing as something that seems simple in the context of email, it becomes much more consequential the more that we’re trying to build intelligent machines, and that intelligence grows out of the minds of a small subset of humanity who don’t even recognize their own assumptions when it comes to affective intelligence, of looking at the world through their own emotional lens, and so this is something that’s gonna grow in importance, and we’re gonna have to really continue to question and talk about because the range of consequences is going to become even greater in the next few years.


Kim: Yeah, because we were talking about—like you said, just talkin’ about emotions—how many Black women feel safe enough to show up as their authentic selves at work? You don’t know our range of emotions ’cause we can’t be angry. We can’t be sad. We can’t be any… We can only be that, “Hi! How you doin’? Da da da da da.” That’s all you’ll accept from us. And so your assumption is that’s who we are, and anything else is contrary; that means something’s wrong. As opposed to we have access to a full set of emotions like everybody else, it’s just not safe for us to express those in white folx’ company.

Dr. Benjamin: Yes, absolutely.

Kim: Woo! All right. This has been grand. I love when I talk to researchers. [Laughs]

Dr: Benjamin: Yay!

Kim: I love when I talk to researchers. That it brings out the nerd in me. [Laughs] And what final words would you like to leave the audience with?

Dr. Benjamin: I think the main thing I want us to think about is to think about our own sphere of influence, whether we’re a stay at home parent, whether we’re a CEO, whether we are an artist, or activist, we all have a sphere of influence—one is not greater than the other—and let’s think how to practice our commitment to antiracism, anti-sexism, economic justice, all of these grand values that we give lip service to, we have to find ways to actually enact it in whatever small or large contexts that we happen to be in, rather than waiting for some savior to float down and guide us in the way, looking for a single leader or a single movement, we need to actually take it upon ourselves and own our own power, rather than only thinking that some people have it and some people don’t. We all exercise influence and power in some way, so let’s figure out how to actually direct that towards the values that we hold up.


Kim: I wanted to end by sayin’, I loved how you described your book as not new information, but a collection of information, because that’s to me what #CauseAScene is about. It is about we get there together or we don’t get there at all. And so what I’m tryna do is create a movement of people from various aspects of tech, who have various levels of expertise, to start seeing, start questioning and learning so that we can all have a larger impact on who gets the platform, whose voices are heard, who gets to decide what emotions are, all of those things. And it’s not just who’s historically been there.

Dr. Benjamin: Absolutely.

Kim: So thank you so much. This has been well worth the wait.

Dr. Benjamin: Awesome. Thank you so much for having me and your lineup before me and after me is wonderful, so I can’t wait to continue listening. Thank you so much for invitin’ me on.

Kim: Thank you. Have a great day.

Ruha Benjamin

Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.

Learn more >

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud.

Listen to more great #causeascene podcasts