Shireen Mitchell

image of Shireen MitchellShireen is an Internet pioneer and serial founder who was born and raised in the projects of New York City, playing video games before they could be played on televisions and designing BBS boards and gopher sites prior to the Web going world wide. Resources Links:  

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud

Transcription

00:30

Kim Crayton: Hello everyone, and welcome to today’s episode of #CauseAScene podcast. I have an interesting, and I say interesting person as my guest today because I’m gonna be honest, I don’t know what the hell this woman does. I just know that she is everywhere talking about everything that deals with privilege, whiteness—yeah, all the things. So I will let Shireen introduce herself.

Shireen Mitchell: Absolutely. Thank you for having me, I really appreciate it. So my name is Shireen Mitchell. I am a woman who formed the very first in existence nonprofit organization to get girls and women of color into tech in 1999.

Kim: Oh, shit. OK. [Laughs]

Shireen:  Back when everyone told me I was crazy and that that shouldn’t even be a conversation to be had. And yes, and because of that time period, also, I have spent decades dealing with not just white guys in the industry but also white women in the industry in reference to the issues that face women and girls of color in tech and the specific issues of that.

01:45

I fast forward with all my life in politics dealing with and still doing legislation and the like, I’m now focused on stopping violence against women with a particular focus on women and girls of color as—as always—because the the unique issues that they face in this space is very rarely discussed or or dealt with, and, you know, we see that—that’s why I’m everywhere—because we see that everywhere in social media. And the hard part is constantly that some of our biggest—you know, when we did our data; we’re still collecting data and doing things in terms of that with tech companies and media companies and the like—but when it comes down to it the same problem that I had in the industry years ago with the tech industry with the issues with trying to get girls of color into tech, the top three barriers are white men, white women and then men from our own ethnic background.

Kim: Whew. OK. Oh, good lord. Oh, we got a lot to unpack here.

Shireen: Mhm.

02:59

Kim: I always start with—let’s just go on and let’s just roll this dice, OK? So why is it important to cause a scene and then—lord have mercy—how are you causing a scene? [Both laugh]

Shireen: So, you know, straight out of the gate causing a scene is the only way to make the change happen that we need. So if I didn’t form Digital Sisters in 1999 I would have no freakin’ idea how bad it was. When I did that, I did that at a time that it wasn’t heard of. And if I didn’t do it, if I hadn’t caused that scene and created that organization, we wouldn’t be even having half the conversations we’re having right now about it.

So that’s the reason; and how I’m doing it, I mean, I’m doing it from all walks of life. I’m going into tech companies. I am speaking about it publicly. I am working with people who want to try to get the changes in their companies. I am causing a scene with the stories that are being told, I am causing a scene with even fighting with legislators and the like about Facebook and the crap that happened with the Russian hacking, that I saw way back when, and nobody wanted to listen to us because we were the targets.

04:28

And this is the hard part about what we’re talking about now. I mean, I’m trying to cause a bigger scene about the fact that the targets and the reason why everyone sort of missed what was happening is that we were the targets, Black women and Black people. And I say Black to be very specific about that, because what people don’t realize is that 3500 ads that everyone was talking about was done by the Russian IRA, and now they’re trying to fix political ads by forcing people to do all this other crazy shit. The reason why they still can’t fix it, and the reason why they’re making the wrong decisions about fixing it is because we were 90% of the targets. And since they don’t wanna freakin’ talk to us, they can’t solve the goddamn problem.

Kim: Oh my god. [Shouts] You are speaking to the fucking choir!

Shireen: I know!

Kim: It is like you have—I don’t—their blind spot when it comes to listenin’ to our voices, I would never have thought… now politics, fuck ’em. They are bought and sold by the dozen. That’s why I try to stay away from them. But business owners, business leaders, I never would have thought that their attitudes to dismissing the voices of Black people—and I’m gonna be specific here too—of Black people, it so, so, outweighs their desire to make money.

05:49

I just don’t fucking get—that’s what really blew my mind, because—and I’m gonna, yes Stack Overflow, I’m talking about your ass again because you keep doing it. So you will, the leaders of that platform will amplify the voices of white women who are talking about how toxic it is, and I never use the word toxic. You will try to get a puff piece, some PR thing talking about how great your leadership in your organization and I’m like, I call bullshit on that. How do you have a product that’s 92% white males between the ages of 18 and 34, that’s exclusionary to so many people, but yet you’re a great employer?

There’s no way in hell I could believe that, except for to those white men and I look at your board and I see, and I look at your leadership and yes, I’ve seen you’ve added a Black guy, and you’ve—hold on, I got it right up here, ’cause it’s changed since the last time I went—so they have one, two—one white chick, she’s doing this Deputy Chief of Staff. So I have a—I’m looking at her, and I’m gonna tell you I’m not really convinced that she’s gonna be—because she looks kind of young, and that’s just my assumption—that she’s gonna be able to do anything with improving inclusion and diversity. If she’s running HR, that’s a problem.

Then you have the Black guy, Jerry Raphael is VP of Finance, and then you keep going, you keep going, you keep going, you keep going—oh, and then you have an advisor, and she used to be the marketing person. And then… that be about damn it! [Shireen laughs]

07:35

And so what they’ll do is, they’ll write these blog posts and put all these other people’s voices in, but yet I’ve seen that they’ve used some of the things that I’ve told them about in the stuff without citing me. And I’ll give one example, ’cause yes, Stack Overflow, they were calling, they said they were having a mentoring that didn’t—a mentoring experiment. Well, after the data person explained to me what that was, I said on Twitter, and you, anybody can go find it, I have a concern that that’s actually not a mentoring program, but actually a coaching program.

And then one of the researchers actually sent me the document, sent me the paper, and it was, in fact, a coaching program. And so when this one individual—can’t think of his name right now, it’s on Medium—spoke about because he was saying how the culture, they recognize that it’s not inclusive, whatever—they’ve now changed the term from mentoring experiment to coaching experiment. Did anybody say, “Hey, thanks Kim for that”? Nooo. [Shireen laughs]

No, and the reason—and what they don’t get is the more they ignore me, it gives me more opportunity to talk about this. Had they addressed this earlier, I wouldn’t have a platform to talk about this on.

Shireen: Right.

08:56

Kim: And this is what they keep—and I don’t understand why you keep hiring people who,  again, like I said, this young lady looks like she could be in her early thirties, I would doubt it, but she could be in her early thirties. How are you going to hire for creating an environment, help create a environment for people? And you—how much experience could you have in doing this?

Shireen: I have no idea, I would have to look at her background, but this is the same similar problem that happened with Uber, right? That HR person just left and the same problem with it happened to be a woman, happened to be a white woman, and as much as Uber has had multiple issues with sexism and sexual assault as an issue, they very rarely talked about the race part, and it was quite clear, based on what was said, that she had a race problem and mistreated people, and she was in charge, and she was in HR!

Kim: Exactly, she’s the Head of HR. And then another article came out about an executive that just left because of the same, for discrimination—hold on, let me…

Shireen: You’re talking about Boz [Bozoma Saint John, former Chief Brand Officer at Uber]. They were basically saying that Boz left because she [the Head of HR] was the one giving Boz a hard time.

Kim: No, no, not her. Give me a second.

Shireen: Oh, OK.

Kim: Keep talking. It is… Yeah, they just lost another guy for—an executive, let me go on my Trello—for, for hold on, let me get the exact… Where is it? You can talk while I’m looking.

10:36

Shireen: Yeah, so, Boz went there from, I think, Apple to try to help them with their marketing. She went right after all the crazy stuff had happened. And, you know, my heart went out to her for her even going in there to attempt it. But I really didn’t like—it’s like you know when there’s a toxic environment. It’s so clear in so many different ways, and to go in there or to put yourself in that space, trying to be the face of diversity and helping them with their marketing, which is what she went in there to go do, it was clear that wasn’t gonna work and it didn’t work, and I felt bad that it didn’t work for her.

And then you then you see after she leaves, right, the HR person is gone, and she was also the person who was giving her a hard time. Who’s the person who should be the one working on diversity in the organization!

Kim: And what happens is, when these things happen, then they—because they bring a Black person or person of color into these roles, they don’t have any autonomy, they don’t have any resources, they don’t have any authority—and when they fail, it becomes, “Oh,  we tried”. No, no.

So the Uber guy is there, Barney Hartford? It says he was hired last year to fix problems at Uber, the ride hailing company, instead he created new ones, in the… [laughs]. Yeah. So he—what did he do? So he just left this past week as well, on a conference call with—this spring with colleagues, he—it was the Chief Operating Officer.

12:01

A new ad—oh, yeah, yeah, yeah, it was an ad that had a mixed race couple, and he doubted, he questioned that, and that was one of the things that he had done. So he comes in and—so we’re talking about HR people, and we’re talking about the people who are Head of Operations being wholly not connected, not sensitive to these issues, have no perspective. You are hiring the wrong people for these roles.

Shireen: And the thing is, is that it’s not so much that they’re hiring the wrong people, the problem is their internal policies and workings, and all they’re doing is hiring people to come into what is already a cultural problem in the organization, and it’s either you completely shift the culture or you become a part of it. And that’s why everyone’s talking about Sheryl Sandberg at Facebook and things like that. She’s a part of that culture. She’s been a part of that culture. She helped build that culture. So she’s not fixing anything. She is just…

Kim: “Lean In” only works for a certain people. It does not work for us. [Laughs]

Shireen: Right? Yeah. You know, this young woman Minda [Harts, author of “The Memo: What Women of Color Need to Know to Secure a Seat at the Table”] is doing a book now about “Lean In” for women of color, because “Lean In”, you know, what Sheryl was talking about was not focused on women of color though, it was focused on a particular group of women.

13:27

And Sheryl even—I was frustrated with her, even like, before. I’ve had conversations with her and other people have had, and she just—it was more talk, and it was more the same old, just like the other guys’ network. And she was able to navigate it, and she thought that that was gonna work for everybody. But if you’re not a woman who happens to be a white woman that they can identify and be comfortable with, that’s very hard to do. And on average, even If we had the same behavior that she had, they would see us as a problem instead of a colleague.

Kim: Oh, most definitely. We could do anything, and everything is angry. I don’t care what we do, it’s always interpreted as angry, intimidating. And this isn’t, this is one of the things that I talk about often ’cause my tagline is “we got to get comfortable with being uncomfortable.” You cannot change anything in comfort. That is just not gonna happen. These organizations, communities, and events are so exclusive that it’s gonna take radical changes to shift to something that’s inclusive and that requires us not to—I’m sorry—to give a fuck about white people’s feelings. I’m sorry. You’re gonna be uncomfortable, get used to it. And this is what I’m talkin’ about…

15:01

Shireen: Yeah. Listen, we have to go through this existence of being uncomfortable dealing with you, and having to navigate all these different nuances, but you wanna be comfortable while we exist…

Kim: All the time.

Shireen: …all the time.

Kim: And this is where I go back to, it’s unbelievable to me. I would not under—I would not have believed it if I had not been here how they would rather be comfortable and exclusionary than make more money. I just, it just blows my mind, because when you’re talking about a knowledge-based economy, people are still acting like we have been making widgets. We’re not making widgets anymore. We’re creating knowledge, and you cannot create knowledge that impacts a global community from a white perspective, it just does not happen, and you remain profitable, it just doesn’t happen.

And this is why they don’t see these things when they’re coming. Like you said before, they didn’t see the whole Russia ad thing because they don’t have the perspectives.

Shireen: No. No, they don’t. They don’t have the nuance or the perspectives of it. And those ads, it was 3500 ads, and we’re still coding them because we’re trying to code them specifically to tell that story. So whether there was an ad targeting Black people, whether it was targeting Black women specifically, what some of the topics were, a whole slew of it was about Black identity.

16:30

And since they don’t understand Black identity, there was no way that anybody in those organizations was ever gonna catch what was happening, because Black identity is something they avoid at all costs.

Kim: Yeah, and that’s—now that reminds me, ’cause you and I had a recent Twitter engagement about the fact that when Twitter closed those five million accounts, their stock price went down and that right there, for people who’re watching this should understand that those things are kind of intuitive. They are not aligned. Twitter and Facebook and such are not incentivized to protect the very people who they say these things are in place for, because when your only business model is growth at all costs…

Shireen: Yes.

Kim: …when your stockholders see that you’ve closed five million at 50 million accounts, even though you have to come back now and say, “Oh, these accounts were—some of these things weren’t even valid,” now you got to explain to them why they need to put their money back.

Shireen: Yeah. Not only that, but they have to explain to the users while their user count went down, right? Because all of this is about amplification, particularly on Twitter, and so if they don’t have the following to amplify their stories, retweets, and engagement, then Twitter becomes a non-effective tool as well.

18:03

Kim: Yeah, oh yeah. Yeah, yeah.

Shireen: And so, it’s part of the reason that they never took the steps they took before was because of that. Because for them, the more users, the more engagement, whether it was safe engagement, healthy engagement or not was irrelevant to them. They just wanted the engagement at all costs. And the tech industry in particular has had, and we’ll see how it happens, but it has had the same model everyone has in America about free speech. How do you counter speech with more speech? So when the amplification of the more bad speech happens, then what do you do? What do you have? Especially when that amplification now is automated. It’s not even humans any more.

Kim: Yes. Yes, yes.

Shireen: And that’s what that userbase issue is, when many of those accounts are bots. Here in DC we have this term that we use, and it’s not a common term in most places, but all of this concept of disinformation, the ways in which the ads were being used in social media and Facebook, all of this is called computational propaganda. That’s the word that we use, and it looks different in different ways and in different capacities.

19:26

But, one of the key aspects of the computational propaganda is the amplification and the automated engagement. That’s why you can’t now, in Buffer, you can’t use all your Twitter accounts at the same time to send the same message by just cooking each one. It’s the same reason now, right? But it was a great thing to have when you were someone who’s managing multiple accounts as a human; but that same concept can be used to amplify bad messages, which is what we just went through.

Kim: And I’m gonna use it as a technical term, that feature has been, has caused all this, ’cause I was using Buffer and Cloud Fire at the time, and you could see the shifts, but it was interesting because no one explained why they were changing it.

Shireen: Yeah. They never said a word.

20:26

Kim: Yeah. They never said this is—now that would have been beneficial for people like us to understand why—and you were probably more tapped in than I was—why these changes. ‘Cause now what you’re doing is making my life harder, and so you actually lost money because now I don’t need you. If I gotta do this individually, what’s the point of me having your service?

Shireen: Exactly. Exactly.

Kim: Again, when I’m so—oh my god, this circles right back to—you’d rather not be informed and lose money. It’s so interesting.

Shireen: Yeah. I mean, Facebook is going through that now with the ads because now—which is really silly, I just saw this woman who, on Twitter, I retweeted her, she’s a nonprofit organization that does news and they were trying to do—and this has happened to many of them, Reveal and others—every time they try to push a story by paying for it with ads, now they have to claim to having a political affiliation in order to boost their stories. And it’s crazy because they are news…

Kim: Yeah, they shouldn’t have a political affiliation.

Shireen: …because they shouldn’t. Yeah, they shouldn’t, they should not have a political affiliation at all and should not have to choose that option! But the reason that that is in place is because of what I just talked about, those 3500 ads that were even—let me just be honest with you—even those 3500 ads would still be missed by this political advertisement, because they were Black identity focused.

22:06

Kim: Yeah, yeah.

Shireen: So that’s not getting caught in the political news framework.

Kim: And you, I remember, I think I just started following you when Twitter changed their algorithm, and you started speaking about, “All this is gonna do is highlight and make targets of people like us.” So explain that to people, explain what you meant by that.

Shireen: Because there have been several things Twitter has done since they figured out this issue. And then let me go all the way back to the beginning, which is before the election. I have two things to say about Twitter; one, just so that we’re clear, a Black man wrote the code for Twitter.

Kim: Oh, I didn’t know that.

Shireen: The white boy stole it, and they still haven’t been able to operate Twitter effectively, so he’s gonna eventually tell his story. He said he’s ready to start telling his story.

Kim: Whaaat?!

Shireen: So let’s just go, let’s go alllll the way back to the beginning.

Kim: You gotta give me a breath on that one? What?!

Shireen: Yes. Yes. A Black man.

Kim: Ah, fuck. So this is the Matrix all over again. OK. All right. Go ahead, keep goin’, go ahead.

Shireen: So, you steal a product from someone who’s the originator of it, who had a certain thought process about it, and then you try to make it into something other than that because you don’t freakin’ know what to do with it. Because you didn’t create the code and you have no understanding of the code.

23:29

Kim: And you don’t understand the intention behind or any of it, or his strategy or any of that. You just saw something to monetize.

Shireen: You have it in your head to some extent because you think you know, but in execution you failed because you are not the person who had the original idea, and that’s literally what’s happened with Twitter. That’s literally what’s happened with Twitter.

Kim: Damn.

Shireen: And so going from that time period, ’cause I was around during those early days and we were fine during those early days, right? But we were all early adopters and all the tech people were having their engagement. And then Oprah came on, and then all of a sudden the weather changed and everybody got all bent out of shape, and then it was like a hashtag for a while before or after Oprah. You know, it was like that nonsense, because everybody came on after that point, and that, of course, changed the nature of the product. And now, if you don’t have a vision for the product, the people who’re using it will change it to their vision…

Kim: Thank you, Stack Overflow! Oh, my god, I’ma, I gotta, hmm—because their argument continues to be, “Well, this is not how we intended when we created…” It does not matter what your intended creation was, you didn’t have a strategy to stop this shit from happening, and this is where we are.

Shireen: Yes. Yes. Yes. Yes, yes, yes. And if you have a model of, like Facebook’s model, which they just decided to change it because of Russia, you know, move fast and break things—well, you broke it.

Kim: You broke it. [Laughs]

Shireen: You broke democracy. You broke the system. And here you are, with a broken system trying to repair it; and you don’t know how to repair it, cause you’re still not looking at what the original problem was!

25:04

Kim: And you still have the same people who broke the system trying to solve the problem.

Shireen: Exactly.

Kim: OK, so get back to the Twitter story.

Shireen: Yeah, so what happened after Oprah was, everyone got on and then it was being used in different ways. Now of course, they will say the same thing, “We didn’t know it’s being used in ways that we weren’t intending, as good and bad”. So you get the days of the innovation and the outreach that happens with Black Lives Matter, because now people able to galvanize and use that platform as a way to not only create community but be able to make statements about what’s happening in your community that other people have never seen before, right?

Because you have this public platform now that is open to all. And even if you don’t have a Twitter account, you can still go search Twitter and see what’s happening and keep up with what’s going on. Unlike Facebook, of course, which you have to be logged in for. And you have to be friends of friends or, you know, your network is a little smaller. I always compare them when I talk about them, about the ways in which we in tech talk about different aspects as closed network and open network. And to me, Twitter is the open network, and Facebook is the closed network, it’s this walled garden.

26:25

Kim: OK, I’m gonna stop you right there, ’cause I want you to hold your space, but I wanna bring this, ’cause it’s just the thought that just hit me. And that’s why those ads were so effective on Facebook, because that’s a closed network, and it’s an affinity network because these are my friends.

Shireen: Yes. Yes, that’s how come, that’s why it works.

Kim: Woww. Oh my god. Oh, OK. Shit! That’s some propaganda for your ass. Go ahead.

Shireen: They used it. And that’s the reason why Facebook can’t fix this, because they use the platform the way it was…

Kim: Designed.

Shireen: …designed. [Laughs]

Kim: Yep, yep, yep. Damn!

Shireen: They just did it for bad, right? So…

Kim: Oh my god, yes.

Shireen: They used it the way it was supposed to be used. Now Facebook is trying to figure out how to fix that, when they gave all the keys away; it’s like you close the barn door after all the horses are out.

Kim: Yeah. Yeah. Yeah. And it goes back to the researcher, who they tryna scapegoat for this Cambridge Analytica thing when he was like, “I’m just one of thousands who was doing this.”

Shireen: Yes. Yes.

Kim: We just assumed no one would care about their data bein’ gleaned. But everybody was gettin’ it.

Shireen: Everybody was getting it for years. Now, of course, they closed that loop now, but the data is out. The algo that was built off of that data mining is built. It’s out there. It is created. It can be recreated and used in different other fashions. So again, closing the barn door after the horse is out [makes a “pew” sound]. Like any data breach, it’s over. Your information is out there, it’s done. There’s no way to go back and say, “Give me my data back.”

28:03

Kim: Yep, yep. And that’s the thing, when Equifax—and I’m like, “Dude. Come on, please. The shit’s already out there. Let’s…” OK. All right. So get back to your Twitter questions. Oh, my god. This is so deep. Thank you. I’m so happy I got you on. [Shireen laughs] You’re the first person to bring this historical perspective.

Shireen: Yeah, Yeah, ’cause most people don’t know. And I’ll tell you, I have gone back and forth about saying it, not saying it, keeping the history intact. But there aren’t that many people like me who have a full historical perspective of the whole industry, because I started my Web firm in 1997. I formed that organization two years later. I have seen the tech booms and busts.

Kim: Yes.

Shireen: I have. I’ve been in and out of the industry. I’ve gone back and forth in my, like, “Am I just gonna walk away from this, and what am I gonna do now?” moments.

But anyway, let me get back to Twitter. So as we start to see that level of amplification, particularly with Black Lives Matter, would be a great segue into how then it can be used in the bad way. And so the same way that that organizing was happening with Black Lives Matter with real people, you start to see these accounts show up, these user accounts show up where people are basically hiding their identity, and they’re doing the complete opposite, right? They’re doing the anti Black Lives Matter, right? All Lives Matter, right?

29:24

Now we got the neo Nazis, and the Unite the Right, and all this stuff happening—which was a minor group by the way, and that’s the whole key to this. They were a small group, and they were individuals, but they got to amplify, as one person, at least 500 accounts. If you’re one person looking like you’re 500 people, then the messages that are being sent, despite being false, and fake, and unreal, people start to believe.

Kim: Yeah, because now it’s not one nut job. It’s five very credible resources. I mean, 500.

Shireen: So you get a troll farm of thousands of people who were managing 500 each, and then you amplify the false negative stories. And, and, you also use those accounts to shut down or try to silence others.

Kim: Yeah, by reporting them, based on…

Shireen: By reporting them, mass reports, mass reports to now silence real people.

Kim: Yes, yes, yes.

Shireen: So if the case is speech and more speech is the best answer to the problem, then what happens when that speech is deadly? What happens when that speech incites violence? Then what are you doin’?

Kim: And then I wanna stop you a moment there because I want to bring this up, but I’m glad you brought it up again. I don’t understand why people confuse the First Amendment right to speech is about with the federal government. It is not with private organizations…

Shireen: Or people.

30:59

Kim: …and people. Exactly. So you don’t get to say any fuckin’ thing without  consequences…

Shireen: No.

Kim: …except for with the fed—well hell, it’s still consequences if you, even with the federal government, you can’t make a bomb threat or something like that and not have to deal with the consequences.

Shireen: You can’t scream fire in a crowded theater.

Kim: Exactly.

Shireen: You know, it’s all those things. But when it comes to this issue, because this issue is predicated on the access of race, then we want to change the rules around a little bit.

Kim: Yes, yes, because now amplifying—now we’re, first of all, we’re forcing a message that whiteness never wanted to have a conversation about. And so having people, whoever and—OK, I know I’m all over the place because I love history and this is just blowing my fuckin’ mind here, because even with your 500 bots, we’re coming to my mind, “…and still we rise.” You cannot shut us up at this point.

Shireen: No. No. No.

Kim: And that is what’s fuckin’ with them. Because…

Shireen: They were expecting us to go away and be quiet.

Kim: Yes! And this is where I’m telling—like I said, white liberalism is the biggest barrier I have to inclusion and diversity. And this is why we need to—white liberals need to be truly informed about these issues, and come together in ways like you just—because you just give us a great example; the 1 to 500. These one-off gunshot things aren’t working.

32:33

Shireen: No.

Kim: We need to come in together, and as masses, and we don’t have to do it—as we just spoke of—we don’t have to do it with bots because we’re not going away.

Shireen: No. No. We’re still, we’re actually… [pauses] Let me let me say something else here, and I’m going back to politics on this one.

Kim: Yes, please.

Shireen: OK, so let me finish my thought on that, and I will come back to it, which—OK, so when the 500 bots happened, they knew, Twitter knew at that point, right? They could see the patterns, they could see the problem. But by this time, everything is now politicized and political, so—and this is the key piece to this—as these bots and people were interacting and trying to silence, particularly people on the predication of race, Twitter, of course, didn’t do things unless it was a big deal, right?

So shutting down Leslie Jones for no freakin’ reason, right? Because she’s famous, Twitter decided to engage. The first time they started to engage in this problem was around 2009, I believe, and I may be off about this, but I forget what year this [was], I think it’s 2009. When, Robin—will just have to look this up—when Robin Williams committed suicide and they went after his daughter. That was the first, yeah, that was the first time Twitter started—hold on. [Pauses]

34:05

  1. No, sorry. It’s not 2009 because 2009 seems so early ’cause that was like Obamas in that time period, so I was like, that can’t be it. So no, he died 2014. And 2014 is when the bots stuff started. And so the first target, public target, media target, was Robin Williams’ daughter.

Kim: About what?

Shireen: They basically blamed her for him dying, committing suicide. It was her fault. So she got off. But that was 2014, and that’s the first time you start to see this organized, really like organized personal attacks, like these one-offs. The other—this year is also the year Gamergate starts; all of it’s connected.

Kim: OK, explain Gamergate.

Shireen: Gamergate is basically an organized faction, of which the gaming industry was trying to push back on racism, or being, you know, on race and gender, sexism. It started actually, another historical point, because of a jilted boyfriend who wrote a post about his girlfriend and basically insinuated that she slept around to get her good ratings on her games.

35:31

Kim: I think I just saw, there’s been a update on that recently.

Shireen: Like what?

Kim: Like somebody’s been—I don’t know what’s happened ’cause I don’t know what Gamergate was—but that’s been popping up in my Twitter showing old, actually showing his old messages and—oh, man…

Shireen: I have to go look at it, if anything new has happened—but that is literally how Gamergate started. If you think about it, it’s like, that’s a very personal thing and then a whole mob got created, and then that whole mob started going after everybody of diversity in the gaming industry.

Kim: Yeah, yeah.

Shireen: And they were like, “Not my shield” and all this other kind of stuff, basically saying it wasn’t about diversity, it’s about ethics. It was not about ethics and games. It was about the diversity issue in gaming, and they were the last stand, and trying to fight back about the diversity, pushing the tech in the gaming industry, right? ‘Cause now the games belong to the boys, and how dare women come in trying to create games, and how dare people of color come in trying to create games that looked like or represent them? But they can have their fantasy games that represents themselves, but we can’t. We have to, we have to have their models.

Kim: Again—I’m going to keep going back to this because everything you’re saying goes back to your hatred—and that’s what we’re gonna call it—your hatred of brown and Black people outweighs your desire to make money, and that says a lot in a capitalist society. That really says a lot in a capitalist society.

37:07

Shireen: They want us to purchase these things, but they want us to purchase them in the ideas and models of them, but not us. And they still can’t wrap their head around that. And that’s a big deal. That’s a big deal. And now they’ll say, you know, they would never want to not increase their customer base. Right?

But just like, this whole why we get into these stupid arguments about—and this goes back to comic books—if the comic book isn’t, the movie doesn’t have the exact model of the comic book, then everybody’s freaking out. They go after the Asian girl, right? They went after Jar-jar Binks; he was 25 years old, and he was told that he had destroyed Star Wars. And he was a Black boy who just didn’t, a young Black boy who almost committed suicide over that. It’s ridiculous because that level of vitriol for someone playing a role…

Kim: Yes. Not even real life. It’s not even frickin’ real life.

Shireen: Makes no sense—it’s fiction, and you’re still acting like this. Games are fantasy, and so you get to have your fantasy, but I can’t get to have mine? It makes no sense. But that’s exactly what the behavior is, and so on Twitter, it was the same thing.

38:28

So once it started being used with the Black community, Black Lives Matter and all of this, that we were aggravating a whole industry community and trying to get something to change for us—all of a sudden, the pushback is All Lives Matter, which is why in Facebook, if someone wrote Black Lives Matter on the board that we get erased out, so our free speech doesn’t matter, right? But everybody else’s does. So when it got to the head of being a problem and they realized it was a problem on Twitter, they waited to after the election to get rid of those accounts, the robots, that were breakin’ terms of service. Because…

Kim: Yeah. They made a conscious effort based off politics.

Shireen: They made a—effort off of politics! They could have fixed it before, but they didn’t wanna have the perception of bias.

Kim: Yes, yes. And again, this is not bias, this is your product, you get to say, “And it’s a free product to users!”

Shireen: Exactly. So you know, no one has ownership, but—and that’s the funny part, is like we don’t even have ownership to keep our accounts on Twitter if Twitter doesn’t want us to have our account, right? So it makes no sense that this is even the argument to need to have, but the argument is still the case that we just saw, which is the fewer users using the platform, it messes with their valuation.

40:05

Kim: Yes, exactly.

Shireen: So they weren’t going to do it initially, not because of our safety, our democracy, but because it was about their valuation, their bottom line.

Kim: Exactly. There weren’t no incentives for them to do it.

Shireen: No, not until it got bad, right? Not until something bad happened.

Kim: And now, but then—and this is where I continue, I’ve been saying this a lot lately, is inclusion, lack of inclusion is a risk management issue. It’s not all these people talking about the right thing to do. Screw right thing to do. Screw your politics. Screw your morals. This is now a risk management issue for these companies. And if you don’t see this as a business leader, you are in a world of trouble ’cause what’s gonna happen next? I hope you have great lawyers because litigation is coming.

Shireen: Yeah, I mean, they’re dancing around now from the political legislative side with the hearings between the UK and the US, both—all of these companies really—’cause  after what happened with Russia, you got Google, you got Tumblr, you got Facebook, you got Twitter, you got YouTube. I mean, they’ve all been used and they’re all implicated in these things, so they have to do something about it. They have to, you know, those family groups on Facebook—I mean on YouTube—are having problems now because they changed the algorithms or they deleted certain videos ’cause certain words were in them.

41:39

All of a sudden it matters what the words are, and so it’s just really interesting to see the shift, but the shift is still predicated on the same problem, and it’s a race problem, and they’re calling it a free speech problem, but it’s not even that, because the truth is hate speech is not free speech either. So this dance that they were trying to do is predicated on the problems that we just talked about.

Kim: So what…

Shireen: But let me get back…

Kim: Yeah, so I was gonna say, what did the new algorithm, what did they do?

Shireen: So the new algorithms, there two things about the new algos, right? Some of the accounts people are identifying, for example, I identify as being queer, as an example. “Queer”—the algo is banning the content, ’cause the person said “queer.”

Kim: Why? That’s how you identify as a person.

Shireen: Exactly. But it may be—the algo doesn’t know, right? And they have been coding the algos for some of these things, and one of them is something like “queer,” calling someone queer or someone identifying as queer. So you’re seeing content being grabbed by the algos without any context.

43:01

Kim: Because algos don’t have the ability to decide…

Shireen: Mm-mm. As a whole. There’s a book Safiya [Umoja Noble] has called “Algos [Algorithms] of Oppression”, I mean, it literally is about that. Same thing with Facebook, they had their product manager person come—I can’t remember his name right now, I was at a conference where I was basically operating on fix Facebook with a group of really smart people—he came to that conference trying to get us to understand what they were trying to change, and this conference was basically a left-leaning conference that focused on civic technology, which is basically using these platforms for political purposes, which is one of my other parts of my life that I deal with quite often.

And the Facebook guy comes in and he’s talking about all the things that they’re doing as they’re trying to evolve as to what the next steps are, another example, and so he puts up on the screen one of the things that they’re weighing, and the—’cause he had three of those screens, but the one that struck me the most, that was—I was flabbergasted—they put up safety versus censorship.

Kim: What? There is—how is there a versus?

Shireen: How is that a versus? Why is that—why are those two things even being put together? So the thought of that in the first place is like, “Why those two things supposed to be in some kind of balance? That makes no sense”. So I was like, “OK, let me see what he, where he goes…”

44:34

Kim: Where he goes with this.

Shireen: So then he asked the audience, “We have algos now, if I tell you this, we have algos now that can rip down hate speech before anybody sees it. Before anybody sees it. How many people in the room think that’s a great idea and wants us to do that?” So of course, the majority of people in the room raising their hand, there’re people who have issues with this ’cause, you know, there’s free speech people in there.

So then his second question is, “What if I tell you that if we did that, one political party would have their content pulled down more than the other one?” Huh? It’s fuckin’ hate speech. How is that..? OK, so I kinda let that one go.

Kim: So now you made this a political issue.

Shireen: So he throws in the political part, and we’re political conference, so, so, so… And so then he goes to the third part where he’s basically asking people how many changed their view when they found out that one political party’s content will be pulled down more than the other. And people stop voting, like, they were just like, “What? What’s happening here?” And so the free speech people were kind of, mostly putting their hands up because that’s—you could tell who, you know. You can obviously tell by the question who the free speech people were. And then I was tweeting, and I was like, “This is—first of all, how is that a causality reason to not take down hate speech?”

Kim: Yes, Yes.

[Interlude]

46:58

Shireen: If a political party is having more hate speech than the other, then their content needs to fucking come down! How are we debating the reality of that? And so I’m pretty sure I know which party you’re talking about, but let’s be honest here! So now you can’t fix, this is supposed to be your fix!

Kim: But now you’re saying you can’t fix that.

Shireen: You can’t fix it!

Kim: So this is a barrier to that, because, my god…

Shireen: Because you decided to put safety up against censorship. Safety! People in Miramar were killed in a genocidal, organized fashion from Facebook! And Facebook knew about it! They forced the organizations to sign NDAs so that they wouldn’t talk about it, but they broke it after Facebook was talking about, “Oh, we came up with these fantastic algos to fix hate speech”. They did not. They were working with those people and didn’t even give them credit.

Kim: And I—hmmmm, oh my god…

Shireen: And here they are talking about solving something between, a balance between censorship and safety. I’m sorry. First and foremost, safety is mostly more important, and second of all, those two things should not be together.

Kim: They should not be juxtaposed at all.

Shireen: At all.

Kim: At all. It shouldn’t be a choice between safety and there—it’s like, it’s juxta—oh, my lord.

48:31

Shireen: It makes no sense. But if these are the tactics that you’re going through and thought processes you’re going through to solve this problem that we’re having, well guess who’s still gonna be victimized?

Kim: Yes, yesss.

Shireen: Because you’re gonna weigh the value of our safety over certain other people.

Kim: Oh, my word.

Shireen: Whose voices out of censorship is more important than our lived lives.

Kim: This is…

Shireen: So Twitter’s thinking like this, and Facebook is thinking like this.

Kim: I mean, I’m gonna be honest. [Sighs] My stomach—I knew bringin’ you on this show was gonna be—’cause I watched the [sighs again]—I don’t even have words right now. Because, I have words right now because this is viscerally affecting me, because I am a history person and understanding the history of things changes people’s perspective, and that’s what they don’t understand.

49:56

And this is why I’m not surprised by the situation we’re in right now. If anybody looks at any history, you can see it. This is how every war—and people want to focus on World War II, it’s no, it’s not just World War II, go back in history. This is how it all changes, this is how all starts; and so your perspective, particularly when it comes to tech’s role—and I’m gonna say this again—tech’s role, technology’s role in creating the situation that we’re in has to be addressed, because currently most people think all hate speech just amplified with Trump, and I’ve been saying no, it has not!

Shireen: No. It has not.

Kim: It has come out of the woodwork. They feel safer because of this administration, but Black people been telling you about this shit forever. Y’all keep talking about, “Well, didn’t we take care of this during the Civil Rights movement?” Oh my god, no, we did not.

Shireen: Black women saw the patterns earlier on. And here’s a couple of key things, and this is about the historical framework of this. These are things I’m still writing down and trying to document ’cause there aren’t that many people like me around who can connect these dots.

51:24

I had this situation with this woman, had written a book about cyber attacks and all this other kind of stuff, and she talked about Gamergate. I’m at this event and she’s like, “Gamergate is the first organized faction of these trolls.” And I said, “No, it’s not. What’re you talking about?” And then she was like, “Well, I’m just talking about in the time period.” It’s still not the first. I mean, we can agree about what happened, because we were agreein’ it was over a jilted boyfriend. But she was like, “This is the first time it was organized this way”. No, it was not. And so I told her, I said, “Well, they finally came after me in 2015. It was bad, and of course, I’m still on their radar, I’m on their list.”

Kim: Who is “they”? Stop, who is “they”?

Shireen: Oh, the Gamergaters. The Gamergaters came after me in 2015, but that’s a specific group of those trolls. There were others, and it was organized in many ways before, and in 2013, where several women started calling out with the hashtag, #YourSlipIsShowing, because what was happening, there were fake profile accounts on Twitter as Black women agitating people. That’s where they started. That’s 20-freakin’-13.

By the time it turned over and got into Gamergate and everything else, that’s 2014. That’s almost a freaking year later. So if you see on my [Twitter] I have that pinned moment, it says, “The hacking in 2016 would not have happened if they freaking listened to Black women,” ’cause we called it out. The system, the organized fashion of it—2013. And before Gamergate there was Donglegate.

53:13

Kim: The fucking what? [Laughs]

Shireen: See? And nobody knows, right, because the reason why nobody knows is because the target of that one is a Black woman.

Kim: Of course, yeah.

Shireen: The target of that one, that was organized trolling, the dude who was being called out at a tech conference—again, a tech conference—about his sexism was called out by a Black woman who felt uncomfortable about what he was doin’, and she decided to do a post and a tweet about it. And they went after her. He got his identity covered up. He’s still freaking anonymous. He’s still in the freaking tech industry, and nobody knows his name, but her name pops up? It’s a whoooooole mass problem. She’s been named; he is walking around with no consequences. OK? She’s the target. She was the target.

And again, this woman who wrote this book, I was like “So you know, you got Gamergate in there, where’s Donglegate?” She didn’t know about Donglegate. I said, “What’s different about Gamergate is the media paid attention to it.” What’s different about Gamergate and the difference between Gamergate and Donglegate is the Black woman who was targeted was vulnerable.

54:39

Her livelihood was at stake. Her life was at stake and she just wanted it to all go away. So she did no media. She hid from the media. She felt that the media was mischaracterizing what was happening to her. And, you know, I did my best to support her. I even tried to get her to write that up. And I mean, we did try to at least get her to tell parts of her story.

And she did tell one person that is sort of written because she got manipulated; there’s a book out there—which I will trash—called, “So You’ve Been Publicly Shamed” [by Jon Ronson]. That book is written for white people who have been publicly shamed and had their livelihood or their life turned upside down for doing some really racist stuff.

Kim: Mmhm. Like we see every day in some fuckin’ video.

Shireen: Right! But he wrote that book in support of them, that their lives should not have been turned upside down. In that book, he goes after the Donglegate girl. He actually said the behavior he abhorred the most was her. The Black woman, who was targeted for calling out what? Sexism, in the tech industry, and she’s abhorred the most?!

56:11

Kim: Forget the sexism, a Black woman callin’ out whiteness, that was, that’s it right there.

Shireen: Of course. Of course.

Kim: And that’s what I was gonna say with the Gamergate—and not to minimize that young woman’s issue—but would have gotten media attention had she not been a white woman?

Shireen: I don’t think so. Not not the way it went and and not the way that the level of the vitriol continues, because it continued to target mostly women, and even when they came after me, they tried to erase me from the storyline. The media, if you go out there and look at what happened in South by Southwest, and everything that went down when the Gamergaters were coming after them, I’m on the axis. I am the axis of that.

I was the only woman of color, that they literally removed my panel after agreein’ to have us have it, after us expressing to them all the attacks that we were going through, personally and professionally, by email, they turn around and give my attackers, the people who were coming after us, a platform. They gave them a panel, after the deadline, after they were attacking us, and after everything else.

57:28

And when I said no, I’m not going to be—first of all my panel, just so we’re clear about what my panel was about when I submitted it, it was talking about all that harassment, talking about the access of the online harassment that would have been the focus on women of color. And they decided, since Gamergate was coming after me, that they wanted to put my panel in the gaming segment of the conference. Now you know they’re coming after me.

Kim: Yeah, why would you set you up for a target?

Shireen: Why would you? Why would you do that?

Kim: Yeah, but that’s putting a target on your back.

Shireen: For sport. For sport. There’s one article out there about South by Southwest used Gamergate as a spectator sport. And my name is in that one story. All the other stories, no one talks about me. They talk about everybody else but me. Same problem.

Kim: OK, OK. So now again, so OK. And the reason this historical perspective is helping me, because I only transitioned into tech in 2014. I’m, this is all new to me, all this historical stuff. But it now helps me understand that the Stack Overflow ignorance is not new.

Shireen: No. They all have the same behavior. And when they try to fix the problem, in my opinion, what I’m seeing, is they’re mimicking each other tryna fix it.

Kim: Yeah. Yeah. Or they’re trading people. They’re, I mean, they’re trading strategies, people, and—yeah, OK.

58:59

Shireen: Yeah, and let’s just be honest, most of those Gamergaters, doin’ all their hacking and stuff, work in the freakin’ tech industry.

Kim: Yeah, mhm.

Shireen: They’re going to work every day and then doing this doxxing and harassing and hacking at night. That’s what they’re doing. And they have access to that content.

Kim: And that’s the same thing with, when you talk about organizations are now using Stack Overflow and GitHub scores for hiring purposes. That’s already exclusionary, because what you have is a whole bunch of people who have the time to sit on their ass and gameify, play this game called Stack Overflow, and other people who use Stack Overflow as only a resource, a reference tool, don’t engage, so they don’t have scores.

Shireen: Mhm.

Kim: So you use that as his way of saying, “Well, they’re not here, they’re not…” No, they don’t feel safe. So again—oh my god, you just, this—ahh, so again, but what you just said about the censorship versus safety? That’s what Stack Overflow is for me. You’re saying, “Oh, please, we need more marginalized people on the platform,” but you’re putting censorship over their personal safeties.

Shireen: Mhm. Mhm.

Kim: And wondering why it’s not—oh my, oh. Wow. This is interesting.

Shireen: And it’s mostly an American thing too, you have to think about that. That’s why the stuff that Facebook was doing for Germany was not being applied for us. Because we have this strong free speech conversation.

1:00:41

Kim: OK, so tell me about, what’s the difference, because I don’t know anything about this.

Shireen: So now they’re all being forced—I think you’ve seen it or heard it was the GDPR stuff—so now they’re having to, instead of do these separate things for different countries, everyone’s now required to do GDPR across the board, which is really challenging and very difficult, and I don’t see how they’re ever going to really get the platform to those levels. But that’s what the hearings and stuff have been about between the UK and the United States.

Kim: So break down what that GDPR thing is, ’cause I just got a whole bunch of emails and I…

Shireen: Yeah, everybody got those like, “Your privacy is important to us, and is it OK for us to still keep sending your stuff?” Yeah. I mean, that was like a whole series of emails that came from everybody at that time about…

Kim: Accounts that you forgot you even had.

Shireen: …you had. Yeah, yeah, yeah, yeah. But that’s what it is, because now all of us are sort of being required to operate under GDPR, and GDPR is just a different way in which to look at privacy and data. I mean, that’s the overall framework, and the regulations are much stricter.

1:01:58

And the legislation for those policy changes comes mostly from the UK. And they have their own intelligence agency that basically tracks and keeps up to date with this kind of stuff, and they’ll take these companies to court for. Unfortunately, the United States is very loose and wiggly with these tech companies, and that’s part of our problem. So I’ll just say that we—like from the legislative side—there’s some things coming down the pipe, but overall, the ways in which our data and our privacy and our safety is being protected is just nowhere in the range of regulation close to what the GDPR would be.

Kim: And the thing that I’ve specifically noticed, like during the hearings, when you’re talking about legislation, the people who write in the legislation have no fucking clue about this.

Shireen: They have no clue.

Kim: They have no clue.

Shireen: The UK, by the way, has a better sense—if you watch the hearings in the UK and then watch the hearings in the United States, they’re like night and freakin’ day.

Kim: Yeah, night and day, mhm.

Shireen: They’re night and day. There’re a multitude of reasons, I mean, there’s bias ’cause now they’re talking about conservative content is being censored more than everybody else’s, and that freakin’ isn’t true.

1:03:31

But the same problem exists, that if your content is hate speech, then your content should not be present. I don’t care who you are and what political party you’re part of. But this is also why Facebook is trying to do this political ad nonsense, because they got duped. They got manipulated, and so they’re trying to come up with their own solutions, and some of it is just—it’s theater. It’s theater.

People are still—now you can see your ad content, and I have this young man that I have been working with, and he’s trying to figure out what the hell’s happening, ’cause we’re still trying to code the ads, and I’m still grabbing people’s stories that, you know, how they’re being banned, what’s getting them banned. And most of it is predicated on race.

So this one dude who happens to be a Jewish guy, will call out white people and he’ll do, “White people are trash,” or, “Whiteness is a problem,” and he’ll do all that stuff, and he’ll get banned for 30 days. Somebody reports him, and he’ll get banned. Well, now the algos have him, right? Because he’s been banned so often. So he finally went into his ad demographic data and found that he was being listed as a Black man. [Kim gasps; Shireen laughs]

1:04:55

Kim: What the fuck?!

Shireen: So, as everything he had done before, he went through the whole, trying to get his stuff back on, and they gave him the 30 days bans again. The third time he went to do it, he had taken that ad data off. So he went back and did his appeal to have his account back up, and all of a sudden, all of a sudden, the 30 days were—never happened. He was right back on. And then he does another post—’cause what he would do is, he would get banned for something about white people and he get his account back, and the next post is—fucking white people. [Laughs] He’d get banned immediately!

Kim: Yeah. Yeah.

Shireen: And then he went into that data, he found out—yeah. You have to think about this. If the ad data is saying that he’s Black and he’s talking about white people…

Kim: That—to them that’s hate speech.

Shireen: It’s hate speech.

Kim: Mhm.

Shireen: But the n-word can last forever.

Kim: Yes.

Shireen: The amount of people who have complained about that stuff. The amount of groups that were predicated on that nonsense, right? Now they’re playing games with it, and this is basically whack-a-mole, these things still exist, these groups still exist, but then they’ll name it something else, like the one that my friends were in for Black Panther.

1:06:32

You know, all the white people upset for Black Panther, so they started a Facebook group about Black Panther, and it was predicated on the same ideals that was predicated on politics. Same things that made it look like they were pro-Black when they were actually trying to destroy Black Panther.

Kim: Mm. OK, so—oh my god, you have just, my mind is completely—I need to process all of this.

Shireen: Yeah. I mean, and that’s the thing, that’s like, all of this is happening. And I know it because I’ve been monitoring and collecting these stories.

Kim: Yesss. And I know of small parts of it because I watch your stuff, which I will retweet because it makes me think about something and then I’ll retweet comment because it relates to something specifically I’m talking about. But there has been some—I mean, just the fact that you—one of the hearings that you gave a link to when I was watching the hearings live when they were in the UK.

Shireen: Mhm.

Kim: OK, so—ohh my god. So for me, you have become—well, I didn’t realize it, but you’re like this unofficial news source? [Shireen laughs]

1:07:52

And I’m trying to figure out… [pauses] where to go from that; I’m just trying to figure out how do we amplify that? And what can brown and Black people do to protect themselves in this space so we can—? Because I know I’ve just recently—#CauseAScene is new. It’s only been official since International Women’s Day.

And I am increasingly becoming more… as I become more aware, speaking more, and specifically to my followers about—who are privileged and white—that you’re not doing enough, that you’re not protecting me and others like me enough. And I understand it at some point, this what you’ve been explaining, will happen to me. It’s just probably a matter of time.

Shireen: Mhm.

Kim: So how does someone like me move forward? How—is there a strategy? Is there a playbook we need to create? Is there…?

Shireen: Yeah, that’s a good question. I have my own, I guess. You know, that’s a really, really good question, and I’ve been asked that before and I think about—I go back and forth about sharing, not sharing, how much I share, who I tell, how public I say some of these things.

1:09:18

But the challenge that I have been faced with, and I have been struck by it, is… [pauses] there is a strategic pattern in the ways in which we’re being targeted, right? That  part is clear and very true, and so I can look and see the pattern of the attacks. I can look and see the words that are being used. Right now, you’re hearing, “Diversity of thought over diversity.” You’re hearing people say, “Political affiliations need to be a protected class.” You’re seeing people who…

Kim: Oh yeah, Oz-Con. Yeah, uh-huh.

Shireen: The Oz-Con shit was crazy. I won’t even go into that. But…

Kim: Well, it wasn’t just Oz-Con, it was a rally. So all of them.

Shireen: Yeah. Yeah, ’cause it was all the conferences. And I’m glad they fixed it, but it should not have been something that needed to be dealt with. But I went after Tim again myself, ’cause I was like, “Tim, we did this crap in 2009. There was no reason that we’re having this complete turnaround right now, except for the fact that the reason is the pushback on diversity, ’cause that’s really what this is about.”

But—where was I goin’? Oh, patterns that were the targets; the ways in which the language is being changed, which is really important because they’re trying to soften what the harms are against our safety with these ways in which the words are being used. So, even this whole political affiliation stuff, it’s just another way to say, “I get to say the n-word to you, because politically that’s what I believe,”…

Kim: Yes.

Shireen: …which is bull, right?

1:11:02

Kim: Yeah.

Shireen: But that’s the other piece of this ’cause we know that that’s the line that this is on. We also know, because of what I told you and all the research that I have and things that we’ve done, that we’re—Black people—are the number one target, especially in the United States.

The other pieces, I go back and forth about because I think everyone should do better protection on their data themselves, the ways in which they engage people. But there’s something I can keenly tell you—I mean this sincerely—these people are punks. [Laughs]

Kim: Oh, yeah. Yeah. I’ve—yeah, mhm.

Shireen: The other part about these people being punks, which is really funny to me because you could see how they try to buff and puff like they big and bad, but they’re not, they’re punk. They’re scared, they’re scared—let’s all get out—they’re scared of us joining forces against them.

Kim: That’s exactly it. Exactly.

Shireen: So let’s just make sure that we got that part clear; and we do need to have people paying a little bit more attention about stuff like that, so that they can help join forces with us instead of joining the other side, thinking that they’re helping us, ’cause they’re, nine times out of 10, they’re—especially white liberals—they’re causing more harm.

1:12:10

Kim: Causing harm. Yes.

Shireen: So I just want to make sure that that’s clear. The other—and I’ll give an example of a conversation that just happened, we just went through this yesterday or day before—the other piece is about preparing, right? My struggle with it is that I do things all the time and I’ve been doing these things all the time, so a lot of what I do in my own protection—protecting myself, protecting people I care about—is natural. I do it as it’s become…

Kim: Second nature.

Shireen: It’s second nature. It’s a pattern of my existence, it’s how I start my day, every day, checking to see which accounts have been trying to, attempted hacked into, which websites, which, you know, who—and I know now, that’s how I found out—I know now where the breaches are, where the tech boys are that are causing some of these problems.

So I know those answers, so I know how to navigate myself and avoid some of the problems for me. But it’s not widely known, right? It’s not something that everybody knows how to do. And even when white boys start to face it, they’re shocked by it. They’re shocked by it, they… it’s—I said on Twitter, once you see the code, you can’t unsee it, you know? It’s…

1:13:31

Kim: And this is why, when people are like, “Why don’t you block?” Because whiteness has been comfortable for too damn long.

Shireen: And that’s the other thing about Twitter, right? Here is the solution: mute, block. That doesn’t stop them from talking about me, and if someone’s talking about me and I don’t see them threatening me, how the hell am I gonna protect myself?

Kim: Yes. Exactly. So…

Shireen: I’m Black, people, you wanna argue with me all day long, feeel free.

Kim: Yes, exactly.

Shireen: But I’ll GIF you all day long if you want to get…

Kim: Yes, yes!

Shireen: Like this one guy was going crazy, he was like, “You and your digital blackface, that’s racist.” He kept sending me a racist GIF, and I was doing a Black girl face, I’m like, “It’s not racist. You know why it’s not racist? ‘Cause it’s my goddamn face and it’s actually how I’m looking at you rrright now.” [Laughs] You know?

Kim: Yes. And you’re talking about the punk thing. It’s so funny because all it takes is a little bit of time and me quote retweeting and not answering their fuckin’ questions. Oh, my god, just apply a little bit of pressure and it comes right on out.

1:14:32

Oh, they get so sensitive. [In a whiny voice] “Why aren’t you…? You’re not answering…” Like, ohh my god…

Shireen: So part of the resiliency that you end up having in this space, and that you’ll have to have in this space, is that you have to be able to maintain that. Now, you know, some things like, I try my best not to engage trolls, I do my best not to engage trolls, but when I do see something—like one of my friends and I went after that chick, you know, Permit Patty, she deleted her Twitter account.

1:15:05

Well, somebody got her account, and I watched it, I watched it real-time. Someone got an account, started pretending to be her and then started trying to counter the narrative, right? So I watched it in real time. He literally changed from a made-up name of hers—like a twisted, like her last name was first and then her first name—and then as we kept talking to him, it started progressing, and then it switched, so he actually got her account. He eventually was able to secure her original account.

So now people were engaging, thinking it was her, right? But me and a friend of mine were engaged, we knew it was probably a fake account. We kept saying it, even in the tweets, was probably a fake account. But the reason that we engaged that person—not because we believed it was her defending herself—it was because the manners of which the language was being used, was the language that other people were using to defend her.

So we were like, “We’re not letting this go.” And that person was like, “Anyone who thought this account was real is stupid.” No, idiot. You were the stupid one, thinking that you were doing something. And his answer was something like, “I did this as an experiment about fake news.”

1:16:33

This is not fake news. And my friend is a media person too. And the reason this is not fake news is because Permit Patty exists. She ain’t fake. And everything you said is everything she said. So how is that fake?! We were just like, “We figured out you were fake. We’re just letting you keep saying this bullshit!”

Kim: Yeah. Exactly. Exactly. And that’s, yep, exactly. And it’s like, “No, like, it could be about…” Mmhm, it could be about, you’re absolutely right; but it’s saying a whole bunch of shit that other folx are sayin’, that you need to be exposed to this.

Shireen: Exactly. Exactly. And we just basically, we weren’t letting this go. Weren’t letting this go, because we were not letting this narrative in this public space of all—this ain’t Facebook. This ain’t, this is not a walled garden, this is a public space that other people are gonna see and we’re nipping that mess in the bud, because the problem is we let that go. The reason that we’re in the problem that we’re in now is because we kept letting that go. The reason that people feel so upset about their voices being taken is because they’re OK with silencing us. They’re not OK with being silenced.

Kim: With being silenced. Yes! [Claps] And that is my thing. I’m just like, if I could shut this—and then it’s like, people like, “Kim!”, but, baby, I am watching something on Netflix while I’m doing this, this is not stressing me out…

Shireen: …stressing me out.

Kim: ‘Cause these—I don’t know these motherfuckers. They don’t know me, and as long as you’re not talking about physical threats, you could say whateeever you want.

1:18:04

Shireen: Whatever you want.

Kim: Whatever you want to, because I am not invested in…

Shireen: But the minute you start trying to threaten me, then we got a different conversation; and I had this pile-on that was happening, and it was over Joy Reid, ’cause Joy Reid’s account—her old blog, I think you’ve seen in the news, and I’m writing something on this too—but Joy… [Pauses as though reading a message] My friend is—never mind.

But the Joy Reid situation is this—and this is another thing, it’s like media people don’t know tech…

Kim: No.

Shireen: And so when you have people trying to do a technical analysis on something, particularly what happened with Joy, was that her old blog got hacked, and so they changed the content of the old blog, made her look more homophobic and transphobic than she was. And then there was some issue about the 9/11 conspiracy theory, nonsense or whatever.

So you had journalists jumping on her. But the reason that they were going after her was the same reason about the bots situation. The bots was escalating the situation. She had already been hacked, and they kept re-sharing that content. Why has this happened to Black women all the time? Because we are the easiest victims, and most people won’t believe anything we say.

1:19:27

So Joy, being the journalist that she was, and being the journalist she’s been, covering real news with real facts and shutting down fake stuff—she will shut down fake stuff in a hot second. She is deadly, so they were going after her to keep her voice from continuing, and so that’s when the whole blog thing happened.

So she had an old blog that evidently got hacked, and she made—and here’s an example—she made a single mistake. She deleted it. The whole blog. She should have never—that’s an example what not to do. No one should do that. You know why? Because there’s no content now to go back to, and now there’s no evidence to verify you have been hacked.

Kim: And that’s another reason I don’t block people, because then there’s one side of a conversation and…

Shireen: Exactly.

Kim: …you can’t get to it.

Shireen: We can’t get to it. So everybody is like, jumping down her throw about saying that she was hacked and she probably wasn’t, and it’s not true. She was hacked. Then it’s the Internet Archives jumps in, and I know the Internet Archives well. I know those folx, and W3C, and all the stuff in between, ’cause there’s a whole bunch of white boys; and the reason that I know the Internet Archives very well is because I have been designing sites since 19-freaking-97.

So I know when the Archives is messing up. I know when the Archives have messed up my stuff. I know when the Archives I’ve gone back to find my stuff and can’t find it. So I have a looong history with Internet Archives.

1:21:02

So when they were like, “Well, you know, it’s not us,” and people are taking screenshots of the Internet Archive of Joy’s blog that says no robots’ text, right? So now the tech people tryna jump down my throat ’cause I called out the fact that the Internet Archives don’t know what they’re talking about.

And so they’re like, “What are you talking about?” And I said, “Because, you could see the robots texting on the Internet Arch[ive], ’cause you’re sitting here with the screenshots. Y’all are arguin’ with me and y’all got the screenshots of her evidence!” Because, because, even if her thing was hacked, the Internet Archives is only keeping this page that’s sitting there, right, with the link. So if, if, the content is now missing and all you see is bots’ text and not the content, then the Internet Archives doesn’t archive everything the way you need it!

Kim: Yeah, it’s not, it’s not…

Shireen: It can be manipulated.

Kim: Yeah, yeah, yeah.

Shireen: OK? So don’t play games with me about something I’ve been doing since ninety-freakin’-seven! And I had to go after one of my media friends. He was freakin’ one of the White House reporters and he was going after—I said, “Stop going after her, this is bull.” And he was like, “But de de de, and these people, and I trust these people”, I was like, “Dude. 1997. I’ve worked with the Internet Archives. You can go check what I’m saying.”

1:22:25

So he says, “Oh, yeah, you’re right,” he’s like, “I know tech too, y’know, I know.” He’s one of the ones who—we’re in the DC tech space, so that’s how I know him—so he is one of the ones who could get it. So, we had that conversation, and when we got to the end, he was like, “Oh, yeah, you’re probably right. So I’ll let this go for right now,” and I was like, “But the problem is, right, no one else is gonna do that.”

Kim: Exactly, you had his ear, and you had to prove it before…

Shireen: And I still had to prove it to somebody who knows me, right?

Kim: Yeah, yeah.

Shireen: So I told him, he says, “But the hacking, the hacking….” I said, “It was hacked. It was. Her dumb ass deleted it. She should have left it because she could have went back and then changed it herself. That’s what you don’t understand!” [Coughs]

So anyway, then they started coming after me, right? Which I didn’t mind, people from the UK, from all over the place, trying to discredit Joy and trying to discredit me. And they were like, “Well, you go on that show with Joy so we can laugh at you.” I was like, “Let me on that freakin’ show, ’cause I got stuff to say.” So I did my little GIF of like, “Yeah, she takes me, I will get there,” and so then they kept coming, right? Because they weren’t able to get past where I was, and that was like, “The evidence is here. Y’all are doing it. Y’all are giving it to me. I don’t even have to verify it.”

Kim: Oh, my god—and that’s my whole thing. Another thing…

Shireen: Yeah.

Kim: It’s free marketing.

Shreen: Yeah.

Kim: The more I—the more you show your ass, I get clients, I get all kinds of stuff.

Shireen: Exactly. It’s like…

1:24:00

Kim: Because people are like, “Oh my god, I didn’t know this.” Yep. You needed to see it.

Shireen: Exactly, because you need to see what’s happening. And so then one of them made a gun threat.

Kim: Ohhh.

Shireen: And I was like, I could engage all day long in any kind of way, and I’m going to, but as I was doing that, that gun threat tweet? I reported that damn thing.

Kim: So who did you report it to?

Shireen: Twitter.

Kim: OK.

Shireen: And so as the conversation—’cause this person was the worst of all of them—so at some point, that person starts to disappear, right? So, Twitter banned them. So, one of their friends was like, “That’s not fair, for false reporting”. I was like, “False reporting? It’s a gun threat. Here’s a tweet.”

Kim: Yeah.

Shireen: So, y’all wanna talk about tech? Let’s talk about tech. And I’ve argued with you down here to Doomsday. But I’m not sitting here allowing a gun threat against me go unchallenged.

Kim: Ain’t nobody trying to be a martyr for this shit.

Shireen: I was like, “Sorr-y that you want to stop engaging me now, ’cause you’re afraid of false reporting from me? Feel free. Don’t threaten me. We can have many arguments, but do not threaten me.” And so they, of course, stopped, right? Because at that point, nobody wanted to talk to me ’cause they were now afraid of getting banned.

Kim: Yes. Yes.

Shireen: But that’s [the] exact thing. I was like, “You’re more concerned about false reporting than the fact that I was threatened.”

Kim: Yeah.

Shireen: There was a gun threat, you know? Like that’s some bull.

Kim: That should have been like, oh my god.

1:25:30

Shireen: But that’s the thinking, right? Because that’s silencing them.

Kim: Yes, yes. And that comes back to the whole, anything we could do that shuts them down even for a moment is a victory.

Shireen: It’s like—it’s a shocker to them, mostly.

Kim: Yep. Because they’ve never been challenged in these ways…

Shireen: Yes.

Kim: …and they’ve never had consequences to behavior.

Shireen: Exactly.

Kim: Yeah, yeah.

Shireen: Because most of their behavior goes unchecked, and the minute they get checked, that’s when they start acting a fool.

Kim: Mhmmm. Yeah, oh yeah, it really shows up. It really shows up.

Shireen: They really start… [coughs]. And that’s key to this conversation, this is key to these next levels, because once you start getting people to see the code of what is really happening, what the tactics are, how they’re engaging with you specifically, and why they’re engaging you in this way; the full circle is still predicated on race and gender, and it’s really interesting to me when I watch people who don’t want to get to the bottom of that actual problem, the systems that are in place.

I had several people from my old days in tech, they were trying to talk about Kylie Jenner, and one, you know, Brittany Packnett did this whole thread trying to call out what the whole problem was, which was about cultural appropriation and theft of products.

1:26:58

So—VCs, tech VCs, are arguing over the business strategy thing here. So I’m just like, you know, “Guys…” like, I jumped in, Arlan [Hamilton] jumped in, and she says, “Do you understand what Black women are saying, right?” ‘Cause that was the other piece, was like, “Do you get what Black women are actually saying about this? This isn’t about Kylie being self-made. This isn’t about Kylie’s business acumen. This is about IP [Intellectual Property] theft, and any of you VCs would know,” like, “You would not like this.”

So we’re having a good conversation, this is not one of those debated nonsense ones, but we were just like—so a couple of guys jumped in, and I was shocked that they were on our side. You know? I was like, “Whoa! Whoa, white dudes, VCers!” You know? So then I just said, “Look, I don’t care about people having to work hard. I don’t care about—I want people to get credit for their work. But I do care about theft of product and someone making money from them.”

And I said, “You know, in the tech industry, there’s a lot of stuff out there, there’s a lot of things people have tried to steal from me and still do, all the time, and think they can get away with it. As we go back to the issue about Twitter, because if we are Black, and we’re the ones creating, you can pretend like we still don’t create anything by basically taking it from us, and somebody will give you money for it.”

1:28:31

Kim: And this is no different when they stole our music and, and…

Shireen: Yeah, exactly, it’s the same thing. And so I’ve gone in and out of the industry because of this, because a lot of that is based on that feeling. It’s like, every time I create something, nobody wants to fund what I create…

Kim: Oh my god, yes. Ughh.

Shireen: …but someone can steal it, and then all of a sudden it’s the best thing on the planet. And so the Kylie Jenner thing was a great example of that. And so we got towards the end of that, where people start—like the bell starts to go off, right? And then, Arlan says—there was a problem, I felt there was a problem [with] one of the analogies, and I was just like, I mean, not get mad, let me just sort of get her to think through what she’s saying. She’s making an analogy, that’s not the same, right? It’s like, there’s nowhere close to the analysis and I don’t think she did it out of harm, I just think that in her head that was the best way she could phrase what she was understanding.

So we get toward the bottom, and Arlan says, you know, someone says, “I’m jumping in, I’m viewing, I’m watching, I’m trying to understand, is there something I can see to kind of get at the stolen appropriation stuff? ‘Cause I’m trying to wrap my head around what you guys are saying, right?” So that’s a fair question. So she said, “Go ahead and Google,” she did what I would normally do, “Go Google Brittany.” And so I took it upon myself at that point—’cause I felt like these are people genuinely trying to understand—so I went and got Brittany’s thread, and I said, “Here is her thread. You should read it.”

1:30:15

The guys went through the entire thread, got to the bottom, and they were like, “Whoa, not only is this a good thread, but that last piece, that piece by the Daily Beast is damning. Like now, there’s no question about this part.” And then, I didn’t have to say anything anymore, right, because because Brittany had already done the thread, there was no reason for me to jump on and repeat any of that. They got to read it for themselves, which is how this is a problem with that, is that they have to see it for themselves to even understand our relationship.

And then when they got down to that last piece, they were like, “Wow.” And then they started retweeting the piece, and CC-ing Brittany, and then Brittany was like, “Everyone started threatening me, started coming after me, ’cause they were trying to protect Kylie, and there’s receipts. There’s receipts,” ’cause people weren’t believing her, and she’s like, “There’s plenty of freakin’ receipts.” And so it was that hard part, right?

But  these were people I knew. So somebody jumps in and they go—somebody I have known for quite some time—they go, “This is the way Twitter used to be. This is like 2010 Twitter, when we were all good people, we’re having conversations about a difficult subject and we could get through it,” right?

1:31:25

Kim: Mhm.

Shireen: And I was like, I didn’t say anything, I just kind of let them go at that point because I knew them, and she came back and she said—I jumped in, normally I wouldn’t—but I jumped in because I knew in this moment I could, right? And so it ended up being a healthy conversation.

But it was another—to me, and Arlan is doing amazing work on getting women of color and LGBTQ people funded in the tech industry, so she’s doing this amazing work—so the turn on that was some of those VCs are funding her, and funding that work. So it was like, it could have been a deadly conversation, you know? But it was that moment of, this is people who are trying, and this is what this teamwork should look like.

Kim: And that, and you just—and we’ll end on this, because this is why when people say, “Why don’t you get burnt out?” and all this, it’s because it’s a strategy and my strategy is just what you said. I’m not—my—when I speak, it’s not to these random people on Twitter. It is to the individuals who’ve chosen to follow me, and it’s the people that—those are the people who need to see what’s happening, because they’ve elected to follow me. I didn’t follow you.

Shireen: Right, mhm.

Kim: I didn’t ask, ’cause I’m not trying to collect followers. I don’t need a whole bunch of people comin’ out, no allies, none of that. But what I do is, because you elected to follow me, when I find that I have the bandwidth to do so, I want to…

Shireen: Exactly.

1:33:02

Kim: …I want to provide as much valuable content for you as possible so that you could—excuse me [sneezes]—understand and see what this stuff is, because it comes in so many cloak—different, it is—oh my god, it is so many—comes in so many disguises.

But just listenin’ to you, and giving this historical perspective, I can tell you, you’ve done a great thing for me in this moment, and I wanna really thank you. Thank you. Thank you. Because, I knew that the Stack Overflow thing—because people are like, “Why?” Because it’s important to me. And because Stack Overflow is the biggest platform that developers are getting content from and it’s a harmful place for many individuals.

And the fact that you are citing—and having been in this industry for so much longer than I have—are citing the same experiences, just confirms to me that I’m not being paranoid, sensitive and other things. I have outlined for Stack Overflow steps they can take that will make me feel that they’re going in the right direction, and they have yet to engage with me.

I am not throwing out shade. I’m not threatening. I’m not doing any of those things. What I’ve come to them with, is as a business person who consults on these issues, who writes about these issues, who knows these issues intimately, and you continue to ignore me, but yet speak to other people who don’t have my same hue?

1:34:42

Shireen: Yeah.

Kim: If I didn’t get anything else—and I got so much from this conversation—it was that it confirmed that as a Black woman, I don’t care how much you talk about your desire to have inclusion and diversity, that is not your intention.

Shireen: Yes. You have—the intent matters, and then it’s the actions; and honestly, the tech industry has failed in this, and as they have been talking more publicly about it, they have not been acting.

Kim: Yes, it’s marketing and PR at this point.

Shireen: PR.

Kim: Yep, but they’ve shot themselves in the foot because before, if they weren’t speaking about it, no one can hold you accountable.

Shireen: Right.

Kim: But now that you’re saying what you’re doing and you’re not doing it? Now, as you say, where’re the receipts?

Shireen: Exactly, exactly. I mean, Facebook just—I’m still writing all this stuff up—but Facebook just released their diversity data on Thursday…

Kim: Yess. Good lord.

Shireen: …and you can see. It’s obvious. And now…

Kim: And now they blame…

Shireen: And now the story…

Kim: …they blame Black lady for the problem!

Shireen: I know, I know. Isn’t that somethin’ else? We know why. 

1:36:01

Kim: Oh my god.

Shireen: Yeah, I mean, but it’s the same, it’s still the same problem. But the difference now in doing this, you can now start to have a better conversation, that it’s not just women; Black women, Latin women are not being given the same afforded opportunities, even when they create stuff.

Kim: Yes.

Shireen: OK? Even when they create stuff, even when it’s their idea, even when they have the years of expertise, we are still and will forever be questioned about who we are, and it’s more about who we are, our—bodies that we’re in, than it is our intellect or ability to do the work, or our intents.

Kim: Yeah. And I used to put, I mean—and I knew that subconsciously, because I actually on my third slide when I was speaking would be my name and the title, my content information, and then my credentials. And then at JSConf EU I told them—and I was very specific—I told them I was no longer putting that in because I’m sick of trying to prove myself to whiteness. I’m done.

Either you take me for what I have and what—I have content. I don’t have as much content is your ass does [Shireen laughs] but I have content out there that you can go to if you if you have a question about my qualifications, and it also speaks to—and I wanna end on this because this is where #CauseAScene got started, was on the International Women’s Day of this year, my talk was, “Why aren’t all women making gains in tech?”, and that speaks specifically to that, is that white women are not diversity.

1:37:43

And they’re making gains in tech because they also benefit from the unearned privilege of whiteness, unearned benefits of whiteness and privilege. And they are often barriers for our success.

Shireen: Yes, they’re—I told you the top thing in our research for harassment is white men, white women is second and then men from our own ethnic group. So—when I break those things down like that, people are shocked, ’cause the data that I collect is only on women of color; only on women of color, because the narrative of that story is still missing.

That’s why the diversity for them in the industry has not increased. It’s why we don’t have—and everyone’s like, “Well, you shouldn’t just be hired because of your gender or your race,” and it’s like, you know that’s not what we’re doing, and you know that’s not what we’re talking about. So let’s get past that. I have been doing this work since I was 10 years old. There’s a lot of decades behind that right now, you know? So this…

Kim: And that’s one of the reasons the #CauseAScene conference—it used to be that was for underrepresented and marginalized. Well, everything I did was for underrepresented and marginalized, but the more I learned about whiteness and white women in this space I’ve removed underrepresented, because that’s the group they follow; very few of them fall in the marginalized, unless they’re in the LGBTQ community or have a disability.

1:39:13

And I’ll focus only on amplifying the messages and stories and providing a platform for marginalized individuals, because we’re the people who are, as you said, we’re questioned, we’re shut out, and we need safe spaces to tell our stories to this tech industry, who needs us again—and I’m literally gonna end on this—to make you money. I just don’t get it. I don’t get that your hatred of us outweighs your desire to make money.

Shireen: Yeah.

Kim: So thank you so much. This has been an amazing call, talk, and I know it went over, but—and guys, I sent her a chat while we were talking that if she had the time, I wanted to make sure we thoroughly talked about this issue, ’cause this is the first guest who I’ve had on who really talked about the historical perspective, and who’s a Black woman who can talk about it from that angle. So thank you so much for being on the show and taking the extra time to educate me; if nobody else got anything, I am well—my cup runneth over, so thank you so much.

Shireen: You’re welcome. And thank you for having me.

Kim: And have a great day.

Shireen: You too.