Kate Klonick

Podcast Description

“Everyone wanted diversity for the board, but it was really unclear as to how on Earth you have diversity over an entire globe with only 40 people on the entire board.”

Kate Klonick is an Assistant Professor at Law at St. John’s University Law School and an Affiliate Fellow at the Information Society Project at Yale Law School. Her current research focuses on the development of Facebook’s new Oversight Board — an independent body that will hear user appeals from Facebook users and advise the platform about its online speech policies. Thanks to individual grants from the Knight Foundation, Charles Koch Institute, and MacArthur Foundation, she has amassed over 100+ hours of interviews and embedded research with the Governance Team at Facebook that is creating the Board. The results of this research will be published in a law review article in the Yale Law Journal in 2020, but available in draft form online in late 2019.

Kate holds a  JD from Georgetown University Law Center, where she was a Senior Editor at The Georgetown Law Journal and the Founding Editor of the The Georgetown Law Journal Online; and a PhD from Yale Law School where she studied under Jack Balkin, Tom Tyler, and Josh Knobe.  Between law school and her time at Yale, she clerked for the Hon. Richard C. Wesley of the Second Circuit and the Hon. Eric N. Vitaliano of the Eastern District of New York.

She has a background in cognitive psychology which she apply to the study of emerging issues in law and technology. Specifically, this has included research and work on the Internet’s effect on freedom of expression and private platform governance. Kate also writes and works on issues related to online shaming, artificial intelligence, robotics, content moderation, algorithms, privacy, and intellectual property.

Her work on these topics has appeared in the Harvard Law Review, the Southern California Law ReviewMaryland Law ReviewNew YorkerNew York TimesThe AtlanticSlateLawfareVoxThe Guardian and numerous other publications.

Additional Resources



Kim Crayton: Hello everyone, and welcome to today’s episode of the #CauseAScene Podcast. Today, my guest is Kate Klonick. Kate, could you please introduce yourself to the audience?

Kate Klonick: Yeah, I’m a professor at law at St. John’s University, and I write primarily about how tech and the law intersect, and specifically about private platform governance and how sites like Facebook and Twitter and Google, YouTube influence our ability to speak online with their private governance policies.

Kim: OK, so I always start this show with two questions. Why is it important to cause a scene? And how are you causing a scene?


Kate: Yeah, well it’s always important to cause a scene, otherwise, you just have—especially if you’re coming from outside one of these very large companies—I don’t think that people know to do anything differently unless somebody outside causes a scene. And so that’s kind of my argument for it. I think that otherwise you just continue with unjust policies and status quo and very biased baseline, non neutral starts to a lot of policies and never have them be any better.

I guess the way that I’m causing a scene is, I try to explain and do arbitrage between what a lot of these platforms are doing and other people who want to change what the policies are. Which is basically just to say that I think that people outside these companies have a lot of really correct instincts, and I am one of them who want to change how things were done. But I think that it’s hard to convince people to change until you can speak in their language and understand how things are exactly working.

So I try to document exactly what’s actually happening at these companies and translate it, then with the effect—hopefully—of having better policies and changes that could be put forward, instead of just mindless legislation that does something that no one wants any of the effects that end up happening after you have outrage legislation or something like that. So that’s how I guess I’m causing a scene.


Kim: Alright, so everyone, I came across Kate… I’m sure someone—’cause I wasn’t following her—shared it and I saw it in my timeline. On September 17th, she tweeted:

“Today @facebook is releasing the charter for its Oversight Board; the “Supreme Court”-like body that has been in formation for the last 9+ months. Under independent grants & no NDA, I’ve been embedded w/the FB Team that’s been working on this and documenting the process. 1/2

My research will be summarized in a forthcoming article @YaleLJournal [Yale Law Journal]; but I’m also available to talk about my observations now. DM for more info.

The more transparency and push back on this Board, the better it is for all of us. 2/2″

So everybody knows that when someone tells me their DMs are open, even when they’re not, [Kate laughs] I actively try to reach out, so I was intrigued. I actually retweeted this, and I was intrigued. So before we get into any other questions I have could you give us a overview of what all of this is?


Kate: Yeah, totally. So this is in keeping with my work. I was reporting, I had a Law Review article in 2017 that was from—I didn’t have any access inside any of the companies; I spoke all with people who had left the company, and people who would speak to me anonymously that worked for the companies. I wrote a paper that was basically describing how they do content moderation. Do you want me to describe what content moderation is?

Kim: Yes. Please.

Kate: OK, sure. So, content moderation is basically, is the fancy words for what Facebook or Twitter or whatever decide to take down or keep up. The rules that they run and enforce, internally at their site, to basically decide to moderate content, to moderate what content you can see and what content you can post.

And so I did this very large descriptive project on how they did that in 2017. Shortly after that, in November of 2018, Mark Zuckerberg announced that they were going to be starting a user accountability appeals board. That was something that people who have done content moderation work for the last 15 years and academics and advocates and a lot of people who wanted to see more transparent policies that these platforms had advocated for, for a long, long time.

And so it was like, “Well, I’m gonna write about this,” and so they ended up inviting me to be embedded, but I only would do it if I could have absolutely no NDA and I could tape everything. That was to protect my independence and all of the claims that I was going to get from being somehow favoring Facebook or something, but I really just wanted to see what the heck was going on.


So they started to build this out over the course of one year—that was the timeline that Mark laid out—the course of one year, for this appeals board to globally review content. So literally, Kim, if you have a photo that they take down, you can go through the internal Facebook appeals process, and then the next couple of months, what will happen is if you think that your photo should be up according to the Facebook rules, you can then appeal it all the way up to this oversight board that will look at your content and is outside of Facebook, is independently funded, is a group of people who are not appointed by Facebook directly etcetera, etcetera, and will review your content and issue a decision that the entire world can basically see so that we can start having some type of transparency on this black box of what they’re taking down and keeping up.

So that’s what I’ve been looking at over the past nine months, and really just trying to document what’s happening. That’s why I was like, “My DMs are open,” because I also think that it’s a little weird that I was one of the only people who got let in, and I don’t know if it’s because people didn’t know this was happening and there’s just—no one is as big a dork as me, and so they weren’t bothering Facebook for this or what, but I do think that it’s my job to just tell everyone exactly what I saw.


Kim: OK, so you hit on a lot of questions that I had, so thank you. It always happens this way. I wrote some notes, and I’m trying to be more diligent about doing this podcast, ’cause I’m really a fly by the seat of my pants kind of person. But because I didn’t know you, I wanted to do some research, other than I was interested in the content, because I’ve spoke a lot about content management on Twitter, Facebook and how these companies, because they lack various perspectives, have caused harm, although people in the communities who are not inside these walls have been telling them that there’s harm being caused. So you hit on that. So one one of the things that—and you mentioned about how you had your DMs… so, oh yeah, you just hit on a lot of things ’cause—I’m just gonna just read you some of the questions I wrote down.

Kate: Please.

Kim: I was looking at the oversight board because it says, “What type of board members? Trustees> What’s the diversity? Are they for marginalized groups? Sounds pretty privileged to me. How did you get embedded? Who were you embedded [with]?”

Then I wanted to bring up—because you said it’s independent money—how is it independent money if Facebook is funding it? Based on this oversight chart that I see it says, “Relationship with Facebook: the trust will receive funding from Facebook and consult on future trustee selections.” Then it says, then I saw in the thread of this, you were getting some challenges based on that article that you wrote, which is entitled “Inside the team of Facebook that dealt with the Christ Church shooting,” and so you were getting some pushback on the fact that you wrote this article and people thought it was positive. And so they challenged you on conflict of interest and bias related to the article.


So there’s a lot of places for us to go and that’s why I said this hour will go by faster than you think, but I want to say thank you for being open to comin’ on the show without even knowin’ who I was and what this was all about. That says something, ’cause most people would not—particularly if they know my reputation—do not just willingly come on the show without having researched me because I am pretty hard nosed. Because that’s what I do. I cause a scene. So…

Kate: I love that. That’s fine. [Laughs]

Kim: So let’s just answer some of these questions. I get your areas of expertise and where you do research. I really like the fact that you were like, “I was the only person there and why was I the only person there? Is it because I’m a nerd?” And it also goes to the oversight board. Who’s on this board? What’s the diversity? Because it’s not gonna be helpful if this board is made up of other privileged individuals—like most boards of companies like this—who are these high level, very privileged, very wealthy individuals. So let’s talk about that first before we get into the challenges of what people have for you personally.

Kate: Yeah, totally. OK, so to the question about the board. One of the things, so for the first six months, January to June, Facebook did this—the team, I should say. I would really try—it’s a hard habit to break—but one of the things that I’ve tried to do is stop seeing Facebook as one entity. [Laughs] It’s this huge, huge, tens of thousands of people company.

Kim: It’s IBM. It’s just as… yep.


Kate: Yes, 100%. And I think the people that know better—like I, and you, and other people—still say “Facebook this,” “Facebook that,” but this was one specific, very small, 10 person team that was called the Governance Team that was supposed to be creating this board. They did a six month, I would call it a listening tour, another word would be a consulting tour, where they basically just talked to anyone that wanted to talk to them, and also tried to identify people that they thought would have really good insight and invite them to a series of six larger workshops that were held in Singapore, Nairobi, New Delhi, Mexico City, New York, and Berlin.

The workshops were about 40 to 50 people, and they were people from the general hemisphere / area—that geographical area. They were two day consulting, basically, and people sat at workshops and went through simulations and gave their feedback on what they thought that the board should have or should not have.

Interestingly, I was kind of impressed by the thoughtfulness of this; I wouldn’t have thought of this. They started them in the global South specifically because they knew that their bias—being in Menlo Park—was going to be to a Western / Western European or American cultural baseline. So they wanted the first people that they heard from—at least this is what I was told—to be from the global South so they could have that as the baseline of what the board should be. For whatever that’s worth, I thought it was kind of an interesting way to think about it.

The one thing I will say—so I’m getting off track about the board itself—the one thing I will say is that everyone wanted diversity for the board, but it was really unclear as to how on earth you have diversity over an entire globe with only 40 people on the board.



Kate: …with only 40 people on the board. And 40 was basically the number that was picked because after that, it just gets to start being an unwieldy group to manage logistically. It might grow—who knows?—but for right now, I guess that was the number. And if you pick one person from the United States—if you picked you, or me, or Lindsey Graham—we would be different types of Americans with different types of views. And I wouldn’t say that any of us are representative of America, per se. So that seemed like you couldn’t just pick a random person.

But generally, I would say that when people talked about diversity, the thing they were most concerned with—when I say people, I would say the people giving feedback to the governance team—people were concerned with marginalized, vulnerable populations not having a voice.

They have not announced who any of the board numbers are going to be yet. And unfortunately, I do think that there is going to be some amount of privilege because they decided to go with a “qualifications based,” which is having a background in decision making that’s judicial-like or legal, and being able to neutrally apply rules. But that being said, I think that within that, they’re trying to find as much diversity is possible. But again, I have no idea. I don’t think very many people know—even at the team—who they’re really formally looking at yet for the board.

Kim: OK, so. I’m gonna sound off on some things here.

Kate: No, please.


Kim: Yes. So, first of all, the term “American,” it’s so interesting that the US is the default for America, when there is North, South and Central America. And that is problematic. And so the fact that they consider the South, OK, that’s great. Global diversity—this is one of the reasons why I developed the four guiding principles for #CauseAScene and how we will move forward and how I will engage and have conversations. So the four guiding principles—let me restate them. One is, “Tech is not neutral.” Two is, “Lack of inclusion is a risk management issue.” Three is, “We must prioritize the most vulnerable.” And four is, “Intention without strategy is chaos.”

So when I’m thinking of… so I’ve been doing a lot of talks lately and a lot of workshops lately, because we are inviting more marginalized individuals into spaces who have always been isolated for their own protection and for their own ways of being successful. How do you bring these marginalized groups together so that we can be cohesive and it not become a thing of we’re jockeying for “I need this,” “We need that,” but everybody’s safe and we’re all movin’ in the same direction.

And so, the fact that… lemme state this other thing: one of the things I talk to my consultant clients about a lot—or even when I’m talking speaking—we need to stop acting as if we’re in an industrial age, we’re building widgets; we’re in a knowledge age, which is an information economy, a knowledge economy, and we need to be hiring—to me diversity is about the variety, so it’s about how you hire and inclusion is about people’s experience, so it’s about retention. We need to stop thinking about assimilation, which is, you come into my organization and you fit the organization; and start thinking about accommodation, which is about every time someone new comes into the space, the space changes.


So I have some serious concerns that they have this arbitrary measures of who gets on this board based on judicial blah, blah, blah, because the people who are directly impacted don’t have that experience, those experiences. This right there tells me that it’s still very elitist and very privileged and very concerning for me.

Kate: Yeah, but, sorry to interrupt, but isn’t any judicial system like that? I mean, well, maybe that’s your point, too.

Kim: Exactly. [Kate laughs] And that’s my point. Why aren’t the people who are closest to the problem—the ones who are most impacted—being used for their lived experience as experts? Why is it the theory people who get to tell this story, who get to make these rules, instead of the people who have the practical lived experiences.

Kate: Yeah, I mean, I think that’s…

Kim: And this is something—when I’m talkin’ to my clients and they don’t have job descriptions or people’re saying, “Oh, COO is not necessary,” well, a business is processes, procedures and policies. If you don’t have that, all you have is a product or service at scale. You don’t have a business, you have organization. So even with my clients, when we were doing a COO job description, they had all these qualifiers that they needed to have; “They need to understand the SaaS market,” “They need to understand…” I’m like, “Hold up. They’re coming with their expertise in policies, procedures, and processes. They can learn the business. Why are they required to know so much more than anybody else, when you have no idea how to put in place what they put in place.”


So for me, a perfect—an ideal match, I’m not gonna say perfect—would be a CEO and a COO who work hand in hand, and the CEO trains the COO about the business, and the COO trains the CEO about how to—not only put processes and policies and procedures in place—but how do you measure them? What does the data tell you? Because that’s what I’m comin’ from.

So I told them, what is the difference between—this is a small team—what is the difference between… you’re saying all these requirements; I would rather have a manager who was at one of the most popular nightclubs in the city who is turnin’ over liquor, and has to be able to know how to order, what nights to order, all these other things. That tells me a person who understands processes, procedures and policies—they can learn the business.

Kate: Yeah.

Kim: So this is my point. Why do we need judicial folx there—exclusively—when you have people with lived experience who can learn? Who could read? To understand about governance.

Kate: No, of course. I want to say also that was an example I used. I’m not sure that that’s… that’s not everyone that’s gonna be on the board. But it…


Kim: That’s fine. So I’m gonna push back on that, because I don’t care. I don’t care. To me, that’s just another sign of how the tech space ignores the people who are most harmed and continues to put people who are farthest away from the harm in charge of making the decisions about what happens. That goes against the principal of prioritizing the most vulnerable, and this is why I say this. Because if I’m in a room, if I’m in a room wi’ white women, I am the most vulnerable in that room. And so my needs need to be prioritized. But if I’m in a room with trans Black women, their needs need to be prioritized because they are the most vulnerable, and once the most vulnerable feel safe and protected, everybody else does. So that to me answers your question of how you get global diversity.

Kate: Yeah, but how do you get global diversity, because what you just said was that there’s some type of hierarchy of vulnerability. And I guess I would—not disagreeing with that—I would just say when you put that on a global scale, how do you decide who… so like, are the Rohingya, by default—in Myanmar—the most vulnerable population in the world because they’re being targeted for genocide? And so how do you end up… I guess what I’m trying to say is, I’m not disagreeing with what you’re saying about making people safe, but I’m also trying to be like, how do you possibly… there’re so many levels in which…

Kim: OK, so let me answer. Lemme answer that, because it also speaks to another part you talked about. How did you get into this? ‘Cause you said they were talkin’ to anyone who wanted to talk to them. I didn’t hear about this. I would have loved to talk to talk to them. Where was this advertised? Lemme give you an example. This is the analogy that I often use. So Stack Overflow. You heard of Stack Overflow?

Kate: Of course, yes.


Kim: OK, so Stack Overflow and I had battled a lot in 2018 after that survey came out, because they positioned it as if it was the global developer community, when in fact, it was the 95% of white men between ages of 18 and 34 who felt safe enough to participate actively on that platform, who actually even knew about the survey, so they engage more actively than anybody else.

So I challenged that. I’m like, “No, that’s not the global developer community. You need to say that that’s the community that engages with you actively.” And so when asked why the numbers for people of color, LGBTQ, were so low, the response was a whole bunch of causation, but you couldn’t prove to me causation. It was, “We reached out to 50 organizations.” Hell, I live in [redacted]. There are 50+ groups of marginalized people just in this city. So 50 is not enough. And I’m sayin’ this because I see it in organizations all the time. “Well, we did this.” Well, it did not work. Now, what else are you gonna do?

Kate: Yeah.

Kim: So, for me, global diversity is about—there is no formula, there’s no A plus B equals C—it is about… and that’s why I have the four guiding principles. Because to me put together, they paint a more complete picture.

Kate: Also, I’m gonna say that I love your four guiding principles. [Laughs] I think that they’re completely correct.

Kim: Well, I had to develop them because I recognized that the work that I was doing—the work that we’re doin’ as a community—we were causing harm, even in our intention to not cause harm.

Kate: Yeah, it’s so hard.


Kim: Harm is gonna happen. What we need to do is minimize it. Bias is gonna be there. We’re human. What we need to do is minimize it. Conflicts of interest are gonna be there. What we need to do is minimize it and be transparent. So I guess that speaks to me, why I challenge all this, because tell me, transparently, Facebook, where are all these individuals [that] said anybody who wants to talk to them, where’d they come from?

Kate: Well, I will just say that I don’t think I would have come across it if I wasn’t so involved in this—just to be totally fair—but they did have a global portal that they opened, and they posted about in their newsroom, and they put out a lot for about three months I think that invited anyone to write in and give thoughts and asked them to go through a flow of surveying them about what they thought the most important parts of the oversight board were, and that was open to the entire public. About 1200 people filled out that part of the survey.

Kim: OK.

Kate: But again, I’m with you; I think that type of thing, for better or for worse… that wasn’t… I didn’t see… I don’t know if I would have seen that…

Kim: That shoulda been an ad. If you wanna advertise for me on Facebook, that should have been a ad on everybody’s timeline. If I don’t go to your newsroom or whatever, why would I find that?

Kate: Nope. Totally.


Kim: This is the problem. Facebook uses—it has—this is where it gets on my nerves, because this is not something they would even have to make up. They make plenty of money off advertising. They have crap in my feed that I don’t even care about. If this was something that they really cared—to me—about getting the diversity of perspectives and that global diversity, it should have been in everybody’s timeline.

Kate: Yeah, that’s a really great point. I think that’s a wonderful suggestion. I didn’t start shadowing them or being embedded with them until late May, and so by that point, the portal had been up for a while, but I completely agree. And also I was really trying to the best of my ability to not make suggestions to them because I really felt like I was supposed to be more of an observer and so—for what that’s worth—but I agree with you that putting it in an ad in everybody’s timeline would have been a much better way to generate attention for it and get feedback. I mean, it also could have ended up becoming like a Boaty McBoatface incident where you have eight million—you know, you have all of these people writing crazy stuff in.

Kim: And? And you have content moderators. You have a system for it. So none of that would have mattered to me. I mean this platform, this is not the Ace Hardware on the corner [Kate laughs] where—if I’m in Washington state on some street in Seattle—that I’m expectin’ to get feedback from someone from Atlanta. No, that’s not what this is. This is a global organization with tentacles and money everywhere. And so the fact that that did not happen, particularly—or at least, you have so much data on us, you could at least put it in the timelines of the people who you say are that you consider most marginalized and most vulnerable.

Kate: Mm-hmm. Totally.


Kim: You do all kinds of stuff with data and comin’ up with algorithms; if you didn’t even want to do with for everybody, you could have picked, just click, checked that box, and hit send or enter and put it in the timelines of the people who you consider who have been identified as the most vulnerable. The people who have sent you the most reports on being harassed, on being silenced. Those are the people you should have reached out to, and that would have been global.

Kate: Yeah. No, I think that that’s completely—I’m not disagreeing with you. I think that’s actually really… yeah. I think that that’s a brilliant idea. Everything you’re saying resonates with me.

Kim: And these are the conversations I have every day because people—and this is not about you, this is about the audience—because people, you don’t have my perspective, so you wouldn’t see how to do this. You don’t have my background, so you wouldn’t see how to do this. I’m an educator by background—and not only am I an educator, but I’m special needs certified. So all my students have to come with me. I have to figure out how to get all of us there at the end of the year. But we going to get there differently.

Kate: No, I think that that’s completely—that’s exactly right. But I do think, really more to your point, what you’re bringing up is a very actionable suggestion frankly. So I think that it’s a very scalable…

Kim: Exactly. Exactly. I’m not askin’ people…

Kate: This is a very good, a really good suggestion.


Kim: Exactly. I’m not asking someone to go build something. You have this already. This is about how your platform functions. Use the stuff other than extrapolating data and and making money, use it for this. Just flip a switch and give it to another team. Here is the platform. Now make this thing happen.

Kate: Yep, No, I think that that’s a really great suggestion. I think there’s—you know, it is hard. You are not everyone. You have this background in tech. You have this background as an educator. You have this background speaking for marginalized voices, and causing a scene, [laughs] which is, I think, wonderful. But everyone thinks that they have an opinion on Facebook, to be fair. I’m not saying you’re one of the people—your qualifications are pretty unique—but everyone has… and so some of the feedback is more useful than others. And people…

Kim: And I’m gonna stop you, because that is why—they created that situation for themselves, so deal with it. Because, as you say in your article, tech—Facebook and other companies—have not always done enough to address hate. Many early policies took the absolutionist view of importance of free speech.

Kate: 100%. Yep.


Kim: So I have no sympathy for them. No empathy at all. You created this when people were tellin’ you, at the beginning, you did not have enough voices at the table. So if  you’re getting stuff that’s not of value, deal with it, figure out how to vet it and move on. Because that, at scale, is gonna tell you how to fix this stuff. See, this one problem that you just said, will help them come up with strategies and solutions to move forward.

Kate: No, 100%. I will say this, that you might find interesting because I found it super interesting, was that I thought that some of the feedback that I heard that they had gotten from the populations in the Global South; I was anticipating that they would be more pro speech. And, basically, because I thought I saw some amount of autocratic governments and speaking out against autocratic governments, and there being a value there. So I thought that some places in the Global South would be more free speech and balanced than maybe Europe, where you’re having a lot of restrictions now on speech.

What instead seemed to be, and of course it was not—as we just finished discussing—a perfect group, but a lot of the surveys and a lot of the conversations with people from the Global South—and, of course, the Global South is not one place—but what was interesting where people seemed to have a weak rule of law, they were much more concerned about safety than they were about speech. Specifically because one person deciding to go have a mob shaming, or a pile-on online, or passing around messages, was so much more—they didn’t have a strong police force or a sense of basically being able to protect themselves in other ways. That was a moment in which I flipped my bias about what I thought was happening and what I thought people would be needing. And that was, I don’t know if that resonates with you, but I thought it was a super interesting finding.


Kim: Well, it resonates with me in a different way because I don’t find it interesting.

Kate: You don’t find it surprising at all? Yeah.

Kim: I don’t find it surprising at all. No. Because I’ve started adding in my speaking moving forward, that I need people to understand I have thought about and have put in the contract and in the code of conduct, things about weapons at conferences. Because my personal safety is—the more my platform grows, the more attention gets to me. And I try to stay in tech and there’s some people who really don’t like me in tech. And I don’t engage actively with white supremacists and Nazis and the alt-right, because I’m a Black woman from the south. I know where my safety lies.

And so I am not surprised by this at all, because—and again, when I talk about prioritizing the most vulnerable—I’ll be speaking in Germany next year, and I’ve asked that they have an awareness team and da da da, but where in your code of conduct do you address weapons? Because if you buy a ticket, you can just walk into a conference. They don’t have—I have not been to one that has metal detectors.

Also, have you… because almost every conference I speak at, almost every one, I’d say 95% of the time I am—well, first of all, my third slide is always a content warning: I’m here to make white people uncomfortable. So I tell you that up front. And almost every conference I speak at, I am reported for code of conduct violation. And when they ask individual what the issue was, it is, “She made me uncomfortable,” “That was inappropriate,” “She hates white people,” blah, blah, blah. And what I don’t need—and this is one reason I will not speak in an open carry state in the US until I know that the organizers have thought about my safety—is because some white person getting their feelings hurt and, not thinking, not having impulse control, and start shootin’. I’m just not doin’ that.


And I also recognize that I have a privilege and the platform to demand this so that other people who are more vulnerable than me can have it.

Kate: Yeah.

Kim: So again, we’re talking about bringing more marginalized individuals in the community. I’m a Black woman. Trans individuals are targeted all the time. Black trans women are being killed in this country and around the world in numbers that totally outweigh how many trans Black and brown individuals there are. So, yeah, I get it. I totally get it. And these are the questions that people in tech don’t ask—particularly white people in tech—because it’s not your lived experience. You don’t walk around everyday worried about if you’re gonna get home, if this cop…

Kate: I mean, I am a woman. I have some… I do…

Kim: Nope, no, no. I’ma stop you there. Stop you there. I’ma stop you there.

Kate: OK.

Kim: Because that is—yep—white women are not diversity. So, yes you have…

Kate: Oh, I didn’t mean diversity. I just meant I’ve worried about my safety before. [Chuckles]


Kim: Yes, exactly. But—and then, I’m gonna put a period on that and not say “but.” And the reason I challenged that is because white women are often put up in a role as if we cater to white women’s needs then everybody else is taking care of, and that’s what white feminism comes in. Well my gender is not the thing that it makes me a target. It’s my race. And some people, it’s their religion. And so we need to have more of a open conversation about these things. So, yeah, I’m not surprised at all that they—because at this point for me, the free speech debate is gone. I don’t see how you gonna put that genie back in the bottle.

So, at this point, and this is how I find it very privileged for people who are on Twitter—the people love it in tech, these are people who can actually go build their own Twitter—will say, “Oh, this is just so toxic. Everybody needs to leave.” Or even Facebook; “Everybody just needs to leave that.” No, I’m not—Facebook is where my friends and family are. Twitter is where my community is. What I’ve learned, as a Black woman on both of these platforms—so on Facebook, I have no connection to any of my family and friends. I had my mom and everybody to make sure they disconnected me.

Nope, you don’t have to know who I am. We all have different last names, so, no, you don’t get to target me through my family. And on Twitter, this is where I find my community—particularly Black women—where I feel safe. So I’m not goin’ anywhere. We’ve just learned to deal. We’ve learned to—because we’ve had to in many situations—we’ve learned to adapt. So don’t go playin’ around with the like button, Twitter—which you always just doin’ stuff that doesn’t even matter—stop screwin’ that up, because likes shows me—just like I found you—it helps me find the people I need to connect with and know where I’m safe. That’s what likes are for me, and retweets are for me.


Now for somebody else who doesn’t have this experience, they don’t understand that. What I need you to do is, not take down my damn Periscope video because it’s called “White men in tech ain’t shit,” but when you listen to it, it is talkin’ about the fact that they’re not usin’ their privilege to the best of their abilities, and then people get upset and they take that down. Whereas we have people in public who have blue checks, you could say whatever the hell they want.

Kate: Yep. Yeah, everything that you said. I think that there’s a huge problem, a huge problem with not being able to leverage voices of anyone who’s in a marginalized community that is trying to speak out. And this is the “kill all white men” problem, this is the whatever problem. And I think that your points are extremely… are totally valid. And I think that this is part of the problem with tech in general, is that they—and the problem of content moderation—is that it’s hard to understand when things are being said in protest, versus they’re actual threats, versus when they’re whatever. Everyone is losing something by not hearing, I guess. Ultimately, you could load a headline, “Kill all men,” and it’s polemic. You’re not literally calling kill all men. It’s meant to draw attention to something into a greater argument, like as you said, your video of “White men in tech ain’t shit,” or whatever.

And so there just seems to be the hard and fast rules that they have created seem to miss really important moments of protest and conversation like this, that are people speaking back in really interesting ways and educating people like me or others about these issues and about where they’re coming from and about the lack of neutrality in the platform, and about, kind of everything else. So I’m agreeing with you completely; I just think this—like everything you’re talking about—is part of the problem. I don’t think this was going to get better before the oversight board. And I hope—it was just a black box before the oversight board, and now it’s maybe a little less of a black box—and that’s the only thing I can hope for. That there’s now a little bit of a way in, a little chink in the armor that they’re opening up. But, you know…



Kate: …a little bit of a way in, a little chink in the armor that they’re opening up. But, you know…

Kim: And, you see, my thing is, fuck the black box. We need to remove the black box, because the black box is what’s harming the people that you’re saying you’re trying to protect. If I can’t see into the—this is the very reason I check my followers. First of all, I spent all day one day getting rid of bots and everything. And now I check my followers on Twitter for, if you have a private account, I don’t care why you have it—you might have it for your own protection or whatever—I will not allow you to follow me because I can’t see what you have. So there’s no transparency there.

And I know that people who don’t follow me, who people tell me they screenshot something I said. I can’t control that. There’s certain things I can’t control, but what I can control are people who are inside a black box followin’ me, and I don’t know what you’re sayin’, so I don’t know if you’re makin’ threats, all that kind of thing. And so it’s the black box that I have a problem—it’s the black box that causes harms to people like me, and I need to get rid of the black box.

So I commend you for the work that you did—and this again is not about you, this is directly to Facebook—I’m not impressed. I’m not impressed until I see some people who don’t have that level of whatever they considered expertise and their lived—see this is another thing—how is my degree more valid than my lived experience? And this is how we propagate white supremacy in this country and around the world. Because if I have a degree in something—you have a degree in yours, right? You’re automatically considered the expert. I’m tellin’ you, I’ve lived this, but then I have to bring you proof. I have to do all these other things. And that is a problem with—again, we’re not in an industrial economy anymore. This is a knowledge based economy. And so my knowledge of this experience should be seen to be as valuable if not more valuable. So…


Kate: I’m just gonna just cut in really quick, ’cause I dislike my degrees. And I…

Kim: You said, you dislike? [Laughs]

Kate: Yes, I dislike that I had to get a degree…

Kim: Oh, you’re not the only one. Oh you’re not the only one. Oh my god, yes, I would not be taken seriously if I didn’t have mine. [Laughs]

Kate: Right. So I really deeply, this is probably not a great thing for me to say, but I had a miserable time—this will shock you—at Yale Law School. It was a very difficult environment that I did not feel was particularly hospitable to women and particularly hospitable to queer people, and so I had a very hard time. And I was a person who didn’t speak in the way that all of my cohort spoke, or my professors, and so I felt constantly judged and in conflict. So it looks like I have all of this, “Oh, she went to this fancy place.” The only thing that I can do is take whatever—and I think it’s invalid—whatever invalid cred [Kim Laughs] that degree gives me and try to go forward and do valid things with it, despite the degree.

Kim: Yes, I totally agree. Yes.

Kate: And so that’s how—in my mind, I’m like, “You all let me into your little cabal and I have this rubber stamp,” and great, now really fancy people at country clubs are impressed by me. But there is still—and this is something I am sure you agree with—they’re the people of power and whether I like it or not, reaching them is still important. And trying to convince them of some of these arguments is still super important.


Kim: OK, so that’s the whole reason for #CauseAScene. It is to train white people to have conversations with white people, ’cause they’re not gonna listen to me. [Laughs]

Kate: Right. Well, congratulations, I didn’t know I was doing your god’s work already. [Both laugh]

Kim: Yes! I mean, that’s it! I need white people to not just keep fuckin’ up in what they’re doing and saying, so I’m like, “OK, let me show you what to do. When you get point A, this is what you need to say. When you… this is what you need to say.” So if you follow the hashtag, or you follow me, you will see how I will quote tweet someone and the people in the community know how to respond.

Kate: Yep, yep.

Kim: It’s their job to respond. And I totally agree with you that I was getting—until last year—getting a Doctor’s in Business Administration specializing in technology entrepreneurship. And then I was like, “Why am I still going into debt for people who could give a damn about my degree when I have people who barely have a high school diploma who I’m workin’ with? No, I’m done. I’m tired. I refuse to prove myself through this anymore.” [Laughs] So yes, I totally get that. In our last minutes I really want you to talk about some of the pushback about the article.

Kate: Oh sure. It’s actually perfectly related to what we’re talking about right now. [Laughs]

Kim: Yeah, so talk about that because you opened yourself up and you engaged. So…


Kate: Well, I feel like I promised to engage, so if I had ignored the women who were coming at me for that… you know, Twitter’s this… I don’t have any problem with kind of—I couldn’t really grasp what they were… So, just to give some context, their pushback was that I had written a New Yorker article that was about the takedown of the…

Kim: Lemme—I’ll read it. Hold on.

Kate: Yeah, sure.

Kim: So one person wrote—so they shared your article—and it says, “So when you wrote this New Yorker article on what a terrific job Facebook team did in moderating content, you were embedded with the Facebook team?”

Kate: And that wasn’t true. [Laughs]

Kim: Yeah, exactly. “Isn’t this a major conflict of interest readers should have known about?” And then someone else came… no, it was just her, I guess. Yeah. OK. Go ahead.

Kate: Yeah. No, no, no. There is somebody else later who basically was like, “Even if you weren’t directly embedded, the fact that you six weeks later got embedded with Facebook should have…” I mean the allegation was that…


Kim: OK, So it says “just to understand, you say you wrote the article before you finalized any plans to be embedded with Facebook. But surely that means that you wrote it and published it while you were in talks with Facebook about joining them.” Yeah.

Kate: So I wasn’t in talks yet, but I get that they can think that, but there was a couple of things that were annoying about that. The New Yorker article was really meant to be super descriptive of how Facebook worked to take down the Christchurch video and how hard it was to take down live video. Period. Full stop. That was just supposed to—from a tech perspective, a product perspective—show exactly what happened. And basically, in particular show how these sites like 8chan worked actively to thwart all of the people that were trying to take down the video at Facebook and YouTube, by messing with hash technology and photo DNA technology.

And so that was supposed to be the thing. I don’t think—this is a big push back, ’cause I’m like, “This wasn’t a pro Facebook article. This was a nobody wants terrorists to be able to leverage live video to do propaganda on these major platforms. And so it’s in everyone’s best interest to take these type of things down. And when they finally were able to get the video under control, that’s, a good thing for humanity.” And so that was my takeaway. And so these two women both viewed it as a Facebook-favorable post, which they’re welcome to do. But the other thing was that it was not in any way directly related. It was totally separate.


Kim: Yeah. ‘Cause I read the article based on that and I can tell you when I read the article—and I will be including the article in your show notes—it was a tour of how they handled an incident. That’s what I…

Kate: Thank you. Yeah, that’s exactly how I felt about it. I was just trying to tell people what happened. Or how it worked. It wasn’t supposed to be favorable.

Kim: Yes exactly. Just how the process worked.

Kate: Yeah, it was really meant to be… I—and in fact, one of the things that I’m… people can’t know this because the piece hasn’t come out, but the Yale Law Journal piece is—literally a third of it is critique of everything that Facebook has done. So this is one of the reasons that I’m talking about this and I responded to their critiques on Twitter was because I felt like one of the things that I promised was that I would respond.

And that gave it oxygen, right? And then people saw it and I got pushback and blah, blah, blah, But I could have just ignored it and nothing would have happened. But I felt like it was important to be like, “No, I didn’t do this. There wasn’t anything that happened like that,” but then there just wasn’t enough space for it. And I thought it was actually a pretty weird allegation that made it sound so much more nefarious than it was. I was like, “Wow, you’re making this sound like there was some quid pro quo that just didn’t even exist.” So that was just, that was all.


Kim: But I’m glad it came out, just because other people may have had the same thought and didn’t address it or bring it up. I love when I get called or challenged, ’cause people say, like, “Oh, you’re doin’ this for attention.” [Kate Laughs] Why the hell would a Black woman do this work for attention? I don’t know why you think this is funsies for me. [Kim laughs]

Kate: [Laughs] I was just gonna say—it was a long day. That day was a really long day. I got some weird DMs. Yours was one of the first, and the nicest honestly. So, yup.

Kim: And I do this every day, though. I do this every single day. So why would you think that—’cause I tell people all the time: I am educating the oppressor while I’m oppressed. Do you know what a mind fuck that is for me sometimes when I have to think about, “OK, hold on. Let’s stop. ‘Cause this person has said summin’ really stupid, and that triggered somethin’ in me as a Black woman, so how do I process that and still move forward?” Sometimes I just go the fuck off. That’s just it. And sometimes I’m able to pull back because that is just the nature of the work that we’re doing.

This work—and this is the one thing I wanna say before I ask you for your final questions—people need to understand that wading into this work, if your content—I feel sorry for content moderators. Oh, my lord, I feel… I could not have willingly get paid for PTSD like that. I just could not do it. But people who are actively fighting on global behalf of inclusion and diversity. I tell people I’m not an inclusion and diversity specialist, I’m a business strategist, but we can’t get past this! [Chuckles] Every business I go to, when I’m talking about the information economy, a knowledge based economy, you will never be successful if you can’t figure out the diversity part, which is the recruitment and the inclusion part, which is the retention.


And so we’re not doing this for funsies. We’re doing this because—and also, I don’t speak for other marginalized groups—I speak on behalf of them. Everybody has a voice, but I’ve set my life up so that I don’t get retaliated—if I get retaliated against, you’re not takin’ my money. I don’t work for you. This is the very reason I have community sponsors for the podcast and not corporate sponsors, ’cause you cannot tell me I can’t do that episode or whatever. We don’t have that. My community sponsors me. I have members of the community who once a month give me $100. It’s like a PBS membership.

Kate: That’s great.

Kim: Yeah, yeah, they sponsor the podcast.

Kate: I mean, that’s similar to how I try to do these grants. I also crowdsourced the grants, so it couldn’t even be one—from very different types of organizations—but yeah, the same type of thing.

Kim: Yeah. You have to have… and so this work is not—I can tell you, it’s kind of weird, it seems oxymoronic—I wouldn’t want to be doing anything else. I love this. I love to have the voice that I have. It is so freakin’ freeing. Yet it comes with a cost of being a person from a marginalized group. It comes with a huge cost. And that’s why intention without strategy is chaos.


I learned very early on—and I see it in my peers—if you don’t have a strategy for doing this work, you will burn out, and worse: you will be depressed and suicidal and all these other things that I can’t afford to be. And so the fact that you took it from an academic perspective, got your grants, knew what you were going, had a strategy for goin’ in and what you’re coming out of, that helps us. It seems to other people that we’re distant, and we’re doin’ this for clout or whatever. No, you have to do this work this way. [Laughs]

Kate: Yeah, It makes me feel really good to hear you say that, ’cause everything you just said resonates. And one of the reasons I wanted to talk to you and that I did—I just don’t think that you solve any of these miscommunications or these… there’s so much allegation of bias that goes around tech just in terms of who’s in whose pocket and everything else, and I feel like unless you sit down and talk to people and realize the things that they’re really trying to do—and I really appreciate your questions, which give me a chance to explain who I am and what I’m trying to do. So, I… no. But everything you’re saying is correct. And I also think this is a great purpose for a podcast and just, really, really fun.

Kim: Well, thank you. And your final words. What would you like to share? What would you like to leave the audience with?


Kate: Oh, I just think that—we’ve talked about a whole bunch of stuff today—but I think that it would be that I’m really interested personally as a researcher, but then I hope Facebook will be open to continuing to hear really loudly the marginalized voices. That as they build out the oversight board—and from every walk of life, not just the United States, but from all over the place. And I hope that there’s—you’re so well-named, #CauseAScene—but you kind of have to sometimes to get them to listen. I just hope more people do that and I hope that I can help make sure that those voices get heard with the privilege that I have of having an inside perspective there.

Kim: And that’s one of the reasons one of the guiding principles is “Lack of inclusion is a risk management issue,” because people’ll talk about, “Oh these are things that are nice to…” you know, “People should wanna…” No, no, uh-uh. These are businesses. So we need to talk about risk management. And if you’re not minimizing harm, you’re complicit in harming me and I have a problem with that.

Thank you so much, Kate, for coming on the show. It has been a pleasure. I learned a lot, and I really wish you success and keep me posted on when the article comes out and any other feedback or updates that you may receive. So I would like to stay looped into what is going on with this governing board.

Kate: Yeah, totally. And I would love to keep the conversation going. This has been brilliant, so thank you.

Kim: All right, Thank you. Have a great day.

Kate: Bye.

Kate Klonick

Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.

Learn more >

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud.

Listen to more great #causeascene podcasts