Nandini Jammi

Podcast Description

“What I’ve realized is that after three years we have made absolutely no dent on this issue at all. What’s happened instead is that these companies are relying on our free labor to flag this up for them and they’re not doing anything.”

Nandini Jammi is co-founder at Sleeping Giants, the campaign to make hate and bigotry unprofitable. She has been behind some of the most high-profile social media campaigns in the U.S. since 2016.

Sleeping Giants began as an anonymous effort to alert advertisers their brands were appearing on propaganda site, Breitbart News.

Since then, she has led campaigns that have convinced advertisers to flee Fox News’ The O’Reilly Factor, Tucker Carlson Tonight and The Ingraham Angle. The campaign is also behind the effort to deplatform Alex Jones’ Infowars as well as influential white nationalist figures.

In 2019, she made international headlines for successfully calling on PayPal to drop the KKK as a customer.

Through her work, Sleeping Giants has ushered in a new era of brand safety and corporate accountability. 

As a marketing consultant-turned-brand safety advocate, Nandini is now working to help marketing and tech audiences safely navigate this volatile and unpredictable new world of consumer activism.

She has been named by Business Insider as one of 23 industry leaders “fixing” digital advertising, a DigiDay Changemaker in 2017, and recognized as one of 30 women shaping the B2B marketing space. 

Sleeping Giants is a Cannes Gold Lion🥇 and a Webby-award winning campaign.

Additional Resources

Transcription

00:30 

Kim Crayton: Hello everyone, and welcome to today’s episode of the #CauseAScene podcast. I grabbed. I snatched. I commandeered. This guest popped up in my feed, went immediately to DMs. We were like, “You know what? I need to start 2020 with this conversation.” So everybody welcome Nandini Jammi. You said I could call you Dini?

Nandini Jammi: Yes, you can call me Dini. Hi.

Kim: Dini, hi. Introduce yourself to everyone.

Nandini: OK. Well, I’m Nandini. You can call me Dini. I’m cofounder of Sleeping Giants, the campaign dedicated to making bigotry and hate speech unprofitable online and in the media.

Kim: Yes. And it’s so funny because I did not know that you were a part of that. There’s the face of it, the gentleman Mark or Mike or…

Nandini: Matt.

01:23

Kim: Matt. Yes, and I didn’t know it was anybody else, so we’re definitely want to get into that because there’s a person in the #CauseAScene community right now who has realized that that work is what he’s really good at. And he is like de-platforming so many racist, bigot-y people. So I really want to talk about that because people need to understand that this work needs a lot of people doing a lot of different work. And it doesn’t have to look the same; we’re just going in the same direction. So, we always start with the same two questions. Why is it important to cause a scene? And how are you causing a scene?

Nandini: Why is it important to cause a scene? I think that it’s important to cause a scene because once you find something that is wrong in the world or in your work or your industry, it’s your responsibility to fix it. It’s just as simple as that. It’s see something, say something. And in my case, I saw something and I realized that I was in a unique position to be able to fix it—or to start fixing it. So I just did what I could do.  And I’ve been doing that ever since.

And I will add that I have been working in tech for my entire career. I’ve been a product marketer, a content marketer. I was a head of growth for a while. And it was actually at my last job at a product management software company, a start up, where I was given a £3000 budget to—it was a British company, so £3000 budget—to run ads for the first time. And I had never done it. The company had never done it. I didn’t have anyone to teach me anything about ads, so I just went into Google AdWords.

03:10

And I’m a copywriter by trade, so I wrote my own ad copy, I wrote my own landing pages and then I created an audience and all that stuff, and then I turned on these ads. And when I looked to see what the results were, when I looked to see where my ads were being placed, I was—3000 seemed to be a lot of money for me back then—so I was very careful about where this money was going, and how it was being spent, and where my ads were appearing. And I saw that it was appearing on some weird sites that normal people don’t visit. Websites that are just not—it’s just like, why would any human being visit that website? And so I had an inkling that something was up with Google Ads.

But  was I was a total novice. I didn’t know this world at all. I only knew what other people were talking about on online and what people were sharing on Twitter, and just the guides and stuff that I read online. So I didn’t say anything. I saw something, but I didn’t say anything.

But a couple months later, the elections happened, and after the elections, I decided to visit this website that I had been hearing so much about, Breitbart.com. And I immediately saw ads for Old Navy and Banana Republic and Target and stuff. They must be targeting me and my shopping, right? And I immediately was like, [chuckles] “Those guys don’t know they’re on there. They’ve turned on their Google ads and their Facebook ads just like I did, and they’re not watching to see where those ads went.” So that was an aha moment for me.

05:00

And that was when I was like—I remember actually having this moment where I was sitting exactly where I am right now, a couple years ago—and I was like, “Oh, my god, I think I stumbled upon something really big.” And do you know what I did? I wrote a Medium post. [Kim laughs] I wrote a Medium post. That was the only thing I knew how to do.

Kim: Oh shit. [laughs]

Nandini: So I wrote this post and it was called, “PPC Marketers”—I forgot what it was titled—but something like, “Don’t Ask for Permission, Just Blacklist Breitbart Now.” And I just wrote it all out. I was like, “Guys, I think I figured it out. All we have to do is go into our backend and blacklist these ads. And it’s important because the guy who runs this website is about to become a strategic advisor, a senior adviser in the white House, Steve Bannon.”

And I was like, “This is the only way we can stick it to them. We have all the power. All we have to do is blacklist it. All of us, together. And don’t ask for permission. Don’t wait to get permission. Why bother? No one’s going to get mad at you for blacklisting this racist website. Just do it.” [Laughs] So I was trying to start a little insurrection.

06:15

Kim: And no one was—and as you said—no one was gonna check anyway. So if you did it, no one was gonna know. Yeah.

Nandini: Yeah. Exactly. [Laughs] So I, like, just sat there and waited for it to go viral.[Laughs] I had a couple influential friends push it out. And I was like, “All right, you guys, [laughs] let’s do this.” And it did not go viral, like 20 people maybe saw it. One of them was Matt. Matt reached out to me. At the time, it was just a Twitter account called Sleeping Giants. He was like, “Oh, cool article. You should come join us. We’re doing the same things.”

So this guy had already been—and he’s on the other side of the country. Well, I was living in Germany back then, but, yeah, he was on the other side of the country. He had done the same thing: he had went on this website, he’d seen this issue, and he’d already started taking these screenshots and notifying advertisers, but not under his own name, because he was worried for his safety, of course.

So, yeah, and this was nuts, it happened all within the span of… actually a day. It was less than a day. I published it and he found me the next day; he had started it the week before.

07:26

Kim: So he only started his the week before?

Nandini: Yeah.

Kim: So it was perfect timing. That synergy of right place, right time. Definitely.

Nandini: Yeah. Yeah, it was amazing. The timing was crazy. And turns out he’s a copywriter as well. I’m a copywriter. You know, we had, I think, the same vision. Well, we  didn’t have a vision back then. [Laughs] At the time, it was just like, “Let’s just work on this every day.” I think we both had the appetite to take on this workload. But it was also like—this thing moved so quickly. I mean, so we met at the end of November, and about two weeks later we were making our first headlines when we got our first big win; it was Kellogg’s.

Kellogg’s came out and said, “No, we did not intend to advertise on Breitbart. We will be blacklisting it. Plus, we’re gonna go back and audit all of our ads.” So then Breitbart got [laughs] really mad, and they launched a counter boycott called #BoycottKellogg’s. They started—they made all these crazy pop-ups. They’re like, “Sign the petition. Kellogg’s doesn’t care about Breitbart readers,” or I don’t know something nuts. And Breitbart was still powerful back then, and anything they did would garner press attention. So outlets started writing about them, and, you know, we were part of that story. So that’s how things started to snowball.

08:54

Kim: Wow! And see that speaks to—you and I were talking before we came online—it’s about finding that thing that… there’s so much out here that we could be doing to stop normalizing hate, and bigotry, and oppression, and racism and all these all these things that everybody… The reason I do #CauseAScene is cause I set my life up to say what the fuck I want and not have—you can’t take a job away from me. I still have consulting clients. So I can say things that people who are afraid they might lose their job can’t.

I’m not—please don’t go do what I do if your life, if your finances depend on it. Don’t go do what I do, because I recognize that I have sacrificed a lot of income to do this. [Laughs] I mean, this is cobbling together a business where I could have more easily kept my mouth shut or—well, that wasn’t gonna happen—but kept my voice a little more quiet and gone into doing business strategy with companies. I could have done that and be consulting and be making way shit more money than I am. But I took this hit—no, I’m not even gonna call in a hit—I took this path because I knew that other people couldn’t.

So I took this path so that other people wouldn’t have to do this; that they could stand up in their whatever way they need to stand up to advocate for themselves and people who are more vulnerable than than themselves. And yet it’s like everybody can do it. Like you just explained, you saw something, you were like, “Oh shit, This is wrong.” You created a something you thought was gonna go viral. It didn’t, but it connected you to who you needed to connect with to make something go viral. [Laughs]

10:41

Nandini: Yeah. Perfectly said. Perfectly said. And you’re right. I think it’s very brave what you’ve done as well, going out on your own. The same goes for me. At the time, I was working for a company full time. But I left after about a year, and I’ve been freelancing on my own since then. And it’s been… [laughs] it’s been pretty funny because almost everyone knows who we are in tech. And I think that it’s been intriguing for them to talk to me.

But when it comes to—I did for a while, actually, earlier this year I was looking into potentially getting back into full time work—and I think they were interested in talking to me, but I think they were also a little bit freaked out about hiring me. So the same things that have made me good at what I do are also a little bit of a liability. And I realized that I am a liability for a lot of companies because of the fact that I’m outspoken on social media. But that’s OK, because there’s things that I’m good at that are monetizeable. And of course, people are definitely still interested in working with me on a consulting basis.

Kim: Exactly. And the fact that we have this privilege enables us to—’cause I’m able to now go into companies ’cause I only talk to business leaders. And I don’t—the Kellogg’s and all, that’s too big for me to grasp my hands on—there’s so many small- to medium-sized companies in tech that are havin’ a huge impact on our community, and I can directly get to the leadership who is controlling and guiding the culture. If I can get my hands on them and have these conversations, I can move—we can scale this. And I’m getting more and more of that.

12:44

And so for me, first of all, it’s funny how people think—like you just said, when you started, it was no strategy—when I started this, there wasn’t a strategy; I was just pissed. I was just like, “You know what? I’m so pissed off. People just causing harm, don’t care.” My friend and I were havin’ a  conversation, I was just, “I just wanna be disruptive in 2018,” and we’re like, “Yeah, cause a scene.” And I was like, “Oh my god, #CauseAScene.” That’s how that started.

Nandini: Love it.

Kim: But since then, because I’m a business strategist, I have a strategy now. I know exactly what I’m doing. To do this work without a strategy is—as one of the #CauseAScene core Guiding Principles is: intention without strategy is chaos, which causes harm. I love how people—and I wonder if you have the same thing—who don’t understand that I’m a business strategist, so what I’m doing—Twitter is just a place for me to amplify the message, educate people who are trying to figure this out, but that’s not where my business is.

And I get these individuals who—and I want to go back to the fact that you say you’ve been into tech for so long—because there are people who’ve been in tech for so long who don’t have that social justice—as people call it social justice, I don’t even consider it… it just makes common fuckin’ sense to me; it’s not [laughs] even social justice, it’s how we should be doing businesses in a knowledge economy—who have these huge platforms, and then try to take to having these opinions and they fuck it all up.

14:16

And I really appreciate the fact that you said that you’ve been in tech your whole career, because people want to make it seem like if you’re not coding, you’re not of value. And your voice isn’t heard. But the stuff that we’re doing impacts the code, impacts how the products and services are used and who they’re used by and all this other stuff. So I know I’ve said a lot, but it’s just interestin’ to me how people just assume—’cause I’ll get all the time when I say something, I’ll get these people like, “Is she even in tech?” Who the fuck? What? What? [Laughs] If all you can do is code, are you gonna be in tech for long? You know, it’s like… [laughs]

Nandini: No, those are all—that all rings true to me. I have been lucky in my career so far to have worked with very conscientious startup folx, so I feel pretty lucky on my end that they’ve always been really supportive of me and they’ve understood my value. So again, I started out just writing, as a content marketer, and then I started to move into more strategic roles. I was also lucky, by the way, to have started working with a customer support help desk, and because I was the content marketer, my job was to learn about customer support and then write about it.

Kim: If you’re a developer and you have not spent time at the customer help desk, you don’t know what your product is.

15:47

Nandini: 100%. Yeah. So I think that was the most valuable place to start. My education in—both in tech and learning about customer support. And then my next job was selling product management software, and a lot of the work I had to learn about how product management works so I could write about it and develop content that people find on SEO [Search Engine Optimization]. And a lot of people actually mistook me for a product manager. I could never actually do the job, but I can write about it, I could talk about it.

So both of those things really were very valuable for me when I started to work on Sleeping Giants, because I understood how things work within a company, how things flow from the customer support side, and social media crisis side up to the C level, and the product management level folx. So I was able to use that insight to inform our little strategies.

Kim: Yeah, and for me, I tell people all the time I would not be in this position if I had not—I hated being a high school teacher, I hated it—but if I did not have that experience of being an educator and developing classroom management skills, I wouldn’t be able to do—I have such a varied, eclectic background that uniquely fits this space. I consider us unicorns. It’s the folx like us who have these varied backgrounds that we’re ever able to put this puzzle together that no one else understands. Because, like my family, they’re like, “What the fuck are you doing?” [Both laugh] They don’t get it. [Laughs] They don’t get it, but they’re like, “Hell, you seem happy, so keep doin’ it,” kind of thing.

17:36

And this also goes back to when we talk about inclusion and diversity. People want to say I’m an inclusion and diversity Specialist. I am not; I am a business strategist. We just can’t get anywhere because there is no inclusion and diversity in the space, so we have to deal with the kindergarten stuff first. But this speaks to that. This is why it’s important to have diversity at the table. Because you bring in all these different perspectives and who can help you make informed decisions about the things that we’re doing. If everybody is comin’ from the same space, there’s so many blind spots. There’s so many things that we don’t see, that we don’t even know that we don’t even know. [Laughs]

Nandini: Oh my god, yeah, 100%. That’s the reason that I go after so many little things—seemingly little things—and I just… I hit it. I hit it for months.

Kim: Give me an example of what you’re talkin’ about.

Nandini: So this summer, I was writing a talk for Turing Fest—which is in Scotland—and it was about Sleeping Giants, of course. And one of the things I wanted to highlight was the way that payment processors are enabling hate groups and hate speech. And I was like, “You know, I’m gonna look for a fresh example,” so I Google the KKK. [Kim laughs] And, I don’t know, it was like KKK dot com, I don’t remember the URL any more. But it was the official KKK website, and they were using PayPal to collect donations.

19:01

Kim: [Gasps] But yet PayPal can’t figure out a way to enable trans individuals not to use their dead names anymore.

Nandini: Oh, don’t even get me started. Yeah. So here’s the thing: I have a history with PayPal. I’ve been talking to them for months because I had been collecting various groups and asking them to take a look and review it. Now, for the longest time—or no, actually, back in 2016 after the Charlottesville rally, PayPal released a statement that basically was on how they were remaining vigilant in the aftermath of Charlottesville. They said, “We just want to say that we’re taking this really seriously, and we will not be… we’re remaining vigilant on intolerance, racial hatred, harassment,” and so on. And they specifically called out groups like, “We will not be working with groups like the KKK.” So two years later… [laughs]

Kim: [High-pitched voice] Wow, you’re working with groups like the KKK!

Nandini: Like the literal KKK. And so I think I’ve lost all perspective on this stuff at this point ’cause I’m just constantly tweeting stuff out. So I was like, “Oh, cool. PayPal’s working with the KKK.” And I just took a screenshot, and I just moved on, [laughed] ’cause there’s so many other groups. And a Canadian group picked it up, a Twitter account picked it up, and they asked their followers to write into PayPal, and a week later—and I had just given my talk, so I was really tired—and the next day I have a BBC reporter asking me—in my DMs—asking me for an interview about that.

21:06

And I was like, “Wow, is that newsworthy? [Laughs] Oh, I guess it is pretty newsworthy.” So it was massively embarrassing for PayPal. It was a news item that did take off. Fox News ended up writing about it. Side note: Fox News ended up writing about it, and in their original article—you know how they have to put media on every blog?—they put a video of some previous segment about how tech companies are censoring conservative groups. So I called them out on that, too. [Kim laughs] So I guess the KKK is just a conservative group now. Apparently.

22:26

Kim: Is that what it is?

Nandini: So that happened—and by the way, Fox did update their media after that. I think they didn’t even think that through. You know, just what leftist did this to them, and… wild. But, that turnaround was one week. OK?

Then there’s another guy that I’d been working on for a while: Stefan Molyneux; he’s a white nationalist, as well. And I had been flagging him up with the appropriate person at PayPal since April of 2019, and I had been—I was nice enough to include evidence and a little dossier of information about the kind of rhetoric that he used, what kind of business he’s built and the type of content he’s putting online.

And they hadn’t—and I was following up every month or so—and it was not until November that they banned him. And I think it was at the point where I had been—I try to be nice about it. I don’t want to put them on blast on social media for no reason. But Stefan Molyneux was clearly emboldened. He was starting to share images of people in the media, and he had Jewish stars, and all the Jewish people in the media. And it was just getting really, really bad. It was getting really dangerous.

23:50

Kim: And this is what… [exasperated sigh] This is these platforms don’t understand—I take that back; it’s not that they don’t understand—it’s not effective of their business model until—as I say, lack of inclusion is a risk management issue—until it becomes a real risk management issue they don’t deal with it. Because what happens is—and this is why I have hard lines, period: whiteness by design is racist and cannot be trusted by default, without consistent antiracist behavior. And I put whiteness as a construct of Blackness as equal, since we’re a group, I’m not talkin’ about individual white people, I’m talking about the system of white supremacy.

And I do that because if I give—again classroom management—if I give that disruptive student one out, my whole fuckin’ classroom is a mess. They will not only disrupt the classroom, but those individuals who are relying on me to keep them safe no longer feel safe. And I have to prioritize them. So, no, I am not going to give whomever the asshole of the day is, the opportunity to vent, to do whatever they want to because, oh, you just felt like it that day. No, no, no, no, no. Got a hard line. Because your comfort is not important to me.

And so this is what we get into with the Twitters and the Facebooks and, “oh, all speech is equal.” No. Can we stop talkin’ about fuckin’ equality? It is not about equality, it is about equity. And it’s about elevating the people’s voices who have not had a voice. Whiteness has always had a voice, so it is not about equal. It is about—if for every one voice of whiteness I need 10 to 20 to 50 voices of people of color, people in marginalized communities, people with disabilities for that to even scale out to be something of equal value. And it still isn’t. [Laughs]

25:53

Nandini: And the fact that that message hasn’t really resonated with tech leaders yet—they haven’t been able to figure out how to bring in diverse voices into their companies—is playing out now with the work that I do. Because when I send them a guy like Stefan and I say, “Please review this,” I can’t speak for what’s going on internally, but I will make the assumption that they don’t have people of color or marginalized people at the table making those decisions. What I think happened, because all I have is two data points—well, I have more than that, actually; I’ve been working on PayPal for a while—is that the KKK was gone within the week and Stefan Molyneux was a seven month investigation. And why is that? I know that for a fact they had on their Charlottesville statement, they specifically called out the KKK, But what do they do with a guy who calls himself a philosopher and couches all of his racist ideology in euphemism?

Kim: And debate?

Nandini: And debate. Are we infringing on this man’s freedom of speech?

Kim: Yes, they have no fuckin’ clue how to handle—and this is how it’s slipping; that’s that inch that they take and they turn it into a yard. ‘Cause the KKK has a definite hood thing. And this is one thing that I often do on my timeline and people see it: I will tweet something and say, “This is what white supremacy looks like,” because it’s not all swastikas. It’s not all white hoods. It is all these little systems and things that are in place that make people think,” Oh, it’s the failure of the individual,” when in fact, no, it’s a failure of a system.

27:45

Nandini: Exactly. Yeah.

Kim: And you’re right. If no one’s at the table challenging that for them, they don’t see it. Then it becomes, “Oh, I need all these 3000 data points to prove,” when you and I can walk in and say, “Oh, shit, that’s hate speech right there.” [Laughs]

Nandini: Yeah, it’s infuriating. And three years ago, when I first became aware of this problem, I had a little patience. I wouldn’t say that I was angry at that time. But now we’ve been working on this for three years, and as we mentioned before, this does come at a personal cost, it comes at a financial cost. When I started my freelance business, I was running two things. But this point, you know, as of right now, just a couple months ago, I publicly announced that I’m not copyrighting anymore for clients so I could work on this issue full time. Because I’ve realized that after three years, we have made absolutely no dent on this issue. At all. And what’s happening instead is that these companies are relying on our free labor to flag this up for them, and they’re not doing anything.

28:50

Kim: Yes. And do nothing else. And do nothing else. They’re like, “Ah, check that box.” But it’s the same thing as, “Oh, we brought Black and disabled individuals. They just don’t stay. Huh. I don’t know why.” Because you didn’t do shit to change the culture. You did absolutely nothing. You expected them to come in and be… nothing happens—OK, so this is what I tell people. We are no longer in an industrial age. We keep acting like we’re making widgets, where you brought someone in, it didn’t matter what damn background they had, because you’re gonna give them a manual. They’re going to read the manual. Because every widget needs to look like everybody else’s widget because they need to fit in these certain things.

We are in a information knowledge economy where you need to get out of my head what I know. And that is what helps you differentiate, and innovate, and be competitive in the 21st century. I’m not creating widgets, and if I don’t feel safe, I’ma sit there. I’m gonna say, this building could be burning the fuck down and I’m not going to say a word because you didn’t create safety for me. And then you want to say, “Well, they left.” Exactly. They knew the building was about to burn. And they left. And then you wake up one, two weeks later, they’re gone; it’s like, “Why is the company not running?” “Oh, Sharon left.” Did you not realize that Sharon left? [laughs]

Nandini: No. I think that’s a really good metaphor. We’ve all seen it over the past year or so, right? Especially with these employees that are speaking up at GitHub and Chef and…

Kim: Google.

30:19

Nandini: Google, Wayfair. I mean, these are really, really amazing examples. And I think what’s really necessary now is that we create a set or some kind of a standard for these employees or some kind of a way to give them the leverage to go to their management and say, “This is not good for our business. It’s not good for our business to be advertising on Tucker Carlson or the Daily Caller. It’s not good for our business to end up on racist websites.” Even if it doesn’t really fit your…

Kim: And see, the thing is—again, going back to that metaphor or the analogy of the industrial age—that shit worked. See, this is also where the mindset hasn’t changed. That shit worked back when the local hardware store only had to cater to people in the community. You could all be KKK. You could all be whatever because no one gave a shit. That was the corner hardware store. We are creating products and services that have a global impact. We can no longer do that because the racist global hardware store owner is causing a problem in Central America, is causing problems in… [chuckles] you know, we can’t do that anymore.

31:38

Nandini: You know, it’s so funny you say that because right before I got on this call with you, I’ve been tweeting furiously at Stripe, the payment processor, for almost a year. And one thing that just really, really pisses me off is Stripe holds actual real life events. And they have a code of conduct at these events. They say you’re not allowed to harass, bully, intimidate people not just in general, but of course, on the basis of their race, their religion, identity, whatever.

And so today I was asking Stripe I said, “OK. So what about your customer the neo-Nazi group, The Third Path? Which is an active Nazi group in Germany. Are they allowed to come to your event? Or are they not allowed to come to your event but they’re only allowed to be your customer as long as they’re 5000 miles away? So they’re allowed to make the community unsafe around where they live, but you’re not gonna let them into your doors, under your roof.”

Kim: And you’re askin’ those questions that is stumping the hell out of ’em ’cause they’ve never had anyone asking those questions of them before.

Nandini: Yeah. Cause they’re willing to outsource that externality, right? [Frustrated laugh] They’re willing to let anyone—nobody else matters.

32:59

Kim: Yes. Yes! It’s the same thing you see with Facebook and their content management. Those people aren’t employees of Facebook. But the PTSD and the trauma these individuals are experiencing havin’ ta look at that shit every day, all day long, is detrimental to their mental health, their family’s health, their community’s health. But yeah, we’re gonna outsource that out to somebody. And so they don’t get the benefits, they don’t get any of the stuff that Facebook regular employees get.

It’s strategic risk management. That’s what it is. So what you just mentioned with Stripe and what I just said, they’ve thought about this shit. Because if they hadn’t, these would be employees. If they hadn’t, these would be individuals they would not put on their platforms. But they’ve strategically said, “We are personally responsible for this. If we outsource that we’re not directly responsible for that.” And what we need to expect them to understand is, yes, you are.

Nandini: I mean, I think it’s just a matter of shining some light on it, actually.

Kim: Mhm. I don’t think many in the community understands how this shit is doin’, how they’re passin’ it off to other people.

34:10

Nandini: No. I mean, I think Facebook employees are now starting to become aware. I think Stripe employees have no idea. Stripe has, I think they invest quite a lot in their diversity and inclusion initiatives. And I think they’re doing what they can—you know, I don’t know how well it’s going—to make their company feel like an inviting, welcoming place for marginalized people. I don’t think those people are aware.

And again, fine, if you want to talk about free speech or even terms of service, acceptable use policies; Stripe’s  acceptable use policy is one of the most ridiculous things I’ve seen. If you wanna use Stripe and you’re a psychic, you can’t. You can’t use Stripe if you’re psychic, they don’t work with psychic services. That’s a specific thing in their terms of service. But if you are an anti-Semite and you have a podcast to be anti-Semitic, to distribute hate speech, you can do it. Because Stripe’s acceptable use policy only kicks in when you start calling for violence.

Kim: So what’s with the psychics? [Laughs]

Nandini: So I think the psychics comes from charge-backs, like high charge-back rates. [Laughs]

35:31

Kim: Ah, gotcha, gotcha, yeah. “You told me some shit. That shit wasn’t true, so…” [laughs]

Nandini: Yeah, exactly. “I want a refund, please.”

Kim: OK, so for them, they could talk about violence, they can… but as long as—OK, hold on. So they could talk about violence, but unless they, being the customer of Stripe, actively participates in violence, not the fact that they incite violence.

Nandini: Inciting violence is the line.

Kim: Is the problem.

Nandini: So you can say all sorts of horrible things and just stop short of saying, “But, you should be killed.”

Kim: But the people who are listening to all of that up until that line are actively goin’ out and committing violence.

Nandini: Yeah. All you have to do is put a little disclaimer that says, “We do not condone violence.” I’m pretty sure the KKK has that, too, “We do not condone violence.”

36:26

Kim: Yeah, yeah. Wow.

Nandini: And that’s all you gotta do. Or satire; you know, “It was just a joke. We didn’t mean it.”

Kim: Oh, we hear that a lot lately. Haven’t we? Everything’s so fuckin’ funny. [Laughs]

Nandini: Yeah, the oldest trick in the book. And I think that’s so simple to you and me, to anyone with any common sense, but for tech companies, because they’re looking for some kind of a global system—there was an article on this, I think in “The Verge,” that talked about Facebook’s attempt to create a global set of standards on what hate speech is, on what harassment is, on what is allowed to be said and not said—that it’s actively hindering their efforts to act quickly on actual hate speech, because they want to apply that one standard. You can’t apply that same standard from the United States.

Kim: Globally.

Nandini: Exactly.

37:20

Kim: Yes. And that speaks to this new committee, this new board, that they’re creating as well, that they’re tryin’ to find this perfect equation of having marginalized people. “But who do we get? Da-da-da-da. How many do we have?” [Laughs] I’m just like, “Oh my god.” This comes from a place of—and people people think I’m being conspiratorial—but this all comes back to white supremacy, because white supremacy is never investigated, never evaluated. So it is a one plus one equals two equation. For us, we know that one plus one, depending on who the fuck is saying it, could be anything because there is no black and white. This is not binary. [Laughs]

Nandini: Yeah, exactly. And when it comes to misinformation, I think it’s even more dangerous. And that’s because misinformation is designed to make people angry—or disinformation, I should say; misinformation is not always on purpose—but disinformation is designed to make people feel angry, and to feel emotional. And the people who bear the brunt of this, the target—or not the target but the source of this disinformation—is usually about marginalized people.

[Interlude]

40:27

Nandini: …the source of this disinformation is usually about marginalized people. And when you see the kind of headlines that they put on Breitbart, it was designed to harm us.

Kim: Yeah. Exactly. Oh, you’re just getting to it. So it’s like Breitbart has a strategy, they know exactly… why can’t the fuck these other people figure out—and this is one of the reasons why in 2020 I have my team working on building #CauseAScene Alliance, which is the antiracist tech agenda. Because we need to get offline. We need to have a safe space where we can come together, have conversations without being interrupted by assholes, and create strategy, because this other side has had years.

This is why, when you have these hate—just like you said, they go to the tried and true: “I was just joking,” “Not all white people.” This is where, no matter where they are in the world, they say the same shit. It’s like they have the same book that they’re readin’ out of. We need the same thing. We’re behind them. We’re behind the curve, but it’s going to require us to fundamentally get in a space where we’re having these conversations, being willing to get uncomfortable, being willing to prioritize the most vulnerable, and listen to their lived experiences, and create strategies that counter this stuff.

41:53

Right now, everybody’s like just goin’ off. It’s like buckshot. It’s just goin’ everywhere. We need a lasered approach. And then for me, I believe that’s when you can hit Stripe. Because it’s just you and Matt. If we had a group of 1000 people sendin’ the same shit to Stripe and saying, “Hey, if you don’t change this within 24 hours or whatever, we’re makin’ this public.” We need a community effort.

Because again, right now, they’re doing the risk management equation on “Hmm, based on legal, if something happens, can we make a payout with that?” I mean, they’re doin’ some whole bunch of equations about this. They’re not innocent in this. So it’s like, “If something happens do we have a risk management plan in place? OK, so what is step one? OK. And we make an apology. OK, now, if it escalates to this, this is how much money we’re willin’ to pay for this, and da-da-da-da.” And then they walk away. And it happens over—and we see this with Facebook, every single time. They do something, he says something. It’s bullshit. They walk away.

Nandini: Yeah, exactly.

Kim: They pay some money, they walk away. [Chuckles]

43:06

Nandini: Yeah. It’s been tough to find that leverage because with PayPal, I had this public statement that they made saying, “We’re actively working to stamp this stuff out.” So I could just point to that statement and say, “OK. Do something right away please.” With Stripe, they don’t have that. And they haven’t been put in a place where they’re pressured to change their acceptable use policy.

Kim: And you know what? And that’s fine for me, because with the community and with leverage, you know what? You can continue to house them, but you’re gonna make a public statement that you’re OK with that. See, that’s how they’re jumpin’ the line. They get to hide behind these things and not make anything public. So they really don’t have to—you can have whatever customers you want, this is your business; but as another customer, I can make a decision that I don’t wanna work with you because of the people you’re workin’ with. And this is where the information asymmetry comes in because we as customers don’t realize that that’s their customer. And I would rather not give my percentage of whatever to you for supporting that.

Nandini: Same. I mean, [laughs] I moved to checks and bank transfers.

Kim: Yeah. Wow. Wow.

44:20

Nandini: I mean, I can’t even… the depths of this issue is unbelievable. If you take the two examples we’ve been talking about, Facebook and Stripe, did you know that Stripe is Facebook’s payment processor? It’s integrated into all their payments.

So, I think it was earlier this year that someone tipped us off and said that the United Constitutional Patriots, the vigilante group, they had been kicked off of PayPal. They had been kicked off of CashApp and they had started to fundraise on Facebook using Facebook fundraisers. [Laughs] They said that they were going camping and they needed some money for supplies. So I did the same thing that I tend to do: as a first attempt, I reached out privately to the appropriate person at Facebook and emailed her and asked her to look into this. She got back to me super quickly and said, “I’m looking into it, thank you so much.”

And within like an hour or two the fundraiser was taken down. By the way, when the tipster had gone through regular channels, it hadn’t been taken down.

Kim: Exactly. Exactly.

Nandini: Because it was overridden. But what happened next was, a couple hours later that evening the same tipster got back to us and said, “Yeah, the guy has started a new fundraiser on the page. He’s also started a fundraiser on his personal Facebook profile.” So whatever she did earlier that day didn’t fix the problem.

46:00

Kim: No, exactly. That’s the whole thing. They don’t fix the problem. They pull the plug on this one thing and just let everything else—and it’s, again, the risk management: “She’s complaining about this thing. She doesn’t know about this other thing. Maybe we don’t even know about this other thing.” They don’t put flags on this account that if this person does something, this immediately needs to be escalated to see what this person is doing.

Nandini: Yeah. I mean, can you imagine? I couldn’t even believe it. So I did up the risk for her. We published her email address on Facebook, and we had our entire community write to her. And one thing that we do that I think is different from other groups is, I feel very comfortable contacting the VP of marketing and ask them about their ad buy—or their TV ad buy, whether it’s on TV or online—I feel comfortable going to the individual that’s responsible because I want them to know their name is on this. I also want them to know that they are empowered to do something. They’re in a much better position to address the issue than we are on the outside.

So it is a little bit risky and I don’t like to publish people’s email addresses for no reason. And one thing that we do from our side is we absolutely encourage our community to be polite, but assertive. Ask for what you want. Ask for what we want, and don’t be rude to the person. I think we’re contributing to an awakening among employees themselves to say, “Wow, I hadn’t thought about this. It never entered my consciousness. Or maybe I thought it was an edge issue, but it’s actually a massive problem.”

47:44

Kim: Oh my god. Don’t talk about edge cases. I’m so sick of people saying edge cases. It’s an edge because you’re not in the fuckin’ community. [Frustrated laugh] It’s not edge to us. It is the case. Before we run out of time, I wanted you to talk about this tweet you did on December 14th about the CEO of Patreon.

Nandini: Yeah, sure. So back in 2017, the CEO of Patreon banned a white nationalist named Lauren Southern, with her organization called Defend Europe, from using Patreon services. They had rented a boat, and they were using it to block actual NGO rescue ships from rescuing refugees on the waters. And what Jack did at the time was he banned her. And I think that that would have just been that if it wasn’t for Lauren Southern, her team being super loud and obnoxious and probably harassed him endlessly.

So what he did was he published a video where he explained exactly what his review method was, exactly what his process was, that made him decide to ban her. So he introduced this idea, this concept, called manifest observable behavior. And manifest observable behavior is a review method that is based entirely on what a video or an audio has recorded or statements that they have written or made to the past. So anything that you can see, tangible things.

49:18

And he said, “Well, I reviewed Lauren’s own evidence. She had videotaped the whole thing, reported the whole thing and put it up on YouTube. And Lauren says that she was just there as a journalist. She was just there to record the experience, apparently. But I can see on video that she was on the boat. She was directing the boat operator to get in front of the ship. She stated a clear intention on selfie mode, saying,’I intend to block these rescue attempts.’ And, not only that, but she came back and she started fundraising to buy a boat. And they successfully fundraised. She made statements to the press saying, ‘If the politicians won’t stop the refugees, we will. Our boat will do it. That’s what we’re buying the boat for.'”

So he said, “Well, based on this overwhelming evidence, I can pretty confidently say that this is against the section of our policy that prevents us from working with users that are causing loss of life or intend to to cause harm or loss of life.”

What I liked about his approach was that a lot of tech CEOs, they don’t want to get into emotions, or ethics, or morals, or feelings, or any of that stuff. So while what I described to you is a really disgusting thing to do, right? What she was doing was she was going to kill refugees basically, or have them killed on the waters by preventing these rescues. What Jack did kind of took the emotion out of it. He just used his acceptable use policy and just pointed to that and said, “You know, this is against our policy.” And I think that to me was an “Aha!” moment.

51:17

So I don’t know why when he published this video in 2017, he got no attention for it. But I kept thinking about it over the years, and I don’t know, just this weekend I decided to write about it. Because I think it’s the most advanced technique or method or thinking in the tech industry about how to moderate bad behavior and bad content.

So where I think a lot of tech CEOs get tripped up is in this feeling, emotion, ethical, gray area. What Jack did was he took that gray area out. He had a black and white list of things that are acceptable on the platform and what isn’t acceptable on the platform. And so he no longer has to worry about turning this into an emotional decision or a decision based on people’s feelings. He can just say, “That’s just not allowed on our platform.”

Kim: And I liked that approach because it’s basically mixed methods. He used quantitative, plus there was qualitative data, her stuff, that he was able to make—although it was not, as you say, “emotional,” the fact that he used her lived experience as evidence of how he made his decision is [clap] perfect mixed methods. Is [clap] perfect of [clap] how a [clap] way to [clap] do this. Use their [clap] own words and [clap] actions [clap] against [clap] them. I mean, they’re there. They have no problem with publicizing this shit. And it goes back to when you’re saying that it took PayPal seven months to get this individual off. Had they used, what’s the name of approach again?

Nandini: Manifest observable behavior.

53:05

Kim: Yes. Had they done that… If you make that a blanket across everything, that kind of takes some of that “ehh” out of it. [Laughs] They could have done the same thing with the Ku Klux Klan. They have a history of that manifest observable behavior. But with this Stefan—I think his name is, that you mentioned—had they used the same litmus test, he wouldn’t have been on there for seven months.

Nandini: Exactly. It’s coded into their culture. It’s coded into their policy. And it’s not just words, like harassment and bullying, but a specific understanding of what that is, a specific definition. Because you don’t wanna get into a place—I understand this—if you’re a tech CEO once you make the decision to ban or prevent someone from using your services, it’s kind of like writing a law.

So what he’s done is he’s made it possible for his trust and safety team to operate and make these decisions without him having to jump in every time this happens. And I think that’s what must have happened internally at PayPal. There was no way for them to make this decision. There was no way for them to evaluate this and say, “I feel confident to ban this individual from using our services.”

54:21

Kim: And I can’t think of the law right now, and it’s slipping me and I’ll have to look it up again, [Kim is referring to Section 230 of the Communications Decency Act] but it’s interesting how you preface it, that by doing this, they will make it quote unquote “a law.” Because they’re operating on a law from the federal government that does not hold them accountable for information that is posted on their platforms. So that’s the loophole they’re gettin’ out of. They’re taking full advantage of the benefit of being exempt from the content that people post on their thing. But yet they don’t want—so why afford us that safety? You know? “Well, who are these people?” [Laughs]

Nandini: Yeah, exactly. So, it’s just one way of thinking about content moderation. And I’m sorry to say that I haven’t seen anything as thoughtful from anybody else in the tech world. This is something that everyone is grappling with in the industry. But no one has taken the time to come up with an answer for. Patreon I don’t think is perfect. I think there’s a lot of other issues with the company that I’ve heard about over the years. But, they’re one step closer to figuring this out than everybody else.

Kim: And that’s it. And that’s what I was about to say. It’s, at this point, it’s not—and this is what I tell people: we’re all gonna fuck this up. We’re tryna create an experience that was never meant to exist. None of us know what we’re doing. But we have to take steps to move forward, make mistakes, evaluate those mistakes, learn from our mistakes, and move forward again. And that’s why I found that thread that you wrote so enlightening, ’cause I had not heard of a—that had something. I was like, “Oh, somebody has some shit? OK.”

56:06

Nandini: Yeah, isn’t that incredible?

Kim: It’s new to me. And then the fact that you said it’s two years old. It’s like, what?

Nandini: Yeah. I mean I think that back in 2017 we were just so busy. I think one thing that we could do better is praise people a little bit more when they do something right. So we’ve just—we missed the boat on that one back in 2017.

Kim: Well, you’re not the only one, ’cause I have never heard of this. But I could tell you in the US 2017 was right after the election, so people were just like, “Oh, shit.” [Laughs] They were definitely dealing with some other stuff.

Nandini: Yeah. We were dealing with some other stuff.

Kim: All right, what would you like to say in your final moments on the show?

Nandini: What can I say? I hope that… from everything that I’ve learned this year, I think that next year is going to be a bombshell year.

Kim: I agree. I agree. I believe we’re at a tipping point.

57:02

Nandini: Yeah, it is a tipping point. I think companies are still grasping their role in misinformation, either accidentally funding it or, on the other side, accidentally enabling it. And I know that from the work that I’ve been doing, now that I’m doing it full time, I’ve been connected with a lot of really great, smart people who are thinking about this issue in a lot of different ways.

And now that we have formed a loose coalition, I think that we’re going to be really successful in bringing this issue to the forefront in the next year and it’s gonna be really important that we act fast, that we put a shit-ton of pressure on everybody involved, because it’s an election year and we can’t afford to lose our democracy again to misinformation and hate.

So, if you have any capacity to help, you can help just by the simple retweet. Or just by sharing it in your LinkedIn community or your Facebook community, wherever your professional friends are.

58:20

Kim: Yeah. I tell people there’s always a space for all of us, particularly privileged people; I need you guys to shut the fuck up and amplify the people who are doing the work.

Nandini: Yeah, a simple share can do so much.

Kim: And, all caps: PAY US.

Nandini: Oh, yeah, I would want to be paid. [Laughs]

Kim: Well, thank you so much for being on the show. This has been great. I just knew that this would be a great conversation, a informative conversation, to have to start the beginning of the year. So thank you for takin’ the time.

Nandini: Thanks so much for having me.

Kim: Have a wonderful day.

Nandini Jammi

Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.

Learn more >

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud.

Listen to more great #causeascene podcasts