Cathy O’Neil

Podcast Description

My argument in my Shame book is that it’s like a sort of fortress…of denial that white people have like surrounded themselves with because they are living in cognitive dissonance. And that’s what shame does. Shame when it’s real and it and it hits. It hurts so badly that you’re like how can I square this with being a good person…And you have a choice at that moment, and like some white people have been like, “Oh, let’s march against police brutality.” And some white people are like, “Let’s just pretend that it’s not happening.”

Cathy O’Neil earned a Ph.D. in math from Harvard, was a postdoc at the MIT math department, and a professor at Barnard College where she published a number of research papers in arithmetic algebraic geometry. She then switched over to the private sector, working as a quant for the hedge fund D.E. Shaw in the middle of the credit crisis, and then for RiskMetrics, a risk software company that assesses risk for the holdings of hedge funds and banks. She left finance in 2011 and started working as a data scientist in the New York start-up scene, building models that predicted people’s purchases and clicks. She wrote Doing Data Science in 2013 and launched the Lede Program in Data Journalism at Columbia in 2014. She is a regular contributor to Bloomberg View and wrote the book Weapons of Math Destruction: how big data increases inequality and threatens democracy. She recently founded ORCAA, an algorithmic auditing company.

Additional Resources

Transcription

00:30

Kim Crayton: Hello everyone, and welcome to today’s episode of the #CauseAScene podcast. My guest today is Cathy O’Neil. I know many of you in the tech space know about her, and she keeps popping up in these documentaries we’re seeing about AI here lately. So I’ll let her introduce—pronouns are she/her—so I’ll let her introduce herself.

Cathy O’Neil: Well, hello. I’m very glad to be here. My background is mathematics, and then I became a quant in a hedge fund during the credit crisis, then I became a data scientist and saw like… and also I joined Occupy, which really was my education, to be honest, and I saw sort of all the things that went wrong in finance going wrong with data science and AI. But instead of screwing up the markets, it was screwing up humans along the sort of same lines that you’d always see that.

And I was myself a data scientist deciding who got opportunities and who was denied opportunities, and I realized pretty quickly that it was like money, race, gender, same old stuff, and that nobody was really acknowledging it. So I started studying that as a problem in itself, and I wrote a book called “Weapons of Math Destruction”. And I started a company called ORCAA, which audits algorithms for things like this, for bias along those lines. And, yeah, I’m just basically a professional loudmouth now.

02:00

Kim: We need more of those! [Laughs] So, we always start this conversation with the same two questions: why is it important to cause a scene? And how are you specifically causing a scene?

Cathy: Well, I mean, causing a scene is the only way to make fundamental change. I’m writing a book now actually about shame, called “The Shame Machine,” and, you know, shame has a bad rap, and it often isn’t inappropriate, but I’ve been thinking about it for four years now, and I’m like, there’s some moments when it’s the only tool. When the laws themselves are unjust, or the powers in charge are unaccountable and don’t follow the laws, then shame is it. That’s what you got. And that’s how I interpret causing a scene, is holding people to account, in a sort of pushy, divisive, in your face way, which is literally the only way it works. So that’s why it’s abstractly very important to cause a scene.

What I’m doing to cause a scene personally, besides just my sort of general loudmouthedness… well, I guess the way we connected, Kim, is that you saw the tweet I did, which was challenging some people to a public debate.

Kim: [Laughs] Yes.

03:22

Cathy: I love to do that, you know; I’ll tell you a little more about the context, which is that one topic of my book was predictive policing algorithms and how they propagate racist police practices, in particular broken windows policies, that we claim we’re not following any more in places like New York City and Chicago and Los Angeles. But it’s like the “scientific policing”—I put that in air quotes, just in case… this is a podcast, not a video?

Kim: Yes, this is a podcast, yes. So thank you for explaining! [Laughs]

Cathy: So-called “scientific policing” was supposed to sort of like say, “Oh, we’re going to take the bias out of it. We’re gonna follow the data.” But what we’re really doing is we’re mistaking crimes for arrests, and they’re not the same thing. And there’s a lot of missing data when it comes to crime—most crimes don’t have data attached to them at all—but there’s way more missing white crime than Black crime, for example. So when you do this “data driven policing,”—again in quotes—what you’re really doing is predicting the police. And they’re actually pretty predictable. But people think that what they’re doing is predicting crime, and that’s not what’s happening. So yet again, it’s sort of like an artifact of recent racist history, but we’re propagating it. We’re pretending that we’re breaking free of our bias, but in fact we’re just turning it into another thing, and we’re calling it science.

04:50

Anyway, so that was in my book, and then a whole couple of years went by, but the data science, the mathematics community started thinking along these lines more generally, and there was an open letter, which I signed—I did not write it, but I was very, very happy to see it being written, and I signed it—basically saying mathematicians should stop participating in predictive policing algorithms, because it’s just bullshit. Am I allowed to swear, Kim?

Kim: Yo, girl, this is an adult conversation. [Laughs]

Cathy: [Laughs] That’s good news.

Kim: Yes, I’ll just set the pattern. Fuckety fuck, fuck, fuck!

Cathy: OK, good. Yeah.

Kim: Mhm. [Laughs]

Cathy: OK. So I was super glad to see the activists of the math community waking up to this and—just to back up a little bit in the story, I hope I’m not taking too much time with this?

Kim: Oh no, no. No, this is your floor.

Cathy: OK.

Kim: I only centre white folx for a reason. So go ahead, take this—because I don’t do it often—so take this opportunity.

05:48

Cathy: [Laughs] I appreciate it. I went to this math conference—joint math meeting—a year and a half ago, maybe two years, I can’t even remember. And I met this guy who runs this place at Brown University. He’s an old friend of mine, actually, from the math community. And he had hosted a conference, put on essentially by PredPol, which is the most mathematically sophisticated predictive policing shop, out of Los Angeles—and it’s used in Los Angeles Police Department. And of course, I had spoken to folx, organizers in Los Angeles. I don’t know if you’ve talked to these guys, the Stop LAPD Spying Coalition? Really fantastic people.

Kim: Mm-mm. Thank you. If you can make that introduction, that would be great.

Cathy: Anyway, I talked to them about how to organize around stopping this. And I talked to them about the pseudo math, the pseudoscience around it, along these lines. But anyway, the point is that this mathematical, the so-called Mathematical Research Institute, hosted at Brown University, had let this conference go on—it was a two day conference where it was basically an advertisement for PredPol. And I said, “Why did you do that?” Like, “Man, that was a mistake. It’s not real math. It gives to way too much of advertising and free marketing to something that’s really not right. And I think you should make up for it somehow. And here’s what I think you should do: you should have, you should gather at a conference of activists against PredPol, against mathematical predictive policing.”

Anyway, he didn’t buy it. [Laughs] And you know, whatever. I didn’t think one conversation was going to change the course of that. But then this letter was written and I signed it. And the conversation was—you know, we caused a scene basically, right?—because the conversation was elevated, so it was the entire mathematical community saying, “Yes, no, yes, no.” Like, “We should… Mathematicians have something to offer.” Like, “No, I think you should just stay out of it, because it’s never gonna work.” I’m definitely in that latter category.

08:14

But it’s a continuing conversation: there has been another letter arguing for the participation of mathematics and mathematicians in this world, in this universe. And so when that second letter came out, against the first letter, that’s when I was like, I raised my hand, “I will publicly debate anybody who wrote that second letter or who agrees with that second letter, and I’ll do it at that same place, at Brown.”

Kim: Mhm, mhm.

Cathy: “I’ll do it there, or wherever, you know,” like, “Let’s do this!”

Kim: Has anybody taken you up on it?

Cathy: You know, I’ve had a couple nibbles, but no actual…

Kim: Of course!

Cathy: [laughs] …no actual dates. [Kim laughs] Yeah, I think it’s very weak. I think their arguments are very weak. And it stems from this, Kim, it stems from the fact that mathematicians really don’t understand the context of this data. They really…

09:04

Kim: OK stop! Stop, stop, stop, stop! You got me excited. You got me excited, because I was writing some notes here, and you just got there. Because this is the difference between quant and qual, and why we need both. [Laughs] This is why mixed methods… if someone’s comin’ to me with just quantitative data, I don’t want to hear shit—particularly from somebody from a marginalized group—I don’t wanna hear shit about your data, if you cannot tell me any qualitative, anything that explains where the where the data came from. Who asked the questions? Who decided the questions? All of these things. And this is where tech screws up so much because it wants to be apolitical and it wants to be neutral when there is no such fuckin’ thing.

Cathy: I could not have said it better.

Kim: I just fuckin’ tweeted at Uncle Bob’s punk ass again, [Cathy laughs] because he did a tweet: “Politics is ruining everything.” Bitch! My life is political! So you actually just publicly sad that everybody, women, people in marginalized communities, Black people, Indian people, people in the LGBTQA community; we are the problem because our lives are political. Fuck you, Bob.

Cathy: I have to confess. I don’t know who Uncle Bob is.

Kim: Oh, baby, don’t worry about it. [Cathy laughs] Don’t worry about it. He’s the asshole who wrote “Clean Code.” He’s some antiquated white supremacist piece of shit that that is still making money for O’Reilly, so they keep him on. He’s still making money for Pearson, so they keep him on. People refuse to have him speak at conferences because he’s toxic. The community wants him gone, but because he can turn a dime, he gets to say whatever the fuck he wants to.

10:57

Cathy: Got it. Yeah, I hate that. I mean, I was at Ted, so don’t get me started. Talk about assholes who don’t get kicked out even though they’re toxic. I was there when Robin Hanson was speaking. Do you know who that guy is?

Kim: No. Now you’re making me Google. [Laughs]

Cathy: You don’t even want to.

Kim: Robin Hanson?

Cathy: Yeah, it’s really… it’s a dark, dark rabbit hole that you’re about to dive into.

Kim: OK, there you go. Thank you. That’s my life, OK go ahead. [Laughs]

Cathy: Oh, yeah, it’s dark.

Kim: And so going back to… so the things I wrote down are “shame,” I really want to talk about shame, because you spun that in such a great way, because what white guilt is, is shame. We saw that after George Floyd.

Cathy: I have a chapter called “White shame.”

11:45

Kim: Yeah, that’s why Black folx was making money after George Floyd. It was all about white shame, white guilt. And then I want to talk about… oh, just you just hit me because I’ve been diving deep into this, because I am a certified special needs teacher—I was certified; I didn’t renew my so education since I’m no longer teaching. But taught K through 12, certified special needs teacher, and I’ve been recently diagnosed with ADHD. So it’s interesting because being an educator, I had these IEPs [Individualized Education Program]. You have white boy, Black boy, both exhibiting the same behaviors. White boy gets ADHD designation, which means he’s on track to help him modify his behavior so that he can stay on the academic track. Black boy gets EBD, which is Emotional Behavior Disorder, which means now they’re not… basically, he’s basically no longer on the educational track, it is all about monitoring his behavior, which—this is the qualitative data, so…

Cathy: Oh 100%.

Kim: So when you have the white kid who can still take gifted classes or AP classes or whatever and you have the EBD child who all you care about is making sure he’s not being disruptive in the classroom, then you talk about that is a determinant based on how many prisons and shit they’re building. It is—I don’t want to talk about what you, like you said, quote unquote “crime data”—when that’s not what the hell that is. And you’re not tellin’ the story of how these communities got where they are.

13:29

Cathy: 100% and that… yeah, it’s fundamentally the story behind this particular letter and series of letters, and public debate suggestion that I’m involved in. But it is, I would agree with you. And your background is economics, right?

Kim: You know, it is because that’s what I made… and see I’m glad you brought that up. Because I never—I had a hard time calling myself an economist because I don’t have what the traditional, what people… but I’ve earned, I’ve worked for my knowledge. And it took me… and that goes back to the whole, these white dudes don’t have any problem calling themselves whatever the hell they want to. And I had to sit back and say—it was really mind-fuck for me to put that in my bio. I’m finishing up my doctorate in business administration, focusing on technology entrepreneurship. And it’s hard. I mean, it is those things that I know I need because people going to come at me with “you’re not an economist”. [Laughs]

Cathy: It’s a very credentialed world. Yeah, I agree with you. But all I meant to say was that the study, it’s all about social contexts. And data around that.

Kim: Yes! And that’s what I want to bring to economists. That’s what I want to bring to this economics question, this economics thing. Because people aren’t talkin’ about… yeah, it’s always this quantitative thing. So I really, how I really started realizing that I was an economist is because I started speaking loudly about, “There’s nothing fundamentally wrong with capitalism.” I have nothing fundamentally wrong with the idea of private ownership in business. I have a fundamental problem that our economics is rooted in white supremacy. That’s what I have a problem with. And so, it is about how do we redefine capitalism without white supremacy? [Laughs]

15:18

Cathy: Right, and that’s one of the reasons it’s so hard to compare the United States economy to like a Northern European, very very homogeneously white economy like Sweden or something. I was talking to my cousin who does health—she’s an economist, and she studies health systems of different countries—and she shows me a graph where it’s like the United States is by far the most expensive and by far the least efficient health system of any rich country. And I’m like, “Well why can’t we just do it like socialized medicine, like Sweden?” And her answer was like, “Oh, because we’re too racist.” Like…

Kim: I was about to say! I was just about to say! Because the reason they don’t have those problems—and it’s creepin’ in now because they’re getting more refugees and immigrants in those countries—when you have a homogeneous community, your racism is hidden. It is not…

Cathy: I agree with you. I’m not saying they’re not racist. I’m saying that their non-white community is small enough so that they don’t begrudge universal health care because of that, you know what I mean?

Kim: But that’s what I’m sayin’: it’s shifting though. I have someone who is a follower who is in Sweden and he talks about now that there are, as more immigrants coming, they’re cutting off their access to those universal, those socialist things. So, yes, they’re finding ways…

Cathy: So they’re finding ways to make it not quite universal?

16:52

Kim: Yes! Exactly. And that’s the thing when you have homogeneous all-white, you’ve never had to test your systems.

Cathy: Look, I didn’t mean to say that they’re not racist.

Kim: Oh no no no, I’m not debating you. I’m just sayin’.

Cathy: The white supremacy is so deep in our country that people will be like, “I’d rather we all get sick and stay sick and lose and go bankrupt because we’re sick, than let any Black person get…”

Kim: And that’s what Jonathan Metzl wrote about in “Dying of Whiteness.” [Laughs] He interviewed…

Cathy: I haven’t read that.

Kim: Yeah, he interviewed—because in certain states, they didn’t extend Medicare—and he went to folx in these states and said, “Hey, if we could extend Medicare, but that meant Black people would get it,” these people were saying, “No, we don’t want it. We don’t want it.”

Cathy: That’s exactly what I’m saying.

Kim: And that is my thing about whiteness. This is what it is. The parasite is now eating its host. I say it all the time.

Cathy: I totally agree. I totally agree.

Kim: The whiteness might not be the first to be impacted, but white supremacy is designed only for chaos and destruction and at some point, white people will be impacted, and that’s what’s happening right now.

18:02

Cathy: My argument in my shame book, which I’m not allowed to talk about because it isn’t out yet, but I’ll say just between you and me, [both laugh] is that it’s like a sort of fortress of denial that white people have surrounded themselves with because they’re living in cognitive dissonance. And that’s what shame does. Shame, when it’s real and it hits, it hurts so badly that you’re like, “How can I square this with being a good person?” And you have a choice at that moment, and some white people have been like, “Oh, let’s march against police brutality.” And some white people are like, “Let’s just pretend that it’s not happening.”

[Interlude]

19:21

Cathy: And some white people are like, “Let’s just pretend that it’s not happening.” And I’m not saying it’s that simple, because…

Kim: Or it’s people like Uncle Bob go even worse and blame, and say, “It’s nothing wrong. It’s them. They deserve everything they’re getting. Nothing about the system.”

Cathy: There’s so many flavors of both of those choices; I didn’t mean to… and there’s a lot of people in my friend group—and this is a confession—who’s like, “I will march against white supremacist cops, but I will never let zoning change in my community.”

Kim: [Laughs] Exactly! ‘Cause I wanna…

Cathy: Because I want my kids to get into Harvard, for god’s sake!

Kim: But that’s what whiteness is. Whiteness is about the individual. It is about how it impacts me. Everybody else has had to—because of the systems, institutions and policies—have had to learn how to be communities. Whiteness has no idea about communal anything. And we saw that with the…

Cathy: You know there’s little bit of a community spirit in white supremacy. That can’t be denied.

Kim: Oh yeah, yeah. But there’s no allegiance there because they would throw each other under the bus.

20:47

Cathy: Interestingly, there is a lot of community spirit in these crazy, crazy little worlds. These little shamey, shame-based worlds that I’ve been researching. Like incels. You know, incels are like the most toxic possible community, and yet they are a community in this really weird, weird way. And people get stuck in them for the sense of community because, you know, we just decide [inaudible].

Kim: It’s the same thing with gangs. That’s why when I was—one of my first jobs was mentoring right out of high school in Chicago, and I understood gangs. When I was teaching, what people thought were the worst students, always came to me because I understood they wanted, they were seeking belonging.

Cathy: Brotherhood.

Kim: And so I get the incel, I get all of that, and yet outside of that, if they are left to their own devices, they will throw each other under the bus. We saw that with the pandemic. “Oh well, let the old people go back to work.” It’s no way in hell that a Black person is going to say, “Let Gramma go back to work.” [Cathy laughs] No, no! [Laughs]

Cathy: No question about it, it is much more individualistic. And you know what’s crazy to me after that election? All those people in North Dakota, they don’t—they’re dying like flies in North Dakota; my aunt lives there, so I worry about her daily—and they don’t blame Trump. They don’t blame Trump. [Kim scoffs] They don’t. They blame each person who gets sick, for getting sick and dying. It’s insane. They just, they’re so individualistic. So I’m agreeing with you.

22:21

Kim: Yes. Oh no no no, I get that, and so you telling me only confirms what I’m saying. Whiteness has no humanity, and it does not benefit white people to cling to this thing. It is like we get there together or not at all. We got to all get there.

Cathy: I’m with you.

Kim: And so, you’re my first mathematician. And so that excites me, because—for a number of reasons—and one is math frightens so many people that it is still in the domain of white dudes, because everybody else is so frightened. So it is very, I can imagine. It is not diverse at all.

Cathy: You know, another chapter of my—I’m sorry for talking so much about my shame book. [Laughs]

Kim: Oh no no, please. [Both laugh]

Cathy: Another chapter of my shame book is a story about being fat-shamed as a girl. And honestly, I ascribe that, being thrown out of the cool kids’ world because I was a fat little girl, that’s the only reason I was studying math because I was like, “Well, I’m already a loser. I might as well.” [Kim laughs] You know what I mean?

Kim: [Laughs] Might as well benefit from it!

23:33

Cathy: It really was a benefit. It freed me in some sense. Sometimes—and that’s something that we’ve seen a lot—once you are out of the norm group, you’re just out, you’re out. Then you do have more freedom. You could be shameless.

Kim: That is why I get so pissed off with gatekeepers in tech, because it could afford so many of us oddballs an opportunity—not only for our own well-being and financial and all that, but for our communities. And we still have these gatekeepers that have these arbitrary, nonsensical rules and regulations that they couldn’t even meet if they were…

Cathy: Give me an example. I’m not sure what you’re talking about.

Kim: Well, so… OK, let me lemme think right quick. So, one of the things is when we’re talkin’ about, “Why don’t we have a more diverse tech community?” Everybody wants to scream, “It’s the pipeline, it’s the pipeline.” Which puts the onus and the responsibility and the problem on people who can’t get in. So again, that’s a quantitative thing. But no one wants to talk about why there’s a pipeline problem because there is no pipeline problem. What the problem is is the hoops that these individuals have to jump through discourage, stop them right at the doors. Or they get in and the environments are so toxic that they cannot stay. But no one wants to talk about that.

25:09

Cathy: I’ll add two points to that, Kim. I totally agree. The first one is that the algorithms that are being built to decide who to offer an interview to who apply is based on historical data. And so if you have Black people applying for tech jobs that got the job but then came in, and it was an awful environment, and they left soon? Well, it will look to them like, “Oh, Black people aren’t doing well at our company. Let’s not hire more Black people.” So those algorithms tend to embed the nasty experience, but it’s identified as a failure on the part of the employee.

Kim: Exactly, exactly yes.

Cathy: So that’s the first comment, and the second comment I’ll make, which is something I worry about on a daily basis, is all these matchmaking algorithms and platforms that are popping up in the Internet: LinkedIn…

Kim: I hate LinkedIn.

Cathy: …monster.com, all the job-seeking, job-matching things; I don’t think any of those matchmaking algorithms have been audited, and there’s no reason to think they are offering STEM job opportunities to qualified Black people. It is just… I doubt they are.

Kim: I agree. I agree.

26:09

Cathy: I found a LinkedIn—I’m so suspicious of LinkedIn—and they are by far the biggest… with the highest reputation. So they’re my target. I’m causing a scene there, if you want to help.

Kim: Good, because I don’t like LinkedIn.

Cathy: Yeah, fuck LinkedIn.

Kim: Exactly.

Cathy: They had this blog post, which I think it was like 2012. It was so long ago about how like, oh, they’re making sure that their recommendation engine isn’t as sexist as it used to be. So there was a blog post where they acknowledged that they have this kind of power and this power is non-objective and the algorithms can be sexist, and then I think they squashed that blog post and they never talked about anything like that again. And I’m like, “Oh, what happened that whole thing?”

Kim: Exactly. Where’d that conversation go? [Laughs]

27:10

Cathy: Excuse me. I’m an algorithmic auditor. I would like to know what you guys are working on in this realm for race, for gender, for age, for disability status. It’s crazy. And think about all people that have lost their job.

Kim: And the fact that they can just, people can just put stuff out there and just take it away, is the same thing that happens on Twitter. When somebody says something, they get their shit handed to them for saying something racist, whatever, and then they delete or close their account and they come back like nothing ever happened. [Laughs]

Cathy: That’s right.

Kim: Tell me, what actually is an algorithmic audit?

Cathy: Well, to be—again, don’t tell anyone this—but [Kim laughs] it doesn’t have much to do with the algorithm itself. It’s a bureaucratic—imagine you have a bureaucratic process and you don’t really know how it works. But you do know that all these people enter here and they’re being scored and you see what their score is. So you don’t really understand how the scoring works, but you do see the people walking in the door and you see their scores, and you’re wondering, “Is this fair?” But nobody can explain the actual process, so what you’re going to do is devise a bunch of questions that you want answering. And then once you have a question, you’re like, “What’s the quantitative test I can force this black box process through so that I can see what the answer is?”

28:22

And the questions are things like, “Is this racist?” And then the test would be like, “Well, let’s put some qualified folx into this box and see how many of them get scored well, compared to if we put qualified white folx in.” That’s the kind of test. But different processes engender different questions, and each question engenders one or two or maybe a dozen tests, because it’s actually not that well defined to say what it means for an algorithm to be racist. There’s lots of different ways algorithms could be racist. So you don’t wanna narrow yourself too much. The way I think about it, well think of every single possible way an algorithm could be racist and test them all. But that’s the kind of thing I get paid to do. So most people who build algorithms, they just build something and they deploy it. They never really know how it works.

Kim: Yes. One of the things I have talked about is, you know, Twitter has polls. And I did a poll right before the election, maybe a month before the election, because I was feeling a little angsty, I could see what was goin’ on. And I know that I spent more time developing that fucking question, that one question to ensure that I was getting the data that I was looking for, than most people do with a whole damn algorithm.

Cathy: Yeah, you’re probably right.

Kim: And I’m just, and that just blows my mind, because there’s so much that we don’t know that we don’t know, and if we don’t have the people at the table to inform those things, we are causing harm.

30:15

Cathy: I couldn’t agree more, and one of the things that I just feel like—I just wanna swing back to that original conversation about predictive policing. I just wanna give an example of what you just said, that we’re actively causing harm when we don’t know enough context, because we’re not just accidentally not knowing enough context. There is actual weaponized language in this world. Like when we say “crime data.” We just discussed that. It’s not crime data. That’s arrest data. It’s racist—it’s evidence of racist policing is what it is. That’s what it is, right?

Kim: Exactly.

Cathy: Another thing that I think data scientists—largely white, definitely well off, not living in neighborhoods that are over-policed—don’t understand is that you have this sort of false characteristic of violent versus nonviolent crimes. This categorization of what does a violent crime mean? Violent crimes could just be resisting arrest. As soon as it’s resisting arrest, as I understand it, it’s considered a violent crime. And so that’s another category where the word, the naming is weaponized because we’re led to believe that, “Well, we might not wanna criminalize poverty, but we have to be able to do this with violent crimes.” And it’s like, “Well, that sounds right, but if you actually dig down and look at the context of what a violent crime is, especially if resisting arrest is slapped onto anyone who asks a question, then that’s completely inappropriate.” In other words, I’m saying that you actually have to be kind of a detective to know the actual context sometimes.

31:56

Kim: And that’s why I often say, “words matter.” Words matter. We can’t just just throw— this’s one thing I challenge tech all the time: people who can code are “technical” and everybody else is “nontechnical.” That’s bullshit. They are technical in a technology, whereas other people are technical in other areas, other domains.

Cathy: Exactly.

Kim: And so you could see that, and so when you’re talking about algorithmic issues, think about bringing somebody in the hiring process that because we already favor “technical,” which is coding, they’re gonna get more money, get more benefits, whatever. But the person who, the people who actually make the company run to make the code that they’re creating actually have any value? Because they were considered “non-technical,” then they’re shit.

Cathy: Yeah, I couldn’t agree more. One of the things I do in my algorithmic auditing framework is I insist that there’s a conversation with all the stakeholders. And it’s a non-technical conversation. The technical shit can happen at the end. The first question is: who are the stakeholders? What are their concerns? For whom might this fail? And how do we balance the ethical concerns of this stakeholder group against that stakeholder group? This stakeholder group is going to care about false positives. That stakeholder group is going to care about false negatives. We are either implicitly or explicitly balancing those concerns, and usually it’s like we’re just ignoring these people. And so we have to have that conversation and we have to expose our values, quite explicitly. And then once we’ve done that, the data scientists do the technical work of translating that into code, but we can’t expect a data scientist to be able to be an ethicist and a historian, an advocate for the communities that are affected by this. They can’t do all that, and they won’t do all that. They’ll just throw something at the wall and then call it perfect.

34:04

Kim: Mhm. You’re speakin’ to my soul right now because you just… because I love these conversations. We have to have these conversations. I say it often. To get to the other side of good, to great, we have to go through the bullshit. We have to go through. We can’t stop. We have to stop avoiding it. We have to stop dismissin’ it. We have to stop. We have to go through it.

Cathy: I often said—I gave a bunch of talks about my book which was chock full of bad examples of terrible racist, whatever algorithms—that these algorithms were put in place to avoid a difficult conversation.

Kim: Yes. It’s the same reason we automate everything as soon as possible, ’cause we don’t want to deal with humans.

Cathy: We don’t wanna deal with it. We have a “teacher assessment algorithm,” in quotes. Have you ever met five pedagogy experts who agree on what it means for a teacher to be a good teacher?

Kim: Hell no.

Cathy: No. You cannot have—that is not something we agree on. So why do we think…

35:09

Kim: For some of them it is—for me a good teacher was classroom management, ’cause if you can’t run the classroom, you can’t do anything. Other people thought a good teacher was, they can put some shit on the board, you know, that kinda…

Cathy: And of course, then there’s a harder to even measure stuff of like, they inspire.

Kim: Exactly. That’s why I said, the students that teachers did not like, loved me! [Laughs]

Cathy: Yes. Right, and that’s actually the most important thing, is that a truly good educational experience has all kinds of different kinds of teachers. But the point is that in the book there’s this ridiculous value-added model for teachers that got a bunch of teachers fired or denied tenure.

Kim: Oh yes, mhm. I know what you’re talkin’ about, yes.

Cathy: It was literally a random number generator, Kim. A random number generator. And I just think about it, and I’m like, “Who decided that was a good idea?” And I think the real answer is they were like, “This is gonna be a great tool against teachers unions.”

Kim: Well, one of the things—and this is why I left education—because it’s broken and there’s no incentive to fix it, because too many people are profitin’ off it being broken.

Cathy: Say more about that.

36:04

Kim: So I worked in a county that had Title 1 money, which means they were below the poverty level. First of all, the fact that schools are in the same community, in the same county or city, are funded by tax base, but it’s not evenly spread throughout the same county, is problematic. So how we fund schools is problematic.

Cathy: It absolutely is, I couldn’t agree more.

Kim: But you would have tech people, particularly testing people come in, and because each school gets their own Title 1 money, right? So the principal is responsible for this money. This principal knows nothing about this technology you’re talkin’ about, so you’ve just spun a whole sales pitch on them. They get this—every school I know has a closet full of shit they done bought at the school level and when they got to the district level, the district people are like, “We can’t support that.” So you have a closet full of custom shit that you can do nothing with because it’s not compatible with the systems that the school uses.

Cathy: I see. It’s just like such a waste.

Kim: Exactly. Exactly.

Cathy: So you’re saying that is incentivized?

37:27

Kim: That is very much incentivized. Why would anybody, why would Pearson, why would all these testing… why would they change? Why would they… Again, we go back to how capitalism is set up now, it is set up to cause harm, to exploit, to take advantage of the oppressed. There is no incentive to fix this. No Child Left Behind was great in theory, but—and this is one of the problems I have with quantitative data that isn’t rooted in qualitative data—inside your academic office it makes sense. In the real world, it does not pan out that way. [Laughs]

Cathy: Absolutely. And the value-added model for teachers was a direct product of that teacher accountability part of No Child Left Behind.

Kim: Oh, I remember that. I remember people were getting fired left and right. But then you get pissed and you want to send folx to jail when they cheat on the test.

Cathy: I know, it’s like you set up a system where you’re like, “Here’s how you don’t get fired. Oh, we’re gonna put you in jail now.” It’s so insane.

Kim: You set up a system that teachers are required to meet a certain threshold. They understand that based on their school, they’re not gettin’ the money, they’re not gettin’ the resources, their students just aren’t there. So hell yeah, we gonna get together and and fuck up these damn, and raise some shit, because I ain’t trying to lose my job.

38:56

Cathy: Yes, agreed. I just brought that up as an example of us trying to avoid difficult conversations. We’re not ready to talk about what it really means to be a good teacher, and what it really means to close the achievement gap, which was the whole point of that, because guess what? The achievement gap is growing because inequality is growing.

Kim: Thank you! [Laughs]

Cathy: Yeah, if we can address that…

Kim: And it also—but you don’t want to, because the way it is, you can blame the victims and not the systems.

Cathy: Absolutely.

Kim: Because when you’re forced to blame the systems, then now you take responsibility for that, because you benefit from the system, institution, and policy. But when you can put it on the individual, who are never individuals in any other time except to be blamed. [Both laugh]

Cathy: That’s such well said.

Kim: Then it’s not us. It’s the same thing with the pipeline. It’s not us. “We don’t have a toxic culture. We’re welcoming!” You don’t—that’s what I tell people—you don’t get to say that your environment is inclusive. It’s the people who are in it that get to say if they feel included or not.

40:01

Cathy: That’s a good point too.

Kim: And that’s what so many… just because you put “inclusive” in your damn PR shit, that means absolutely nuthin’ to me.

Cathy: I call it punching down in order to not be punched up.

Kim: So, before, on our last time together, cause you’re in two different movies. So you’re in the shitshow of “Social Dilemma,” you had a small little thing as well as—Joy [Buolamwini] was in that movie also, right?

Cathy: I didn’t watch it. I was like…

Kim: Yeah, OK. So I think it was Joy, it was another—it was one Black woman.

Cathy: Oh no, that was Rashida [Richardson].

Kim: OK, Rashida. She had less time than you.

Cathy: Yeah, that’s ridiculous. I actually complained to them. I was like, “All these white people in this movie.”

40:47

Kim: It was not only just all these white people, but it’s all these unremarkable, mediocre white dudes who’s fucked shit up.

Cathy: Who fucked shit up.

Kim: And are now profiting from how to fix it.

Cathy: Exactly. And by the way, it’s just like, are you kidding me? Like the narrative is you brought up with your colleagues that you realize that this shit is bad for the world and you were surprised when they’re like, “Oh?” [Laughs] Like dude, have you ever met your colleagues or yourself? You guys aren’t going to change the system. You’re benefiting from it.

[Interlude]

42:53

Cathy: You guys aren’t going to change the system. You’re benefiting from it.

Kim: Yeah, and and that’s why—goin’ back to what I just said about education system—people are benefiting from it. Yes, so and so now this dude gets to have a whole movie centered around him, gets to be on the speaking circuit as an expert. Of what?! We’ve been telling you for years! This is my whole point. I never had a problem with the ethos that everybody—the Facebook ethos that everybody has adopted: “move fast, break things.” I have a problem with “Move fast, break things, move fast, break things, move fast, break things,” without havin’ to stop, “What did we break? How did we break it? Who did we harm? How do we make amends? What did we learn?” We don’t ever do that work.

Cathy: “For whom did this fail?”

Kim: Yes. And if those people had done that work, then I would say, “You know what? Yeah, you might be an expert,” but you don’t get the fuck shit up and then become an expert fixin’ it.

Cathy: Yeah, I’m with you.

Kim: And so then you were in “Coded Bias,” which hopefully as a community we’ll watch, because that’s comin’ out on Wednesday.

43:01

Cathy: And that’s gonna be great. I’m really into that movie. I love Shalini [Kantayya]. Joy’s a good friend. There’s a lot of my good friends in it.

Kim: I’ve been trying to get to Joy for a while. Oh, my god. [Laughs]

Cathy: She’s great. She’s fantastic. She’s so busy.

Kim: I know. I know. [Laughs]

Cathy: But she’s—lucky for me—my neighbor here in Cambridge, so I got to see her this weekend. But yeah, that’s gonna be a real different movie. And I hope you can…

Kim: Talk about what’s the difference?

Cathy: Well, first of all, the narrative is I realized this is a problem, and I audited it, and I held powerful people to account, and I’m going to Congress, and I’m changing the law. It is—Joy is such an amazing story. Her story is so fantastic. And Shalini really, really saw it before it happened. I mean, it was fantastic. And she did a lot; she took a lot of film and it was not obvious to me as a non-filmmaker—I’m not an artist whatsoever—how it was gonna work out. I was like, “Wow, Shalini is filling a lot of time with me, and Joy,” and actually, my friend Keelie [spelling uncertain] is—I believe—in “Coded Bias.” Her teachers were subject to that value-added model for teachers, and I believe she got into the final cut.

45:17

And I didn’t even… I was like, “I wonder what this movie is gonna be about ultimately?” And then when I was lucky enough to get to see Joy speak to Congress at the committee hearing, there Shalini was following her around. And I was like, “Now I’m starting to see how this might work.” And again, I haven’t seen “Coded Bias” either. I’m really bad at watching movies, especially ones I’m in. So I assume it’s all about Joy. She’s on every one of the PR posters. So I’m imagining what the story’s about, but it’s just much more about justice. It’s much more about punching up. And it’s a standard, beautiful story about, seeking and obtaining—not a comprehensive concept of justice—but incremental progress, which is normally how it works.

Kim: And that is—I’m glad you mentioned that—because harm reduction is about incremental progress. We cannot—that’s what pisses me off about all these people, wannabe revolutionaries—you’re gonna flip the table on somebody more vulnerable than you. Have you thought about that? What are you doing…

Cathy: That’s why it has to be punching up, Kim. It can’t be punching down.

Kim: It’s like, what? Please stop talkin’ about flipping tables. [Laughs] Because the most vulnerable are always gonna be harmed in that if they’re not a part of the ones who flippin’ the tables. I don’t need privileged folx flippin’ tables. What I need y’all to do is leverage your networks, your resources, your money, your voices; amplify and lend those to other people who have the lived experience—which is qualitative work—to do this.

47:05

Cathy: Well, you know Joy’s original story about how she was trying to use facial recognition as part of her master’s thesis work at MIT.

Kim: And she did a Ted talk, right? That’s the first time I saw [her], yeah.

Cathy: And it didn’t recognize her face.

Kim: She put a white thing, a white mask on.

Cathy: White mask on, and then it did recognize her face, as a human. And it’s so profound. And it’s also really true. And it’s a lived experience, and so her work stems out of that, and it’s a very beautiful story to me. I’m glad that Shalini captured it.

Kim: Mmm. Well, I’m looking forward to watchin’ it, and discussin’ it. Shalini’s gonna be on the show, so discussin’ it…

Cathy: Oh, fantastic! She’s a lot of fun.

Kim: …yeah, discussing that with her. Because we…

Cathy: I think I’m gonna be on some panel on Wednesday.

Kim: Yes, you are. [Laughs]

Cathy: OK.

Kim: I’ve signed up for that. [Laughs] Yeah, and we need more diverse voices telling these stories.

Cathy: Yeah. Can I make a little side note / complaint, though?

Kim: Of course. Yes.

48:15

Cathy: [Laughs] I’m just looking at the people who are slated to be hired by the Biden administration…

Kim: Oh god.

Cathy: …around tech, you know, advisory stuff. And I’m like, “oh my god.”

Kim: It’s a bunch of clueless white people? White dudes? [Cathy sighs] Is it the same—are we gonna have the same vacant faces we saw at the hearings? [Laughs]

Cathy: It’s not great. It’s not great. I think there’s like a Facebook legal counsel, actually someone from Facebook.

Kim: So you’re lettin’ the fox in the henhouse.

Cathy: I just feel like, what are we, just hire lobbyists? Is that what’s going on?

Kim: Yeah, see…

Cathy: I mean the truth is, lobbyists do know a lot, but they’re usually not…

Kim: From this community.

Cathy: …on the side of justice.

48:57

Kim: Mhm. They’re not getting impacted by… I strongly believe, I firmly believe there are two people in tech that I could just do without. One is Elon Musk, but he’s just a dolt to me. He’s just like, dude, they call you genius? You had apartheid at your back. You should be… your parents… whatever, dude. Whatever. Who I think is the most dangerous person in tech is Mark Zuckerberg. I really do. He just does not… oh, he just doesn’t… oh, mm. To me it is not even harmful. He is dangerous.

Cathy: You know what? He is awful. He’s awful. And he’s in my top two. But Elon Musk doesn’t even make my top two because he’s so self destructive.

Kim: Yeah, that’s what… [laughs] Who’s your other?

Cathy: For me the other one, though—and I can’t decide whether it’s above or below Mark Zuckerberg on the ranked list—is Cheryl Sandberg.

Kim: Oh, yeah! Yeah, yeah, yeah, yeah, yeah. OK so, yes. So I put her [claps] up there because she’s definitely doing the white woman thing of enabling white supremacy.

Cathy: She really is. And she is covered by her, like, “Lean In” psuedo-shit.

Kim: Oh! Bullshit!

Cathy: That “feminist philosophy.” [Kim laughs] Which is, by the way, in my shame book as like, “Are you fucking kidding me?” [Kim laughs] Like, “Are you fucking kidding me?” Talk about blaming the victim. Talk about blaming the victim, like, “Oh, if you didn’t do well in your corporate setting, maybe you didn’t suck enough dick.” [Laughs]

50:29

Kim: Yes! Yes! Thank you. You used the dick word, there you go. [Laughs] Exactly!

Cathy: That came out a little crass, I should have rephrased.

Kim: No, don’t worry about it. That’s what people do when they uncomfortable.

Cathy: I hate her. I hate her, and I think she is smarter than Mark Zuckerberg. That’s why…

Kim: Oh she’s an enabler. She’s doing what—white women were designed to maintain and breed white supremacy. That is her role. Yes, that is her role. And she does it well, and she gaslights the hell out of other folx.

Cathy: Exactly. And you know, I’m not a fan of tearing down women, but I’m a fan of tearing down her because I do feel like she hides behind her gender, and she’s impenetrable. So the only people that can really attack her are women.

Kim: Yes.

Cathy: I’m just, I’m her enemy.

Kim: Yes. Oh, I can’t believe we have the same top two. [Both laugh] That should say something. That should say something. Because we come from very different lived experiences.

Cathy: Yes. Yes, but we see the same facts.

51:35

Kim: And that is why we need other people in the room. So people think it’s all about—when you bring other people in the room—it’s always gonna be division. No, we agree on some things. We can get to a place where it’s like, from my life experience, from your lived experience, oh we’re seeing some same things. And it’s…

Cathy: Yeah. Yeah, agreed.

Kim: This has been… Oh, yeah, and it pisses me off when you have these people who are never impacted or seldom impacted by harm will say, “Well, why don’t you just delete Facebook and Twitter?” That’s how… they’ve done one thing correct and that is create community. That’s the only way I get to communicate with family members I don’t see.

Cathy: Right. It’s not just that though, Kim. And I agree that there’s that. The fact that all of our aunts and uncles and moms and brothers and sisters are on Facebook means that it is an important source of community. But it’s also like Twitter is vile, but it is a conversation.

Kim: And Black women have figured out how to amplify, work… it is a shitshow, but we figured out how to make it work for us, and to tell me to walk away from my community. No!

Cathy: Yeah, you’re asking people to take themselves out of an important conversation, and that’s just not acceptable.

52:58

Kim: And that’s my livelihood. I get clients. I get support. I get financially supported from Twitter. I’m not gonna cut that. I am an entrepreneur. You’re outta your mind.

Cathy: Yeah, exactly. I agree with you completely. And I do also agree that Black women on Twitter are a force to be reckoned with.

Kim: Oh we rock it.

Cathy: And by the way, I wanted to write an essay about Stacey Abrams today, but I’m like, “Everyone else just wrote an essay about Stacy Abrams.”

Kim: You do not know how proud I am of this fat Black woman…

Cathy: Me too.

Kim: …because everybody counted her out. Everybody counted her out. And everybody, you know, “She’s not qualified to be…” She is differently qualified, but if you go by what the fucking constitution [says], she is of age, she’s a US citizen, and what we got in there right now, are you really comparin’ her to that?

Cathy: [Laughs] That’s a really good point.

Kim: Are you saying…

Cathy: Everyone is qualified. My fucking dog is qualified compared to that.

54:01

Kim: Thank you. Thank you. And so the fact—and here’s another thing. It’s fuckin’ with people that they actually have to give her credit for what she’s done.

Cathy: I know. I know. It is really hard for some people to deal with it. But that’s one of the reasons I wanted to write that essay. The other thing I wanted to say today…

Kim: Well, I’m going to say before you say there are never going to be enough essays about her. G’on ahead and write that essay. [Laughs]

Cathy: OK thank you, I will. But the other thing I wanted to say is it was the most fantastic, ironic backlash to the intended voter suppression, of like, “Oh, don’t count the mail-in ballots till afterwards.” What it did was, it made it so much more obvious than it already was—which was already obvious—that it is Black voters that are turning these elections. When I think back to 2016 and the voter suppression ads that the Trump campaign ran on Facebook that basically told Black people, “Don’t bother voting, ’cause your vote isn’t gonna count.” This is the opposite. It’s like, “Your votes made this happen.” And that’s just the most wonderful message.

Kim: Every city. Detroit, Philly, Atlanta, the Navajo Nation. I mean, all—turned Arizona, and Milwaukee—all of these disenfranchised people who you consider the bottom of your shoe, all of this anti-Blackness rolls up and said, “No bitches, we got this.” And think about if we had not, because 70 million voted for that asshole. So think about if Black folx had not come up, we would have him.

55:52

Cathy: Exactly. It really is the difference. And it is always going to be true that this is the story of the 2020 election. You know what I mean? There will never be another successful attempt to tell people, “Your vote doesn’t count.” It’s like, “Oh, yeah? What happened in 2020?” It’s a wonderful but a totally unintended side effect.

Kim: Oh definitely. Just like they didn’t realize by—or he didn’t realize—by demonizing mail-in votes that people were gonna [scoffs] mail in votes! [Laugh] And that meant that his folx who actually listened to him—because we don’t listen to him—actually did not do that. [Laughs]

Cathy: It was actually kind of predictable, but at the same time, I didn’t see how sweet it was gonna turn out.

Kim: Oh yeah that was a sweet feeling, wunnit? [Laughs]

Cathy: It was so sweet! Like, “Oh. OK.”

Kim: Yeah, yeah. And I still—and the thing is that I keep coming back to—I still see folx struggling with saying, “Black people, we owe you for this.” And yet—or, they’ll say that, but yet, you know, “Y’all saved us,” but let us try to have a conversation about reparations. “Oh, hell, no. Don’t, everybody, what preparations? What are you talking about? You don’t deserve reparations. [Laughs] That’s in the past, what are you talking about?”

Cathy: We’re still not there. It’s really, white shame is the real thing. It really is.

57:15

Kim: Yep. And that’s what I tell people: I’m no longer responsible for your fuckin’ feelings. You need to go get you some—manage yo feelings—get you some counseling, do whatever the hell you need to, because the most marginalized, the oppressed are no longer… Fuck civility. We no longer care about white feelin’s.

Cathy: Oh, I have a whole section of my book about civility is just, it just means like white people or people in power don’t like to hear things that they’re doing wrong. Like, get over it. When it’s like, we’re oppressing poor people, that never happens, right? It’s just civility is called for when powerful people don’t like feeling uncomfortable.

Kim: Well, my… This is how I say it: civility is optional for white people. It is the expected behavior of everyone else because it forces us to manage us so whiteness can do whatever it wants to.

Cathy: Exactly, exactly.

Kim: It’s just a way of getting us to, “You shouldn’t say it like that.” And it’s definitely in my timeline; I tell people all the time, the Black community definitely has to deal with our own internalized white supremacy and anti-Blackness, because of all of this stuff, that we have to be this way, we can’t do that way, and we look down on it. It’s all in service to white supremacy. It is never in service to us.

Cathy: Yeah, I believe that.

58:39

Kim: So what would you like to say in your final moments on the show?

Cathy: Well just: really fantastic to meet you. And I agree that you are an economist. You think about incentives and you think about quantitative, and it’s an important perspective, and I’m really glad you’re doing this. Thank you for inviting me.

Kim: Thank you so much for having me. When I put my name out there as the Antiracist Economist, I was like, “You know what, we’re gonna do it.” I actually put it in my bio, Future Winner of the Nobel Prize in Economics.

Cathy: [Laughs] That’s awesome!

Kim: My puppy’s name is Nobel Prize in Economics; we call her “Bel.” [Laughs]

Cathy: That is cute. My puppy’s name is Cooper. No relation. Cooper. [Laughs] Just a standard puppy name.

Kim: That’s a cutie! [Laughs] Well, thank you so much. This has been amazing. And I’ll see you on Wednesday.

Cathy: All right, great. Nice to meet you.

Kim: All right, thank you.

Image of Cathy O’Neil

Cathy O’Neil

Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.

Learn more >

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud.

Listen to more great #causeascene podcasts