Ayodele Odubela

Podcast Description

It’s funny because the title of my upcoming book is called Uncovering Bias in ML and I feel like it’s…it’s not that it doesn’t go deep. I feel like that’s the first layer, right? Yes, it talks about discovering that this bias is here. But I’m like: I make it clear from the first chapter…Yeah I’m not hear about…here just to tell you this exists. I’m here to tell you that it is your responsibility and job to change it. Or you are continuing to reinforce the status quo. You’re complicit in the white supremacy. You’re complicit in the racism, the homophobia. Every aspect of that you are complicit in actively reinforcing if you do NOT take these steps. And I think that’s the thing. People are like: “Ethics is like, oh, it’s just a couple steps at the end”. I’m like…we have to destroy the entire workflow every company is using to build AI products, or it’s not gonna change. I’m not gonna keep putting a band aid on it. I’m not gonna keep talking about band aids.

Ayodele Odubela is a Data Scientist working on driver risk mitigation at SambaSafety in Denver, CO. She earned her Master’s degree in Data Science after transitioning to tech from social media marketing. She’s created algorithms that predict consumer segment movement, goals in hockey, and the location of firearms using radio frequency sensors. Ayodele is passionate about using tech to improve the lives of marginalized people.

Additional Resources

Transcription

00:30

Kim Crayton: Hello everyone, and welcome to today’s episode of the podcast, #CauseAScene podcast. I know I’m gonna fuck this name up just because I’m fucking thinking about it. I apologize in advance, but don’t help me; I’m gonna phonetically: Ay-o-dele Oh-du-bela.

Ayodele Odubela: You got it perfect. You didn’t even fuck it up.

Kim: Oh shit!

Ayodele: Yes. Take the win. Take the win.

Kim: Girl, you gotta. As a Black woman, we gotta take ’em where the fuck they come.

Ayodele: Everywhere we can. Like really.

Kim: Exactly. Pronouns: she / her / hers. Girl! Baby! Introduce yo’self and then tell us…

Ayodele: Hello!

Kim: …’cause we got a lot to talk about this day.

Ayodele: Yes we do. We really do.

Kim: And then tell us why is it important to cause a scene? And how are you causing a scene? And we’ll just get right into it.

01:16

Ayodele: Yeah. Yeah. So my name is Ayodele, like you said. I’m a data scientist. I consider myself more of an AI ethicist because—and I don’t even like the word “ethics,” but [laughs]—I care about creating fair algorithms because we have suffered under crap algorithms that disproportionately affect us and ruin our lives since like day one. I started my career like in marketing and I started working more with analytics and then went to school for data science. And it wasn’t until after graduation; I attended the Data for Black Lives Conference on like a student scholarship, like blown out, paid for my hotel. And that was when I realized, “Oh, wait, this whole data thing that you’ve been learning, all of this machine learning stuff, really hurts Black people,” and nobody in grad school mentioned it, nobody in a lot of industry roles mentioned it.

So I think it’s important to cause a scene because, like, this affects me. This affects my personal life. This affects multiple aspects of our livelihoods. And on top of that, I mean, I’m causing a scene because I’m writing a book. I’m talking about these issues and delving deep into especially the social implications of technology and not just focusing on, “Here is a technical solution. Here’s some code.” Really going in on like… and even, you know, talking with my publishers. This is not just a technical book. If we don’t sit here and decolonize this; if we don’t sit here and talk about white supremacy; if we don’t peel back the layers and actually get to the truth; a lot of what data ethics does is it kind of glazes over and lets practitioners feel cushy and like, [raises pitch mockingly] “Yeah, we’re doing it the ethical way. And people are just biased. What are we supposed to do?”

Kim: Whoa girl, you just hit on something!

Ayodele: No, I’m going into we’re straight up not creating this technology. Because it’s biased, because the data is bad, does not mean we just get to gloss over and do it anyway. So…

03:10

Kim: OK, so you’ve just fucked me all up, because I’m just gonna tell you right here: you said you don’t like being called a data ethicist, you don’t like using that, but you just hit me on—ooh, to slam me in my face—was the very reason I don’t like terms like “nice” and “fair” and “good.” Who the fuck gets to define what ethics are?

Ayodele: Yep.

Kim: And you just hit that!

Ayodele: The same people who are perpetuating… yup.

Kim: Exactly! And so then they get to tap themselves on the back because they’ve given themselves these titles as if they’re ethical and their decisions are ethical by default because they give themselves these titles. But when white supremacy is the default, all you can do is perpetuate white supremacy in your systems, in your data.

Ayodele: With the same tools, with the same methodologies, to only… I think that’s been my biggest critique too, like you mentioned, these groups “for good technology” and all these “for good” initiatives; they really just go lightly and tread lightly over, [raises pitch mockingly] “You know, there’s some biases, there’s some bad things on the planet; like criminal data, it’s got some problems. But you know, let’s do it anyway.” And I’m like, “No, we have to destroy this at the fucking beginning.” If we’re not going to literally destroy how we’ve built our academic and our technological—basically tech companies—in every aspect of the faux meritocracy, the racism that goes on in these upper—white—middle class communities of engineers; if we’re not going to destroy that systemic racism, we’re not going to actually change anything. We’re just gonna be putting a Band-Aid…

04:52

Kim: Girl, we’re not even… we puttin’ a Band-Aid on a damn bullet wound. And it’s so interesting; I’m glad you bring up because people are asking me now, “Kim, do you know any antiracist companies?” No, I do! No! Then don’t fuckin’… y’all just started usin’ the goddamn word a few months ago.

Ayodele: Zero. Zero. Like the only antiracist companies are companies who are founded, typically by marginalized people, who instil in every act—facet of their organization—that they do not tolerate racism; that they do not perpetuate it. And even then, they’re still typically at the wills of investors.

Kim: Oh, most definitely. I was gonna say, they don’t scale. They’re not—those aren’t the companies that are scaling. Those aren’t the companies that’s getting funded. Those aren’t the companies… yes, so I mean so they might be making their… First of all, you hit on it: it’s a marginalized team or person. It’s not some white person who’s doing this.

Ayodele: And usually with less access to the same resource pool.

Kim: Exactly. And no funding. So no, I don’t see it. Where the fuck—where do you think I’m supposed to see it?

Ayodele: It doesn’t exist.

Kim: So I’m building them. I’m building ’em. I’m just like, “You know what? Let’s build this crap.”

Ayodele: I’m with you.

06:05

Kim: I do not trust white folx to do this work. Y’all care about ya feelings too much. Just like you just said, you put a—to me, that “ethicist” title’s just like “ally” title—I don’t need none of that mess, because your consistent demonstrated behavior is not what… is not… is not showing…

Ayodele: It doesn’t support that.

Kim: …all of this antiracist, anti-bias—and I love that you just hit on this, ’cause you just you just took me to, this is why I get so—and I’m gonna talk about Coinbase and Brian Armstrong…

Ayodele: Yup. We’re goin’ in.

Kim: Girl he… until his shit dragged.

Ayodele: We’re goin’ in.

Kim: Exactly. He does not realize that his moment of trying to be woke has set a legacy for his company for years to come, because now he’s a case study of this bullshit, because there is no apolitical perspective, period. And you just said that in that brief introduction. There is no apolitical!

Ayodele: There is none. There is none. And I keep posting this quote because—and I hate… [sighs] slight tangent; I hate this, but we have to sometime leverage the work of white folx who are doin’ good, you know, in order to get the same message across because it gets listened to. I keep quotin’ white people talkin’ about this, like data is political; if you’re a data scientist, your job is political. Stop pretending like…

Kim: Oh, I use white…

Ayodele: I’ma quote you all day long, because…

Kim: Baby, I use white adjacency a lot. [Laughs]

07:33

Ayodele: Yeah. Yes. And I’ll tap into it. Link the paper, go read the paper, go read “Data Science as Political Action,” and go believe it. Go read it.

Kim: And unfortunately, that’s a very slippery slope, because then you have a Robin DiAngelo, who does not do antiracism work at all, being touted and put up as the antiracist expert, and her work is about bias, which I have problems with.

Ayodele: Mhm. It’s not the same thing. We can dig into bias, but even that like—and it’s funny because the title of my upcoming book is called “Uncovering Bias in ML”—and I feel like it’s—it’s not that it doesn’t go deep—I feel like that’s the first layer, right? I feel like, for tech folx, we are not as…

Kim: Oh girl, if you put racism in your title, these motherfuckers like, “Oh, that’s not for me. I’m not racist.”

Ayodele: “Oh, I can’t buy that. That’s not a technical book.”

Kim: “That’s not for me. I can’t watch that on—I can’t read that on the train.”

Ayodele: Exactly. Literally. Yes it talks about discovering that this bias is here. But I’m like, I make it clear from the first chapter, yeah, I’m not here just to tell you this exists; I’m here to tell you that it is your responsibility and job to change it, or you are continuing to reinforce…

Kim: You’re complicit.

Ayodele: You’re complicit in the white supremacy; you’re complicit in the racism, the homophobia; every aspect of that you are complicit and actively reinforcing if you do not take these steps. And I think that’s the thing; people are like, “Ethics is, oh, it’s just a couple steps at the end.” And I’m like, we have to destroy the entire workflow every company’s using to build AI products or it’s not gonna change. I’m not gonna keep putting a Band-aid on it. I’m not going to keep talkin’ about Band-aids.

Kim: And you’re hitting on just the whole… it all has to go.

Ayodele: If we don’t throw it out, I’m sorry. I didn’t create it; don’t be mad at me. I didn’t create the system, but I’m gonna tell you what we have to do.

09:30

Kim: Oh, let’s use this. Their logic makes that—just like you just said, we didn’t create it—so they’re like, “Oh, but you know, that was…” OK, fine. None of us who exist right now created it, right? But would we use a car to feed a baby? No. So I don’t understand why this is just, like these hoops that they are cool—it’s like, your argument makes absolutely no sense. Your analogies, your metaphors, whatever you wanna call them, they make absolutely no sense when you when you put it… OK. Cars have killed people, right? Yeah, uh-huh, I mean, people drunk… I mean, even a car, somebody forget to put it in park, it rolls down a hill…

Ayodele: It’s not hard at all.

Kim: Exactly. And yet we don’t say, because cars were never designed to feed babies, and so we don’t extend that to…

Ayodele: We don’t use that kind of logic for anything else. And I think that’s the thing, too, is that like you mentioned, it’s kind of that patting themselves on the back like, “Oh, well,”—I think even with the Twitter cropping algorithm—”Well, we tested it for fairness.” No, no, no, no, no. I want to see y’all’s documentation. You better pull it up. Pull it up!

Kim: But again, I don’t even get to that point. Who the fuck gets to define fair? So when you get there, I don’t want to see your data; I don’t want to see shit. Because I’m cutting it off right there. Show me the pictures. Show me the lineup of the folx who said it was fair.

Ayodele: Exactly that.

Kim: And that right there is why this work is now—you need to be looking at it as a risk management issue, and increasingly, a crisis management issue.

11:12

Ayodele: Yes. Literally incident response, like monitoring algorithms. I’m like… so it’s shocking to me—and I’ve watched, since I started writing this book, I’ve done a ridiculous amount of research; read hundreds of papers, watched hundreds of people talk about this—even from leading organizations like Google, they are trying to set a precedent, but they’ve used their own user survey data to come up with numbers. It’s like yeah, 90% of companies don’t test for fairness at all. These companies don’t have teams to manage incident response.

Kim: This is why I get so flustered, flummoxed, at our industry, because we only want quantitative data, which is a point, which is a point.

Ayodele: One point.

Kim: And I need to understand the qualitative—where’d that data come from? What question did you ask to get that data? Who was in the room? What happened? I need all of that before I can even start having a conversation about what a quote unquote “fair” is.

Ayodele: Yeah, I think that’s exactly the fact. And then we don’t—tech has always had this problem, and from someone who doesn’t have a straight technical background, it’s more apparent—there is such a reluctance to work with social scientists. I’m like, “Y’all, this is too late.” We’re in 2020, where this has arrested people, this has killed people, this has harmed people in multiple ways. I’m not going to sit here and placate your boo hoo feelings. If you’re not onboard with fixing it, find another industry. Find another thing to do or build that ain’t gon’ hurt people. Peace. Because if you’re going to be working on AI technology, it’s not just to be like, “OK, so I saw the bias in the data, and then I tested for fairness, but my company wanted something to release anyway, so I built it. But I feel bad.” I’m not gonna placate your fucking feelings anymore, like, “It’s OK.” It’s not OK.

13:02

Kim: And then you get on Twitter and you talk about, “Oh, da-da-da…”

Ayodele: “I saw this talk that made me feel really guilty.” First of all, data professionals, the vast majority of data professionals, have enough privilege to turn this work down. Have enough privilege to walk the fuck away.

Kim: ‘Cause folx are collecting and storing data and don’t know what the fuck to do with it anyway, so yes. So, yeah, you can get a job. Yeah, yeah, yeah.

Ayodele: Yeah. You can go get another job. You can tell your… you can risk—for most white men—you could risk your own fucking reputation and be like, “Yeah, I don’t think we should do this, and I am not going to build this.” Guess what? People like me get fired on the spot. But at the same time, the burden is on my back to tell the fuckin’ truth because everyone else wants to work in data ethics wants to be like, “Guys, it’s OK. I know you want to balance the company ROI because you put a lot of money into AI. You know, we can just test and then release it anyway.”

I’m like, “No.” If you’re not onboard for just blowin’ the whole thing up, then stop pretending that you give a shit about accountability, stop pretending that your company is gonna be accountable. Being accountable is not just like, “OK, we saw data bias. We tried to fix it.” You need to be fuckin’ responsible for when this shit hurts people. You need to pay people when it fucks up their lives. You need to have recourse and remediability.

14:20

Kim: That’s the part that’s coming. That’s the part that’s coming, a that’s why they’re fucking scared…

Ayodele: It’s gonna hit ’em quick.

Kim: And they scared about section 230, because that is what’s been holding them together and letting them just slide by. But even if that isn’t dismantled, what’s gonna happen is the crisis—it’s gonna be litigious, and that’s where the shit’s gonna hit the fan. Because your AI has killed someone. That’s where it’s going. That’s where…

Ayodele: Is at that point. And I think that we are finally getting there policy wise, and even if it’s not just—I don’t even really give a shit about US government policy—the court of public opinion is holding these companies accountable. We’re seeing more and more of it.

Kim: And they’re gonna be so far behind the eight ball because they didn’t plan. Canaries have been dyin’ in the coal mines, and y’all kept ignoring it, because white supremacy allowed you to ignore it and let you be the the expert, and you’re not gonna know where to—and I’m just gonna sit back and hike my prices up ’cause, like I saw after George Floyd, white guilt pays well.

Ayodele: Yes it does. Yes it does. I’m like trust, one of the—it’s funny—the big aspect of my book, the probably most controversial aspect—and the whole time I’ve been writin’ this, I’m like, I don’t know, it’s difficult. You know this work is difficult. In some moments, it’s like, man, this is joy. I get to tell the truth. I get to actually say what these people need to hear. And the other part I’m like, maybe I should get security, ’cause I’m gonna get some…

Kim: Girl! Girl!

Ayodele: I’m talkin’ about it’s acceptable to white supremacy that y’all—everyone in these AI communities are OK.

16:01

Kim: OK, so I’m gonna stop you right there. Have you watched my video about how to protect Black women online that I created?

Ayodele: Oh no, I have not seen it.

Kim: Baby you need to watch it because it talks about how you need to put your stuff in order so that you can’t get doxxed and all this other bullshit.

Ayodele: Box your shit in; fuckin’ WhoisGuard. Literally like…

Kim: And that’s what white folx don’t get; just, I mean, I wanna cut you off, because they don’t get this. You just said what little it would take for them to stand up, and look at us havin’ to protect our physical lives just to open our fuckin’ mouths, ’cause we do the work that they refuse to do.

Ayodele: Exactly. And then they can go collect a check that’s higher because of who they are.

Kim: Oh, I already know that I done burned so many bridges it’s ridiculous. If I don’t have my own business, what the fuck am I gonna do? Because I don’t know many people gonna hire my ass. I’m just gonna be… [laughs]

Ayodele: They don’t want to hear the truth. They don’t wanna pay money for the truth. They want yes men and yes women. Despite what they say, they don’t want to hear someone be like, “So let’s blow it up and start from the beginning.”

[Interlude]

18:01

Ayodele: “…let’s blow it up and start from the beginning.”

Kim: OK, so we’re here. Just today, I knew we were having this conversation, and I wanna thank you—let’s talk about this right quick—because I wanna thank you; I had to reschedule this a few times because this work…

Ayodele: No problem.

Kim: No, I want to talk about that, because as a Black woman, this work got on my nerves at certain days, and I just did not have the energy to do these interviews. I don’t think people understand what this work takes from us. And so… no, no, hold on, ’cause I want you to tell the… so I knew you were coming, but then you posted something today.

Ayodele: Yes.

Kim: That’s what I want you to talk about, because this right here is… yeah, so go ahead.

Ayodele: [Sighs] This is rough. This is rough. So, I’ve been doing a couple talks recently, preparing for one that’s next week, and this article came out days ago—maybe like 5, 6 days ago—about a study they did on Boston liver disease patients. So the hard thing for me—I kind of ignored this article; I had a million things going on at the time—I went back and read it, and I was like, “Wow, this hits home.” So, my father has liver disease. He had a stroke in March of last year, and has been on the transplant list, he’s been going to dialysis consistently for over a year. It’s not that he lives in Boston or anything, but I’m like, he’s a Black man in America; of course he doesn’t have the same access to healthcare as many other people, like my colleagues do, my colleagues’ families.

So not only that, going and diving deep into this study, they basically looked at, I think it’s E… there’s a metric that’s for measuring your liver functionality. And for Black patients, in 2009, these researchers went in and manually race-corrected one of these metrics because they had so few Black patients, right? So they wanted to make their data basically easier to deal with. But it’s one of the few algorithms in healthcare that specifically takes in racial data. So if we think about that, first, it’s just fucked up to begin with, right? Second of all this race correction added about 16% to this measure for every single Black patient—and Black patients only; so this is not including Latino patients, not including Asian patients, not including white patients…

20:31

Kim: OK, I wanna stop you right there. What do you mean, race-corrected? Explain that.

Ayodele:  Mhm. So this measure—I forget what it’s called; it’s like E-something—but it’s a measure of your liver function. So it goes from—and I’m kinda making this up, because I don’t know the full range—I’m imagining from 0 to 100. So the lower the number, the more at risk you are for going into liver failure; it just says your liver is doing worse. And so that’s an aspect that this algorithm takes into consideration when telling doctors, “Hey, you should recommend transplants for some people. You should recommend other kinds of liver therapy for some people.”

So for Black patients, they were like, “OK, looking at the proportions of the data,” they’re like, “Oh, we don’t really have a lot of Black patients in this group.” But looking at the scores of the Black patients and the scores of all the other groups: white, Asian, Latino, they were like, “Oh, you know what would make sense?” And I can’t explain why they made this decision, but for some reason, they were like, “It’s going to make dealing with this really imbalanced data easier to just add a unit, essentially, to every Black patient’s measure for this thing.” Their EKG or something like that.

Kim: So basically, if there is—on the spreadsheet—if there’s a Black person, we’re gonna add 20 points or whatever that was?

Ayodele: Yes. Yes.

Kim: Just by default, because we recognize that there’s something going on that we don’t have any Black people, but there is no explanation in the arbitrariness of whatever they decided, whatever the exponential thing they… OK, go ‘head.

22:12

Ayodele: Correct. I couldn’t find any reason for that. And so for every patient…

Kim: OK, so I’ma stop you. I’m sorry, ’cause now my brain just… so, instead of going back and figuring out why Black people were missing, they just decided to arbitrarily add something to it without knowing why the Black people were missing?

Ayodele: Exactly that. And I think—and I can only extrapolate—they probably went to a hospital and was like, “Give us all your liver patients. OK, we have…”—because of, most likely, the region or the demographics of that hospital—they’re like, “OK, maybe we have 70% white and smaller percentages for everyone else, and very low percentage for Black patients.” So that’s what I’m imagining is probably the situation. And, like you said, instead of saying, “OK, let’s go to hospitals that serve more Black patients who have liver disease and other chronic illnesses to retrieve that and also compare that…” I can’t say why that choice was made, but instead it was, “Let’s artificially bump your numbers.”

Kim: Yeah, ’cause I’m tryna figure out why just the Black patients and not the Latinx patients? You know, it’s like what—again, and so I’m teasing this out, audience, so you can understand, even from a research science point of view, we make decisions that make no sense, that are not rooted in anything that is easily found, explained, and yet those are the decisions that we… you know, “Let’s run with it.”

Ayodele: Exactly. And I’ll tell you the truth; this happens one billion times over in industry. Even just as someone working in data to build models on a daily basis, the human decisions are not just in every step, they are so minuscule and yet can have a drastic impact. Not including one feature in a model can have a drastic impact on the model. I think that’s part of why too, I honestly believe we need to have either a oath or a certification similar to the way doctors do that can be revoked. Talking about specific people working with data and building models that impact people because we play around with life and death.

Kim: Yes, yes, yes.

Ayodele: We don’t want to admit it, nobody wants to… we play around with life and death like it’s nothing and like it’s a hot new…

24:48

Kim: And we play around with Black lives and death, definitely as if—not like it’s nothing, but it is nothing.

Ayodele: It is nothing.

Kim: OK, so go back to your story. Oh my lord. I just have to… this is so many layers. OK, so they added this arbitrary number, numeric thing to it, and then?

Ayodele: And then this study basically went back and looked at the outcomes, right? So the Black patients were 50% less likely to get recommended for a liver transplant, and they went and looked and said, “OK, if we look at this one blood measure that Black patients had artificially inflated; if we took that out and just took the took the measure out for each patient, recalculated the entire algorithm, there would have been 64 patients that were Black that would have been recommended for a kidney transplant.”

Kim: Mmm.

Ayodele: The one thing the article didn’t mention, and I wish it would have, and I think this is one of the most critical problems when we do talk data ethics and then fixing this discrimination problem: tell me how many of those patients died.

Kim: Mhm! That’s what, yup!

Ayodele: Gimme the outcome. Because I think about this. This could be my dad; he’s still on the liver transplant list. And the article mentioned only three hospitals… so after the study was done, they found three hospitals have changed their metric and are not using this racially corrected measure for Black patients. Three. In the country. So everybody else at a hospital in the country, if you get your blood tests and you have liver disease and you’re Black, your numbers will be artificially inflated, and this study concluded a year ago. So I’m like, yeah, what are the chances that any hospitals outside of these three have changed this? What are the chances that Black patients are going to have better outcomes because of this study?

And while this study shines the light on it, they’re saying, “OK, well, you know, medical boards need to change their guidelines in order for widespread adoption, to remove this thing.” I’m like, “So we’re back at where we’ve always been.” This dumb thing has has impacted us, hurt us, potentially killed people, held us back from resources and access to life-critical health care. OK, the study about it comes out, it gets a little bit of news, it’s really popular for a while; do the hospitals actually end up changing it? How many hospitals we have in this entire country?

Kim: Yeah, and it goes back to when you were talking about accountability.

27:37

Ayodele: Yep. If you’re not going to sit here and do something beyond shining the light, there will be no change. And that’s where I’m at the point where it’s like this is personal to me. You know, these are our lives at stake and knowing that—I think that’s why this work is hard is because knowing that most people don’t give a shit. Like most of my colleagues don’t really give a shit. Most of the people, like even the people who follow me online, yeah, you care enough to see it in your timeline, but do you really give a shit? Does it actually impact your life? Is it really something that is life or death for you or people you know? If the answer is no, then you are less likely to hold these folx accountable and be like, “Pay ’em out.”

Kim: And it is so interesting to me because this is when I talk about that white supremacy is rooted in every system, institution, and policy that we have. This… so they saw a discrepancy, and because the systems were set up the way they are, to make this blind, “Let’s just fix that. Let’s just… don’t ask any…” And this is why people love quantitative data and not qualitative. Because when you’re doing quantitative data only, you don’t have—outside of that screenshot, that moment—you don’t have to ask questions about why.

Ayodele: Yup. And I think that is… you hit the nail on the head with that. And I have been really pushing for two things specifically in tech recently. One is to stop this scale business and to try and think that everything is supposed to scale.

Kim: Whoa! Just because we can does not mean we should. Oh, lord!

Ayodele: Really! Have we not… we talk about this. How many of y’all tweet this randomly? “Just because we can doesn’t mean we should.” But then you go and do it at your job.

29:32

Kim: Oh, my god. This is my whole point of like, I don’t have a problem with “Move fast, break things.” I have a problem with “Move fast, break things, move fast, break things, move fast, break things,” and we never stop to figure out what we broke, how we broke it, who we harmed, who we need to make amends to, what do we need to fix? We never do that work.

Ayodele: No, never.

Kim: But you know why? But you know why? Because mediocre, unremarkable white dudes don’t understand that work. That work does not amplify them. That work does not center them. So why the hell do they care?

Ayodele: Exactly. There’s no impetus.

Kim: That’s what they consider the outliers. “That’s the outlier.” No it is not the fucking outlier! [Laughs]

Ayodele: Most of the world is brown. I don’t like—just because y’all live in the Bay Area—I’m sorry, most of the world is brown. And if you are trying to [sing-song] “make the world better” or [sing-song] “make solutions for the whole world,” then you need to center…

Kim: But you know what? You just hit it. But you can’t because anti-Blackness is the most ubiquitous and adopted thing in the world, globally. So now you hit up on it. So most of the world is Black and brown, but we got a strategy, we got a white supremacist strategy for that, and that’s anti-Blackness.

30:44

Ayodele: Yes, and that’s the easiest way to ignore completely, devalue the work, and I think that’s the thing. We haven’t been just starting to do this work. This is not new. This is not… I think, because white supremacy is in every system, not only is our work devalued, there’s less exposure, there are few people willing to stick their necks out and be actual sponsors and advocates. Trust, I still have a lot of respect for my publishing company for taking a book that I’m like, “Look, I’m gonna be honest,” and I just told ’em, I was like, “This is going to be a technical book, but you are going to have to publish something that’s gonna criticize white supremacy; that it’s gonna talk about Western centering; it’s gonna talk about decolonization; are you ready?

Kim: Girl, I’ve given up on the fact of being able to get a book from a mainstream publisher. I don’t even see that happenin’ for me. [Laughs] I don’t even see that happenin’ for me.

Ayodele: No. And I think that’s the thing, is that we know that all of those systemic factors and reasons why are yelling for them to just hear a whisper.

Kim: Oh my god, let’s stop right there, ’cause you just gave me a visual! We are yelling so that they can get a whisper. [Laughs] And not only do we have to yell, but we have to yell, yell with civility. [Laughs]

Ayodele: Yes, yell with civility. Like six big buff security dudes around us like, “Hey guys, we can’t do this anymore.” That’s why this work is hard, but I think if it didn’t—and I hate to say this—if it weren’t my dad’s life, if it weren’t my life, if it weren’t how I interact with technology, if it weren’t so deeply rooted and close to me; I see this every day, we see this on the Twitter cropping algorithm was discovered about a post on Zoom racism. And I’m like, “Can I go a day? Can I have a day?” If I had [unintelligible], maybe if I wouldn’t be so angry. [Laughs]

32:54

Kim: Oh! Oh my god, girl! When I take when I take a week off… I used to never do that because I was like… but I realized there’s always gonna be shit, so I just need to…

Ayodele: There’ll be some new…

Kim: …I could come back—like the Coinbase bullshit happened when I was on break; I took a break. And I was like, and I was—girl!—I was holdin’ it. I was holdin’ it. [Laughs] I was like, “I refuse to break my vacation for this!”

Ayodele: And that’s real. And I think that’s the thing; I’m like, we have to protect our mental health and sanity in ways that y’all don’t. Y’all can go, [mockingly] “Oh yeah, I did this talk about ethics and I went home and didn’t think about it the rest of the day.” I’m thinking about like, how’s my dad gonna do tomorrow?

Kim: How many times did I reschedule wit yo ass? Two or three? [Laughs]

Ayodele: And for the same reasons. Because—I hate to say this—I’ll see something, I’ll have a conversation with somebody about this; I realized it’s not pushing the needle forward at all, and I’m like, “OK, I just need some rest.” [Sighs]

Kim: [Laughs] Yes. I need a gummy, and a chill. [Laughs]

Ayodele: Yup. Yes. Yes. Thank god I live in Colorado because I can do one of those. [Laughter]

Kim: Be quiet! [Laughs]

Ayodele: I would not be able to survive right now in other places.

Kim: Girl, I’ve gotten through this pandemic because folx keep sendin’ me gummies. [Laughs]

Ayodele: Yup. You have to do what you have to do.

34:11

Kim: Because I’m just like this is beyond wild, this is beyond… you could discount me if it was just me. You could discount it if it was just you. I’ve had I don’t know how many people come on this damn podcast who are beating the same damn drum every week.

Ayodele: Every… and it’s frustrating because—and I’ll be real, I’ve taken some moments where I’m like, “Man, I could just not talk as much about this and just live my technical life.” And I’m like, “I don’t have the privilege to do that without a conscience.” Like without saying I’m betraying my people. That puts me in the Candace Owens page. I did not… I’m not gonna go try and fix it on the inside.

Kim: I did not sign up for the civil rights movement. What the fuck did I… I entered tech! I did not sign up to be a civil rights leader. That is not what the fuck I signed up for! But here I am. You know, like, damn!

Ayodele: We have to do what we have to do. I truly feel, if I don’t do this work and I know what I know…

Kim: Oh, I couldn’t live with myself.

Ayodele: …to be in—and I don’t say this to make myself feel special, special, smooth, like bullshit—but to be in the 1% of Black women who have this deep knowledge in AI, I could never live with myself knowing what I know and to not every day be talkin’ about this.

Kim: Black women are the moral compass of this country and this world.

Ayodele: Yeah, and I think it’s… we kind of have to be. I look around rooms; not a single person is ever going to speak up, nor fight, nor—not just educate, but truly challenge the imperatives and the incentives. Not just OK, “I’m gonna come in your organization and give you some data modeling tips and provide algorithm monitoring.” Nah, we gotta talk about the incentives. Are you incentivizing your data team to create products and just push BS products out there with quarterly rewards? With team vacations?

Kim: Girl, you’re thinking of a car salesman. It’s a fuckin’ car salesman mentality.

36:21

Ayodele: Yes! It really is. And I think that’s the thing is that it’s not just, OK, fix our data workflows. That’s a huge part of it. The other part is, what are the incentives that drive your organization to do this in the first place? ‘Cause I’ll be real for a minute…

Kim: Oh, Facebook. Facebook, god damn!

Ayodele: But from an industry company perspective, you’re thinking, “OK, let me look at the bottom line. Each year we’re putting how much money into AI; if they’re not pushing out products that then increase our revenue somehow, increase our operations—or decrease our operations cost so we make more profit, then why are we spending money and paying y’all?”

Kim: Because that, when we have a shareholder-only perspective, that’s problematic. We need to be thinkin’ about stakeholders: who works for you, who partners with you, who buys from you; and who invests in you last, because when those other three are taken care of, your investment will be taken care of. And this is why I am experimenting with—first of all, I believe, I have no problem with capitalism as it’s defined, about small business. My problem with capitalism, and every system that we have—and I say this with socialism, communism—it’s all rooted in white supremacy. That is my problem with it. So my thing is, how do we… what business models, what organizational structures do we have that exist?

Right now, we currently have benefit corporations and L3Cs. I’m sure that we can create others, but that we have to get past legislators or whatever. But we need to create models that decentivize this bullshit, that make something other than shareholder value. Because right now, again goin’ back to quantitative data, the only datapoint these people give a fuck about is the stock market. And we know…

Ayodele: That’s it.

Kim: …that there are other things about the economy that we need to be payin’ attention to. [Laughs]

38:18

Ayodele: Yup. But I think it is that laser focus, just like on the stock market, on that bottom line, what that profit is, whatever your SAS company metric is. If that’s the way we approach it, there is no fixing it within that system.

Kim: No! We need a totally different shit. We need to change. We cannot… people have tried this shit. There is no conscious capitalism; there is no Black capitalism; there’s no Latinx capitalism; there is nothing until we talk about—and this is what my book is gonna be about—redefining capitalism without white supremacy, the economics of being antiracist.

Ayodele: Yes. Oh, my god, I am so ready. I am so ready, because that’s the conversation I think that’s at the root of this. Because we tend to talk about these in silos.

Kim: And we talk around this shit. Yes.

Ayodele: Yup. How it relates to AI in tech. What I could go do, and not really make any impact on the world, is say, “Oh, hire me to do some consulting. I’ll set up some architecture, add some documentation, and some accountability here.” But I’m not being honest to go put that system in a company that’s still revolving in the same kind of incentive-based system, the same kind of organization that, like you said, cares about basically profits; stake—and not stakeholders like you and me, but their investors—it’s not gonna change anything.

Kim: And that’s why I’m pushing this new… I wanna have conversations about profit without oppression.

Ayodele: Yes!

Kim: Because I don’t need to know your political views, I don’t need to know your religious… you want to profit without oppressing or exploiting other people, if that’s the case, let’s build that over here. Because I’m sick—so you hit the nail on the head for me. This last break I took, I recognized, and I’m sick of doing the explanation. I’m sick of—I’ve been doing this for at least three years. If you haven’t figured—and I’ve created enough content, y’all can find that. Find it. What I need to do now is create alternatives because when you talk about what needs to scale? That needs to scale.

40:27

Ayodele: Yes. And I think we have to focus on that because—and in the similar way, companies keep reaching out like, “OK, you talk about this. You showed us that this is a problem. What do we do?” And they’re like, “Give me a checklist,” and I’m like, “No.”

Kim: Yes. Oh, my god!

Ayodele: This ain’t a checklist you can just go and put in your company.

Kim: But that’s that binary thinking, though.

Ayodele: Yes! Yes.

Kim: And I tell people: white folx, y’all are binary. Right, wrong, good, bad. Well, we live in the grey. You need to learn how to live in the grey.

Ayodele: Yes, be comfortable with every answer I have for your company being, “It depends.”

Kim: Yes! [Laughs]

Ayodele: People don’t like it, but I’m like, “I need more information to go on to then say to you…” And even the same thing for fairness, and I go into this in my book, what is fair for an individual is not fair for a group, right? So…

Kim: And what’s fair for a cis, hetero, able-bodied, Christian person is not the same that’s fair—if we’re gonna use the word “fair”—for a marginalized person, so stop. That’s why I… fuck equality; we need equity. I don’t give a damn about equality.

Ayodele: Thank you. Thank you. Oh my god, even in the talk I just just recorded, I’m like, “I’m not talking about equality.” I’m not trying to—if we’re trying to build equality, then why are we putting effort into something that, basically—at least for data and AI technologies—just drags the past forward instead of actually changing it to deal with the oppression that we’ve already created?

[Interlude]

42:43

Kim: OK, I’m gonna stop you there. I’m gonna stop you there because I’m going to challenge that. How can you get equality from a system that relies on oppression and exploitation?

Ayodele: You can’t.

Kim: It is designed for chaos and destruction. There is no equality in there. There is no space for it. It is dark, it is damaged, it is… yuck! [Laughs]

Ayodele: And I think that’s… oh, you know what? That is really the core of the issue. It is that we understand looking at historical data is incredibly biased and systemically fucked up, right? But white people look at it and they’re like, “OK, but that’s what we did then. So if we just push that forward, it’s the same. Everybody’s getting the same amount that they’ve been getting.” And if looking at that ground truth as—they see this as neutral, where we see it as bad.

Kim: And we know there’s no such… but again, we’re not at the same place because we understand that there is no such thing as neutral. So this is the whole point. This is… see this is why there can be no equality because we don’t come from the same—we’re not at the same starting place. I don’t care, if you give me $100 and you give a cis hetero able-bodied Christian dude the same $100, it does not spend the same. [Laughs]

Ayodele: No. It does not. And I find that really educating technical communities about the ranges of privilege and oppression, and I think the hard part about that… so I I like to in my talks, I show the little scale that talks about all of the privileged groups and all of the historically oppressed groups—so talking about non-European, non-English speaking, illegal immigrant, all of those factors—and make a point to say, “You have to understand, there are so many aspects of oppression, and the thing is, the way they interact with each other, amplify each other, cancel each other out? Those are the things we can’t just measure.”

Kim: It’s like side effects of medication. [Laughs]

44:58

Ayodele: Yes. And all they want is like, “OK, give me a fucking calculation and a formula that says OK, Asian plus gay plus this part of the world…” I’m like, “There is none ’cause it doesn’t exist. You cannot quantify it.” So, with that in mind, how do we build systems that then find these relationships—and like you said, not equality—based on these relationships are able to allocate resources that amend for the historical marginalization?

Kim: OK, so I’ma stop you right there because this is… what you said sounds like the holy grail. It sounds like perfection. Why it will not exist in this current system and why we have to tear the system down is because, again, I talk about this often: brown and Black communities come from a collective. We are community-driven. White folx are individual-driven. We don’t have the…

Ayodele: Same mentality.

Kim: …are not aligned. It is not aligned. So where we see obvious, “If this is hurting one person, then the potential for hurting another is there.” Where they see, “Oh, this is what this is, one person. That’s an outlier. But the majority of people won’t get harmed.”

Ayodele: Yes!

Kim: That is totally different perspectives. It’s the same thing as when I was sayin’—and I didn’t understand this when I was talkin’ about it—but how the US has done some fucked up shit around the world; absolutely fucked up shit. And some chickens comin’ home to roost on that. But when I look at how we participate in war, when you have US soldiers—male, female, non-binary, whatever; US soldiers, I’m puttin’ everybody in—who this is a job for them, and they have signed a contract for a number of years, and they intend to come back to their families and they want to get paid. It is a job. This is just the thing that pays their health care and all their… On the opposite side, you have ideological folx who are blowing themselves up. That does not—you’re not fighting the same war. You’re not fighting the same war.

47:20

Ayodele: Wow, you really put it in perspective because it’s the exact… it’s the perfect metaphor for that. We are fighting for the greater good for everyone in our community. It is not the same perspective as most technologists who’re like, “I go to work. They told me to build something and I engineered it. So why do I have to be responsible?” 

Kim: Because—and also because—the thing I engineer does not negatively affect my outside work. You know, my off time. Then I’m good.

Ayodele: I find it hard to communicate that because it’s very frustrating and I’m overwhelmed with the thoughts about this all the time. [Laugh] There is no weekend from thinking about accountability because I’m like, hmm…

Kim: That’s why you gotta get high. [Laughs]

Ayodele: That’s why you get high. [Laugh] You have to sometimes forget this is happening. If I know that it’s so overwhelming, and it’s hard when I love the network thing that Twitter has brought me, just like the fact that Twitter…

Kim: Oh… girl, Twitter is a Black woman’s safe haven. We done figured this shit out.

Ayodele: We found each other.

Kim: We done found community, like what? What? What? What? You… what?

Ayodele: Yes, exactly.

Kim: I ain’t going nowhere. I ain’t going nowhere. Twitter need to fix they shit. I’m not going nowhere. [Laughs]

48:45

Ayodele: I’m with you. I found some of the most amazing community in that. But at the same time, I’m like, I very rarely get a respite from—and because I see so much great work from Black women—I don’t get a rest from this because we’re always talking about it. We’re always calling shit out. I saw a recent article—I think it’s a PhD student in Pittsburgh—and Pittsburgh holds a small place in my heart; I went to my undergrad there.

However, even when I was in school—and this is related to exactly what the researcher was digging into—Pittsburgh is an incredibly segregated city, and being a privileged Black college student in Pittsburgh—because why I may not have racial privilege, I had the educational privilege and the sheltering of my university, all of the medical care and all the BS and housing, right? I go to other neighborhoods, and I’m like, “Oh man, my friends are freaking out about being here.” I’m like, “Oh, it’s fine. I’ve lived in downtown Dallas; this isn’t a big deal to me,” you know? But she’s digging into why Pittsburgh has so many drastically different healthcare disparities because of its… both geography and access to food. Because that thing is real.

Kim: And if people don’t think food is political? Fuck y’all. All you vegans…

Ayodele: Food is fucking political.

Kim: …all you people… this is what… girl, they get on my nerves.

Ayodele: Whole ‘nother onion to like… you know. But it absolutely is. And I love it, but I read this article over the weekend, and I was like, “God damn, I went to this school this whole time,” and I hate to say it, but being young and naive like, [mockingly] “Oh, everything is great. There’s no problem here.” I was able to, you know…

Kim: Put the blinders on. Yes.

50:38

Ayodele: Understand why people get comfortable like that—and in a way, I feel guilty—but I think it allows me to in some ways relate and maybe get to those people more and be like, “Yeah, you don’t see it. That don’t mean it’s not happening.”

Kim: I’m gonna let you off the hook, ’cause you weren’t supposed to see it. Because we were supposed to assimilate. You weren’t supposed to see that.

Ayodele: Yeah, I was supposed to just go to the fancy shmancy…

Kim: And if you did see it, you were supposed to blame them for their—not the systems, institutions and policies, but the individuals.

Ayodele: And you know what sucks is, you’re right. I was a girl in my group of 10 white friends goin’ to buy a fuckin’ $13 salad on campus because they were like, “We love this salad.” I’m like, “Can we just go to Popeyes? Whatever, whatever.” But but you’re right, there’s still an outside pressure to assimilate and be like the people to survive.

Kim: And that’s what I say, we have to deal with, everybody has to deal—everybody—has to deal with their own internalized white supremacy and anti-Blackness.

Ayodele: And you know what? I’ll be real. It took me a long time to really deal with it. And not in that I had the aspect of “Oh, you know, oppression is people’s… is someone’s own fault.” But I kind of accepted it like, OK, our world is shit. And I was like, “Oh wait, I could do some’n ’bout it.” [Laughs]

Kim: To me, that was… although this work is very overwhelming, the thing that keeps me going and where I don’t get burned out, is because white supremacy has left enough crumbs for me to see that this shit was not inevitable. This did not have to happen. This shit was a design. Oh, fuck! This was a strategy! When I got to this was a strategy, that was, “Oh, I can create a strategy to get rid of this shit and do something else.” That is what they don’t realize.

And so that’s what Coinbase—goin’ back to his ass—don’t realize that you, my sir, has set yourself up to be fucked in the first two years. So his follow up was, “Only 5% of the company or whatever took the payout.” And what you’re seeing is the most privileged people were able to take the payout. And this is my whole point: just because the Black folx that you thought was gonna go didn’t go, that just because they’re silent does not mean that they accept this bullshit. They’re in a fucking pandemic; the ability of Black folx to get rehired, particularly if they left out of protest, are far smaller than if a white person said, “I just had it. They just were against my values, and blah blah blah blah blah.”

53:09

Ayodele: And from the white mentality, they’d be like, “Oh, you’ll be able to get a job anywhere.” At the heart of every Black person working in a majority white space, we know if we choose to leave based off of our values, we are gonna be asked the next… do you think his employees don’t…

Kim: Why you leave your job? [Laughs]

Ayodele: …know, if they’re Black and their end date at Coinbase says anytime between June and October, you don’t think every single… I don’t want to say white male person in HR, but…

Kim: Oh no, white women too, because they uphold white supremacy.

Ayodele: Yeah, everyone in HR who upholds white supremacy and some who are marginalized who still uphold white supremacy…

Kim: And they gotta deal with their own internalized… yup.

Ayodele: …will see this, and that is an actual impact on their next job. It is not just, “Oh, you can write me a check and I’m good.”

Kim: No.

Ayodele: It goes back to where you’re coming from. We’re not coming from the same place in life.

Kim: Yes!

Ayodele: There is no safety net for me. I had to be successful.

54:08

Kim: And then… exactly—girl! This white guilt right now? I knew they were gonna lose interest, and I told all my Black friends: take this money now, put that shit away, put it away because they ain’t gonna be givin’ this shit for long ’cause they gonna realize, “Well, Black lives matter, but they don’t matter enough for me to keep paying for this shit.”

Ayodele: Yes! They’re like, “they want another check? They want another… like, ugh.”

Kim: “I gotta do this ongoing? Oh, no, I don’t think I can do it ongoing. This is hitting into my sushi money. What the hell?”

Ayodele: It is still in their brains: charity. Until it’s not charity.

Kim: But you know what? But that allows them to—spot the pattern—to be the hero or the victim and never the villain.

Ayodele: Yup. Yes! And they said, “Well, I did it because it felt like charity, but then I had to consistently do it, and then it felt like a habit. And then I don’t get rewarded for my habits.”

Kim: Exactly. But also, it’s not even that it’s charity; they’re doing it because—it wasn’t because of charity for us. It was because for the first time, they couldn’t be distracted from a man getting killed in front of them. This was all white guilt. This had shit to do with us. And I tell it, I just say this had nothing to do with—the 1400 people who paid to come to my event the first time had nothing to do with me. And that’s why afterwards, the other two had 100—both had less than 200 people each—because those were the people who actually want to do the work.

Ayodele: Yes. Yes. And I think that’s it, we do mention doing the work, but sometimes we got to go back and be like, “Are you ready to dismantle the systems that benefit you? To completely go back and re-imagine a new way of functioning in new systems? Because we’re not gonna keep Band-aiding this.”

56:01

Kim: OK, but you know what? I’m gonna… that last part they can see. They can see, “Oh, let’s re-imagine.” That first part of how I have to decouple my whiteness from this? Many ain’t havin’. And that’s the problem.

Ayodele: Yep. “What do you mean? I have to lose my rights.” People be like—I’ll say on the Internet, I’m like, “Yeah, if you’re a white dude, you can give up your seat at the conference.” And they’re like, “I don’t get to talk at a conference? My voice doesn’t get to be heard for the first time?” I’m like, “Aww.”

Kim: Personally, I don’t wanna hear shit else a white dude got to say, so you can get…

Ayodele: Me neither. It’ll be the same shit that ten of you have previously said.

Kim: Exactly. Talkin’ about some technology; that’s all y’all wanna talk about. If y’all talkin’ about somethin’ social issues is right at the damn… it’s the iceberg, right at the top, surface bullshit.

Ayodele: Yup. I can count on my hands, out of all of the research I’ve read—specifically on algorithmic oppression, on accountability, researching data bias—I could probably list by name 10 to 20 white researchers who are actually doing the work. And it’s not to say, “Oh, the rest are speaking fluff,” but the rest are not taking it as seriously as it is.

Kim: Well, that’s great. I mean, it’s the same thing what my problem with Robin DiAngelo’s book and Kendi’s book [“How to be an Antiracist”]. It’s great, it’s great in when you talkin’ about academics, but when you put it out in the real world, that shit don’t fly.

Ayodele: It’s not the same. It is not. It’s really not.

57:33

Kim: So what would you like to say in the closing moments on the show?

Ayodele: Ooh, closing moments. Just I think we have to not just change our mindsets on this, but truly understand that everything algorithmically impacts our real lives. It’s not just Black lives, and I think that, yes, we are—I am focused on Black lives—however, it’s easy for people to ignore the small percentage of everybody else that’s impacted by this, but we shouldn’t ignore them regardless. We—if we’re going to be doing work that drastically impacts people, creating algorithms that decide if you get a job, that decide if you’re good enough to get into college—we have introspect the systems in which all of these things operate and adjust everything we do to create equity based off of that. So what’s my last words? I just, you know, it’s about dismantling systems and not just providing shallow solutions, and we gotta understand: push back. If someone’s like, “Give me a checklist, give me one or two things to do.” I’m like, “There’s actionable steps, but there is no one size fits all. It’s just not going to be like that.”

Kim: And if you’re looking for it, then you should look into yourself because you’re the problem.

Ayodele: Yes, exactly that.

Kim: Thank you so much for being on the show. This hour has flown by. This has been amazing.

Ayodele: It has. Oh, this helped me. This was my therapy this week.

Kim: Yes, yes, yes, I hope you… and this is what this is. When I talk to Black women, it’s just like breathing. [Sighs contentedly]

Ayodele: Yes. I’m like, “Oh, it’s not just a singular burden anymore.” [Laughs]

Kim: It’s shared for this moment, yes.

Ayodele: Yes. Thank you so much for having me. I admire your work so much, you have no idea.

Kim: Thank you and have a wonderful day.

Image of Ayodele Odubela

Ayodele Odubela

Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.

Learn more >

All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud.

Listen to more great #causeascene podcasts