Episode 32 Transcript

Safiya Noble: I have found in my own community, as a, you know, Black woman living in Los Angeles watching so many different kinds of predictive technologies, predictive policing, experimented upon my own community, that I think we have to abolish many of these technologies, and we actually have to understand to what degree are digital technologies implicated in some of the most horrific forms of violence and inhumanity. 

Noshir Contractor: Welcome to this episode of Untangling the Web, a podcast of the Web Science Trust. I am Noshir Contractor and I will be your host today. On this podcast we bring in thought leaders to explore how the web is shaping society, and how society in turn is shaping the web. My guest today is Dr. Safiya Noble, an associate professor of Gender Studies and African American Studies at the University of California, Los Angeles. She is the co-founder and faculty director of the UCLA Center for Critical Internet Inquiry, an interdisciplinary research center focused on the intersection of human rights, social justice, democracy and technology. She is the author of the best-selling book Algorithms of Oppression: How Search Engines Reinforce Racism. Safiya was also the recipient of a 2021 MacArthur Foundation fellowship. Her nonprofit community work to foster civil and human rights, the expansion of democracy, and intersectional racial justice is developing at the Equity Engine. Welcome, Safiya.

Safiya Noble: Hi. First of all, I just want to say thanks so much for having me on the podcast today. This is such a thrill and such an honor to be in conversation with you. 

Noshir Contractor: Thanks for joining us here today. Safiya, your book about the algorithms of oppression focuses on search engines. And you talk about your experience using these search engines and recognizing very quickly that they are not quite as objective and neutral as one might like to believe they are. 

Safiya Noble: A decade ago, I was thinking about large scale digital media platforms. And I had kind of spent my whole first career in advertising and marketing. And as I was leaving the ad industry and going back to graduate school, it was so interesting to me the way that people were talking about search engines at the university. So I was at the University of Illinois at Urbana-Champaign in the information school there. Lycos, and Google and Yahoo, you know, these new technologies were amazing the way that they were indexing the web. But I also had just left this experience of trying to game these systems for my clients when I was in advertising. So really, I understood them as media systems. We were buying ads and trying to optimize. We were hiring, like, programmers to come into the agency and, like, help us get this General Motors ad up on the first listing, things like that. So it was interesting to kind of come into the academy and study search engines in particular, because I was just so fascinated by them. They were just kind of so banal and non-interesting compared to social networking sites that were coming into vogue. And it was kind of in this inquiry that I started doing searches and looking to see what kind of results we get. And one day I just kind of stumbled upon searching for Black girls. You know, I’m a Black woman. My daughter at the time was a tween. I realized that when you search on Black girls, you were met with almost exclusively pornography. And I was thinking about like, what does it mean that you don’t have to add the word “sex,” you don’t have to add the word “porn,” but Black girls themselves, that phrase is synonymous with hyper sexualization. This was kind of like a sexism 101, racism 101. And that really was the thread that I started pulling on that led to the book Algorithms of Oppression.

Noshir Contractor: Are you suggesting that the reason the search engines were privileging these kinds of search results is because that actually reflected something that was happening in society already and it was simply being amplified here? Or was this some other kind of manipulation that resulted in those?

Safiya Noble: That’s the question, right? I mean, I think that the prevailing logic at the time, 10 years ago, was that whatever we found in technology was purely a reflection of society, right, that the tech itself, those mechanisms were neutral, and that if you found these aberrations, it was because people were misusing the technology and that’s what was becoming reflected back. And I felt like that was an insufficient answer. Because I knew from my previous career, we had spent time trying to figure out how to manipulate these technologies. So I knew that if they were gamable, that that was a design choice or a set of kind of business imperatives.

Noshir Contractor: Sometimes referred to innocuously as search engine optimization, which was a term that was used at the time.

Safiya Noble: It was interesting, because search engine optimization was kind of a nascent industry. And then, you know, our beloved colleague Siva Vaidhyanathan wrote this incredible book called The Googlization of Everything: (And Why We Should Worry). I felt like, this is the jumping off, this book is actually the place now that I can go even more specific about how this skews in these kind of historically racist and sexist ways toward vulnerable people, toward communities of color, toward women and girls under the guise of being neutral, normal, and apolitical.

Noshir Contractor: During the COVID crisis, we see algorithms now are also playing a sinister role in the propagation of information, not just in terms of giving us search results. Talk about some of the work that you have been concerned about with regards to how algorithms were employed by university hospitals in terms of guiding vaccine distribution.

Safiya Noble: This was one of the most egregious, I think, examples of a kind of distorted logics that are embedded into a lot of different kinds of software that we use every day and really don’t think twice about. So you know, there was this story that kind of went viral pretty quickly about how the vaccine during COVID-19 would be distributed. And of course, we had so many frontline workers who desperately needed that. And Stanford University Hospital, in the heart of Silicon Valley right, you have the hospital that deploys an algorithmic decision making tool to determine who should get the vaccine first. They determine that the algorithm suggests a group of people that happen to be people who are retired doctors, who are at home, who are mostly protected. We have to look and understand the data, the logics that are imbued into the different kinds of systems that we are making ubiquitous, because they will inevitably have huge consequence. 

Noshir Contractor: Concurrent to that we also saw an infodemic. You’ve talked about the role of the algorithms and the propaganda in terms of escalating violence against Asian Americans.

Safiya Noble: One of the things we want to remember is this dynamic interplay between social media and search. During the Trump administration, he was one of the greatest propagators of racist propaganda against Asians and Asian Americans by invoking sarcasm and hostility toward our communities, and in suggesting that Asian Americans and South Asians and really Asians throughout the diaspora were responsible for Coronavirus, right. And so this, of course, we know, also elicited incredible violence. If you come across something like racist propaganda against Asian and Asian American communities in social media, on Facebook, you might go and turn to a search engine to query whether it’s true. And this of course becomes extremely dangerous, because we know that search engines also are likely to be flooded with disinformation and propaganda too. Using something like Google as a fact checker for propaganda, and then having that propaganda be made visible to you, only confirms the dangerous kinds of ideas that you might be experiencing or exposed to.

Noshir Contractor: So there is almost a symbiotic relationship between what propagates in social media and then what shows up on your search results. One example that you’ve talked about is how Dylann Roof was influenced by reading white nationalist websites before massacring nine African American church goers.

Safiya Noble: Yes, that’s right. What we know is that the most titillating and often egregious – which includes racist and sexist propaganda against religious minorities and sexual minorities – this kind of material on the web is actually very, very engaging. It’s fashioned many times like clickbait, so that it looks like it could be kind of true. And this is one of the main arguments that I really try to make in my work, which is that it is not just a matter of the fact that people are clicking on it, because guess what, people who are against that kind of material are also clicking on it, trying to understand what in the world is going on. But every one of those clicks really translates to profits. It will, in fact, contribute quite handsomely to the bottom line for many of these companies.

Noshir Contractor: Which takes you back to your initial profession, in the ad business.

Safiya Noble: Right. Publicly traded companicompasses are required to maximize shareholder investment and profit at all costs, so there is no social responsibility, there is no social justice in the frameworks of Wall Street. I think we’re seeing now a decade of the results of that kind of misplaced value in our society.

Noshir Contractor: A lot of the tech industry engages in some pretty brazen experiments where they try to engage in some kind of an intervention where they would experiment with a particular kind of manipulation for a day. And you’ve compared that to something that would not be allowed in industries such as the pharmaceutical industry, in the tobacco industry. 

Safiya Noble: The way in which Silicon corridors around the world are able to articulate their work is shrouded in math, science, engineering. We have to be careful about how we deploy even words and frameworks like science, that get used as a shield sometimes right for some of the most egregious kinds of products and services. I heard the investigative journalists who broke the story about Compass, the recidivism prediction software, right, that is profoundly racist, predicting African Americans, I think it was like at a four or five times the rate of white people who were arrested, to go back to jail or to stay in prison. They had all these boxes of paper documents to prove how the harm was happening. And the programmers, they didn’t want to hear it. And I remember sitting on this panel, in fact, it was at Stanford, with these journalists. I thought to myself, you know, we would never let three guys rent some space in a strip mall like those guys, and cook up some drugs and roll it out in pharmacies, right and then when people die or people are harmed, we say, “Hey, it’s just chemistry. Chemistry can’t be dangerous,” right? Like, we would never do that. So why is it that in the tech industry, we allow this kind of deep belief in the neutrality of these technologies without realizing that so many people have personally been harmed. And I think that we have to look at other industries, even like the era of big cotton, you know, during the transatlantic slave trade. We had many, many arguments during that era, where people said, you know, “We can’t do away with the institution of slavery and the enslavement of African people and Indigenous people, because the American economies are built on it, it’s impossible, our economy would collapse.” And that is actually the same kind of discourse that we use today. And I think there’s a lot to learn from these other historical moments, and figure out how we will shift the paradigm as we did for those other industries.

Noshir Contractor: That includes the abolitionist movement as well as a historical precedent.

Safiya Noble: Absolutely. When I set out a decade ago to work in this area, and to think about these things, I didn’t think of myself at the time as being, like, an abolitionist. I thought I was curious in doing this inquiry, and I knew that there was something unjust happening, and I wanted to sort it out. And now I can truly say that there are so many technologies that are deployed that are made with no oversight, with no regulatory framework. I have found in my own community, as a, you know, Black woman living in watching so many different kinds of predictive technologies, predictive policing, experimented upon my own community, that I think we have to abolish many of these technologies. And we actually have to understand to what degree are digital technologies implicated in some of the most horrific forms of violence and inhumanity. Doing this work for 10 years has definitely moved me to the place of thinking of myself as kind of an abolitionist in the sense that, like during the transatlantic slave trade and during the institution of slavery, it really was a small handful of people that were persistent about the moral and ethical failings of the economic and kind of religious political system that was holding up such an inhumane set of business practices and social practices and political practices for centuries. And I think it will be those of us who are trying to point to these dangerous moves, that probably we will be articulated as some type of abolitionists in this sector, trying to raise attention to the costs, the long term costs.

Noshir Contractor: How does one bring a more concerted way of addressing all of these injustices?

Safiya Noble: We need the structural changes, we need different laws, we need different policies. The law, it’s not the only thing. But it certainly is very important. What I worry about with predictive analytics is that so much information is collected on people that then is used to determine whether they will have an opportunity, but also to foreclose opportunities. And of course, you know, we have to think about each of us. Imagine our worst moments of our lives. And I think, what if that moment is the snapshot in time, collected about me and that determines, it’s a no-go for Safiya Noble, right? And of course, it also forecloses any possibility of redemption, of forgiveness, of empathy, of learning, of change. And I think we don’t want to live in a society without those qualities. And predictive analytics really forecloses the opportunity for being known, for changing, and for having a high quality of life. Cathy O’Neil says in her book, Weapons of Math Destruction, she says, predictive analytics make things much better for people who are already doing great, and much worse for people who are already not doing well. And I think we want to take that heed seriously.

Noshir Contractor: Technology has had a history of widening the knowledge gap and the information gap in society. And so this is a natural progression of what has preceded it. One of the things you have also discussed as a way of addressing some of the issues you just talked about was proposing an awareness campaign and digital amnesty legislation to combat the harms perpetuated by algorithmic bias. 

Safiya Noble: When I was a graduate student, I was thinking about what happens when you are ensnared in a search engine, for example, and you can’t fight your way out of it, right, your name is destroyed. And of course, we have legislation in the EU like the right to be forgotten, that helps address this, right. We don’t have this yet in the United States. Every engagement we have on the web is bought, sold, traded by 1000s of companies. So how do we withdraw? How do we pull ourselves out of these systems? How do we create amnesty out of these situations? I’ve been trying to talk to lawmakers in California about what would it mean when we do pass our own kind of localized versions of GDPR? I love this article written by Jean-François Blanchette and his collaborators about the social value of forgetting, like why we seal juvenile records, so that those mistakes don’t follow you into the future. So how do we grapple with that now in this global web? Could we imagine and reimagine the way in which we appear? I once heard the director of the FBI, and he was at a conference, and he said, “As far as the government is concerned, who you are is your digital profile.” Can you imagine? I mean, people like us who study the worst parts of the web, who are on the internet looking at terrible things all the time. All the things we’re doing on the internet, nothing could be further from the truth about who I am as a person. 

Noshir Contractor: That’s a really important point, because we celebrate the fact that now we can store and put into memory everything. But you’re pointing out the perverse aspects of keeping that memory always available to everyone. And you worked with engineers, executives, artists, and policymakers to think through these broader ramifications of how technology is built, how does it get deployed, and how most importantly, does it get used in unfair ways?

Safiya Noble: I think one of the most impactful organizations that I’ve been able to be a part of and on the board of is the Cyber Civil Rights Initiative, which is really the organization that has helped develop and pass all of the non-consensual pornography or revenge porn laws that we have in the United States. I think that’s a place that actively engages with trust and safety staff in large tech companies to try and help them understand truly the cost of their algorithmic distortions of people, many times young women. I have many relationships into Silicon Valley. The benefit of having had my whole first career in corporate America for 15 years before being an academic is I really understand, when you work in a large global company, you’re one person like sometimes on the Titanic, and it’s going down, and you can’t actually stop it yourself. You know, you’re trying to figure out how to leverage and work across all kinds of teams and all kinds of people. I don’t only stay in academic and, kind of, activist spaces, or community spaces, I also go into these corporate spaces, and I try to talk about the work and give them the examples and challenge them to think differently. And I do think that now, if I look out at engineering programs, you see slowly schools changing, and that means that that’s because industry is also saying, maybe we need people who have some background and experience in social sciences and humanities, too.

Noshir Contractor: But you are saying that in general, you find many of them to be receptive to these ideas and willing to be educated about these issues?

Safiya Noble: Absolutely. I think there are a lot of people who do not want to look back and feel that they are on the wrong side of history, that they didn’t ask tough enough questions, that they took for granted the wrong assumptions. I can tell you in the classroom, for sure, as I train engineering and computer science students who take my classes, 10 years ago, they were hostile to the idea that their work had any political value. And now, traditionally aged undergraduates are completely clear when they enter the field, and they’re here to change it.

Noshir Contractor:  So in addition to the work you do with corporate America, as well as activism, you’ve also rubbed shoulders with celebrities. Meghan Markle has cited Algorithms of Oppression as key to understanding the online vitriol that was spewed about her. 

Safiya Noble: I got this email that said, you know, “Please save the date to meet with the Duke and Duchess of Sussex.” And I thought it was like a scam email. Okay, long story short, it was real. They had been given my book by my former dean from USC. Meghan, I think she really saw an explanation for the incredible racist and sexist vitriol that she’s experienced on the internet. You know, the way that I could articulate how Black women and women of color become fodder for destruction, almost like as a sport on the internet, I think she really, she had experienced herself. And sent here was someone explaining that this is actually a business imperative for companies. And so they have given, you know, resources to the UCLA Center for Critical Internet Inquiry. Bot Sentinel, just, you know, issued a couple of really important reports that showed how just a few dozen high impact accounts on social media were coordinated to basically try to destroy their family, destroy them. But I will tell you, Meghan and Harry, they understand that a girl living in Iowa, a teenager living in Oakland, that people who are vulnerable that don’t have the platform and resources they have, could never fight off these kinds of internet attacks and cyber bullying and trolling. And I think that is what they want to put their time and efforts and support around people who get that and care about that are also working on that, too.

Noshir Contractor: Wonderful. So as we wrap things up here, I want you to look ahead. I want you to tell us a little bit about what plans you have as part of working as a MacArthur Genius, and also the launch of your new nonprofit Equity Engine.

Safiya Noble: I’m still pinching myself. I’m very grateful. And there are many Black women and women of color, who with just a little bit of support and scaffolding could continue to have very big impact. I mean, I look around, and I see, whether it’s Stacey  Abrams, or you know, a whole host of Black women, too many to name, and women of color, who are holding up families, villages, neighborhoods, the country, democracy, and who are really under-resourced in doing that work. And so I’m really trying to use this opportunity to build Equity Engine. I’m hoping that people will just give resources and their networks and their power to women of color, because the one thing that women of color have is incredible sense of justice. And our work has been on the frontlines of expanding human and civil rights around the world. And we are also the least resourced in doing that. And so the Equity Engine is really a place for people to help us hold Black women and women of color up and build their power, in many of the ways that this MacArthur Fellowship is helping me do.

Noshir Contractor: I love the term Equity Engine as well, I think it’s a very apt name for what your vision is. Thank you again so much, Safiya for speaking with us. You brought us lots of really interesting insights and awareness about some of the ways in which we need to be much more skeptical about the web in general and about the algorithms and do something about it to make a difference.

Safiya Noble: Yes. Well, thank you. It’s such an honor. I have followed you and my whole career, and I just am so honored that I’ve done something right in life to get to be in conversation with you, so thank you so much for this opportunity.

Noshir Contractor: Untangling the Web is a production of the Web Science Trust. This episode was edited by Susanna Kemp. I am Noshir Contractor. You can find out more about our conversation today in the show notes. Thanks for listening.