Episode 31: Vint Cerf on Launching the Internet on Earth – Then in Space

 

Our guest for this episode is Vint Cerf, who is considered to be one of the fathers of the internet. Vint is the co-designer of the TCP/IP protocols and currently serves as Google’s vice president and Chief Internet Evangelist – we’ll talk in this episode about how that title came to be. Vint has served in executive positions at places like the Internet Society and the Defense Advanced Research Projects Agency (DARPA) and serves in advisory capacities at NIST and NASA. 

In this conversation, Vint talks about how the TCP/IP protocols (which provide internet-connected devices with a way to communicate with one another) came to be and his dedication to spreading the “internet religion” and making information available to all. He focuses much of the conversation on how we can expand the internet in various ways, by allocating more bit space for networks, improving its accessibility, and developing an interplanetary internet. 

Click here for this episode’s transcript, and here for this episode’s show notes.

 

Episode 30: David Lazer on Using the Web to Study the Web

 

Our guest for this episode is David Lazer, a Professor of Political Science and Computer and Information Science at Northeastern University. David is among the leading scholars in the world on misinformation, and he has also researched how we can use the web as a tool to improve our political system. He co-wrote the book Politics with the People: Building Directly Representative Democracy, which was published in 2018 by Cambridge University Press. 

In this episode, David talks about the potential for members of Congress to meet online with voters. He also discusses an online platform he helped to design called Volunteer Science, which houses a large pool of remote volunteers and lowers the startup costs of running experiments for researchers. Finally, he talks about his research on social media and big tech’s algorithms and misinformation on the web – and a recent grant from the National Science Foundation that will fund some of this work. 

Click here for this episode’s transcript, and here for this episode’s show notes.

 

Episode 29 Show Notes

If you enjoyed this episode and want to learn more, here are some materials to check out:

Siva’s Website

Siva’s Articles

Siva’s Podcast: Democracy in Danger

Siva’s Books:

  • Copyrights and Copywrongs: The Rise of Intellectual Property and How it Threatens Creativity (2001)
  • The Anarchist in the Library (2004)
  • Rewiring the Nation: The Place of Technology in American Studies (2006)
  • The Googlization of Everything — And Why We Should Worry (2011)
  • Intellectual Property: A Very Short Introduction (2010)
  • Antisocial Media: How Facebook Disconnects Us and Undermines Democracy (2018)

Episode 29: Siva Vaidhyanathan on the Operating System of Our Lives

 

Our guest for this episode is Siva Vaidhyanathan, a media studies professor at the University of Virginia. Siva is a regular columnist for The Guardian as well as the author of Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy (Oxford, 2018) and The Googlization of Everything: (And Why We Should Worry) (University of California Press, 2011), among other books. He focuses on how big tech companies – especially Google and Facebook – are permeating our lives.

In this conversation, Siva talks about the creation of Google Books and why he thinks Google was the wrong choice to be a platform that houses the world’s online library. He also talks about how authoritarian rulers have used Facebook to win elections and ties this fact into a discussion of the big tech companies’ race to become “the operating system of our lives” – and to manage everything from our houses to our minds. 

Click here for this episode’s transcript, and here for this episode’s show notes.

Episode 28: Children and the Digital Future with Sonia Livingstone

 

Our guest for this episode is Sonia Livingstone, a professor of social psychology at the London School of Economics and Political Science. Sonia’s research focuses on children and young people’s media literacy and rights in the digital environment. She recently co-authored (with Alicia Blum-Ross) the book Parenting for a Digital Future: How hopes and fears about technology shape children’s lives, published by Oxford U Press.

In this episode, Sonia suggests we examine children’s media use in more expansive ways, thinking beyond how much time children spend online and also considering how exactly they’re engaging with screens. She also emphasizes that technology inequalities merit more attention and discusses children’s rights and agency within the digital space. 

Click here for this episode’s transcript, and here for this episode’s show notes.

Episode 28 Show Notes

If you enjoyed this episode and want to learn more, here are some materials to check out:

Sonia’s websites:

London School of Economics Bio

The Digital Futures Commission

Some of Sonia’s articles

Sonia’s Ted Talk on Parenting in the Digital Age

Some of Sonia’s Recent Books:

Some of Sonia’s Recent Podcast Appearances

  • Parenting for a Digital Future interview by the New Books Network
  • A Conversation with Sonia Livingstone on the Mediation of Everything on Deuzevlog

Sonia’s Social Media:

Episode 34 Transcript

Brewster Kahle: We’re now going backwards and digitizing books, music, video. And we really want an open library system as opposed to a commercial answer to the whole thing. There’s so much wonderful things that are just not being read because they’re not that available, and people are going to read whatever it is they can get their hands on. Misinformation can be rife and just published out the wazoo. 

Noshir Contractor: Welcome to this episode of Untangling the Web, a podcast of the Web Science Trust. I am Noshir Contractor and I will be your host today. On this podcast, we bring in thought leaders to explore how the web is shaping society, and how society in turn is shaping the web. My guest today is Brewster Kahle, who you just heard talking about his vision to create an all-encompassing online archive.

Brewster has spent his career intent on a singular focus: providing universal access to all knowledge. In 1989, he created the internet’s first publishing system, called Wide Area Information Server (WAIS for short), which he later sold to America Online. In 1996, he co-founded two sites to help catalog the web: Alexa Internet, which he sold to Amazon, and the Internet Archive. The Internet Archive is one of the largest libraries in the world and now preserves 99 unique Petabytes of data, books, web pages, music, television and software of our cultural heritage. In 2001, Brewster implemented the Wayback Machine, which allows public access to the World Wide Web archive that the Internet Archive has been gathering since 1996. Brewster was elected a member of the National Academy of Engineering in 2010. He’s also a member of the Internet Hall of Fame, a fellow of the American Academy of Arts and Sciences and serves on several boards. Welcome, Brewster.

Brewster Kahle: Thank you. It’s great to be here.

Noshir Contractor: This podcast, of course, is Untangling the Web. But when I think of you and everything you have been doing in your career, I think of you as somebody who’s contributed to help us rewinding the web rather than just untangling it. And in that spirit of the Wayback Machine, I want you to take us back to 1992 when you first came up with the idea of WAIS. Tell us what prompted that. And it’s important to note that in many ways, WAIS was a precursor to the World Wide Web.

Brewster Kahle: Absolutely. The idea was of the internet, the opportunity was to build the library, well, of everything. Could you take the published works of humankind and make them available to anybody, but not just anybody, but any computer. Could we go and mush people, networks, and computers together? This was sort of the dream back in 1980 to try to figure out, how do we go and build this? First, we needed to go and build computers that actually could handle this. And Danny Hillis at MIT who I worked with had a great idea called the connection machine of making a supercomputer out of lots of little computers. And so I helped build that to go and try to make it so we could go and handle building the library of everything. And then I built WAIS and did that in 1989 and then made it publicly available for free on the internet in 1992, as you point out. The idea is to try to get publishing to go so that you don’t just have one big database of everything, you wanted to have people be able to have their own information in lots of different servers, a decentralized system. And that was the idea of WAIS. WAIS was kind of the search thing at that time.

Noshir Contractor: So as you looked at it, the library that you were building was a distributed search and document retrieval system where documents could be distributed all over the internet. And what you were providing was an indexing system, in the parlance of library talk, and you were trying to see how one could search for these documents anywhere on the web and then how one could retrieve it. At the same time that you were thinking about WAIS and how it fed into the World Wide Web, you also were thinking about a different product called the Alexa Internet. 

Brewster Kahle: So WAIS helped get people online and made everyone able to become a publisher. And could you even, you know, control the distribution of your works? Could you even charge for it? We made the first subscription-based service on the internet. We made the first ad-based system on the early web to try to help make that all work. But once we got kind of the commercial side going by ‘94, ‘95, then the idea was, we could turn to build the library. So Alexa Internet and the Internet Archive started on the same day in 1996. And one was a for profit, and one was a nonprofit. And the for profit, Alexa Internet, was to catalog the web. So we could start crawling the whole World Wide Web and trying to find related links. The thought was that the search engines were going to give up steam, that the keywords weren’t going to be enough to get you the right document out of billions. Well, I was kind of wrong, because Google has done such a fabulous job. But we do really need some of these other things like related links, like if you’re looking at a webpage, tell me am I, like, is this crap or is this good? What else have people said about it? How long has it been there? If I’m looking for other things like it or maybe other points of view, what can I go see? Maybe it’s actually now that we have disinformation being broadcast so widely on the internet, that this technology that Alexa Internet was really designed to do was important. And the idea was also to go and leverage the link structure of the web and the usage trails of the web. And the idea then, also, for profits don’t last that long. So we said, okay, let’s go and build a contract into the soul of Alexa Internet that all the data that was collected would be put into this new nonprofit called the Internet Archive. So every day since 1996, it’s been donating data to the Internet Archive.

Noshir Contractor: If I recall correctly, they announced that they’re shutting down Alexa in May of this year.

Brewster Kahle: Oh, so sad. Yes. But it was a good 25 year run, which is a lot longer than most tech organizations, commercial ones, last, but nonprofits tend to last much longer.

Noshir Contractor: One of the ways in which I first encountered Alexa was basically as a way of understanding web traffic. A lot of people who do web science research would use Alexa data, you can know a little bit about the status of a website, how well trafficked it is. And so that’s one piece of metadata that you might consider when you’re looking at a website. But as you point out, Alexa was also archiving the web. And when you say it was doing that every single day, help me understand. Does it take a snapshot of the internet every single day? Does it sample it and say, I’m going to do every part of the internet once every week or month? How does that work?

Brewster Kahle: Let’s take the Internet Archive, this post-Alexa internet. What we do is, we have many different crawlers that basically go through the web – each one have different mandates. There are about 3,000 crawlers that run on any particular day, There are about 900 organizations now – libraries, archives, museums – that work with the Internet Archive, where they go and state particular mandates to these crawlers, that they say they want this particular subject area, they want this particular language, they want this particular whole country domain. They want it this deep, they want it this often. And a total of over a billion URLs every day get archived by the Internet Archive, to just try to keep up with what’s going on out there. Then we index it to make it available in lots of different ways, including the Wayback Machine.

Noshir Contractor: So tell us a little bit about how the Wayback Machine sits in some ways on top of the Internet Archive.

Brewster Kahle: We found that the average life of a webpage is about 100 days before it’s either changed or deleted. So we basically needed to try to keep up with that and then make all the out-of-print web pages available to people. So the way that the Wayback Machine works is completely simple. Fundamentally, it’s a line and a file for every URL we have. And it’s sorted based on the URL and the date. And every time somebody wants to look up a URL, we go and binary search this, well, multi-terabyte file to be able to find the most relevant page for that user. Or every GIF, every JPG, every JavaScript file is indexed in this way. And by running it in a parallel computer, much like the Connection Machine, we’re able to go and pull these out at 1000s of times per second, for the millions of users that use the Internet Archive’s resources every day.

Noshir Contractor: As you know, there have been several movements around the world, especially from the European Union, to legalize the right to be forgotten. And I imagine that the archive might make it difficult for people to have the right to be forgotten. What are you doing in the archive in terms of addressing this issue?

Brewster Kahle: Oh, yeah, a lot of the web was not really meant to be publicly available always. And so we take requests from people to remove things from the Wayback Machine, and those come in all the time from users, and you can write to info@archive.org and, you know, say what URLs or domain name, and then you have to try to prove that you own that so you can’t delete microsoft.com or something like that, and then it’s removed. And that seems to work pretty well. 

Noshir Contractor: I was struck by a comment that you wrote that for the cost of 60 miles of highway, we can have 10 million-book digital library available to a generation that is growing up reading on screen. 

Brewster Kahle: You know, being brought up during the tail end of the hippie generation, right, so the utopian “let’s build a better world,” I took that all very seriously and being a technologist tried to figure out what could we do. We thought, let’s start with what became the World Wide Web. But then also, let’s do television, radio. So we’re trying to get good at those. But we’re now going backwards and digitizing books, music, video. And we really want an open library system as opposed to a commercial answer to the whole thing. There’s so much wonderful things that are just not being read, because they’re not that available, and people are going to read whatever it is they can get their hands on. And this next generation is going to learn from whatever they can get. And it’s, a lot of it’s crap. Misinformation can be rife and just published out the wazoo by anybody with some budget, because a lot of the good materials are locked up behind paywalls, are still in print, or just, they haven’t really moved into the bigger picture of the opportunity of the internet. And so we’re gonna want to put the best we have to offer within the hands of our children.

Noshir Contractor: So it sounds like while you began by trying to create an archive of the internet, you’re now moving more towards creating an archive on the internet.

Brewster Kahle: It’s a good point. Absolutely. We’ve got maybe five or 6 million books that have been digitized. And we’re starting to do periodicals. First going and digitizing these for the blind and dyslexic. Then we make it somewhat available, you know, to, for instance, machine learning researchers, but also through borrowing, interlibrary loan, controlled digital lending, those sorts of things. You shouldn’t have to be at Yale to be able to see some of these good works.

Noshir Contractor: At the end of the day, even what is digitized is being supported on some material resource, whether it’s a disk drive or something else. I recall reading that you were inspired by the Global Seed Vault idea of trying to keep one physical copy of perhaps every book. Now maybe it’s not a physical copy as in papyrus or paper, but a digital storage record. And talk a little bit about the fragility of all of these different media that we have, starting with paper, but including many of the servers that you have and how often you have to be careful to make sure that those servers don’t get obsolete or die.

Brewster Kahle: I mean, it’s such a problem. You see these beautiful pieces of papyrus from 5000 years ago, it’s great. But it seems like it’s getting shorter and shorter. So some of these new technologies like microfilm and microfiche, they were reported that they could last 500 years. And so we’re starting to collect the microfilm and microfiche not only to preserve the microfilm, microfiche, but to then also digitize it. So we’re moving forward, but we’re always keeping the physical materials. The Internet Archive works with other libraries that have these large physical archives to keep these.

Noshir Contractor: You have also argued that the value of digital archives is not just historians, but also to help resolve common infrastructure complaints about the internet, such as adding reliability to 404 document not found. Tell us a little bit more about what you see as the value in that space.

Brewster Kahle: Yeah, at least let’s fix some of the bugs on the web. The 404 document not found is just bad engineering. We made a little extension that you can add to your browser such that if any of a number of errors come up, then we’ll probe the Internet Archive Wayback Machine and see if it’s got it. I think, also, the big opportunity is thinking at scale. My friend Jesse Ausubel put it: humanity got a long way with a microscope; What we need now is a macroscope, an ability to step back, understand the bigger trends. There’s a great interface on top of the Television Archive that’s just the transcripts that were taken by a fellow named Kalev, and he made GDELT, and you can go and do queries to find out terms, how much were they on one cable channel versus another over time, and you can start to see biases in these bubbles by stepping back and getting a bigger picture of what’s going on. I think that’s absolutely critical. People are very good at getting excited about some tweet or blog posts or Facebook something or other, some cable news, dramatic whatever. And it’s difficult to put it in context. If there would be a wish that I’d have for the next 10 years of web science and the like, is let’s build context into our online experience. So that’s not necessarily fact checking. It’s, what were the debates around it? What’s the information around it? It’s all the sorts of things that scientists in the academic publishing used to know about before the paywall sort of took over. This, this whole approach, I think we need to bring that to a much broader population.

Noshir Contractor: In many ways, what you’re talking about is a much more nuanced version of what we sometimes call metadata, that is, data about the data in this particular case.

Brewster Kahle: So Bill Dunn was one of my mentors. He did the electronic side of Dow Jones. He was the first purchaser of a Connection Machine to go and do full text search. And he had this saying back in the mid-80s that the metadata is more important than the data itself. And that’s what Google leveraged with their anchor text and PageRank. It’s what Alexa did by looking at user trails to be able to find: people that like this webpage, what web pages did they like even more? The importance of the internet is not the computers at the edges, it’s that we’re all connected.

Noshir Contractor: So you mentioned GDELT as being an example of a repository that has become incredibly helpful to scientists, including web scientists, to study ways in which information is flowing and how it impacts public opinion and so on. Given that you have built this incredible Internet Archive, and given that there are so many people who are using it to help understand society today and how it has been in the past, can you share with us some of the most interesting insights that you have learned from what you or others have found by using the Internet Archive as a way of studying society? 

Brewster Kahle: Oh, I wish I had more time to go and study society. Mostly I’m just a librarian building the darn thing. We went and studied all of the political ads in the United States to understand the wash of money that’s going over the media system based on Citizens United and other decisions by the United States to allow corporations to pay for politicians. And it’s fascinating to see how much money and just the barrage of ads that you would get if you were in a battleground state. I mean, you couldn’t flip channels fast enough to not be seeing an ad at all times. So there are these things you can kind of see by stepping back. Let’s see. The World Wide Web made it so that you could take unpopular websites and they could become popular, which is a really good sign for going and having an ecosystem that’s alive. When you get too much either regulation or too many monopolies going and controlling things, a lot of that will slow down and stop. And I’m very excited about the decentralized web technologies. Let’s see another round or two of these to go and put people back in charge of some of these technologies rather than just these very large corporations that have started to take over whole media types. Let’s build open systems that lots of people can play. I like games with many winners. 

Noshir Contractor: It sounds a bit like history repeating itself, because when the World Wide Web and WAIS and other technologies were spawning, it was also in response to corporations at the time, think of companies like AT&T, for example. The mantra was decentralized. Now we are saying we want a new wave of decentralized technologies. So was there a cycle where this decentralization gave way to a certain level of centralized authority that we now need to renew our efforts at decentralizing the web. 

Brewster Kahle: We certainly need to renew our efforts. But the decentralization never needed to come. If you actually had government antitrust law that was actually, you know, used as much as it was before 1980 when things started to collapse in terms of antitrust, then I think we would have had an ecosystem without having to go through revolutions. And I’m hoping that we invent something better.

Noshir Contractor: Speaking of inventing something better, I was fascinated by a recent blog posting by you titled “Imagining the Internet: Explaining our Digital Transition.” My understanding here is that you have talked about the different metaphors that we have used to talk about the internet from the time it began. Tell us a little bit about what those metaphors are and how you see us trying to imagine the internet of the future.

Brewster Kahle: If we’re trying to, you know, look forward, I find looking backwards and seeing the trajectory we’re on to try to understand where we were going might be useful. So what I did is I went and tried to look at what was the metaphors that people had for the internet and tried to track that change over time, if you will. So the first one, I would say, would be the library. But then it moved and it started to become other things, like, just portrayals of a raw network. I would say then cyberspace was a term that people use. So it was far away. Then it started coming home towards being a frontier, the Electronic Frontier Foundation. It was a wild west and it had to be navigated. Then there was information superhighway. We moved to surfing. So now, it’s not just some place out there. But now you can experience it, you can ride on it, you can use it. Then I would say the next one was the Facebook, right? The idea of the Borg. Your cell phone was glued to your face. So now, where does it go from here? I would say, the thing we’re wrestling with around now is algorithms. If that’s where we are, then what happens next? And I would say machines are starting to not need us anymore. [The machines sort of detach.] In The Matrix in ‘99, Agent Smith has this terrific rant about people being a disease.

Noshir Contractor: That does paint a somewhat dark picture of where we are headed. 

Brewster Kahle: I mean, you can’t see a movie these days without it being frickin’ dystopian. People are anxious. They are really worried about what’s going on. They are not feeling in control. I would like to make it so people have a feeling that they’ve got some level of control of what it is they’re reading, what it is they’re writing, where it’s going, their privacy, their sense of self, their friends. And we have done almost everything we can to strip that away from them. I do like the Alan Kay “don’t predict the future, go and invent it.? We as technologists should do a better job than just go to, “Hey, let’s go make a ton of money and be like a rich internet mogul.” Let’s leave a better environment for people to be the most they can be, they can be creative and feel safe and achieve and build and grow. That’s what our technologies and our internet should be for.

Noshir Contractor: And that’s a wonderful place to end this conversation. A very upbeat note, very inspiring. Brewster, thank you so much again for joining us and for all the work that you’ve done in helping us to understand the archive of the internet and to, as I said, to rewind the web and the Wayback Machine. I would certainly recommend folks take a look at the blog entry that Brewster has just been summarizing at brewster.kahle.org as well as play with the Wayback Machine if you haven’t. It’s a lot of fun and somewhat embarrassing to go back and see what kind of websites we created back in the ‘90s and also in the early part of the century. So thank you again, Brewster so much for joining us today. 

Brewster Kahle: Thank you very much, Noshir. 

Noshir Contractor: Untangling the Web is a production of the Web Science Trust. This episode was edited by Susanna Kemp. I am Noshir Contractor. You can find out more about our conversation, whether you are listening to us today or via the Wayback machine decades from now, in the show notes. Thanks for listening.

 

Episode 33 Transcript

Howard Rheingold: I ended up creating a course called “social media issues” around my book Net Smart. My answer to Is this any good for us? has been, it depends on what people know, that it’s no longer a matter of hardware or software or regulation or policy. It has to do with who knows how to use this medium well. And I felt that if you mastered these five fundamental literacies or fluencies, that you would do better. 

Noshir Contractor: Welcome to this episode of Untangling the Web, a podcast of the Web Science Trust. I am Noshir Contractor and I will be your host today. On this podcast we bring in thought leaders to explore how the web is shaping society, and how society in turn is shaping the web. My guest today is Howard Rheingold, who you just heard talking about how we can use media responsibly.  

Howard is an American critic, writer, and teacher. He specializes in the cultural, social, and political implications of modern communication media, such as the internet, mobile telephony, and virtual communities. In the mid 80s, he worked on and wrote about the earliest personal computers at Xerox Palo Alto Research Center or Xerox PARC, for short. He was also one of the early users of the Whole Earth ‘Lectronic Link or The WELL, an influential early online community. And in 1994, he was hired as the founding executive director of HotWired. He is the author of several books, including The Virtual Community, Smart Mobs: The Next Social Revolution, and Net Smart: How to Thrive Online. Welcome, Howard. 

Howard Rheingold: Good to be here. 

Noshir Contractor: Howard, you were one of the folks who was there, even before the birth of the web, and certainly at the birth of the personal computer and the very first online communities. Take us back to what things were at Xerox PARC, where so many important things were invented that helped shape the web that was yet to come.

Howard Rheingold: I found my way to Xerox PARC because I heard that you could edit writing on a television-like screen with a computer. And I had been a freelance writer for 10 years at that point, and my technology was a correcting electric typewriter, which meant that you could white out the last line that you wrote. And people who lived through that era know that you marked up your pages, and sometimes you literally cut and paste them, and then at a certain point, you had to retype them, which is really a pain. If you’re going to write a book of 400 pages, you probably retyped 3000 pages. I found an article in the 1977 Scientific American titled “Microelectronics and the Personal Computer” by Alan Kay, and it had images of what he called a Dynabook of the future, pretty much an iPad. I called and asked if there was any writing jobs that they needed at PARC. Eventually, I got the job of roaming around and finding interesting people and writing about them, and then the Xerox PR department would place it in magazines. I drove half an hour from my home in San Francisco every day so that I could work on their computer there. 

When ARPA decided they only wanted to do defense-related research, all of these smart young researchers came to Xerox, because they hired Bob Taylor and they gave him $100 million in 10 years before he had to produce anything, and so he got all of these superstars, really superstars, in one place. I mean, they ended up creating not only the visual interface for the personal computer we know today, but also the laser printer and the local area network. My research tools were a typewriter, a telephone, and a library card. I was interested in extending those capabilities. And then I met Doug Engelbart. Engelbart was talking about using the computer to extend human cognitive capabilities – augmentation, he called it. I’m interested in the intersection of technology and the mind. 

It occurred to me in 1983, I think it was, Time magazine made the personal computer the person of the year, and I thought, boy, there’s a much bigger story here to be told. So I wrote a book called Tools for Thought. And I wrote a chapter about what was happening online. The internet didn’t exist yet. The ARPANET did. And I got a modem, and I plugged my telephone and my computer into it. And that’s when I discovered the WELL, which had been started by the Whole Earth people. The WELL was like three dollars an hour. And I got totally sucked up into that. And I tell the story in my book, my wife became concerned that I was spending so much time having fun online. And I wrote an article for the Whole Earth Review in 1986 or 1987 on virtual communities. And I wrote that because so many people had been saying, or implying to me, that there’s something pathological about communicating with people you don’t already know through computer networks. And I had seen that all the things that happen in a real face-to-face community, you know, people meet and fall in love and get married, people get divorced, there were funerals and parties, and we passed a hat when people were having hard times, we sat by people’s bedsides when they were dying. So that’s why I wrote about virtual community. I discovered that this diverse group of people I could connect with through a computer, not because we knew each other, but because we had similar interests, could really serve as an online think tank for me and help amplify my ability to learn about the things that I was writing about. So I became interested professionally with this as a tool, but also as a writer, I became interested in where is this all leading? What is this all doing to us as individuals and as communities and societies? 

And because I wrote enthusiastically back then, a lot of people since then have set me up as kind of a straw man utopian. But in fact, if you read the last chapter of my book The Virtual Community, it’s called Disinformocracy. I’ve been writing about what might go wrong as well for a long time. And I think it’s important to have a nuanced view of technology, that it’s okay for someone who’s critical to also be enthusiastic. Another thing I’ve always been interested in is how can you look at the signals that we see today and make some kind of extrapolations about the future? So, back in The WELL, we thought, what we were very enthusiastic about, a few 100 people, that this was going to be a big deal someday. And back then it was like words on the screen. But we knew that someday there would be the processing power and the bandwidth for us to have audio and video and graphics. And so I’ve just had a fortunate position in time and space being in the San Francisco Bay Area in the 1970s and 1980s to be a participant observer.

Noshir Contractor: And you were not alone. There were so many other people. I mean, you had this fascination for using the tools for your own trade, but then also using that same curiosity to project further not just how it was going to help you at that point in time, but how these tools, the computers, the mouse, what was happening at Xerox PARC and The WELL, had the potential to transform society. Several people were in a similar situation to you. What do you think motivated you to say, no, I really see something special happening here, and I’m going to write about it, whether it’s to evangelize, or as you point out also, point out cautionary aspects about it. 

Howard Rheingold: I thought, here is something very important happening. People recognize something important is happening. Steve Jobs and Bill Gates, they knew what Xerox PARC was doing. They adopted it. I thought it was very important because this was our consciousness and our capacity to think and communicate meeting our ability to build technologies that are very powerful. And you know, one thing that I think we the human race noticed from the nuclear physicists and the bomb was that human ability to create powerful technology seems to be racing ahead of our ability to know what to do with them morally and ethically. And so it struck me that big changes were going to come. When I was writing The Virtual Community, I found a graduate student at UCLA, his name was Marc Smith, a sociologist. He was studying Usenet, and I asked him, why do people give information away to other people that they don’t really know? He said, “knowledge capital, social capital, and communion.” That was a great lens for looking at things. 

So fast forward to 1999, 2000. I’m in Tokyo. I noticed that people are walking around looking at their telephones. They’re not listening to them, they’re looking at them. A couple of weeks later, I happened to be in Helsinki, other side of the world, and I noticed some teenagers looking at their phones and showing their phones to each other. What was going on here? Those were signals. 1999, the World Trade Organization meeting in Seattle was disrupted by protesters who used the internet to coordinate. In the Philippines, Joseph Estrada, the president, was deposed after mass demonstrations were organized spontaneously using SMS, which wasn’t happening in the U.S. in 2000. It really took off after the iPhone in 2007. So I asked Marc, what’s going on here? And he said, it looks like the merger of the telephone and the internet was lowering the barriers for collective action. So, you know, like any good freelance writer, I went and did some research. Trying to find social scientists to help me understand what the signals meant was really part of this process of looking at the future. 

I was saying, the computer, the telephone and the network are merging into a new medium. We don’t really have a name for it yet. Well, now we call it the smartphone. But that ability, I thought, would signal another kind of phase change in the world in which people were able to organize collective action in the physical world through their connection online. I guess you would say that the insurrection of January 6th was an example of that, as well. So again, throughout this process, the question of Is this stuff any good for us? kept arising. I started teaching, I guess about 2005, because I saw college students were using these. But the universities, you couldn’t take a course on What does it mean? anywhere. They invited me to teach this course on digital journalism at Stanford. I noticed that there were very few teachers using forums and wikis and blogs. Because I was teaching about social media, you know, it only made sense that we use that social media in the process of doing. I ended up creating a course called Social Media Issues around my book Net Smart. My answer to Is this any good for us? has been, it depends on what people know, that it’s no longer a matter of hardware or software or regulation or policy, it has to do with who knows how to use this medium well. And I felt that if you mastered these five fundamental literacies or fluencies, that you would do better.

Noshir Contractor: And when you said how to use a social media well, you parse that into how to use the social media intelligently, humanely, and above all, mindfully.

Howard Rheingold: Yeah. So what are these five essential literacies? Attention, crap detection, participation, collaboration, and network awareness. I start with attention. The bad news is that the business model of the web has to do with attracting and engaging and maintaining your attention so that they can sell you things. And the people who are engineering these apps are very good at doing that, and we’re all suckers. The good news is that there’s ample evidence both from millennia-old contemplative traditions and from neuroscience that you can begin to understand how to deploy your attention more productively, something called metacognition. So one of the things I taught my students was, you know, becoming aware of where you put your attention is important. 

So that was the first chapter, but then I told the story of my daughter when she was in middle school, this was before Google, but she was using search engines. She was beginning to put queries in to do her homework. And I sat her down and said, I showed her a website called martinlutherking.org. I think that they’ve changed their identity. But it’s actually run by white nationalists. And I showed her how to find that out, that you can go to the library and get out a book, and that book was edited, it was published, it was purchased for your library, it was assigned by your teacher. Each of those were kind of gatekeepers to kind of guarantee that what you’re reading is accurate. You can now ask any question anywhere, anytime and get a million answers in a second. But it’s now up to you to determine which of those are accurate information, because a lot of them are wrong. So crap detection comes from Hemingway saying every journalist should have a good internal crap detector. 

And then the next one was participation. And we really wouldn’t be having this conversation about the web if it wasn’t for participation. It was created by millions of people who put up websites and put up links to other websites. From the Google twins to Mark Zuckerberg, people invent things in their dorm rooms, and it changes the world. And part of that is the miracle of the architecture of the internet. You don’t have to get permission to start a new search engine or social network, as long as it operates according to the technical protocols of the internet. You just need people to come to your website. 

So when I wrote Smart Mobs, I became interested in dynamics of collective action. How humans cooperate and what the barriers to cooperation are is probably at the root of our most significant global problems, from climate change to nuclear weapons to interstate conflict to land management. Elinor Ostrom won her Nobel Prize because she came up with design principles that if a group that was managing a scarce resource used these design principles, they would succeed. 

Noshir Contractor: You were amongst the first who introduced or at least popularized the term collective intelligence: when all of us can be smarter than any of us. Today, there’s a lot more interest in collective intelligence. There are conferences on the topic, centers around the world studying it. But again here, there was a signal that you picked up on before others.

Howard Rheingold: It was pretty obvious even back in The WELL. You got a group of people together, you could solve problems together online. Going back to Engelbart. Engelbart was not primarily interested in hardware and software. He was interested in, – and he used these words, “increasing the collective intelligence of organizations,” collective IQ, he called it. 

Noshir Contractor: Net Smart talked about five fundamental digital literacies. And we talked about attention, crap detection, participation, collective intelligence. Can you talk a little bit about the fifth one – the network smarts?

Howard Rheingold: Although we’re used to the term “social network” in response to Facebook, social networks are something that precede technology by a long ways. The way I would describe it is, well, your family, your friends, your teachers, your neighbors, those are your community. The person you buy coffee from, the stranger you see when you’re walking your dog, the people you communicate with online, those are your network. They don’t all know each other. In a community, people know each other. Way back when Marc Smith told me about knowledge capital, social capital, and communion, one of the things that I taught my students was how social capital is cultivated and harvested online. The traditional definition of social capital is the ability of groups of people to get things done together outside of formal mechanisms like laws, governments, corporations, and contracts. If you are a farmer and you have good relationships with your neighbors and you break your leg, your neighbors will come in and help you with your harvest. Well, there’s a lot of social capital to be had online if you know what you’re doing. I learned this way back in The WELL. I learned, if somebody has a question and I have the answer, even if I don’t know that person, doesn’t cost me anything to give them the answer. Well, if you get several hundred people together who have different kinds of expertise and they all do that, suddenly, everybody is empowered. But you know what, people aren’t gonna give you answers unless you give answers yourself. I think anybody who is in a support group online knows about that. 

Noshir Contractor: You can’t go there and simply want to take things from other people and not also then contribute to the public good.

Howard Rheingold: Oh, that’s right. When you study human cooperation, what’s called altruistic punishment is a big part of that. It’s not just laws that enable people to live together, it’s norms. Why do you get angry when someone cuts ahead of you in line? It’s because they’re breaking the norm, and you feel that you need to enforce that. 

Noshir Contractor: Yeah. So in closing, then Howard, I want to go back to something that you’ve done so well over the last several decades, and that is detect signals and use those to project what’s coming down the pike. What are the signals that you’re detecting today that might tell us about what is going to happen in the next couple of decades?

Howard Rheingold:  I think the most important one is the disintegration of consensus about what’s real and what’s not. Misinformation seems to travel much faster than corrections. The anti-vax movement worldwide is a good example. You know, the Enlightenment came along and said, well, let’s not have theological arguments about what causes disease, let’s use microscopes and see if we can discover the physical causes of it, so relying on science and coming to some kind of consensus about what we all agree it’s real. That seems to be in big question. That’s a very troubling signal to me. 

Noshir Contractor: Does this also have implications for another term that you spent a lot of time thinking and writing about: virtual reality or augmented reality? And what is real or not real in that context?

Howard Rheingold: I spent some time in Second Life, which is not immersive, but a kind of metaverse. There were people doing very interesting things, but it was not the next big thing. I don’t think people are gonna want to have avatar meetings and buy avatar groceries and socialize to the degree that the Metaverse vision from Facebook is promulgating. I just don’t think it’s going to appeal to everybody that way. I also think that there’s some problems. In Second Life, there were what were called griefers. You would be having a seminar and a bunch of flying penises would disrupt it. I think we’re going to see that kind of disruption in the Metaverse, and we’ve seen that Facebook has been unable to moderate even in its two-dimensional form. What would be really interesting in something like that would be a molecular biologist taking you through a walkthrough of a ribosome, an archaeologist taking you on a walkthrough of the pyramids to do things in three dimensions that you can’t do any other way. And I know that they’re using it for things like protein folding these days. And I think that being able to navigate and manipulate a three-dimensional world has research and educational implications that really have not been tapped.

Noshir Contractor: It’s been a real delight, Howard, hearing from you as somebody who was witness to the birth of many of these technologies, and you have done a great job of envisioning so many of the phenomena that we have been experiencing and perhaps we should have paid more attention to you when you first raised it, then we might not have found ourselves in some of the predicaments that we do today. I also obviously want to thank you for all your work as an educator, helping to make the next generation more network smart than we were. So thanks again for joining me today, Howard. It’s been a real pleasure.

Howard Rheingold: Mine too. 

Noshir Contractor: Untangling the Web is a production of the Web Science Trust. This episode was edited by Susanna Kemp. I am Noshir Contractor. You can find out more about our conversation today in the show notes. Thanks for listening.

 

Episode 32 Transcript

Safiya Noble: I have found in my own community, as a, you know, Black woman living in Los Angeles watching so many different kinds of predictive technologies, predictive policing, experimented upon my own community, that I think we have to abolish many of these technologies, and we actually have to understand to what degree are digital technologies implicated in some of the most horrific forms of violence and inhumanity. 

Noshir Contractor: Welcome to this episode of Untangling the Web, a podcast of the Web Science Trust. I am Noshir Contractor and I will be your host today. On this podcast we bring in thought leaders to explore how the web is shaping society, and how society in turn is shaping the web. My guest today is Dr. Safiya Noble, an associate professor of Gender Studies and African American Studies at the University of California, Los Angeles. She is the co-founder and faculty director of the UCLA Center for Critical Internet Inquiry, an interdisciplinary research center focused on the intersection of human rights, social justice, democracy and technology. She is the author of the best-selling book Algorithms of Oppression: How Search Engines Reinforce Racism. Safiya was also the recipient of a 2021 MacArthur Foundation fellowship. Her nonprofit community work to foster civil and human rights, the expansion of democracy, and intersectional racial justice is developing at the Equity Engine. Welcome, Safiya.

Safiya Noble: Hi. First of all, I just want to say thanks so much for having me on the podcast today. This is such a thrill and such an honor to be in conversation with you. 

Noshir Contractor: Thanks for joining us here today. Safiya, your book about the algorithms of oppression focuses on search engines. And you talk about your experience using these search engines and recognizing very quickly that they are not quite as objective and neutral as one might like to believe they are. 

Safiya Noble: A decade ago, I was thinking about large scale digital media platforms. And I had kind of spent my whole first career in advertising and marketing. And as I was leaving the ad industry and going back to graduate school, it was so interesting to me the way that people were talking about search engines at the university. So I was at the University of Illinois at Urbana-Champaign in the information school there. Lycos, and Google and Yahoo, you know, these new technologies were amazing the way that they were indexing the web. But I also had just left this experience of trying to game these systems for my clients when I was in advertising. So really, I understood them as media systems. We were buying ads and trying to optimize. We were hiring, like, programmers to come into the agency and, like, help us get this General Motors ad up on the first listing, things like that. So it was interesting to kind of come into the academy and study search engines in particular, because I was just so fascinated by them. They were just kind of so banal and non-interesting compared to social networking sites that were coming into vogue. And it was kind of in this inquiry that I started doing searches and looking to see what kind of results we get. And one day I just kind of stumbled upon searching for Black girls. You know, I’m a Black woman. My daughter at the time was a tween. I realized that when you search on Black girls, you were met with almost exclusively pornography. And I was thinking about like, what does it mean that you don’t have to add the word “sex,” you don’t have to add the word “porn,” but Black girls themselves, that phrase is synonymous with hyper sexualization. This was kind of like a sexism 101, racism 101. And that really was the thread that I started pulling on that led to the book Algorithms of Oppression.

Noshir Contractor: Are you suggesting that the reason the search engines were privileging these kinds of search results is because that actually reflected something that was happening in society already and it was simply being amplified here? Or was this some other kind of manipulation that resulted in those?

Safiya Noble: That’s the question, right? I mean, I think that the prevailing logic at the time, 10 years ago, was that whatever we found in technology was purely a reflection of society, right, that the tech itself, those mechanisms were neutral, and that if you found these aberrations, it was because people were misusing the technology and that’s what was becoming reflected back. And I felt like that was an insufficient answer. Because I knew from my previous career, we had spent time trying to figure out how to manipulate these technologies. So I knew that if they were gamable, that that was a design choice or a set of kind of business imperatives.

Noshir Contractor: Sometimes referred to innocuously as search engine optimization, which was a term that was used at the time.

Safiya Noble: It was interesting, because search engine optimization was kind of a nascent industry. And then, you know, our beloved colleague Siva Vaidhyanathan wrote this incredible book called The Googlization of Everything: (And Why We Should Worry). I felt like, this is the jumping off, this book is actually the place now that I can go even more specific about how this skews in these kind of historically racist and sexist ways toward vulnerable people, toward communities of color, toward women and girls under the guise of being neutral, normal, and apolitical.

Noshir Contractor: During the COVID crisis, we see algorithms now are also playing a sinister role in the propagation of information, not just in terms of giving us search results. Talk about some of the work that you have been concerned about with regards to how algorithms were employed by university hospitals in terms of guiding vaccine distribution.

Safiya Noble: This was one of the most egregious, I think, examples of a kind of distorted logics that are embedded into a lot of different kinds of software that we use every day and really don’t think twice about. So you know, there was this story that kind of went viral pretty quickly about how the vaccine during COVID-19 would be distributed. And of course, we had so many frontline workers who desperately needed that. And Stanford University Hospital, in the heart of Silicon Valley right, you have the hospital that deploys an algorithmic decision making tool to determine who should get the vaccine first. They determine that the algorithm suggests a group of people that happen to be people who are retired doctors, who are at home, who are mostly protected. We have to look and understand the data, the logics that are imbued into the different kinds of systems that we are making ubiquitous, because they will inevitably have huge consequence. 

Noshir Contractor: Concurrent to that we also saw an infodemic. You’ve talked about the role of the algorithms and the propaganda in terms of escalating violence against Asian Americans.

Safiya Noble: One of the things we want to remember is this dynamic interplay between social media and search. During the Trump administration, he was one of the greatest propagators of racist propaganda against Asians and Asian Americans by invoking sarcasm and hostility toward our communities, and in suggesting that Asian Americans and South Asians and really Asians throughout the diaspora were responsible for Coronavirus, right. And so this, of course, we know, also elicited incredible violence. If you come across something like racist propaganda against Asian and Asian American communities in social media, on Facebook, you might go and turn to a search engine to query whether it’s true. And this of course becomes extremely dangerous, because we know that search engines also are likely to be flooded with disinformation and propaganda too. Using something like Google as a fact checker for propaganda, and then having that propaganda be made visible to you, only confirms the dangerous kinds of ideas that you might be experiencing or exposed to.

Noshir Contractor: So there is almost a symbiotic relationship between what propagates in social media and then what shows up on your search results. One example that you’ve talked about is how Dylann Roof was influenced by reading white nationalist websites before massacring nine African American church goers.

Safiya Noble: Yes, that’s right. What we know is that the most titillating and often egregious – which includes racist and sexist propaganda against religious minorities and sexual minorities – this kind of material on the web is actually very, very engaging. It’s fashioned many times like clickbait, so that it looks like it could be kind of true. And this is one of the main arguments that I really try to make in my work, which is that it is not just a matter of the fact that people are clicking on it, because guess what, people who are against that kind of material are also clicking on it, trying to understand what in the world is going on. But every one of those clicks really translates to profits. It will, in fact, contribute quite handsomely to the bottom line for many of these companies.

Noshir Contractor: Which takes you back to your initial profession, in the ad business.

Safiya Noble: Right. Publicly traded companicompasses are required to maximize shareholder investment and profit at all costs, so there is no social responsibility, there is no social justice in the frameworks of Wall Street. I think we’re seeing now a decade of the results of that kind of misplaced value in our society.

Noshir Contractor: A lot of the tech industry engages in some pretty brazen experiments where they try to engage in some kind of an intervention where they would experiment with a particular kind of manipulation for a day. And you’ve compared that to something that would not be allowed in industries such as the pharmaceutical industry, in the tobacco industry. 

Safiya Noble: The way in which Silicon corridors around the world are able to articulate their work is shrouded in math, science, engineering. We have to be careful about how we deploy even words and frameworks like science, that get used as a shield sometimes right for some of the most egregious kinds of products and services. I heard the investigative journalists who broke the story about Compass, the recidivism prediction software, right, that is profoundly racist, predicting African Americans, I think it was like at a four or five times the rate of white people who were arrested, to go back to jail or to stay in prison. They had all these boxes of paper documents to prove how the harm was happening. And the programmers, they didn’t want to hear it. And I remember sitting on this panel, in fact, it was at Stanford, with these journalists. I thought to myself, you know, we would never let three guys rent some space in a strip mall like those guys, and cook up some drugs and roll it out in pharmacies, right and then when people die or people are harmed, we say, “Hey, it’s just chemistry. Chemistry can’t be dangerous,” right? Like, we would never do that. So why is it that in the tech industry, we allow this kind of deep belief in the neutrality of these technologies without realizing that so many people have personally been harmed. And I think that we have to look at other industries, even like the era of big cotton, you know, during the transatlantic slave trade. We had many, many arguments during that era, where people said, you know, “We can’t do away with the institution of slavery and the enslavement of African people and Indigenous people, because the American economies are built on it, it’s impossible, our economy would collapse.” And that is actually the same kind of discourse that we use today. And I think there’s a lot to learn from these other historical moments, and figure out how we will shift the paradigm as we did for those other industries.

Noshir Contractor: That includes the abolitionist movement as well as a historical precedent.

Safiya Noble: Absolutely. When I set out a decade ago to work in this area, and to think about these things, I didn’t think of myself at the time as being, like, an abolitionist. I thought I was curious in doing this inquiry, and I knew that there was something unjust happening, and I wanted to sort it out. And now I can truly say that there are so many technologies that are deployed that are made with no oversight, with no regulatory framework. I have found in my own community, as a, you know, Black woman living in watching so many different kinds of predictive technologies, predictive policing, experimented upon my own community, that I think we have to abolish many of these technologies. And we actually have to understand to what degree are digital technologies implicated in some of the most horrific forms of violence and inhumanity. Doing this work for 10 years has definitely moved me to the place of thinking of myself as kind of an abolitionist in the sense that, like during the transatlantic slave trade and during the institution of slavery, it really was a small handful of people that were persistent about the moral and ethical failings of the economic and kind of religious political system that was holding up such an inhumane set of business practices and social practices and political practices for centuries. And I think it will be those of us who are trying to point to these dangerous moves, that probably we will be articulated as some type of abolitionists in this sector, trying to raise attention to the costs, the long term costs.

Noshir Contractor: How does one bring a more concerted way of addressing all of these injustices?

Safiya Noble: We need the structural changes, we need different laws, we need different policies. The law, it’s not the only thing. But it certainly is very important. What I worry about with predictive analytics is that so much information is collected on people that then is used to determine whether they will have an opportunity, but also to foreclose opportunities. And of course, you know, we have to think about each of us. Imagine our worst moments of our lives. And I think, what if that moment is the snapshot in time, collected about me and that determines, it’s a no-go for Safiya Noble, right? And of course, it also forecloses any possibility of redemption, of forgiveness, of empathy, of learning, of change. And I think we don’t want to live in a society without those qualities. And predictive analytics really forecloses the opportunity for being known, for changing, and for having a high quality of life. Cathy O’Neil says in her book, Weapons of Math Destruction, she says, predictive analytics make things much better for people who are already doing great, and much worse for people who are already not doing well. And I think we want to take that heed seriously.

Noshir Contractor: Technology has had a history of widening the knowledge gap and the information gap in society. And so this is a natural progression of what has preceded it. One of the things you have also discussed as a way of addressing some of the issues you just talked about was proposing an awareness campaign and digital amnesty legislation to combat the harms perpetuated by algorithmic bias. 

Safiya Noble: When I was a graduate student, I was thinking about what happens when you are ensnared in a search engine, for example, and you can’t fight your way out of it, right, your name is destroyed. And of course, we have legislation in the EU like the right to be forgotten, that helps address this, right. We don’t have this yet in the United States. Every engagement we have on the web is bought, sold, traded by 1000s of companies. So how do we withdraw? How do we pull ourselves out of these systems? How do we create amnesty out of these situations? I’ve been trying to talk to lawmakers in California about what would it mean when we do pass our own kind of localized versions of GDPR? I love this article written by Jean-François Blanchette and his collaborators about the social value of forgetting, like why we seal juvenile records, so that those mistakes don’t follow you into the future. So how do we grapple with that now in this global web? Could we imagine and reimagine the way in which we appear? I once heard the director of the FBI, and he was at a conference, and he said, “As far as the government is concerned, who you are is your digital profile.” Can you imagine? I mean, people like us who study the worst parts of the web, who are on the internet looking at terrible things all the time. All the things we’re doing on the internet, nothing could be further from the truth about who I am as a person. 

Noshir Contractor: That’s a really important point, because we celebrate the fact that now we can store and put into memory everything. But you’re pointing out the perverse aspects of keeping that memory always available to everyone. And you worked with engineers, executives, artists, and policymakers to think through these broader ramifications of how technology is built, how does it get deployed, and how most importantly, does it get used in unfair ways?

Safiya Noble: I think one of the most impactful organizations that I’ve been able to be a part of and on the board of is the Cyber Civil Rights Initiative, which is really the organization that has helped develop and pass all of the non-consensual pornography or revenge porn laws that we have in the United States. I think that’s a place that actively engages with trust and safety staff in large tech companies to try and help them understand truly the cost of their algorithmic distortions of people, many times young women. I have many relationships into Silicon Valley. The benefit of having had my whole first career in corporate America for 15 years before being an academic is I really understand, when you work in a large global company, you’re one person like sometimes on the Titanic, and it’s going down, and you can’t actually stop it yourself. You know, you’re trying to figure out how to leverage and work across all kinds of teams and all kinds of people. I don’t only stay in academic and, kind of, activist spaces, or community spaces, I also go into these corporate spaces, and I try to talk about the work and give them the examples and challenge them to think differently. And I do think that now, if I look out at engineering programs, you see slowly schools changing, and that means that that’s because industry is also saying, maybe we need people who have some background and experience in social sciences and humanities, too.

Noshir Contractor: But you are saying that in general, you find many of them to be receptive to these ideas and willing to be educated about these issues?

Safiya Noble: Absolutely. I think there are a lot of people who do not want to look back and feel that they are on the wrong side of history, that they didn’t ask tough enough questions, that they took for granted the wrong assumptions. I can tell you in the classroom, for sure, as I train engineering and computer science students who take my classes, 10 years ago, they were hostile to the idea that their work had any political value. And now, traditionally aged undergraduates are completely clear when they enter the field, and they’re here to change it.

Noshir Contractor:  So in addition to the work you do with corporate America, as well as activism, you’ve also rubbed shoulders with celebrities. Meghan Markle has cited Algorithms of Oppression as key to understanding the online vitriol that was spewed about her. 

Safiya Noble: I got this email that said, you know, “Please save the date to meet with the Duke and Duchess of Sussex.” And I thought it was like a scam email. Okay, long story short, it was real. They had been given my book by my former dean from USC. Meghan, I think she really saw an explanation for the incredible racist and sexist vitriol that she’s experienced on the internet. You know, the way that I could articulate how Black women and women of color become fodder for destruction, almost like as a sport on the internet, I think she really, she had experienced herself. And sent here was someone explaining that this is actually a business imperative for companies. And so they have given, you know, resources to the UCLA Center for Critical Internet Inquiry. Bot Sentinel, just, you know, issued a couple of really important reports that showed how just a few dozen high impact accounts on social media were coordinated to basically try to destroy their family, destroy them. But I will tell you, Meghan and Harry, they understand that a girl living in Iowa, a teenager living in Oakland, that people who are vulnerable that don’t have the platform and resources they have, could never fight off these kinds of internet attacks and cyber bullying and trolling. And I think that is what they want to put their time and efforts and support around people who get that and care about that are also working on that, too.

Noshir Contractor: Wonderful. So as we wrap things up here, I want you to look ahead. I want you to tell us a little bit about what plans you have as part of working as a MacArthur Genius, and also the launch of your new nonprofit Equity Engine.

Safiya Noble: I’m still pinching myself. I’m very grateful. And there are many Black women and women of color, who with just a little bit of support and scaffolding could continue to have very big impact. I mean, I look around, and I see, whether it’s Stacey  Abrams, or you know, a whole host of Black women, too many to name, and women of color, who are holding up families, villages, neighborhoods, the country, democracy, and who are really under-resourced in doing that work. And so I’m really trying to use this opportunity to build Equity Engine. I’m hoping that people will just give resources and their networks and their power to women of color, because the one thing that women of color have is incredible sense of justice. And our work has been on the frontlines of expanding human and civil rights around the world. And we are also the least resourced in doing that. And so the Equity Engine is really a place for people to help us hold Black women and women of color up and build their power, in many of the ways that this MacArthur Fellowship is helping me do.

Noshir Contractor: I love the term Equity Engine as well, I think it’s a very apt name for what your vision is. Thank you again so much, Safiya for speaking with us. You brought us lots of really interesting insights and awareness about some of the ways in which we need to be much more skeptical about the web in general and about the algorithms and do something about it to make a difference.

Safiya Noble: Yes. Well, thank you. It’s such an honor. I have followed you and my whole career, and I just am so honored that I’ve done something right in life to get to be in conversation with you, so thank you so much for this opportunity.

Noshir Contractor: Untangling the Web is a production of the Web Science Trust. This episode was edited by Susanna Kemp. I am Noshir Contractor. You can find out more about our conversation today in the show notes. Thanks for listening.