Lauern Schiller hosted Dr. Sofia Noble on her podcast “Inflection Point” to talk about the idea of algorithms living within search engines we use in our everyday life that are skewing the results that turn up over particular searches. Dr. Noble argues that we have come to view search engines in the same way we view libraries; a collection of knowledge and information that is always readily available to us at all times. Dr. Noble counters, however, by describing these search engines, namely Google, as nothing more than a medium for advertisers to plug their product or service. It is with this understanding that Dr. Noble explains how capitalism has intruded on this online space and completely altered the perception of Google and other search engines. Oppression is an ongoing battle, and these algorithms are doing a disservice to the by skewing the results.
Early in the conversation between Dr. Noble and Lauren, the scholar does not waste any time in identifying where the issues are. She describes how, not just Google, but all search engines are less of “the new public library,” and more of a vehicle for large companies to advertise. It does not end there, however. Dr. Noble reveals to the listeners that these large corporations are in a constant bidding war for the rights to “words.” Allow me to explain. Dr. Noble describes a system in which companies are in constant competition with one another in an auction like environment to get certain words attached to their products and services. Big business and online ads are as ingrained in our society just like fried foods or the spectacle that is Super Bowl sunday. Even with that sentiment, it is certainly unsettling to me that just about every aspect of our lives is impacted by this search for profits and the synergy between our ever more important online footprint and big business.
The first example that Dr. Noble uses to illustrate how these algorithms are oppressing and marginalizing people is when she explains her findings in googling “beauty.” She goes on to explain how the results for searching white or black people were vastly different. While searching “white girls,” she mentions that the most popular results were a movie and items, such as dresses, were essentially cast as being “white.” One interesting thing she says in this podcast is how most white people in America do not see themselves as such; rather choosing to label themselves simply as “American.” It begs the question, how much does that have to do with these algorithms? I would not be surprised if there was such an effect from these algorithms that skew perceptions such as these. It can kind of take on a feel of those who are minorities being viewed as something other than Americans, and that type of isolation is something that should be guarded against the whole time.
Moving into the next segment of the conversation, Dr. Noble speaks on the effect search engines are having on how we learn and “know” information. An interesting insight to this idea was when Dr. Noble started to speak on Dylan Roof. Roof is the infamous perpetrator of the heinous shooting at a southern church that was widely attended by African Americans. Dr. Noble discusses how, in looking into the circumstances surrounding this tragedy, that he had been led to a host of websites where white supremecist ideals were alive online to educate those who may find their value system align with them. Within the context of our conversation, she mentions how we simply do not know if Roof was purposefully seeking out these types of pages or there was a combination of these oppressive algorithms and Roof’s own personal algorithm on his devices and accounts that aided Roof in finding web pages that spew this type of hateful rhetoric. Education from this lense gets even trickier to navigate due to the ready availability of limitless information. Dr. Noble questions if this is a good way for students to learn. Personally, I do not think having access to an endless base of knowledge is the worse thing. However, I will concede that moderation is key in everything that we do. These search engines are now a major part of our experience as being part of civilization, so any moderation or weaning is easier said than done.
One piece that I found to be a great compliment to the podcast was the TED talk with Joy Buolamwini, titled “How I’m Fighting Bias in Algorithms.” In her presentation, she speaks on similar themes to what Dr. Noble has been saying, but she took it a step further and brought some of her experiences with these types of bias. Namely, she talks about facial recognition technology. In her example, it was an app of some sort that would create masks for the user. Being an African-American woman herself, she noticed right away that there was difficulty from the app in recognizing her face and therefore could not create a mask for her like it had done with so many others. She then regurgitates the answer she was given, that the test group was not diverse enough. I find that to be even more alarming than the tangible results. I say this out of frustration to think that those who were developing this lacked the foresight and understanding that there would be issues because they did not have a diverse enough test group. It is truly disheartening.
Lastly, I want to briefly touch upon the interview we read between Dr. Noble and Sarah Roberts. In this discussion, Dr. Noble talks about how she wants to create a non-commercial search engine, that would theoretically compile results based on the popularity and relevance of whatever it is you were searching for. This is the ideal combatant to the Google model that might as well act as NASDAQ or the NYSE. Later in the interview, Sarah Roberts says something that resonated with me maybe more than anything else in these interviews and discussions. She mentions how we have always imagined the internet as this live democracy where we are all apart of it. In reality, she dictates, the internet is nothing more than a series of private islands where each one is operating for their own gain, at no matter the cost.
I really enjoyed having the opportunity to write my reaction paper on a topic such as this. As we become more connected on the web, the large corporations are making it their business to profit from it, no matter the method. In their eyes, having these algorithms profile and marginalize people is a welcomed symptom of their pursuit for profits. That is why it is important we have people like Dr. Noble and Sarah Roberts to “sound the alarm” on these inner workings of search engines and how it is they truly produce the results we see. While I like to believe the internet and the hyperconnectivity we have come to know and expect is way more positive than negative, it does not come without the potential exploitation of the general public.
Buolamwini, Joy. “How I’m Fighting Bias in Algorithms.” TED, www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?language=en#t-511870.
“Engine Failure: Safiya Umoja Noble and Sarah T. Roberts on the Problems of Platform Capitalism.” Logic Magazine, 24 July 2019, logicmag.io/justice/safiya-umoja-noble-and-sarah-t-roberts/.
FM, Player. “‘Algorithms of Oppression’ – Dr. Safiya Noble on the Embedded Misogyny and Racism of Search Engines.” Inflection Point with Lauren Schiller, 7 Nov. 2018, player.fm/series/inflection-point-with-lauren-schiller-1468054/algorithms-of-oppression-dr-safiya-noble-on-the-embedded-misogyny-and-racism-of-search-engines.