This video is a popular analogy to being anesthetized due to external influences to the point where we lose oxygen or in Froggie’s case, become dinner (not for me). My condolences to Kermit, you are still my #1. It shows us that we need to pay attention. We cannot be passive consumers of media or become victims of algorithmization. Algorithms continue to feed us the same diet and restrict our ability to take intellectual risks. My title is replete with cynicism, but there is hope. We, human beings, are the most valuable resource on the planet, and it is up to us to (a) first become aware of the problems of filter bubbles and algorithmization and (b) make steps towards democratizing our world views as responsible netizens.
I will discuss the article and Tufecki’s TedTalkEx in my blog, but I wanted to provide a snapshot of what filter bubbles are and how they can isolate us:
Considering Pariser’s “The Filter Bubble”
In Eli Pariser’s 2011 article, “Filter Bubble: What the Internet Is Hiding From You,” he exposes the hidden algorithms that can severely narrow our world views. He focuses on something impacts our lives: the news media. Of course, we are not naive people and we realize that the news is a profit-making machine and this isn’t inherently bad (we’ll get to the bad things shortly).
In the not so distant past, despite the liberal and conservative bents of newspapers, journalists were driven to do serious reporting and people who had complaints could send papers critical commentaries. Now, they wouldn’t all be necessarily incorporated into the papers’ next editions; however, the point here is that (most) print news media didn’t pander to audiences, trying to get them to be sycophants. Journalists actually got their hands dirtier and presented facts. Of course, is their bias in those representation of facts? Yes! Bias is baked into the human experience. It is up to us to “filter” those out as best as we can.
However, the Internet changed the news business forever. Here is an analysis that Parisi astutely offers:
The Internet had delivered a number of mortal blows to the newspaper business model, any one of which might be fatal. Craigslist made classified advertisements free, and $18M in revenue went poof. Nor was online advertising picking up the slack. An advertising pioneer once famously said, ‘Half of the money I spend on advertising is wasted– I just don’t know what half.’ But the Internet turned that logic on its head– with click through rates and other metrics, businesses suddenly knew exactly which half of their money when to waste. And when ads didn’t work as well as the industry had promised, advertising budgets were cut accordingly. Meanwhile, bloggers and freelance journalists started to package and produce news content for free, which pressured the papers to do the same online.48 (emphasis added).
Pariser causes me to think of a comment Denzel Washington once made. We live in such a fast-paced society that news isn’t valued for “truth,” but for who got it out first on the Internet (despite the lack of verifiable facts). He led me to think long and hard about this.
The author also astutely explains how there is a radicalized change in the who produces news and how they do it. She tells us that news is packaged in brightly covered wrappers and we consume it like gum– until it becomes stale (my analogy). Niche markets, like sports, develop for news consumption. Who needs to see a whole newspaper anymore, just skip to the content that you want. Right? Of course right. (read: Yenta’s voice in Fiddler on The Roof).
I don’t see anything inherently bad in this. I myself go to the Arts and Styles sections of news because that is where my interests lie. I then have to go to multiple websites to get my hard news. However, as a nostalgist, I do miss the days when I could get my hands dirty on newsprint and anticipate getting to the book reviews by reading the hard news first. Enough of that…we are in a brave new world. But you can still buy newspaper subscriptions although it isn’t really economically feasible anymore.
When I got my iPad, I became critically aware that it was feeding me much of the same things, based on articles I was reading. It made my brain itch. But we have to realize that technology often drives the attention economy. The author explains that a new constellation is emerging and I thoroughly agree (51-52). It is becoming much cheaper to produce and distribute media of all kind and importantly because we are deluged with choices to make, we become overwhelmed and just rely more heavily on “human and software creators to determine what news we should consume” (51).
With the consolidation of so many companies (Google married YouTube- it was an expensive wedding), there is a definitive concept that we are seeing. It is media convergence, a topic that arose during my presentation. Let’s pin it down with a good definition:
Media convergence refers to the merging of previously distinct media technologies and platforms through digitization and computer networking. This is also known as technological convergence. Media convergence is also a business strategy whereby communications companies integrate their ownership of different media properties. This is also called media consolidation, media concentration or economic convergence.https://www.thecanadianencyclopedia.ca/en/article/media-convergence.
I posit that if we let it, convergence can cause us to be fed on a diet that does not create much variety (more on that later).
As the author explains, traditional newspapers are losing money. Anyone with an Internet and a laptop (or even a phone) can be a news purveyor. And that is excellent in many regards. Take the example of the Arab Spring in 2011. However, there’s a dark side: news consumers are thinking that they are making informed decisions when they read articles based on niche purveyors who know your tastes and are master curators.
The author also expresses a phenomenon that has already occurred and she discusses it in the context of Google News:
…You will see only stories that are locally and personally relevant to you, based on the interests you’ve demonstrated through Google and what articles you clicked on in thee past….’Most people will have personalized news-reading experiences on mobile-type devices that will largely replace their traditional reading of newspapers…and that news consumption will be very personal very targeted…’ (quoting Google CEO)62.
That future is here, thanks to smart-phones and iPads, etc. So how is this specialization possible: algorhythms. I purposely misspell it because AI is tracking the beats of our lives, what we like, what we don’t, etc. and feeding it back to us, simply based on the news we click. I believe that I mentioned this in my prior blog re: “Filtered Reality,” this freaks me out. I don’t like it. I want to know the “truth” as much as possible, give me raw footage. Stop the spin-doctoring.
But the author offers a solution: calling out the citizens in people and reminding them of their stake in the nation’s business (58). It is what we have to do in order to become better informed, healthier netizens. We can’t keep pandering to our prior choices. Who wants a nation of sycophants? I don’t. Neither does the author. And as we have also learned, we can create new platforms where we up our games and try to be as authentic as possible. That is a great aspect of the Internet!
Tufecki’s TedEx Talk: “We’re Building a Dystopia just to make People Click on Ads”
Now I’m a cynic, however, this TedEx talk stopped me from blinking and caused me to consider the dark underbelly of algorithmization and its concomitant negative effects (hence, my posting of Rihanna’s song “Disturbia”). Big data and machine learning is so targeted that it can follow people who have made manic posts and possibly suffer from bipolar disorder to get them to go to Las Vegas and spend in their manic episodes (sad).
I always thought of big data as consolidating our preferences and feeding us a consistent diet based on what we have liked before. I didn’t consider that big data is enormous and it reaches its little spidery finger-like webs into our heads. Machines learn from our behavior and they are programed with exactly what we do and can do calculations ad infinitum! It’s like Big Brother on steroids.
The notion of getting lost in a YouTube Rabbit hole is familiar to me. But offering people a more shocking diet based on what you’ve clicked on is unfair. Is YouTube the new Shock Jock? I think it is based on Tufecki’s astute analyses. It goes beyond shock-jocking though.
What is very interesting is that machines can be coded with bias that crashes towards racism. If you go to my Twitter @medeathewriter, you will see that machines have been coded not to recognize passport photos of African American and the same thing has happened with hand dispensers! I just learned this TODAY from a random interaction on Twitter.
What I like about Tufecki is that she doesn’t just offer us just dystopia/disturbia. She encourages us to become educated that FaceBook is a form of persuasion architecture. The lack of transparency behind the algorithm is a structural challenge of machine learning capacity when it indiscriminately collects data about us. However, we can work to build data that includes more creativity, diversity and alternative choices, i.e. build AI that supports us in goals that are bounded by ethics and morality. Will those codes have to be negotatiated? Yes, but I remain hopeful.
It is important to note that things are quite different in the EU, especially with the GDPR – General Data Protection Regulation, which places a focus on democratizing technology as it relates to people. Tufecki even mentioned how things were different in Europe in the assigned video. The following gives us a good snapshot of algorithmization and how bias is a danger to fight against. Check it out the embedded video: https://www.cnbc.com/2018/12/14/ai-bias-how-to-fight-prejudice-in-artificial-intelligence.html
Takeaway lessons, we’ve got minds, souls and social awareness, let’s not be froggies who get cooked. We are much smarter than that. Right? Of course right!