1 of 15

Filter Bubbles

by Ashley Hoffmann

for a UCLA GSEIS undergraduate course

2 of 15

The illusion of free will on the Internet

  • As we read in the “World Brain: The idea of a permanent world encyclopedia” article, the World Wide Web is supposed to be this perfect access to information, a great equalizer in publicizing education for all.
  • Most users assume the information they receive is unbiased. After all, what you see usually agrees with your worldview. You think you choose your content, and it’s from these trustworthy big companies like Google!
  • However, self-selection of content and modern personalization lead to a problem, it’s more limiting than you’d think.
  • In other words, don’t trust Google for a hot second

3 of 15

So what exactly are filter bubbles?

  • There is a filter bubble around each individual world wide web user.
  • It was coined by Eli Pariser (who created Upworthy) in 2011.
  • A TED talk by Pariser himself explains it really well: https://www.youtube.com/watch?v=B8ofWFx525s&feature=youtu.be&t=1m10s
  • It starts with Mark Zuckerberg’s quote: “A squirrel dying in front of your house may be more relevant to your interests right now that people dying in Africa.”
  • When people call your Facebook page an “echo chamber” they’re referring to the same concept; your posts are only being seen by likeminded friends.

4 of 15

5 of 15

Some historical context

  • What makes this a new problem?
  • There were fewer options for sources of information. In the past we had newspapers, radio, and limited television stations. Pretty much everyone was receiving the same information. Everyone read the local newspapers, everyone watched I Love Lucy.
  • Personalization technology wasn’t there. Newspapers didn’t print and distribute advertisements based on the demographic of the users.

6 of 15

Why am I talking about this?

  • Overall, it’s a fundamental structure of the world wide web, and how every user receives their information! It’s hard not to acknowledge.
  • It’s like David Foster Wallace’s “This is Water” which begins:

There are these two young fish swimming along, and they happen to meet an older fish swimming the other way, who nods at them and says, “Morning, boys, how's the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes, “What the hell is water?”

  • Studying digital cultures and societies is studying bubbles. Groups you interact with can expose you to new ideas (bring diverse perspectives into your bubble) or reinforce beliefs (keeping new ideas out of your bubble).

7 of 15

The most famous (and recent) example

  • The 2016 US presidential election was contributed to by the phenomenon.
  • Many people suggest that only getting news from Facebook limits us, since the people who surround us are usually like ourselves.
    • Our Facebook feeds are not a diverse section of the population.
    • If we think Hillary is certainly going to win the presidency, we get comfortable.
    • That same idea might transcend to being unaware of other grievances.
  • Further, that’s why Hillary supporters were so shocked by Trump’s win.
  • The idea of filter bubbles was super publicized after this.

8 of 15

Facebook’s effect on the 2016 election serves as a good model for consequences

  • Users go to Facebook to see what’s new (e.g. cats and political news)
  • Because of self-selection, a user’s Facebook friends are already probably people like them (e.g. liberals)
  • Facebook tracks the user’s activity, what they click on, and analyzes that they might be more interested in (e.g. liberal-leaning articles).
  • Facebook personalizes their feed to show them more of what they like (e.g. posts saying “I’m With Her” and CNN links).
  • The user enjoys their time, stays longer, and comes away affirming that their perspective is accurate (e.g. feeling overconfidence that Hillary will win).
  • Journalists are people too, and they feel comfortable and share that.

9 of 15

So to clarify, the problem with filter bubbles

  • In one word: limitation
  • There’s nothing to challenge you. It’s ignorance.
  • You don’t know what is happening outside your bubble. Scary.
  • Algorithmic gatekeepers may not have the same morality as humans. Maybe you should care more about that person in Africa than the squirrel on your lawn. At the least, you should be aware of the person in Africa.
  • At the very very least, you should be aware that you are being shielded from the person dying in Africa.

10 of 15

A brief relevant tangent

Neil Postman, author of “Amusing Ourselves to Death,” in the ‘80s wrote:

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture.

11 of 15

If filter bubbles are so bad, why do they exist?

  • FIRST: Cynically, one view is that it all starts with money. Advertisers pay websites for visibility to potential customers.
  • To get more views, companies want you to stay on their websites.
    • This refers to the attention economy. You only have so many hours in the day, and companies want you to look at *their* content.
  • If the webpage agrees with your worldview, you’re happier and stay longer.
  • If the ads are personalized too, the targeted customer is more likely to buy, and advertisers make more money. Therefore the web page also can make more money off the advertiser.
  • SECOND: Information overload -- you have to sort it somehow! (more ->)

12 of 15

Information overload, the partner in crime

  • I think of personalization as a reaction to information overload.
  • Everyone creates content all the time (consciously and unconsciously --
    • There’s a great article called “In The Future We Will Photograph Everything and Look at Nothing” and it reminds me of how we are making so much and it’s hard to consume it.
  • There’s just too much stuff, and no one has time to read all of it, so we end up with this crutch.
  • In theory, personalization should be the solution. We need each person to receive what’s best for them. It’s more a problem of how we are personalizing today, personalizing based on what we like, rather than what we need to know. ...but what do we need to know? It’s philosophy.

13 of 15

Holy $%*t! What can we do??!

Pariser came up with three overarching solutions:

  1. For individuals: Diversify the content you consume as much as you can. He thinks Twitter is best for this, since you can consciously select sources.
  2. For companies: Be more straightforward about the policies! Tell users when their pages are personalized, give them the option to stay anonymous. Ask for consent, don’t sell their data behind their back.
    1. (That’s kind of a whole long conversation, data brokers.)
  3. For government: Revamp laws regarding use of private information. Catch up with the times, create laws that protects tech users.

14 of 15

So yes, become a lawyer, politician, or tech mogul, fix the system. But also, suggestions for what you can do today:

  • You really have to seek out diverse views. Follow FOX News. Twitter is probably best for this, or maybe Instagram.
  • Interact with the World Wide Web in different ways. Use the Tor browser, Firefox, or Duck Duck Go, or at least incognito/private browsers.
  • The extension Politecho analyzes your Facebook friends to show you where they fall on the political spectrum. Become aware of how liberal UCLA is.
  • This doesn’t mean abandon the Internet! I’m not totally crazy. The Web still allows for us to reach new ideas without physically traveling and that’s special and awesome. Collaboration can solve today’s problems.

15 of 15

Credit: http://www.poorlydrawnlines.com/comic/knowledge/