The manner in which searches are conducted on the internet is changing along with the shift from web 2.0 to 3.0. Search engines on the web have traditionally returned the same results to every user in a non-biased way. [1]

Filter Bubbles provide, at times negative or positive, the user with experiences on the web that is tailored to their interests, exposing them only to what they already agree with, and filtering out the rest.

Personalized SearchingEdit

Search engines like Google and Bing are increasing filtering users search results based on users past search history. In the past, every user who searched for the term “mouse” would see the same results. Searching in Web 3.0, perhaps better referred to as the semantic web, with a more personalized search, a biologist would get results about the rodent, whereas a computer scientist would get results about the computer accessory.[1]

Dou et al argue that not all queries need personalization. “…for the query ‘Google’…almost all of the users are consistently selecting results to redirect to Google’s homepage, and therefore none of the personalized strategies could provide significant benefits to users.”[1] Benefits and non-benefits of personalized searching will vary from user to user, but there ought to be clarity for the user when it is being helpful and when it is being a hindrance in order to stream line searching.


Perhaps the most obvious benefit of more personalized searches is that the user will have to spend less time filtering through less relevant content on their own, letting the search engines algorithms do the work. Search engines now simply search for the keywords that are entered in the search, devoid of context, and these searches return huge amounts of information. Despite the huge amount of information searches can return, people generally do not search very efficiently and end up relying on the information that was easily located, as opposed to most appropriate.[2] Searching in the semantic web will return information more relevant to whatever topic is being searched and make more sense of the vast amount of information that is currently on the web.


The biggest problem facing the these personalized searches is that people are no longer in charge of the information they are receiving[3], that power now belongs to the search engines algorithms. Searches will return results that should be more relevant to the user, but this will cause a great deal of information to go undiscovered. The searches are not trying to censor the search, but rather get rid of the information the user would ignore on their own.

Google vs. BingEdit

Google has algorithms that automatically filter out, according to them, “…a narrow class of search queries related to pornography, violence, hate speech, and copyright infringement.”[4]

Kiakopoulos’ results show that Google’s API was much stricter than Bing’s when blocking sex-related words. “Conspicuously, Bing does block query suggestions for ‘homosexual,’ raising the question: Is there such a thing as a gay-friendly search engine? In response, a Microsoft spokesperson commented that, ‘Sometimes seemingly benign queries can lead to adult content,’ and consequently are filtered from autosuggest. By that logic, it would seem that ‘homosexual’ merely leads to ‘too much’ adult content, causing the algorithm to flag and filter it.”[4]

130802 FUT Diagram1-EX.jpg.CROP.rectangle3-large

Example of Word Blocking[4]

Though there are many examples where this will go unnoticed and be beneficial to the user, there are often times when a filtered search is not desired. Additionally, it can be unsettling for a user to realized that their search results are being filtered, and there is information they are missing.

Filter BubblesEdit

One of the dangers of a personalized search is the creation of a filter bubble, in which the user experiences a web that is tailored to their interests, exposing them only to what they already agree with and filtering out the rest. This will cause people with different interests and beliefs to experience different versions of the web, and become sheltered by information that does not oppose their world view. It is proposed that this will cause greater divides between people with differing beliefs [5]

Three DynamicsEdit

  1. Users are alone in their bubbles. While there are others who have similar interests, no one has your bubble. Therefore, it is pulling us apart.[5]
  2. The bubble is invisible. When a person goes to a liberal or conservative news source, they’re aware of it (hopefully). “But Google’s agenda is opaque. Google doesn’t tell you who it thinks you are or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong—and you might not even know it’s making assumptions about you in the first place.”[5]
  3. You don’t have a choice about being a filter bubble. Once again, users typically go to biased news sources, but in a filtered Web, the content comes to you.[5]

Dangers of Filter BubblesEdit

Eli Pariser Beware online "filter bubbles"-1

Eli Pariser Beware online "filter bubbles"-1

Perhaps the biggest danger of this personalized search is not the fact that these filter bubbles are created, but rather that people seem to quite like this new version of the web. [6] Participants in this study claim they would have ignored these results anyway, seeing as though the algorithm is based on their own search history, rather than some other person enforcing an agenda. Though people do attempt to break away from these filter bubbles, research has shown that people mostly seek out opinions that oppose their own to find flaws in these opinions.[6] They seek out this information only to further enforce what they already believe, rather that to entertain or further understand a different point of view. Though they are exposing themselves to the information that would be normally filtered out of their search, they are actively reinforcing the ideological gap that is feared to be created by these searches. They seek out this information with the preconceived belief that it is incorrect and that their own view on the world is the correct one.

As Pariser notes then speculates: “Left to their own devices, personalization filters serve up a kind of invisible auto propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown...,”[5] and that, “It’s not just serendipity that’s at risk. By definition, a world constructed from the familiar is a world in which there’s nothing to learn.”[5]


  1. 1.0 1.1 1.2 Dou, Z., Song, R., & Wen, J. R. (2007, May). A large-scale evaluation and analysis of personalized search strategies. In Proceedings of the 16th international conference on World Wide Web (pp. 581-590). ACM.
  2. McClure, R. (2011). WritingResearchWriting: The Semantic Web and the Future of the Research Project. Computers and Composition, 28(4), 315-326.
  3. Martin, J. (2012). The complete guide to filtered search results. Tech Advisor. Retrieved from
  4. 4.0 4.1 4.2 Kiakopoulos, N. (2013). Sex, Violence, and Autocomplete Algoithms. Slate. Retrieved from
  5. 5.0 5.1 5.2 5.3 5.4 5.5 Pariser, E. (2011). The filter bubble: what the Internet is hiding from you. New York: Penguin Press.
  6. 6.0 6.1 Viao, Q. V., & Fu, W. (2013). Beyond the filter bubble: Interactive effects of perceived threat and topic involvement on selective exposure to information . Unpublished raw data, Paris, France. Retrieved from