The biggest problem facing Personalized Searching is that people are no longer in charge of the information they are receiving[1], that power now belongs to the search engines algorithms. Searches will return results that should be more relevant to the user, but this will cause a great deal of information to go undiscovered. The searches are not trying to censor the search, but rather get rid of the information the user would ignore on their own.

Google vs. BingEdit

Google has algorithms that automatically filter out, according to them, “…a narrow class of search queries related to pornography, violence, hate speech, and copyright infringement.”[2]

Kiakopoulos’ results show that Google’s API was much stricter than Bing’s when blocking sex-related words. “Conspicuously, Bing does block query suggestions for ‘homosexual,’ raising the question: Is there such a thing as a gay-friendly search engine? In response, a Microsoft spokesperson commented that, ‘Sometimes seemingly benign queries can lead to adult content,’ and consequently are filtered from autosuggest. By that logic, it would seem that ‘homosexual’ merely leads to ‘too much’ adult content, causing the algorithm to flag and filter it.”[2]

130802 FUT Diagram1-EX.jpg.CROP.rectangle3-large

Example of Word Blocking[2]

Though there are many examples where this will go unnoticed and have Benefits towards the user, there are often times when a filtered search is not desired. Additionally, it can be unsettling for a user to realized that their search results are being filtered, and there is information they are missing.


  1. Martin, J. (2012). The complete guide to filtered search results. Tech Advisor. Retrieved from {{ #NewWindowLink: }}
  2. 2.0 2.1 2.2 Kiakopoulos, N. (2013). Sex, Violence, and Autocomplete Algoithms. Slate. Retrieved from {{ #NewWindowLink: }}