This response will explain the historical development of search engines, considering the transformative social, political, economic and cultural effects they have had. Halavais (2013, p. 5) defines search engines as “information retrieval” systems that provide “keyword searches of distributed digital text”. I will divide the discussion into three parts.
- Part one will explain the genesis of search engines, considering the way they have evolved as a component of information management in the media and communications landscape.
- Part two will focus on Google as the largest and most dominant player in the field. Here I will explain some of the social, political, economic and cultural impacts of Google’s monopoly of power.
- In the final section, I will consider the positive and negative implications of search engines for different social groups. Although search engines allow users to navigate information more efficiently, the targeted nature of search engines pose a threat to individual privacy and can create an echo chamber effect. Moreover, search engines lack transparency and are rarely held accountable, something problematic for the wider public. Search engines are not unbiased and objective and have been shown to unfairly discriminate against some individuals and groups. Furthermore, corporations have developed techniques to maximise revenue through search engines.
Part 1: Genesis of search engines and role in historical trends in information management
Search engines emerged as the proliferation of information on the web meant there was “more information than could easily be browsed” (Halavais, 2013, p. 13). This has been important as search engines effectively organise, collect and retrieve information. Consequently, they have been incorporated into the “entire media ecosystem”, navigating data outside the Internet that would otherwise be “unmanageable, unstructured and heterogenous” (Halavais, 2013, p.
Emails and documents provide an example of how search engines have made it easier for users to navigate data.
Search engines have evolved over time, incorporating techniques used long before the emergence of the internet. They have been somewhat modelled off “computing systems in libraries” as well as the filing of documents following the industrial revolution (Halavais, 2013, p. 12). During the 1990’s, search engines, including Lycos and Webcrawler gave results based on the number of times keywords appeared (Hazan, 2013, p. 792). The advent of Google radically transformed search as algorithms were developed to rank results on “relevance”. That was quantified by popularity and the number of times webpages appeared in links (Hazan, 2013, p. 792). By the early 2000’s, Google had organised upwards of a billion web pages. More broadly, the “search box” had become a “fixture on most sites of any size”, expected to behave in recognisable ways (Neilson, 2005).
Search engines are programmed to manage information for specific purposes. For instance, some search engines, including Google search are used for more general-purpose searches. Specialised search engines have been developed in recent times, constructed to index a portion of the web. This is referred to as “vertical search” as they are tailored to index pages falling under a particular “knowledge domain” (Halavais, 2013, p. 7). For instance, Austlii is a legal search tool to find cases and legal materials in Australasia. Another example is Google Scholar.
Part 2: Google domination
Google has and continues to be the most frequently used search engine. This has given Google a monopoly over the search world. Google has even created its own “niche sites” and acquired sites exhibiting “potential” to maintain control and ‘integrate their services” (Halavais, 2013, p. 8). This has raised concerns for competition as Google’s “dominance in core search” has been used to “gain market share in vertical search” as well (Hazan, 2013, p. 790). Google has also been accused of exploiting their power to “disproportionately direct” users to their content (Hazan, 2013, p. 792). For instance, a webpage’s rank in a Google search is closely aligned with “that page’s web traffic” (Hazan, 2013, p. 794).
Google’s page ranking is influenced by a number of factors including “cultural norms, commercial interests and political interests” (Rollman & Evanston, 2018). Hence, as this video explained, Google is not unbiased or objective. For instance, ranking may be influenced by censorship or personal biases of algorithmic designers. Indeed, Google is in a position to rank sites “based on its own commercial interests” (Hazan, 2013, p. 795). Google will boost its own “proprietary services” over a more relevant result (Hazan, 2013, p. 795). For instance, searching “email” or “maps” will bring up Google Maps or Gmail first. This is especially problematic in a digital age as knowledge is increasingly being shared online. Hence, the “human and technical biases” manifested in Google’s algorithms will significantly influence society, culture, politics and economics (Bozdag, 2013, p. 209).
Part 3: Impact, the good and the bad
The personalised nature of search engines raises concerns for individual privacy. Search engines “extract patterns” of behaviour, recording personal data to make search results more relevant and personal (Halavais, 2013, p. 9). This made it easier to find relevant information. Moreover, it made it easier for “information intermediaries” like Google to manage the abundance of information (Bozdag, 2013, p. 209). However, it poses a threat to personal privacy as personal data is used to manage information. For instance, an employer is unlikely to favour an applicant if they are informed of her searching about pregnancy.
The targeted nature of search engines can create an echo chamber effect. Search engines form opinions of users and “customize results” accordingly, thereby influencing our “tastes and preferences” and often perpetuating preconceived beliefs (Pariser, 2011). As knowledge of politics, economics and culture is shared online, search engines effectively influence all aspects of society, assuming the role of “gatekeepers” (Bozdag, 2013, p. 209). For instance, a user who typically consumes information produced by liberal news and political groups will see results that perpetuate these worldviews at the top of their search query. This can be problematic as it may limit their access to alternative sources and perspectives.
Search engines are controlled by few and have little accountability and public oversight (Hinman, 2008, p. 74). This is because individuals often have a “blind faith in search engines to deliver trustworthy results” (Jiang, 2014, p. 213). Hence, users may be disillusioned by search engines, witnessing how beneficial they can be in the short term, but failing to recognise the negative implications. This can be problematic since it allows search engines and large corporations to capitalise on this power at the expense of individuals with no consequences.
Search engines do not exist in a vacuum, often perpetuating prevailing power structures in society. For instance, search engines are predominantly constructed by white men disposed to design software to suit those like them (Rollman, 2018, p. 4). They have also been shown to “systematically and unfairly discriminate against certain individuals or groups” over others (Bozdag, 2013, p. 210). One of these is popularity bias, something evident in PageRank algorithms. The most ‘popular’ sites are generally at the top of the results page (Introna & Nissenbaum, 2000) (Hindman et al, 2003). This is particularly problematic as individuals rarely search beyond the first page, assuming the most relevant pages are at the top (Hazan, 2013, p. 794). Noble (2017) argues search engines are harmful using the example of a search for ‘black girls’ to make her point. This search was found to consistently bring up pornographic sites. This demonstrates how search engines are not neutral or objective but rather they manifest the worldviews of their creators. Hence, marginalised groups in society are often negatively impacted by search engines.
Corporations have developed techniques to maximise revenue through search engines. Firstly, ‘Clickbait’ is when a headline compels users to click to see the entire story (Crookes, 2014). SEO refers to a set of strategies used to boost a website’s visibility within a search result (Wynne, 2011). Corporations will also include common search phrases and terms in metadata to boost SEO. Clickbait and SEO are used to exploit people’s limited attention online. Analytics companies, including Google Analytics provide companies with feedback regarding “source, location” and “impact” of content they put out. This enables companies to capitalise on this information (Dwyer & Martin, 2017, p. 1086). SEO also allows those with money to “rig searches to keep their sites at the top” (Rollman, 2018, p. 4).
This response explained the historical development of search engines, examining the transformative social, political, economic and cultural effects. Part one considered the historical development of search engines. Part two examined Google, the most dominant search engine. I explained the impact of Google’s monopoly. Part three identified the positive and negative impacts of search engines. Search engines allow users to navigate information efficiently. However, the personalised nature of search engines pose a threat to individual privacy and can create an echo chamber effect. Moreover, they are not subjected to much public scrutiny and hence lack accountability. Search engines are also influenced by human and technical biases, resulting in unfair discrimination against some individuals and groups. Furthermore, corporations have developed techniques to maximise revenue through search engines.
Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and information technology, 15(3), 209-227. doi: 10.1007/s10676-013-9321-6
Crookes, D. (2014). Clickbait. Web user, 354, 36-47. Retrieved from http://web.b.ebscohost.com.ezproxy1.library.usyd.edu.au/ehost/results?vid=0&sid=8e4a4388-9348-499e-8ec8-0152b9d8ba7e%40sessionmgr120&bquery=IS%2B1473-7094%2BAND%2BTI%2B%28%2522Clickbait%2522%29%2BAND%2BDT%2B%25222014%2522%2BAND%2BAU%2B%2522Crookes%2522&bdata=JmRiPWY2aCZ0eXBlPTEmc2l0ZT1laG9zdC1saXZl
Debuchakrabarty. (Photographer). (2008, March 11). Google site search box. [digital image]. Retrieved from https://www.flickr.com/photos/dchucks/2326959178/in/photolist-4xCgTQ-fsYSNr-9yJchF-5BZDAB-9NJAiA-fxJUf-fvar4E-5TGLwp-bC34GZ-V3tDAh-dQehHG-5qJWn5-5jei-bC35fc-9NvF6h-fwzNG8-fsXJjX-6WwnRv-ftE3cT-c2tR5f-gXMdFC-cpMzVU-Sjohz2-4S2cNf-bEm86H-fzmv2j-bEKdHx-cpMzth-q3qfhM-2mnaga-9fc2MM-572b7L-ftDYKX-mwcvXX-fuhQM8-dzPYNq-ftUoco-ULdunZ-cykc15-5sWh31-fsXL1i-fsXKHt-7gyMQz-bn3yUP-5H7GtS-4mivJZ-fPNLqV-ftUotW-ASXp-5nZ1CS
Devers, C. (Photographer). (2010, April 18). “Tighhter than a …”(in re:http://twitter.com/fervidmuse/status/12360667993). [digital image]. Retrieved from https://www.flickr.com/photos/cdevers/4529270843/in/photolist-7UeGqB-9LBtLk-47a7tQ-4ZrDN-WzcfQL-9hGsbo-5AZh1Z-2apAmoN-s8h2ZK-2SPosf-7ZFnxh-6Hy9td-4Z4qSW-5X5BG7-5ZXYm-VvauuH-W8QEFF-7j4VSy-21MaxPq-U2bkJG-axkcFr-diLRr3-epNmbS-a96Fed-ViFprF-r6J9K1-eezg44-egYyTZ-6jtnBq-a921M6-36q32N-YcmZkb-bpowtJ-7YGasN-crap31-4M2R4k-ZPdyeZ-cJLfmW-oRm2g7-eVLBmw-4zBfMS-9jjm8X-dWw9X9-p8NP4U-7AwrNA-nkx25w-8DjThf-6b1P6L-6gfCiL-bCisqc
Dwyer, T. & Martin, F. (2017). Sharing news online: Social media news analytics and their implications for media pluralism policies. Digital Journalism, 5(8), 1080-1100. doi:10.1080/21670811.2017.1338527
Fry, W. (Photographer). (2014, March 28). Search box on photopage [digital image]. Retrieved from https://www.flickr.com/photos/saintseminole/13468877273/in/photolist-mwcvXX-fuhQM8-dzPYNq-ftUoco-ULdunZ-cykc15-5sWh31-fsXL1i-fsXKHt-7gyMQz-bn3yUP-5H7GtS-4mivJZ-fPNLqV-ftUotW-ASXp-5nZ1CS-ftd733-bJgsZ-moRHiV-dmixJq-cbFSgq-ftUk8G-ftedQQ-5f6V29-biEcMB-aCdL2S-Z2njFV-aN32t2-fQwuDS-frBuyz-5Ru4vz-fuV8He-72owsV-cpMyLy-eKJzVB-bWDE1g-RD9qWC-cpMytw-kH4yN4-6aZrbx-E4XmVG-72svs9-aNxXdz-Jo42nh-UfBbd6-rZwQJ-9K1f6N-VZgMgR-2hfYid
Halavais, A. (2013). The engines. In Search engine society (pp. 5-31). Cambridge, UK: Polity.
Hazan, J. (2013). Stop being evil: A proposal for unbiased Google search. Michigan law review, 111(5), 789-820. Retrieved from https://www.jstor.org/stable/23812653
Heidenreich, N. (Photographer). (2007, February 11). Who is Yves Voggenauer [digital image]. Retrieved from https://www.flickr.com/photos/schoschie/386497642/in/photolist-A9Uhs-Yffbmo-7D9W4w-KpwZHm-5Gqi87-LodZLq-7GFKWM-SA9C4r-6Jx37U-4wrcTY-4zbq1P-7GKGDJ-Vf7Wbb-aKEnhv-ngW9iC-79WXsX-V6Yo2k-ngW4B3-9ppkV4-8jfMD6-ngWmYh-7QNk4g-5RZ7tt-oRmytW-8ia5jM-6HLR7d-bpox4J-apDZQb-52HMRc-52HJhn-3vaZe-nyrSqJ-5vFiPS-dabUzx-9oPbyM-7GKGFG-bCirZx-ZPdy6x-9ezwVu-5TEe9Q-bpoy75-8ia5sr-e6rKju-apBi58-ZUCwuU-an4jfL-an16Sz-ngWqrZ-dZ7BUS-dZmojt
Hinman, L. (2008). Searching ethics: the role of search engines in the construction and distribution of knowledge. In Spink, A., & Zimmer, M. (eds). Web search multidisciplinary perspectives (pp. 67–76). Berlin: Springer
Introna, L. & Nissenbaum, H. (2000). ‘The public good vision of the internet and the politics of search engines’. In Rogers, R. (ed). Preferred Placement – Knowledge Politics on the Web (pp. 25-47). Maastricht: Jan van Eyck Akademy.
Jiang, M. (2014). The business and politics of search engines: A comparative study of Baidu and Google’s search results of internet events in china. New media & society, 16(2), 212-233. doi:10.1177/1461444813481196
Mager, A. (2012). Algorithmic ideology. Information, communication & society, 15(5), 769-787. doi: 10.1080/1369118X.2012.676056
Neilsen, J. (2005). Mental models for search are getting firmer. Alertbox. Retrieved from www.useit.com/alertbox/20050509.html
Noble, S. (2018). Algorithms of oppression: How search engines reinforce racism. New York: New York University Press.
Pacheco-Vega, R. (Photographer). (2016, June 15). Google scholar search. [digital image]. Retrieved from https://www.flickr.com/photos/rolexpv/27666173766/in/photolist-bBdAuR-J9LnzY-bkTacm-c2UqMd-2zT9w3-s6vUhN-b4KD9R-rTgj6o-2zT9pA-bFEffX-a9Aag6-cNquUL-a9AaeP-aFVLat-aFVLiD-8VbBNL-83T4o8-beRUG2-uUxb8D-2zTEg5-bWDhR4-aDwCiU-aFVDr4-aFVCbr-8VU8ps-5Lht6L-dF6RFp-bdjoRT-bQcz4D-51sByA-51sBff-pBkfa4-E1wpY7-bdjNfD-51ojDi-ci4akQ-bdjND6-4UcJY8-51ojop-q8PATt-62nyxw-ZwvNeA-uxPkuW-7Yje9X-fEwtmU-bFEePB-fDodsG-7TjZ3E-8cDr4y
Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., Granka, L. (2007). In Google we trust: Users’ decisions on rank, position, and relevance. Journal of computer-mediated communication, 12(3). Retrieved from https://doi.org/10.1111/j.1083-6101.2007.00351.x
Pariser, E. (2011). The filter bubble: What the internet is hiding from you. London: Viking.
Rollman, H. (2018, Jan 30). Don’t google it! how search engines reinforce racism. PopMatters. Retrieved from http://ezproxy.library.usyd.edu.au/login?url=https://search-proquest-com.ezproxy1.library.usyd.edu.au/docview/2014390690?accountid=14757
Ted. [Username]. (2015, December 7). The moral bias behind your search results Andreas Ekstrom [Video File]. Retrieved from https://www.youtube.com/watch?v=_vBggxCNNno
Wynne, P. (2012). Pimp my site: Your DIY guide to SEO, search marketing, social media and online PR. Chichester, UK: Capstone.