13

Social effects

More and more, there is a feeling that everything that matters is on the web and should be accessible through search1. As LM Hinman puts it, “Esse est indicato in Google (to be is to be indexed on Google).” As he also notes, “citizens in a democracy cannot make informed decisions without access to accurate information”2,3. If democracy stands on free access to undistorted information, search engines directly affect how democratic our countries are. Their role as gatekeepers of knowledge is in direct conflict with their nature as private companies dependent on ads for income. Therefore, for the sake of a free society, we must demand accountability for search engines and transparency in how their algorithms work2.

Creation of filter bubbles

Systems that recommend content based on user profiles, including search engines, can insulate users from exposure to different views. By feeding content that the user likes, they create self-reinforcing biases and “filter bubbles”2,4. These bubbles, created when newly acquired knowledge is based on past interests and activities5, cement biases as solid foundations of knowledge. This could become particularly dangerous when used with young and impressionable minds. Thus, open discussions with peers and teachers and collaborative learning activities should be promoted in the classroom.

Feedback loops

Search engines, like other recommendation systems, predict what will be of interest to the user. Then, when the user clicks on what was recommended, it  (the search engine) takes it as positive feedback. This feedback affects what links are displayed in the future. If a user clicked on the first link displayed, was it because they found it relevant or simply because it was the first result and thus easier to choose?

Implicit feedback is tricky to interpret. When predictions are based on incorrect interpretation, the effects are even trickier to predict. When certain results are repeatedly shown – and are the only thing that the user gets to see – it can even end up changing what the user likes and dislikes – a self-fulfilling prediction, perhaps.

In the United States, a predictive policing system was launched whereby high-crime areas of a certain city were highlighted. This meant that more police officers were deployed to such areas. Since these officers knew the area was at high risk, they were careful, and stopped, searched, or arrested more people than normal. The arrests thus validated the prediction, even where the prediction was biased in the first place. Not only that, the arrests were data for future predictions on the same areas and on areas similar to it, compounding biases over time5.

We use prediction systems in order to act on the predictions. But acting on biased predictions affects future outcomes, the people involved – and ultimately society itself. “As a side-effect of fulfilling its purpose of retrieving relevant information, a search engine will necessarily change the very thing that it aims to measure, sort and rank. Similarly, most machine-learning systems will affect the phenomena that they predict”5.

Fake news, extreme content and censorship

There is increasing prevalence of fake news (false stories that appear as news) in online forums, social media sites and blogs, all available to students through search. Small focused groups of people can drive ratings up for specific videos and web sites of extreme content. This increases the content’s popularity and appearance of authenticity, gaming the ranking algorithms4. Yet, as of now, no clear and explicit policy has been adopted by search-engine companies to control fake news2.

On the other hand, search engines systematically exclude certain sites and certain types of sites in favour of others6. They censor content from some authors, despite not being asked to do so by the public. Search engines, therefore, should be used with awareness, discretion and discrimination.


Hillis, K., Petit, M., Jarrett, K., Google and the Culture of Search, Routledge Taylor and Francis, 2013.

2 Tavani, H., Zimmer, M., Search Engines and Ethics, The Stanford Encyclopedia of Philosophy, Fall 2020 Edition), Edward N. Zalta (ed.).

3 Hinman, L. M., Esse Est Indicato in Google: Ethical and Political Issues in Search Engines, International Review of Information Ethics, 3: 19–25, 2005.

4 Milano, S., Taddeo, M., Floridi, L. Recommender systems and their ethical challenges, AI & Soc 35, 957–967, 2020.

5 Barocas, S.,  Hardt, M., Narayanan, A., Fairness and machine learning Limitations and Opportunities, MIT Press, 2023.

6 Introna, L. and Nissenbaum, H., Shaping the Web: Why The Politics of Search Engines Matters, The Information Society, 16(3): 169–185, 2000.

Licence

Icon for the Creative Commons Attribution 4.0 International License

AI for Teachers: an Open Textbook Copyright © 2024 by Colin de la Higuera and Jotsna Iyer is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book