Promote the need for sincere information for a democracy based on trust

Filter bubbles and echo chambers

The Internet has made an impressive amount of information available to every connected individual. Via social networks and forums, the Internet has also brought the possibility of conversing and debating simultaneously with a large number of individuals, regardless of geographical distance. It seemed as though the ingredients - information and debate - were in place for the development of the Internet to revitalize our democracies. However, the web seems to have contributed more to the emergence of "credulous democracies"[1] than to "knowledge societies"[2], and social networks seem to promote the polarization of opinions rather than to allow for a better understanding of divergent opinions.

Two phenomena resulting from the functioning of Internet are often accused of contributing to this situation: "filter bubbles" and "echo chambers". In this article, we introduce these two notions, often mentioned in the media but rarely well understood, as well as some criticisms that have been levelled towards them.

[1] Bronner, G. (2013). La démocratie des crédules. Presses universitaires de France.

[2] Bindé, J., Demarais, L. & Plouin, J. (2005). Vers les sociétés du savoir. Rapport mondial de l’UNESCO / Collection Ouvrages de référence de l’UNESCO.

Aurélien Brest
Forschungsleiter in Kognitive Psychology Bei Der Fondation Descartes

I – Filter Bubbles

A "filter bubble" refers to the ways in which information is filtered before reaching an Internet user. According to Internet expert Eli Pariser, filter bubbles result from the personalization of online content, and are believed to intellectually isolate Internet users and to diminish the diversity of the information to which they are exposed (Eli Pariser, The Filter Bubble: What the Internet is Hiding from You, Penguin Press, 2011) 1 For example, on Facebook, an individual with a strong interest in cats will be exposed to a large amount of cat-related information on their news feed. This is a result of the algorithms employed by digital platforms, which determine the interests of their users by studying information related to their online behavior.

At first, the information collected about Internet users is quite basic: age, gender, contacts added. But new web-based media have integrated several mechanisms that allow for websites to collect much more precise information. These include "share", "like", "subscribe" buttons, etc., which inform algorithms about users' online behavior. Facebook, for example, introduced its "like" button in 2009. A week later, 50,000 websites had integrated this button into their architecture (Kris Olin, Facebook Advertising Guide, 2010). The presence of this button on countless websites allows Facebook to collect a lot of information about its users. For example, by clicking the "Like" button under a cat-related product, the user sends Facebook an indication of their tastes and preferences. In return, Facebook will display more cat-related content.

New media algorithms are thus able to recommend increasingly personalized content to users of digital platforms. And the more an Internet user visits these websites, the more personalized the information will be and, therefore, the more likely it will be to interest them. Moreover, personalization algorithms are crucial to the functioning of these platforms, as they are the foundation of their business models. Indeed, it is thanks to these algorithms that platforms can offer companies the opportunity to advertise their products in a targeted manner. Advertising is the main source of revenue for digital social networks 2, and companies are indeed willing to invest significant amounts of money to ensure effective advertising targeting 3.

Harmful consequences for users?

According to Eli Pariser, the personalization of information on the Internet is harmful for Internet users, since they are no longer confronted with information that could broaden their interests or challenge their beliefs or opinions. In other words, Internet users are gradually finding themselves trapped in a filter bubble that bypasses any divergent information and impoverishes their curiosity.

Of course, even before the advent of the Internet, there already existed media outlets that offered very specialized information, or that held strong editorial lines. But these outlets were upfront about it. Facebook, YouTube (which belongs to Google) and other similar platforms, on the other hand, do not present themselves as specialized media, or even as media, but simply as platforms that host content. In other words, the specialization and orientation of the information displayed is done in part without the user's knowledge. Eli Pariser illustrates this point by comparing his Google search results for the word "Egypt" with that of one of his friends. To him, Google displays results concerning politics, whereas his friend is shown results related to tourism. In other words, the algorithm shapes and displays two different worldviews, which are based on the inferences made by the algorithm from the personal information collected.

Pariser believes that these personalization filters are partly responsible for the particularly strong political polarization in the United States between liberals and conservatives. Individuals, by being repeatedly confronted with highly personalized political information, end up sharing next to nothing with their political opponents. This in turn undermines any possibility of establishing a public space in the political sense of the term — that is, a space for deliberation and debate, whose proper functioning presupposes minimal agreement on basic facts and information.

Filter bubbles: a contentious issue

While the personalization of information is a specificity of web-based media, some researchers dispute the existence, or the true scope, of the filter bubbles that result from this personalization. Richard Fletcher, a researcher at the Reuters Institute and co-author of the annual report on digital information 4, nuances the harmful effects that Eli Pariser attributes to digital platform algorithms and challenges the idea that political polarization in the United States is linked to the personalization of information 5. His argument is based on a series of academic papers he has written on the issue.

His reasoning is as follows: if personalization algorithms impoverish our informational horizon, this means (1) that we are confronted with less diverse information on digital social networks than offline and (2) that this impoverishment is taking place without our knowledge. Here is what Fletcher observed from the Reuters Institute Digital News Report's annual Reuters Institute Digital News Report database:

  1. IIndividuals who use social networks without intending to seek news are in fact exposed to a greater diversity of information sources than individuals who do not use social networks at all. Therefore, contrary to the filter bubble theory, individuals who do not show any interest in news on social networks are still exposed to a wider range of information sources than individuals who do not use social networks at all. Social networks, far from confining individuals to a limited number of information sources (related to their preferences), actually broaden individuals' sources of information 6.
  2. Individuals who use search engines to seek information tend to be exposed to greater diversity in the political orientation of the information offered than individuals who do not use search engines 7.

In contrast to Pariser's understanding, Fletcher therefore believes that social networks increase the visibility of news information for individuals — and there are many of them — who are not particularly interested in the news. These individuals are exposed to it regardless, because the algorithms of social networks give visibility to information that these individuals would not have otherwise sought.

Fletcher also warns us that, without the personalization of information via algorithms, we would inform ourselves rather unadventurously. In other words, we would tend to select, out of habit, only a handful of sources of information (one or two TV channels, a single newspaper, etc.). Digital social networks and Internet search engines are, in fact, opportunities for us to be exposed to sources of information that we would never have spontaneously consulted.

II – Echo chambers

Fletcher and Pariser both observe that the United States suffers from strong political polarization: individuals on either side of the political spectrum today seem irreconcilable. But, if we follow Fletcher's reasoning, this polarization would not result from American citizens being trapped in filter bubbles, but rather from the contrary. Indeed, his studies suggest that the more we are confronted with information that contradicts our beliefs, the more inclined we are to strengthen our initial position. This is a psychological reaction that has also been documented by other researchers 8.

However, for several other analysts, the radicalization of opinions on the Internet is largely due to so-called echo chambers. On the Internet, individuals seem to preferentially interact with people who are interested in the same topics and who share similar opinions. This leads to the formation of virtual communities in which people share and receive information that is focused on their interests and that is aligned with their beliefs. These communities are referred to as echo chambers because the voice of each member essentially echoes that of every other member. Functioning as mirrors and amplifiers of individuals’ worldviews, echo chambers would seem to be fertile grounds for radicalization.

An emblematic case that highlights the tendency of echo chambers to lead to radicalization is that of the “Incels” community. This community arose from the meeting of men who struggled to find a romantic partner and who gathered online to discuss their emotional woes. Soon, some members of the community began to express increasingly misogynistic views. The Incels then followed a classic pattern of radicalization: the emergence of leaders, the increasingly virulent assertion of group identity, the gradual exclusion of other groups, and isolation 9. In the end, a proper anti-women political theory was developed within this community. A few years after their emergence, several Incels committed attacks against women, accusing them of being the direct cause of all their misfortunes 10.

Many analysts are working to identify echo chambers on the Internet and to understand how they work. Here is an illustration:

This map displays the relationships between American media outlets on the social network Twitter. The larger the dot representing a given media outlet, the more that particular media outlet is cited by other media outlets on Twitter. Media outlets that are far away from the center of the map are primarily cited by other media with a political orientation close to their own. Conversely, the closer a media outlet is to the center of the map, the more it is cited by media with diverse political leanings. This map demonstrates that American media outlets are politically polarized. On the left, the most popular media (Huffington Post, New York Times, Washington Post) are quoted by media that are rather close to the American democratic-liberal political sphere (the political affiliation of Barack Obama and Joe Biden). On the right, the most popular media (Fox News, Breibart) are cited first and foremost by media close to the Republican-Conservative trend (the political line of Donald Trump).

If media outlets that are politically close frequently quote each other, and broadcast the same news and analysis, there is a risk that their audience will develop a very partial and partisan view of the news. This is what some analysts fear: that the media sphere in the United States would tend to be divided into two gigantic echo chambers, structurally representing the two principal political tendencies in the United States. Since the supporters of each of these tendencies are possibly inclined to inform themselves via the echo chamber corresponding to their political affiliation, this could result in the creation of an extremely polarized political spectrum, in which each side no longer communicates with the other 11.

But again, can we consider that the formation of these echo chambers is due to the architecture of social networks? An ambiguity in this analysis, rightly raised by Axel Bruns 12, lies in the confusion between the phenomenon of echo chambers and that of filter bubbles. Contrary to what the title of the article from which the map above is taken suggests, the polarization of the media sphere in the United States does not illustrate a filter bubble, because the tendency of media outlets to quote or follow one another is not a consequence of Twitter’s algorithms.

The main argument made by Axel Bruns and Richard Fletcher is that the Internet does not foster informational or ideological isolation. On the contrary: many debates take place on social networks between opposing sides on various issues, troll campaigns are organized to weaken the opponent, very real counterdemonstrations are organized to protest against the opposing side, etc. In other words, if the polarization of individuals on social networks is indeed intensifying, it would not occur through the isolation of either side in a filter bubble or through echo chambers, but rather via intense virtual confrontations.

The most important thing for analysts who criticize the usefulness of the concepts of filter bubbles and echo chambers to understand is what individuals do with information that is not in line with their opinions, and to find a way to transform social networks into a public space dedicated to deliberation rather than confrontation.

  1. See also: Pariser, Eli. Beware online “filter bubbles” Online.https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=fr.[]
  2. Facebook, for instance, raked in close to $15 billion from advertising revenue in Q1 of 2019. Ad revenues thus represent more than 99% of the companies’ total revenue: https://s21.q4cdn.com/399680738/files/doc_financials/2019/Q1/Q1-19-Press-Release.pdf[]
  3. To learn more on the business models of digital platforms, see: Citton, Y. (Ed.) (2014). L'économie de l'attention: nouvel horizon du capitalisme? La Découverte.[]
  4. Nic Newman, Richard Fletcher, Antonis Kalogeropoulos, and Rasmus Kleis Nielsen. “Reuters Institute Digital News Report” (2019), Reuters Institute for the Study of Journalism.  https://ora.ox.ac.uk/objects/uuid:18c8f2eb-f616-481a-9dff-2a479b2801d0/download_file?file_format=pdf&safe_filename=reuters_institute_digital_news_report_2019.pdf&type_of_work=Report[]
  5. On this topic, see Axel Brun’s text: It’s not the technology, stupid: “How the ‘Echo Chamber’ and ‘Filter Bubble’ metaphors have failed us.” In International Association for Media and Communication Research, 2019- 07-07.[]
  6. Fletcher, R., & Nielsen, R. K. (2018). Are people incidentally exposed to news on social media? A comparative analysis. New media & society, 20(7), 2450-2468.[]
  7. Fletcher, R., & Nielsen, R. K. (2018). Automated serendipity: The effect of using search engines on news repertoire balance and diversity. Digital Journalism, 6(8), 976-989.[]
  8. In particular, see: Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., ... & Nyhan, B. (2018). Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. March 19, 2018, http://dx.doi.org/10.2139/ssrn.3144139[]
  9. On this issue, see: Hogg, M. A., & Adelman, J. (2013). Uncertainty-Identity Theory: Extreme Groups, Radical Behavior, and Authoritarian Leadership. Journal of Social Issues, 69(3), 436–454. doi:10.1111/josi.12023 []
  10. For more information on this community, see: https://fr.wikipedia.org/wiki/Incel[]
  11. This theory has been analyzed in depth by a team of researchers from Harvard University: Faris, Robert M., Hal Roberts, Bruce Etling, Nikki Bourassa, Ethan Zuckerman, and Yochai Benkler. 2017. “Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election.” Berkman Klein Center for Internet & Society Research Paper. https://dash.harvard.edu/bitstream/handle/1/33759251/2017-08_electionReport_0.pdf?sequence=9&isAllowed=y[]
  12. Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4).[]
Share the article
Endowment fund for the creation of the Fondation Descartes
16 Cours Albert 1er, 75008 Paris.
Follow us
usercrossmenuchevron-rightchevron-down-circle linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram