Filter bubble


A filter bubble is a state of intellectual isolation that arises when personalized searches, recommendation systems, and algorithmic curation selectively presents information to each user. The search results are based on information about the user, such as their location, past click-behavior, and search history. As a result, users are increasingly exposed to information that reinforces their existing beliefs, while also separating themselves from content that challenges them. This has effectively enclosed individuals from a cultural or ideological bubble, resulting in a narrow and more customized view of the world. The choices made by these algorithms are only sometimes transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream.
However, there are conflicting reports about the extent to which personalized filtering happens, and whether such activity is beneficial or harmful, with various studies producing inconclusive results.
The term filter bubble was coined by internet activist Eli Pariser circa 2010. In Pariser's influential book under the same name, The Filter Bubble. It was predicted that individualized personalization by algorithmic filtering would lead to intellectual isolation and social fragmentation. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. According to Pariser, "users get less exposure to conflicting viewpoints and are isolated intellectually in their informational bubble." He related an example in which one user searched Google for "BP" and got investment news about BP, while another searcher got information about the Deepwater Horizon oil spill, noting that the two search results pages were "strikingly different" despite use of the same key words. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse.

Concept

Pariser defined his concept of a filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms." An internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in queue, reading news stories," and so forth. An internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages.
This process is not random, as it operates under a three-step process, per Pariser, who states, "First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune in to get the fit just right. Your identity shapes your media." Pariser also reports:
Accessing the data of link clicks displayed through site traffic measurements determines that filter bubbles can be collective or individual.
As of 2011, one engineer had told Pariser that Google looked at 57 different pieces of data to personally tailor a user's search results, including non-cookie data such as the type of computer being used and the user's physical location.
Pariser's idea of the filter bubble was popularized after the TED talk in May 2011, in which he gave examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing Egyptian revolution of 2011, while the other friend's first page of results did not include such links.
In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information," and "creates the impression that our narrow self-interest is all that exists." In his view, filter bubbles are potentially harmful to both individuals and society. He criticized Google and Facebook for offering users "too much candy and not enough carrots." He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook. According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation." He wrote:
Many people are unaware that filter bubbles even exist. This can be seen in an article in The Guardian, which mentioned the fact that "more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed." A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm that takes into account "how you have interacted with similar posts in the past."

Filter bubbles of Groups

Filter bubbles can affect entire groups, not just individuals. When discussed at the group level, it has been called splinternet or cyberbalkanization, which happens when the internet becomes divided into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible internet, with the term "cyberbalkanization" being coined in 1996. Other terms have been used to describe this phenomenon, including "ideological frames" and "the figurative sphere surrounding you as you search the internet."
The concept of a filter bubble has been extended into other areas, to describe societies that self-segregate according political views but also economic, social, and cultural situations. That bubbling results in a loss of the broader community and creates the sense that for example, children do not belong at social events unless those events were especially planned to be appealing for children and unappealing for adults without children.
Barack Obama's farewell address identified a similar concept to filter bubbles as a "threat to democracy," i.e., the "retreat into our own bubbles,...especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions... And increasingly, we become so secure in our bubbles that we start accepting only information, whether it's true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there."

Comparison with echo chambers

Both "echo chambers" and "filter bubbles" describe situations where individuals are exposed to a narrow range of opinions and perspectives that reinforce their existing beliefs and biases, but there are some subtle differences between the two, especially in practices surrounding social media.
Specific to news media, an echo chamber is a metaphorical description of a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system. Based on the sociological concept of selective exposure theory, the term is a metaphor based on the acoustic echo chamber, where sounds reverberate in a hollow enclosure. With regard to social media, this sort of situation feeds into explicit mechanisms of self-selected personalization, which describes all processes in which users of a given platform can actively opt in and out of information consumption, such as a user's ability to follow other users or select into groups.
In an echo chamber, people are able to seek out information that reinforces their existing views, potentially as an unconscious exercise of confirmation bias. This sort of feedback regulation may increase political and social polarization and extremism. This can lead to users aggregating into homophilic clusters within social networks, which contributes to group polarization. "Echo chambers" reinforce an individual's beliefs without factual support. Individuals are surrounded by those who acknowledge and follow the same viewpoints, but they also possess the agency to break outside of the echo chambers.
On the other hand, filter bubbles are implicit mechanisms of pre-selected personalization, where a user's media consumption is created by personalized algorithms; the content a user sees is filtered through an AI-driven algorithm that reinforces their existing beliefs and preferences, potentially excluding contrary or diverse perspectives. In this case, users have a more passive role and are perceived as victims of a technology that automatically limits their exposure to information that would challenge their world view. Some researchers argue, however, that because users still play an active role in selectively curating their own newsfeeds and information sources through their interactions with search engines and social media networks, that they directly assist in the filtering process by AI-driven algorithms, thus effectively engaging in self-segregating filter bubbles.
Despite their differences, the usage of these terms go hand-in-hand in both academic and platform studies. It is often hard to distinguish between the two concepts in social network studies, due to limitations in accessibility of the filtering algorithms, that perhaps could enable researchers to compare and contrast the agencies of the two concepts. This type of research will continue to grow more difficult to conduct, as many social media networks have also begun to limit API access needed for academic research.