Tech News: The distorted reality of Filter Bubbles and Echo Chambers

A filter bubble, therefore, often result in users having significantly less contact with opposing perspectives and contradicting viewpoints, says Professor Louis Fourie. Photo: Courtney Africa/African News Agency(ANA)

A filter bubble, therefore, often result in users having significantly less contact with opposing perspectives and contradicting viewpoints, says Professor Louis Fourie. Photo: Courtney Africa/African News Agency(ANA)

Published Oct 16, 2020

Share

By Prof Louis C H Fourie

JOHANNESBURG - About a decade ago, the term “filter bubble” was created by the Internet activist, Eli Pariser, to refer to a state of “intellectual isolation” when a search engine algorithm selectively personalise web searches by basing the information that a user sees on the profile of the user such as preferences, location, past click-behaviour and search history.

A filter bubble, therefore, often result in users having significantly less contact with opposing perspectives and contradicting viewpoints, which causes the web user to become intellectually isolated in their own cultural and ideological bubbles. Filter bubbles distort our thinking, our understanding of the world, and hinders our ability to make balanced decisions.

Personalised information and news

Typical examples of filter bubbles are, amongst others, Google’s Personalised Search, Google News, and Facebook’s personalised news-stream. But it is not limited to these large technology companies. Any newspaper company can with the help of Artificial Intelligence (AI) print a unique copy of the newspaper for each of its subscribers based on their digital profile and preferences. Many websites offer personalised content selections, based on a person’s browsing history, age, gender, location, and other personal data.

AI- and algorithm-driven news, web searches and websites ensures that a person sees only “relevant” results. No two people see the same results when they do a search, nor do they see the same news from their news curation or online newspaper apps. A simple Google search of the same word or phrase by different people would result in vastly different results depending on the profile and history of the users. According to Eli Pariser, the computer screen becomes a one-way mirror that reflects the interests of their users, while the algorithmic observers are avariciously watching what they click.

Personal ecosystems and filter bubbles

A personalised experience may sound like win-win proposal. Instead of wasting precious time scrolling through irrelevant news feeds or webpages or adjusting their search queries, users can just let the AI algorithm take care of the task for them and only present “relevant” results within a personal ecosystem of information. Unfortunately, these algorithms create personal “filter bubbles” where people are rather surrounded with an overload of information on what they prefer to see, and very little of what they should see.

Just search for the term “depression” in Dictionary.com and the site would download about 223 tracking cookies and beacons on your computer so that other websites can target you with advertising of antidepressant products and information. Or search one of the popular e-commerce sites for a specific product and you will be bombarded by the same product for months to come.

Impact of filter bubbles

The problem is that filter bubbles create “echo chambers” (a situation in which beliefs are reinforced by repetition inside a close system) resulting in the assumption that every person thinks the same and that alternative perspectives do not exist. Eventually a new “reality” without any cognitive dissonance is created, and people do not even realise anymore that what they see is being filtered.

Fake news and filtering are, however, just part of the problem. There is a hidden and more dangerous problem. According to some, this isolation of individuals and lack of exposure to contradicting views led to the current deep-seated biases, polarisation of our societies, lack of tolerance for opposing views and a general vulnerability to and trusting of fake news. We have all seen the intensification of the polarisation, intolerance and violence in recent times during elections, political rivalries, as well as sectarian protests and violence in South Africa and all over the world.

Numerous examples exist, for instance:

  • The role of the infamous Cambridge Analytica and the use of ads and fake news to prey on the fear of the public and in some measure influence the outcomes of the 2016 presidential election in the USA, the 2016 Brexit vote to leave the European Union, and elections in India, Mexico, Malaysia, Kenya, Malta, South Africa, and about 200 other elections over the world.
  • The use of Facebook and “patriotic trolling” (the use of targeted harassment and propaganda) as a weapon by the Philippine government and president Rodrigo Duterte to destroy the fiercest critics and opponents.
  • The use of Facebook for ethnic cleansing through a systematic fake news campaign under false names and accounts by senior military leaders targeting Myanmar’s mostly Rohingya Muslim minority. Facebook and the various created pages and blogs became the main distribution channels for lurid photos, false news, and inflammatory posts. Several human rights groups attributed the numerous murders, rapes, and the largest forced human migration in recent history of more than 700 000 people, to the anti-Rohingya propaganda. The disruptive disinformation campaigns on Facebook by Myanmar’s military are amongst the first examples of the use of a social network by an authoritarian government against its own citizens. Previously Facebook has been used by Russia and Iran to spread false and inflammatory messages to influence people in other countries, while domestic groups in the USA adopted similar tactics prior to elections.

It is not difficult to see why the phenomenon of filter bubbles and eco chambers caused widespread concern. In our current volatile, uncertain, complex and ambiguous (VUCA) world, we need understanding, clarity and adaptability. True democracy requires citizens to be able to understand and accommodate the viewpoints of others, but many people enjoy the comfort and security of their own bubble. Democracy necessitates the reliance on shared facts, but we live in parallel, separate universes. However convenient personalisation may be, it promotes auto-propaganda and indoctrinates us with our own ideas. Nicholas Carr once said: “We become, neurologically, what we think”.

The extent of the influence of filter bubbles is particularly dramatic on the Internet. Research has illustrated how the filtering of information and biased search rankings can favour particular political candidates. Even if only a third of the 2.7 billion Facebook users, or a small percentage of the estimated 300 million Google News readers, consider it as their main news source, it is a cause for serious concern. Even more so, because not even the search engine optimisation (SEO) experts know exactly how search rankings are organised. Neither do we know what information search engines and social platforms collect to build our digital profiles.

Privacy and ethical controversies over the use of AI algorithms to filter online content, such as the Cambridge Analytica and Facebook scandals, have raised awareness on the issue and resulted in many technology companies altering their practices. Governments are also increasingly formulating regulations to exercise more control over the collecting and mining of user data with AI algorithms by these large technology companies.

But as long as access to these platforms are free and they are financed by digital ads, which are driven by engagement and sharing rather than the actual correctness or value of the news, governmental regulations will never be enough. Users will have to do their part by using incognito browsing, delete search histories, delete or block browser cookies, and use ad-blocking browser extensions.

But even more importantly, users will have to take responsibility for ensuring that they read news from a variety of sources to form a balanced view. A company by the name of Tobias created a free Chrome browser extension that reports on political bias of online articles and also give a position on the credibility of the kind of contents users consume, as well as outbound links. When browsing newsfeeds such as Google News, Facebook, or the pages of popular news sites, Nobias will put coloured pawprints on the page to indicate the political slant (liberal, centre, or conservative) of each article. Nobias can also provide a detailed summary of the credibility and bias of the news the user regularly consumes.

It is interesting that Nobias is using machine learning, the very same technology used to reinforce bias and create filter bubbles. Many experts are of the opinion that only AI and machine learning can protect us from the detrimental effects of AI.

But it remains the responsibility of every person to recognise his/her own secure filter bubble, surrounded by social media feeds of their choice and people of the same opinion. And once this is realised, they must ensure that they form a balanced view of every story. We need to listen much wider, learn to tolerate contradicting views, and try to find common grounds, since our democracy and future depends on it. Unfortunately, AI cannot make these personal choices for us – well, not yet!

Prof Louis C H Fourie is a Futurist and Technology Strategist.

BUSINESS REPORT

Related Topics: