Post No.: 0103
The ‘echo chamber effect’ and ‘filter bubbles’ are troubling in this modern world of news and social media – they occur when people self-select their own circles of people and sources of information they follow, and when people utilise services that employ algorithms that curate what information is delivered to them based on their own preferences and/or historical activity, respectively. These tend to be groups, news outlets and social media sources that merely confirm one’s existing biases/worldviews rather than challenges them, thus existing beliefs end up becoming further entrenched and are made even more extreme. This is all far easier to do now with the tools offered by social media platforms, along with the amount of choice people have regarding their information sources online.
Self-selecting one’s own circles and sources of information is not new but modern social media technologies offer greater filtering options than ever before (i.e. a greater ability for people to choose and discriminate whom they want to follow and pay attention to and whom they don’t). And these social media filters or algorithms, or the simple conscious or subconscious act of only clicking on story links that one wants to read while dismissing story links that one does not want to read, expose people to generally only what they want to see and not what they don’t want to see.
So people who use news filters or search engines to look for certain news angles (where people, of course, phrase their own search terms and click on only the links that interest them or come out on top) don’t tend to get exposed enough to the full spectrum or cross-section of views available. For instance, if one holds a particular political bias, one will tend to gravitate to reading only the papers or sources that also lean in the same political direction as one does (this is an example of ‘confirmation bias’).
Social media platforms can connect people for good ends, but can also connect people in a way that helps the contagion of misinformation spread faster and deeper than ever before. The Internet and social media are tools that are supposed to open our eyes wider to the world but they can instead enclose us even tighter within our own little circles. After all, we only search for what we want to search for, we only follow or join groups we want to follow or join, and if that’s ‘other Republican’, ‘other Democrat’ or ‘other pro-anorexia’ sources, forums and like-minded people or groups, for instance, then it’ll only end up reinforcing our limited worldviews and making them even more extreme, black-or-white or one-sided as we gravitate towards and stick with those sources and groups. Or if an algorithm once again decides that ‘you might also like this’ similar and related video/article then you’re hardly going to get exposed to variety and diversity.
Echo chambers have facilitated the rise in extremist views and groups because it’s now far easier to find like-minded people from around the world, whilst we censor, instantly dismiss out-of-hand or ignore anyone who disagrees with us simply by not searching for them or those sorts of information or by not following or joining those groups and listening to what they may also have to say. Attention is finite – more attention spent in one place means less attention can or will be spent elsewhere.
People tend to use their echo chambers as a substitute for a hard, critical education or a personal effortful analysis on the subjects and issues they form or reinforce their own opinions on. People tend to follow the herd and are guided by their own herds/echo chambers. This natural tendency is troublesome in this modern world where modern technology greatly assists in creating and sustaining these echo chambers. Now almost any niche view or conspiracy theory imaginable can have a website, social media page or forum to allow like-minded people from across the world to group together online and reinforce each other’s narrow views; and this is why the world can seem ever more divided despite these social networking technologies, and why extreme-view groups have been able to gather strength much more easily than ever before.
Connecting with a particular group doesn’t necessarily mean, but does often mean, creating divisions between your group and other groups. It’s like strongly supporting a particular sports team will make one become tighter with other supporters of the same team, but it can also result in being more hostile, or at least distant, to supporters of (all) other oppositional teams. Hence joining particular groups, circles or networks can be great for expanding one’s life and world – but one must not close out other networks as a consequence otherwise it’ll end up contracting one’s life and world.
Views made public are also harder to u-turn or reverse on in case of a perceived loss of credibility, and of course our views posted on social media are generally made public. People don’t like to admit they were ever wrong! Especially about something they’ve emotionally invested a lot into believing. So this perceived loss of credibility, if we were to change our publicly-voiced stances, is also a mechanism that makes us stubbornly want to double down to try to confirm that our beliefs are true and robust, and our own herds/echo chambers will back us up too; even though if we could just pull our heads out of our own echo chambers and look around a bit more, we’d realise that in the overall picture we could be wrong. Woof.
We can now virtually always find whatever we want to find on the web (or we can be the first to create it ourselves on the web for being free to) and we can now more easily find other like-minded people in the world, no matter how sparsely spread out they are or relatively uncommon they are, to reinforce our beliefs, as well as shape the beliefs of others. Just a relatively few people sparsely spread out across the globe who believe in a very niche belief can all potentially find each other online now, form a group and reinforce each other’s views, then collectively spread more of their views online to make their group seem larger than it really is (and one person can create and run multiple websites, and anonymity/false aliases or Internet bots can mean one person or even automated software can masquerade as multiple individuals too), thus potentially attracting more people into the same beliefs via building up a network effect (seemingly popular views tend to become more popular just by virtue of already seeming popular, especially within one’s peer group).
We’re more likely to believe in a news story when it’s been passed on by a trusted person (e.g. a friend) regardless of the original source of the story. These wonderful technologies do bring people together – but these tend to be already like-minded people, who make the potentially extreme/niche views of each other not seem that extreme/niche anymore, because these people realise they’re not quite as alone as they thought (for better or worse depending on the belief). Extreme or misinformed-view groups may no longer feel they’re in the overwhelming minority with their views anymore because all they’re hanging around with most of the time are other like-minded people who are mirroring the same views, and they’re collectively shutting out dissenting, opposing or alternative views too. So echo chambers can make a group of people feel like they’re holding ‘common sense’ views and views held by the majority when they could actually be in the minority (not that a truth or well-informed opinion is determined by whether it’s held by the majority or minority). Via social media, every belief, no matter how niche or dangerous, can have its own dedicated ‘church and congregation’ nowadays.
Having such an ability to easily choose whom we wish to listen to and hang around with is a double-edged sword and is having negative consequences in politics, health advice and indeed potentially any area of life one can think of. A self-directed search engine education can be a very dangerous one too because of the dangers of self-selecting which search terms one wishes to use, and then which webpages, videos, etc. one wishes to click on and then read or view in accordance to one’s own preconceptions and pre-existing biases or desires. In contrast, in a formal educational setting with a reputable educational institution – when you’re forced to read something you wouldn’t have chosen to read yourself, that’s when you’re most likely to learn something genuinely new, such as an opposing perspective. And when you must write a dissertation or essay about a complex issue, you’re more likely to receive a higher grade if you explore all sides to a story in a critical manner before coming to a conclusion too.
Because of confirmation bias and these echo chambers that facilitate the confirming of one’s pre-existing biases – instead of the Internet aligning everyone towards the truth and facts because information is now more freely available and democratic, it has in many areas polarised everyone into their own individual camps and reinforced those polarisations. The democracy of information also means the democracy of misinformation. Misinformation fed into the minds of people also tends to be resistant to correction and is difficult to, at least unconsciously, ignore (and that’s why trying to influence a jury from the outside is a serious act that’s in contempt of court).
To be more open-minded, well-informed and rounded citizens, we need to welcome other people or things who or that show us things that we are uncomfortable with, things that challenge our worldviews, alternative perspectives, and things that are important to us even though we’re not interested in them – and we must pay attention to them despite so many other things trying to draw our attention away, as well as always apply critical thinking when assessing any piece of information, whichever side it comes from.
Time and attention are limited in our fluffy lives so no one can ever see or read everything that’s on the web – all anyone ever gets to see or read is potentially only a tiny fraction of all the information and arguments that are out there. So in order for us to be able to figure out the real facts and the most well-informed opinions to hold, we must cast our nets wider and not just listen to things coming from within our own ‘churches and congregations’ (echo chambers and filter bubbles). Misconceptions just grow and spread in social contagion when the misinformed outnumber the informed and when people apply the heuristic fallacy of ‘lots of people (I choose to hang around with and listen to) believe in it so it must be true’.
Thus regarding our own echo chambers, we should pay attention to what other people/groups are interested in too, and we should not ignore views or information that may make us feel uncomfortable. Easier said than done though, especially if we’ve publicly, consistently and strongly staked our flags on particular stances and now our own identities, social belongings, reputations and other interests are intimately tied to them.
But we’ve got to be brave yet ever humble to question our own beliefs, and if we can do that then we can potentially intellectually grow and even gain respect for it. We also all need to urgently learn to become better news consumers (for more about that, please read Post No.: 0094).
Woof! We won’t ever stop just following whom we want to follow or just clicking on what we want to see or read, but we can remember that there’s far more out there to know than what we choose to want to know. So if something that challenges your views comes to you – it’s a good idea to give it a chance and listen before judging it.