Post No.: 0749
Furrywisepuppy says:
Although non-private social media channels are open to anyone who wishes to follow them (apart from due to selected government censorships or state-wide firewalls in certain countries), it doesn’t mean people online are constantly exposed to a broad range of diverse views and we get the wisdom of a diverse crowd. On the contrary, it has allowed ever more narrow-interest echo chambers to form. People can now – however niche or extreme their views – find other likeminded people from across the world no matter how relatively sparsely distributed they are globally. And as we hang around our self-selected echo chambers, we can start to believe that our views are far less niche, extreme or controversial than they really are. Sometimes it’s okay (e.g. drag queens or cosplayers finding each other) but some groups share harmful messages or propaganda (e.g. pro-ana or extremist political or religious groups).
Social media divides, even polarises, people and reinforces extreme views – as much as connects and exposes different people to diverse views. The ‘Splinternet’ – or globally divided Internet due to divergent national, political, religious and/or commercial interests, or simply due to who can afford certain technologies – supercharges the echo chamber effect.
Those who know the least often bark the loudest, firmest and most immoderate opinions too. We know this because, the more we learn about political issues, the more we’ll understand different viewpoints and nuances that’ll moderate any extreme stances.
Yet, via social media, even the most uninformed individual can easily air and spread their views globally nowadays – leading often to the blind leading the blind. And when we’re intellectually blind, we cannot see that we’re intellectually blind. We’ll then reinforce each other’s views when we group together in our chat rooms, thus leading our tribes to further believe that we correctly grasp the full picture because plenty of other individuals seemingly agree with us too (the ad populum fallacy and communal reinforcement) – similar to how a congregation of followers of a particular religious sect will stick with their own church, synagogue, mosque or equivalent and reinforce each other’s beliefs from within, whilst passively or actively filtering out any opposing views coming from outside of one’s circle.
Filter bubbles – stemming from our own biased searches for information and how algorithms selectively guess what information we’d like to see, based on what we’ve previously searched for before, as well as our location and other personal data – shape our own individual realities. Echo chambers also make us feel validated, accepted, and like we fit in and can be loved, rather than uniquely dim for believing in something that no one else does. Similarity therefore attracts. And blanking, belittling (frequently via fallacious argumentation) or banishing dissenting voices from our groups just reinforces this echo chamber effect.
Likewise, protests or rallies seldom achieve their aims for change, at least on their own – but they’re great for acting like mass networking events and surrounding oneself with likeminded people, thus enabling those within them to insulate themselves even further from external views. It’s also been a problem when terror suspects who’ve been detained together – perhaps ironically for a de-radicalisation programme – end up being exposed to each other’s radical views and ideas, which fortify those views further!
This is how we can get the development of over-generalised fears, misconceptions, conspiracy theories, urban myths, conjectures that contradict wider evidence, grooming and the radicalisation of beliefs.
When we’re under-educated, we won’t realise how under-educated we are because we won’t know what we don’t know. And a little bit of knowledge – or certainly any amount of misinformation – can be a hazardous thing. We won’t recognise that we’re actually lacking enough understanding unless and until we open ourselves up for being taught from outsiders or independents in order to discover what we’re missing; like perhaps by taking an academic course on a particular subject. When we think we already know enough to cement our political allegiance with one side (and political beliefs are generally more to do with tribalism than reasoning), we’ll assume that any views that contradict our own must simply be wrong and not worth being open to truly listening to, never mind ever accepting. And the perverse result of this is that we won’t likely ever think we’re under-educated.
For instance, we might strongly believe (maybe because it’s intuitive to our own limited experiences) that if someone else is obese then it’s entirely their own fault – whilst a more extensive understanding about genetics and environmental influences will inform us that this issue isn’t as simple as that. We might also have a motivation to believe in complete personal responsibility because of our own political biases in arguing against the justification of external regulations.
Due to cognitive ease – gossip and oversimplified social media news from the sources we follow or are recommended to us are easier to learn from too, and therefore tend to be more readily trusted than tough technical material of the type that may be taught in academic courses. But achieving a decent grade from a formal education from an internationally reputable institution on a subject we wish to express knowledgeable views (and essentially be a teacher to others) about is more likely going to lead to a fuller and better comprehension of that subject than an informal education based on a self-directed curriculum and our own chosen echo chambers. Academic courses force us to critique and check our proper fluffy comprehension of subjects too. When self-selecting what to learn, we may find it more interesting to learn about the things we want to learn about i.e. the things that make us feel good by telling us that we’re already correct – when it’s more enlightening to learn about the things that don’t. Woof!
Meanwhile, anyone without relevant qualifications can self-style themselves as an ‘expert’ on social media; and if one isn’t educated enough on a relevant subject as a social media consumer either then one won’t always be able to tell the difference between who’s giving sound advice and who’s merely sounding credible (e.g. there are those who confuse or conflate ‘introversion’ with ‘introspection’, ‘psychotic’ with ‘psychopathic’, ‘schizophrenia’ with ‘dissociative identity disorder’, and even ‘paediatricians’ with ‘paedophiles’!)
And those with the largest followings tend to employ emotional appeals like anger and hate, or plausible-sounding pseudoscience that play on people’s hopes and fears (and maybe you must buy whatever they’re promoting to fulfil or allay them respectively), instead of plain factual appeals or nuanced arguments. Things like acupuncture and cannabidiol come along and there’ll be some proper scientific research that shows that it’s beneficial for some specific ailments (e.g. pain relief or epilepsy respectively), but then marketers start to jump on the trend by claiming that it’s some kind of panacea, without the evidence to support those claims.
Now we’re taught to appeal to emotions because this works to persuade. Yet is this like luring children with candy simply because it empirically works?(!) It works but we wish this wasn’t the case because we’d rather everyone were more dispassionately critical, like questioning whether one person getting killed in a politically-motivated, and thus terrorist, attack should be considered worse than three people getting intentionally murdered in a ‘regular’ triple homicide? Terrorist attacks are emotionally salient but perhaps shouldn’t be considered far more newsworthy and worth investing limited public funds to minimise compared to the regular homicides caused by firearms that objectively claim far more lives annually. Greater gun controls in a country like the USA would save more lives than merely focusing on anti-terrorism policies.
So social media has connected the world, but not entirely in a way that had been hoped – it has connected individuals with similar views together into their own groups, which has driven even groups with only slightly divergent views further apart. News and information can now be shared and spread much faster and wider, but lies and abuse can too. Misinformation about a pandemic can even spread faster than the actual virus it concerns(!)
Truth is usually the first casualty because plenty of individuals (not just politicians, businesspeople or so-called ‘elites’) have their own agendas that they can now serve rapidly and directly to news consumers via such platforms like Twitter (e.g. lies, propaganda and conspiracy theories disseminated during election campaigns). These social media platforms facilitate the filter bubbles and echo chambers so we may only get to hear likeminded views and barely hear alternative perspectives. It’s an age of disinformation as well as information.
The thoughts and ideas that should most be heard often never resonate beyond their origins. Meanwhile, more dangerous ideas frequently propagate like a plague – including disinformation, conspiracy theories and scams.
Content creators with extremist beliefs who get tons of viewers, and thus make tons of money, are basically getting rewarded for believing in and spreading their extremist beliefs on the platforms they’re on – hence it’s selfishly rational for them to carry on believing and doing what’s making them wealthy. The viewership appears to validate their ideologies too. The problem is, the platform gets rewarded from the viewership too hence faces a conflict of interest between maximising its own profits or maximising its ethics; unless enough shareholders care to put ethics before profits, or external regulations change the incentives by introducing penalties for certain behaviours. (Extremist content and grooming is even attempted on online videogame chats. Custom levels or maps on Minecraft and Roblox are sometimes even created to act out violent extremist fantasies. Both the creators and platforms can again make money from these activities.)
Meta is arguably essentially a giant media company because many people take their news from Facebook. But no one necessarily needs to follow journalistic standards to publish news there (nor on the web generally) thus tons of influential misinformation is abound on social media in all languages.
So journalism has been democratised, but most social media content that users consider as their news sources nowadays isn’t created by trained journalists who follow journalistic codes or standards, like fact-checking information before publishing it. Advertising money has shifted from traditional news channels to a small concentrated handful of powerful, global tech corporations like Google and Meta that don’t even pay for content (which is one excuse at least some of them give for not interfering with what their users publish on their platforms, even if something is spouting falsehoods – they believe they’re passive platforms rather than content editors). Domestic laws are perhaps beginning to shift on this issue by getting these corporations to pay news publishers for displaying their content, like what has happened in Australia since 2021.
Much news content isn’t directly paid for anymore hence quality journalism has been a victim. The way to make money with social media news is via views, likes and shares, to ultimately attract the advertisement revenue. This has thus incentivised the proliferation of clickbait, even from professional journalists, as they try to adapt to this environment. Getting something to become viral contests with conveying pure truthfulness and informativeness (see Post No.: 0583).
Reacting to other people’s content is popular content too! This is neither new nor a judgement but is an observation. Yet we must watch out because many viral hit videos are totally staged, including the targets of pranks who were in on it and just acting. Many videos of dumb people, real or staged, make us the real dumb ones for watching, sharing and making them popular – it’s the survival of the dumbest! A lot of content is artificially scripted precisely in order to get views and reactions, which prompts the sharing of it. Many things are done nowadays just to attract influencers to post photos of them, like the way food is served in some restaurants. Highly partisan views, fake news, conspiracy theories and trolling have also often been the news itself, which distracts us from the real or other news.
We therefore need to protect the profession of journalism.
Woof!
Comment on this post by replying to this tweet: