Post No.: 0008
Placing our eyes onto something is no guarantee of seeing, and hearing something is no guarantee of listening, for instance. And never mind that we may look but not see if we’re not paying attention – even when we pay attention to what we’re seeing, we only clearly see what’s within our foveae (the central regions of our retinas where we can pick up images sharply) with any details of the rest of the picture somewhat assumed (we also have blind spots over the optic nerves in each eye yet we don’t perceive these blind spots at all because the brain makes assumptions to fill these gaps in – so seamlessly that we totally forget/don’t realise there are even gaps there in our visions at all), and even within this area of vision we can only see a tiny fraction of the total electromagnetic light spectrum, and with current best theories even this electromagnetic spectrum in its entirety can only reveal about 5% of what this universe is made from (the rest is made from what are provisionally called ‘dark energy’ and ‘dark matter’)!
Plus we’re only seeing this universe and possibly missing the interwoven matrix of all the multiverses out there, maybe?! We also really only see 2D images that are upside-down when the light lands on our retinas – it’s not our eyes but our own brains, our own interpretations of the collected signals, that does most of the seeing or ‘seeing’. Quantum mechanics suggests that there’s no such thing as objective reality anyway – that two different observers could legitimately experience conflicting realities. Reality or the world outside our brains (and perhaps within them too) may not be as it seems to us.
So seeing is barely knowledge at all, thus if you think that merely seeing something with your own naked eyes makes you fully know something then you barely know anything at all – we know that we don’t know very much about this universe, never mind what we don’t know we don’t know. Anyone who thinks they know it all is hence sadly far off the mark. All our senses can only sense a tiny fraction of the total information out there, including what’s immediately around us, and we don’t even have perceptible senses at all for magnetism, for instance (never mind possibly other information or senses we cannot even speculate because we have no conception of their existence).
And despite all this – this scant information we do receive about the world around us ends up being filtered and interpreted through a biased lens of our own brain’s existing ‘internal model’ or sensitisation. For instance, even though the aroma molecules are still physically present in the air and unchanged, we can become ‘nose blind’ when desensitised to a smell, leading people to biasedly claim that ‘other people stink but one doesn’t’ – and it’s not fair that it’s the family dog who tends to gets blamed for these bad smells too :(. We may perceive that a well-lit room is brighter than outside on a dull day, but objective light meters may prove otherwise. Or the music we don’t want to hear seems loud but the music we do want to hear isn’t – leading to arguments between a person who says it’s too loud and a person who says it isn’t! Woof!
So your fuzzy brain sculpts your reality subjectively out of the already-narrow trickle of data that it can gather through your limited senses, and from that mere trickle it constructs what seems like a full story about the outside world. Now we cannot appeal to popular perceptions as to what the ‘correct version of reality’ is because all animal species perceive the world outside their own brains differently because they have differences in their senses (like different machine-learning robot designs with different sensors), and humans are not the most numerous animals in the world anyway – not that all humans agree with each other about what they receive and perceive anyway e.g. synaesthetes (those who experience a cross-talk of sensory areas, such as sounds with tastes) or sufferers of schizophrenia or psychosis (those who cannot compartmentalise and separate internal imaginations or dreams from external sensory information). Synaesthesia clearly highlights that there’s no ‘one size fits all’ interpretation of reality, even between humans. We all likely posses some genes correlated with schizophrenia too but it just depends on how many, in combination with our idiosyncratic environmental factors.
And we don’t truly know any differently than our own interpretations and qualia e.g. what exactly does pain feel like to someone else – how does the word ‘excruciating’ actually convert one-to-one to an actual feeling? What if one person’s 510nm wavelength and ‘green’ looks different to another person’s (on a consistent basis)? Meaning that they’ll never know that they’re seeing different qualia of colours inside each other’s brains, even though they might still hold the same (learnt) associations for the wavelengths of light they receive (e.g. 510nm means ‘go’). There are objective air compression waves of sound and wavelengths of light, but what we experience as the qualia of ‘the note middle C’ or ‘the colour green’ are only our own brain’s interpretations of these waves. Maybe analogously it could be like the exact same series of 0s and 1s, low and high voltages, could be interpreted to mean one thing for one (part of a) digital computer software program and something else for another (part of the) program?
A smell that is ‘disgusting’ to you doesn’t make it objectively disgusting – it’s only disgusting to you due to the balance between human genetic evolution (your particular genes) in combination with your own personal past experiences (e.g. the smell of strong cheese, which some people can learn to eventually like if they like the taste of such cheeses or otherwise somehow got to associate the smell with a positive experience), as well as the current context (e.g. cheesy smelling cheese is delicious but cheesy smelling feet is horrible!)
This qualia is not objectively ‘there’ in the outside world – it’s all only inside our own heads. The same shade of tile can look light grey or dark grey depending on what context we perceive it in – they’re the exact same wavelengths of light hitting your own retinas (so you’re comparing yourself with yourself from one moment to the next, never mind comparing your perceptions with somebody else’s) but we don’t see objectively and absolutely but subjectively and relatively. It’s not a matter of some people being physically colour blind (the colour-receiving cones in the eyes being physically missing or faulty). Did you see a black-and-blue or a gold-and-white dress? Did you hear “laurel” or “yanny”? It just depends on your own eyes or ears in combination with your own brain’s internal model and current state.
No species, or therefore individual, experiences an objective version of reality – only a version created from the bits of information their particular limited senses have evolved to receive, and their particular precise physical neural structure of brain. So how can we be absolutely sure we’re all experiencing the same version of reality (e.g. that the colour red looks the same for all of us, or whether a particular sound means pleasure or danger)? It’s possible that every single individual brain has a unique internal model that is telling at least a slightly different narrative.
The upshot is that we should all learn to be more tolerant of different and diverse views and not be naïve in thinking that our own experiences and perceptions are the only ones possible, never mind think our own experiences and perceptions are objective and indisputable.
Woof. My furry hope is that a reduction in self-righteous arrogance can produce a more empathic and peaceful world!