with No Comments

Post No.: 0480tech

 

Furrywisepuppy says:

 

We can have different kinds of relationships with tech – they can be in us (e.g. nanotechnology capsules), between us (e.g. phones when making calls), added to us (e.g. prosthetic limbs), be like us (e.g. droids and AI in the physical or a virtual world) or be about us (e.g. apps that gather our health data). There’s tech merged with us and tech merged with the environment, such as the immersion or augmentation afforded by extended reality (XR), and maybe something else in the future that’ll be dreamt up in science-fiction which will then become reality one day?

 

Technologies are part of our interfaces with the world and shape our relations with the world – they are mediators or liaisons that let us experience the world and be present in it in specific ways. And especially as the boundary between artificial machine and us gradually blurs – how far can it go? And how far should we allow it to go?

 

Tech shapes how we as beings are in our world – it shapes our actions and practices, or the forms of engagement and involvement with the world. It also shapes how the world is there for us – it shapes our perceptions and experiences, or our meanings and views. We shape technologies that shape the world and our place in it, and technologies also shape the world that shapes us. Bicycles and trains, for instance, liberated people so that they could work in and find mates from places that weren’t just within a few miles away from home, so they changed how people worked and found love. Video conferencing tech and dating apps have again changed how people work and find love.

 

Cars have liberated us, yet have at the same time changed our world, cultures and lives with the road networks and town planning needed for them to function (with pedestrians and cyclists often placed second to cars). They’ve also introduced the need for new laws, created new pollution and congestion problems, changed our perception of what’s ‘far away’, and play a role as status symbols for social comparisons between neighbours.

 

Tech has ultimately shaped how we get food, date, study, work, play, sleep (e.g. artificial lighting), travel, interact with others, etc.. It’s not only at the individual but social level too (e.g. an interactive whiteboard affects the learning and teaching experience of pupils and teachers, and also affects the role of pupils as active participators rather than mere audience members). Technologies are therefore hardly passive background things – they influence our lives and behaviours enormously. When viewed as just objects in our environments and as mere means to an end, technologies are just neutral, dead tools – but when viewed as instrumental mediators of our perceptions and behaviours, they’re far from neutral. After all, technology is just a subset or part of our environments, and it’s our genes in conjunction with our environments that sculpt how we ultimately are.

 

There’s no state of pure freedom from influences – our freedom is and was always dependent on all kinds of influences we didn’t choose, or unilaterally choose, from our upbringing, surrounding cultures and indeed technological environments. Nobody’s attitudes, beliefs, desires or realities developed in an independent vacuum. Tech is so fundamental in modern life in particular that we shouldn’t pretend we are entirely independent of it. Indeed, modern tech is the difference between Palaeolithic humans and present day humans, for the specie’s genetics has relatively barely changed yet people’s lives are vastly different today compared to yesterday.

 

So behaviour-shaping influences have always inevitably been there, and once we acknowledge them, it engenders us with the moral responsibility to design technologies in a more desirable way rather than irresponsibly or blindly accepting whatever we give or are given. We should see it as a democratic right to get involved in uncovering and guiding the ethical issues related to tech.

 

Tech even shapes our moral decisions, either directly via our actions or indirectly via our perceptions and interpretations of the contexts in which we decide our actions. For instance, in a world without rubber rings next to riverbanks, it’s relatively less immoral to fail to act to help someone who’s struggling in the water. Wearing a facemask if you’re not well (whether due to having COVID-19 or not) might become a far more commonplace responsible thing to do across the world from now on?

 

Sonograms make parents responsible for their child’s congenital abnormalities – before this, if a child were to be born with a birth defect then it perhaps would’ve been considered ‘fate’, but now parents can choose to abort once they’ve seen the pictures and other test results. Parents can opt out from taking a scan but they cannot opt out from the decision to opt in/out of taking a scan and the consequences of doing/not doing so anymore, hence they cannot escape this responsibility anymore.

 

Contraceptive pills helped liberalise attitudes towards casual sex. Intelligent speed adaptation features can influence drivers’ behaviours in cars. Robots used in teaching or healthcare mediate how we teach or give care. Mobile phones shape how we experience each other and have made us reachable to others 24/7. Our smartphones organise our attentions, usually onto them rather than onto the people physically around us, as we peruse our ‘social’ media(!)

 

Tech can alter our perceptions of morality because a drone strike assassination directed from a joystick and screen situated in a little hut half the world away feels less disgusting or brutal than an eyeball-to-eyeball cleaver-hacking assassination. Online social media technologies can dehumanise people hence death or rape threats are easier for trolls to make online than offline. Will being able to resurrect animals from just their genome record (Jurassic Park-style) or really convincing robotic animals make us care less about conserving the existing or real ones? Will pollination tech mean that insect pollinators can be forgotten about? When medical diagnostic machine-learning technologies get involved in ethical decisions, this poses the question of whether ethics is something that only humans can do? When deep brain implants interfere with our moods and behaviours, this will challenge our ideas about autonomy. Such implants are used to treat certain neurological conditions but it’s not difficult to see how they could be abused.

 

Intended or not, for better or worse, technological advances even shape our metaphysical or religious beliefs and attitudes. For instance, fMRI scanners categorically showed us how the mind – the source of our thoughts, attentions and behaviours – is a function of the physical brain rather than some incorporeal soul. Sonograms have influenced attitudes about abortion, which isn’t allowed in many religions. IVF has shifted the boundary between the given and the made, or fate and choice. Technology advances the tools we use to conduct scientific experiments (e.g. lenses for telescopes, electron microscopes) and these have changed our view of our place in the universe and what it is made of. So tech can even challenge our philosophical concepts and theories. Woof.

 

Tech is involved in almost every dimension of our societies and existences nowadays. It ranks things (e.g. webpages), recommends things (e.g. what to watch or buy next), makes rapid investment decisions, increasingly collects our private data, helps diagnose medical problems, transports us, provides us company, etc. hence ethical questions are intrinsically linked with new and pervasive technologies. A technology can be good in some ways and bad in others (e.g. search engines can expand our intellects, yet can also severely weaken them for a hell of a lot of rubbish can be easily found on the web too that’ll never die out there – hence we can innovate clever ways to make populations dumber or more divided over time!) Various gadgets make our lives easier yet simultaneously complicate them. New tech tends to give new tools for criminals, cheats and fraudsters to exploit. We tend to trust computers over humans too for thinking they’re unbiased and objective (they follow their code to the letter), but they can be hacked into or be plain buggy, or for AIs – be badly trained.

 

Do we still control the tech we use or do they in a sense control us now? Have we already become too dependent on and thus essentially enslaved to them? Have people become merely ‘human resources’ like any other resource (collecting personal and usage data is regarded like striking oil nowadays) or become cogs in a big machine in an ecosystem that’s run by just a handful of giant, multinational tech companies?

 

Tech mediates our knowledge of the world (e.g. how and where we get our news from), the legal questions we ask (e.g. before we had flying machines, aviation regulations weren’t necessary) and the ethical questions we ask (e.g. is gender reassignment acceptable just because it’s possible?), and our answers to them. They even challenge the distinction between what’s natural and artificial. Rather than being merely external to humans, they help to shape what it means to be human, especially for transhumanists.

 

So this profound impact of technologies charges users, distributors, manufacturers, designers, lawmakers, policymakers and other stakeholders with the responsibility to get actively engaged in shaping this impact rather than sleepwalking into problems, especially because modern tech can have such power, reach and influence on societies globally. Well if you want to be a megalomaniac or Big Brother, you’ll need the appropriate tech to make your ambitions possible(!) Technologies are used to gain power and dominate, whether over our bodies or minds. They divide societies between those who can afford a technology and who cannot. Even if something is for the greater good and we want to take better care of each other, such as regarding using viral infection contact tracing apps, we need to care about things like how far such a surveillance tool should be allowed to go? Tech might not be the answer to every problem because nothing comes for free. What’s better off privately or publicly owned? There’s technocracy versus democracy. Technology is always political. Tech and ethics are intertwined.

 

Some of us ascribe deep meaning and attachment to personal objects like fluffy teddy bears or favourite squeaky toys. Some go further and believe in animism – that potentially all non-human things, or even phenomena, intrinsically possess souls and thus agency. Critics however fear that ascribing moral agency to non-human entities would obviate human responsibility – including blaming autonomous vehicles for traffic accidents rather than their drivers/users, other road users, vehicle manufacturers or those who plan and maintain the roads. They perceive a clear distinction between ‘objects’ that don’t have agency and ‘subjects’ that do.

 

But perhaps that’s not the point – for even if technologies aren’t moral agents, they mediate our moral choices and actions. It’s like it’s the combination of the gun and human that causes gun crimes. Perhaps many crimes wouldn’t even be attempted if it weren’t for guns (e.g. holding up a bank with a knife is much harder than with a gun). Ethical decisions and actions aren’t taken within a vacuum but within a context in which technologies inevitably play a role. Human agency is therefore not totally free but fettered, in part by the tech that’s in the environment. Similar to how we cannot not communicate for even when silent we’re communicating something – whatever’s in our environment cannot not influence us either even if something’s just sitting there inanimately, never mind animatedly. This doesn’t obviate human responsibilities but makes everyone realise that the way we design, implement and use technologies needs to be thoughtful. Designers make ethical decisions through their designs. Sellers, lawmakers, voters, etc. are all decision makers and stakeholders too, although not every stakeholder is alike or holds as direct a say.

 

Woof. Tech can always be used for good or bad. It is the role of (a well-informed) society at large to decide how to use any technology – it is not the role of their inventors, marketeers or scientists to decide such things alone.

 

Comment on this post by replying to this tweet:

 

Fluffystealthkitten30

 

Share this post