with No Comments

Post No.: 0929data

 

Furrywisepuppy says:

 

Some businesses collect huge amounts of our personal data. They apparently do so to give us a more customised and personally relevant experience (e.g. better targeted adverts). But some private corporations collect as much data as they can from their users even when they don’t have an immediate use for it (data hoarding). It’s written in their lengthy terms and conditions though if you can read and understand them – so we’re ostensibly making a fully informed choice. But we may be opting into things we’re not aware of. This is where regulations to simplify the terms or more clearly summarising the main and crucial points can help consumers. Their T&Cs only care about and protect the firm’s interests, not their customers’ interests. Don’t use their services if you disagree with their terms – but their services might be hard to avoid if you want to live a modern life.

 

The issues concern consent, trust and transparency, amongst others.

 

Under EU General Data Protection Regulation (GDPR) rules – consent must be freely given, specific, informed and unambiguous before businesses (or even simple non-monetised blogs) can use your personal or behavioural data for marketing purposes. Users need an easy way to check what they’ve apparently agreed to, what data a business currently holds about them, and a way to delete this data if requested. Such regulations don’t stifle choice yet aim to protect consumers by making them more aware of what information will be collected and how it’ll be used.

 

Notwithstanding, the vast majority of people don’t even attempt to read the T&Cs or privacy policies before clicking ‘agree’ anyway! They’ll just follow the crowd by assuming that a service must be fine if thousands of other users are using it. Ignorance maintains bliss. How many of us carefully read every cookie consent notice before clicking ‘accept’ on a website?!

 

Default ‘opt out’ (as opposed to ‘opt in’) laws for consent do help, but sometimes you cannot really refuse an option if you want to use a service at all. And if you use an app that logically must have access to your fluffy contacts or images, you must usually give developers access to these sets of data wholesale i.e. not just the contacts or images relevant to make the app function and that’s it. With just one consent request, they could potentially do pretty much whatever they want with your data too.

 

For the ‘free’ service they give you, you trade with them your personal data, which they’ll exploit to time and target adverts to you (e.g. to persuade you to buy stuff or alter your political beliefs), sell to other businesses, or conduct experiments and influence your behaviours with (e.g. to learn how to better keep you hooked on their apps and ultimately maximise their profits). Users’ instincts are exploited via deliberate habit-forming design choices – read Post No.: 0398. And just like junk food retailers don’t want you thinking about the calories – these apps don’t want you thinking about the data you’re surrendering but the instant gratifications of using their product. You could also say that we surrender personal information to governments in return for the perks of being a citizen in a country (e.g. to access benefits). So the question is – are you okay with this trade? You may well be fine with it.

 

So such ‘free’ social media services aren’t really free – we trade our data for the service, and are put in demographic bins for advertisers to target us. We also pay with our time and attention whenever we watch these adverts.

 

Store loyalty card schemes are also about trading your data, for discounts. And although in this context customers presumably already like and use the store and would like discounts for the things sold there – many of these firms also trade your data with third parties for an extra profit.

 

Data apparently isn’t something that can be legally owned according to intellectual property law. We can bundle data into things like copyrights or database rights that do carry IP rights though. Data protection laws can allow us to use data in certain ways and give individuals rights to that data – so it has value; yet we cannot own it as it is.

 

Regarding using big data analysis techniques to predict health epidemics – it can inform a national health service to stock up on medicines in the right places and times. But health insurance companies could also use such data to determine high-risk areas and charge those who most need health insurance higher premiums accordingly. Insurance relies on the pooling of individual risks that can only be known in aggregate. Actuaries try to predict an individual’s risk based on the historical data of those they deem as most similar to them. Will this therefore be based on illegitimate discrimination – as if all homosexual individuals behave the same way for instance – or is this just efficient market dynamics (more perfect price discrimination)?

 

Algorithms can possibly predict us better than we can predict ourselves. Better predictions of people’s lives will reduce decision risk but a life insurance provider may know so much about you that it knows when to – from their perspective – best offer you insurance (when you’re likely going to live for many more years) and when not to (when you’re not likely going to live for much longer), based on when they predict you’ll die. That’s all to their advantage and not yours. It’s like flood insurance is not included in standard home insurance policies if you do live in a flood risk area, and vice-versa! You’ll be paying for something you won’t likely need, or won’t be able to purchase something you will likely need.

 

We’re generally highly predictable that companies can reasonably confidently infer and predict our personalities and preferences just from the content of our posts and our likes even when we don’t explicitly reveal these details about ourselves on social media, like our personality type, political views, religion, ethnicity, age, gender, sexuality and attitudes to drugs. A retailer’s algorithms can easily predict if you’re expecting a baby based on your recent purchases for example, and thus present more baby-related adverts to you.

 

Advertisers can use browser cookies to track our activities across the web. Assumptions like guessing that people with ethnic names belong to certain ethic groups just need to work most of the time. They don’t have to get every single inference correct. Adverts don’t expect a 100% conversion rate. Voting contexts especially only need an overall result.

 

Freedom of thought, privacy and autonomy are fundamental basic rights – yet you can see that in some situations it’d be great to know as much as one can about people. And this isn’t necessarily about profiting from them either (e.g. determining whether a migrant is a genuine asylum seeker).

 

Should firms like Alphabet, Apple, Meta and Microsoft allow governments or unaffiliated individuals to use their (our) data, like location and biometric data, in order to analyse and predict epidemic trends, assist in countering crime or terrorism, more efficiently allocate public resources, improve town planning, assist in emergency responses and other issues for the public social good? If lots of people in a geographic area are suddenly searching online for flu remedies then it suggests a trend of influenza in that area for example.

 

Sometimes there’s a kind of NIMBYism though – we don’t mind other citizens sharing their data to assist the greater good but we’d rather our own personal data be kept to ourselves. We also want our own deceits to remain hidden but other people’s deceits to be exposed! People can want their own lives to be private yet want all celebrity gossip to be publicly revealed, hence why paparazzi shots and gossip magazines are so demanded. (News International even used phone hacking methods to capture gossip on celebrities and even relatives of deceased people. And it caused the victims to wrongly blame their friends and family for leaking information to the press.) We want others to be transparent but us to keep some secrets.

 

How your credit score is precisely calculated by the credit agencies lacks total transparency yet we’re at the mercy of them with our lives. It’s not objective either thus different agencies can give you different scores. There are non-transparent processes for getting fairness and due process, and this can segregate people.

 

You could imagine a world in which we each have a ‘consumer score’ and ‘health score’ as well as a ‘citizen score’ and these will have a real bearing on what options are available to us in a kind of capitalist apartheid i.e. everyone is rated and served according to how much profit a corporation estimates it’ll capture from each individual, and therefore the rich will get the best options and the poor will get the worst, which keeps the rich as rich and the poor as poor. At airports, there could one day be different queues for ‘trusted travellers’ versus ‘less-trusted travellers’, or one mightn’t even be able to walk into a Bugatti dealership for a look if the scanners at the entrance read that you cannot afford one. This kind of world can be served by organisations acquiring finer amounts of personal data on every single one of us.

 

But again the problem with credit scoring, criminal profiling and the like is that it applies general information to try to predict specific individual outcomes. And maybe such scores could be manipulated on the inside if they’re not transparent? Perhaps identity thieves can steal someone else’s identity/scores? Credit reporting company Equifax, who holds information on customers like social security numbers, dates of birth, addresses and driver’s licence numbers, experienced a data breach in 2017 that put ~150 million customers at risk of identity theft!

 

China’s Social Credit System is frequently misreported in western media as similar to a Black Mirror episode. Nevertheless, the idea alone, of utilising operant conditioning to get people to behave as desired, appears scary for cultures that value their individual liberty more than their collective cooperation. However, that’s essentially just like how social media platforms reward people for staying on their platforms! In the West, we get rated for our taxi rides, have credit scores and criminal records too.

 

Is making reputations transpicuous to tackle corruption and promote social order a bad thing? Well it depends on what a particular government thinks that looks like. It might look like oppression to us. Some private corporations have considered ideas like microchipping employees to monitor their activities while at work too.

 

Do/will governments and corporations holding so much private data about us offer us convenience or present a dystopia? Our DNA data could one day be used against us. Mistakes in or deliberate tampering with our personal data, like credit data, can wreck our lives. And we may be more inclined to trust in a computer because we assume machines can only be objective.

 

We worry about our private furry data being wrongly collected and stored, misused, stolen or sold on without our permission by organisations.

 

We worry about technologies being abused by governments, corporations or really anyone (e.g. tiny cameras, tracking devices or ‘stalkerware’ planted by ordinary citizens) for their own self-interests. (And is it okay for manufacturers to ‘plausibly deny’ all responsibility for the behaviours of their users by simply stating disclaimers that say ‘we don’t approve of such a use’ concurrently with knowing full well that this is how some people will abuse their products?)

 

Well we may object to government-imposed ID cards that hold our personal information, or loathe CCTVs being everywhere invading our privacy – yet happily and freely submit our personal data and images on social networking platforms every day!

 

Woof. Although there’ll be further posts on this topic – you can reply to the tweet linked to the Twitter comment button below if you have anything to add to this subject of your data.

 

Comment on this post by replying to this tweet:

 

Share this post