Have you ever thought twice about posting something on social media for fear of it reflecting badly on you in the future? What we do online is forever out there for others to see and judge.

In a world of big data and online surveillance, our actions and behaviours are increasingly monitored, measured, interpreted and rated. Could this be leading to a society where we are afraid to take risks or speak our minds?

Technology critic Tijmen Schep described this effect as ‘social cooling’, due to its potential chilling effects on society. Here’s how it works.

First, your data is collected and stored. We’re not just talking about personal details here, but all the things you have liked, searched for, bought or commented on – your entire digital footprint.

Your data is then bought by data brokers, who use algorithms based on societal trends to predict thousands of additional details about you, your life and your habits.

This new data forms your ‘digital reputation,’ and can potentially be used against you. A crude example of this is how recruiters can scan your social media history to make assumptions about your lifestyle and personality – too many party pics and you might miss out on that dream job.

Take this a step further, and before you know it, algorithms are using your digital reputation to inform potential employers about your personality flaws and the state of your mental health. The problem is, we just don’t know how the data we produce today could be used against us in the future.

As a result, people are changing their behaviour to improve their digital reputations and gain better access to opportunities. While protecting your digital reputation can mean a better job or a bigger loan, it comes with a heavy price.

Conform – or suffer the consequences

Social cooling is already underway. From avoiding webpages that might reflect badly on ourselves to thinking twice about uploading silly holiday photos, we all take part in online self-censorship for fear of negative consequences.

This has been taken a step further in China, where the government is trialling a system that uses citizens’ digital reputations to award a social credit score, which represents how well they have behaved in the eyes of the state. If you buy the ‘wrong’ things or act in the ‘wrong’ way, the state will mark you down. Those awarded a low score are punished by restricting their opportunities, creating a powerful tool of mass coercion.

In a world where everything is online and interconnected, social credit systems can be used to grant or deny access to public spaces, transport, products and services, creating a society where citizens are forced to conform or be denied the opportunity to fully participate in society. And this can all happen without any resistance – because resistance would damage your social credit rating.

What’s your rating?

These days, it is possible to rate pretty much any product, service or experience. Sites such as TripAdvisor, Yelp and Glassdoor allow us to rate businesses and even individual professionals on a one-to-five scale.

This has given everyday consumers a voice like never before, as well as access to a wealth of feedback and reviews, allowing us to cut through the marketing noise and listen to real opinions instead. In the case of Glassdoor, employees now have the chance to rate their employer, resetting the power balance between the two.

But the proliferation of such metrics, and the importance we place in them, can have unwanted consequences.

The motivation to get a better score can cause those receiving ratings to do whatever it takes to get better ones. This can mean gaming the system, whether by paying people to write fake reviews, or writing negative views about competitors.

Even the genuine customer ratings are rarely an accurate view of the quality of a product or service – most people only write a review if they experience something exceptionally good or bad.

Unforeseen consequences

In some cases, there are inherent problems with the way ratings are awarded. For example, if we rate doctors based on mortality rates, those taking on a terminally ill patient or someone with a slim chance of survival – the very type of person who needs a good doctor – will get a lower score than those who only choose patients that are likely to live. As a result, doctors would avoid treating the very patients that need them most, for fear of tarnishing their reputation.

Some employers are now allowing their staff to rate each other, forcing them to exist in a cutthroat environment where people are thrown under the bus by their peers, or rate others to gain favour in return. The subjective nature of this system is extremely problematic, as human interactions are clouded by emotion rather than logic.

The controversial app Peeple has taken this a step further, allowing us to post reviews of regular people – our friends, neighbours, colleagues or even our partners. Although the app has been toned down due to negative feedback – you can now only rate those who have given their permission – it hints to the potential of a ratings-based society, where all of our interactions are carefully measured to avoid negative reviews.

The ratings effect, or something similar to it, is also creeping into our private lives via social media, where people’s obsession with ‘likes’, ‘views’ and ‘followers’ has altered their behaviour, making the approval of others the true motivator behind the things they do, and the things they do simply a vehicle for gaining that approval. This pretence creates a virtual world of fake experiences, neatly packaged and presented via Instagram posts, Youtube videos and Tweets.

In all of these examples, we become slaves to the metrics. The fear of being rated badly drives us to do things we wouldn’t naturally do, often with negative or unforeseen consequences.

As Jerry Muller said in his book The Tyranny of Metrics:

“There are things that can be measured. There are things that are worth measuring. But what can be measured is not always what is worth measuring; what gets measured may have no relationship to what we really want to know. The costs of measuring may be greater than the benefits. The things that get measured may draw effort away from the things we really care about. And measurement may provide us with distorted knowledge—knowledge that seems solid but is actually deceptive.”

What of our freedom and rights?

The rise of big data and digital surveillance raises some huge questions about freedom. While we still have the right to upload pictures of a drunken night out, the repercussions of doing so are increasingly causing us to think twice. This represents a very subtle and illusive form of control.

We run the risk of dehumanising our lives, sterilising our thoughts, activities, relationships, workplaces and societies for the sake of a better digital reputation or rating. The end game would be a world devoid of individualism and character, where nobody dares do anything outside of the norm.

Tijmen Schep states that privacy is the right to make mistakes. And to make mistakes is to be human. In a world where our every mistake can be recorded and used against us, are we truly free to be human?

Lessons for the workplace

The big data revolution has swept the world of work, transforming the way businesses make decisions, and providing levels of insight that would have been unimaginable just a few years ago. But we should wield its power with care, to avoid a world where we trust blind algorithms to make decisions that affect real human lives.

We risk creating a work environment of fear and distrust, a sort of Big Brother scenario for the office where everything an employee does is monitored, recorded and used to make decisions about their employment. Likewise, ratings systems can be highly problematic in the workplace, reducing real people down to numeric scores, and causing us to act unnaturally in pursuit of higher ratings.

Big data has the potential to make our work lives better and more productive, but only if it is used in a human-centric way. The insights it offers should inform human decisions, not make them.