Decision-bias in HR - are you really making fair decisions?
Read Jan's other articles:
- What should HR professionals know about neuroscience?
- How understanding neuroscience can increase the ROI on your training investment
What is decision bias?
Decision bias is a combination of economics and psychology. Behavioural economics looks at understanding how humans make decisions in reality rather than, as traditional economic theory would assume, completely rationally based on self-interest. It also considers how circumstances influence decisions and create irrationality.
Biases have a way of creeping into decision making. As an HR professional you evaluate people, products and services on a regular basis, but how confident are you that you’re fair and unbiased in your judgements and decisions? Many biases are completely unconscious and operate as a kind of intuition, a 'just knowing'.
Humans have two thought systems
Research by Daniel Kahneman shows we have two thought systems. One is a fast instinctive and emotional brain system and the other a slower, more deliberate and logical system. Often when we make decisions our fast intuitive system jumps to conclusions or takes shortcuts that the rational system doesn’t question. This can lead to biases.
Which decision biases affect HR?
Lots of biases have been identified but here are a few that potentially impact HR.
Anchoring or first reference point: The tendency to use reference points that lead us astray. For example if a candidate asks for £100,000 salary which is way above what you planned to pay and then comes down to £80,000, which is still above your budget, this feels less of an issue than if they had asked for £80,000 in the first place. The initial price or offer tends to have ramifications right through the process of coming to an agreement. The mind keeps referring back to that initial number or proposal. That doesn't mean the best tactic is to make the outrageous offer, although in reality that's often what is done. Think of initial car prices or house sales. There is also evidence in salary negotiations that when the initial anchor figure is set high, the final negotiated amount will be higher. This is a reason that you should open negotiations rather than waiting for the employee to tell you the range, because then you can set the anchor.
Status quo: Favouring alternatives that perpetuate the existing approach. This includes resistance to adopting new ways of working, technology solutions and new processes. This bias often appears in two scenarios: when we are invested in the current approach, see also the Ikea effect below, and when there would be a loss in a change. We see this in our work in applying neuroscience to leadership development. HR leaders know their current approach has not changed behaviour and they believe the evidence that the science-based approach will work, but to change direction would require them to have to move away from what is known and familiar and this is more difficult than staying with the familiar, even if it is not working well. People tend to take the easiest path.
Confirming evidence: Seeking information that supports your existing view. For example assuming someone on the talent radar is there for life and dismissing data that contradicts the assessment. Or favouring the first results of an employee relations investigation and finding it hard to accept new information or when it is hard to take on board data that doesn’t fit with your existing views on what creates success in business. Think about the recent book by Adam Grant, ‘Give and Take: A Revolutionary Approach to Success,’ which suggests helping others is the root to success at work. This is counterintuitive. Surely selfish people succeed? People are very skilled at maintaining a sense of certainty and of favouring data that fits with what we already know. Our brain is wired to do this and so we work hard at explaining why our existing view of the world is right. This bias requires you to really challenge your assumptions and to keep an open mind and if you find yourself resisting new information, asking why.
Loss Aversion: This term was coined by Amos Tversky and Daniel Kahneman and refers to a tendency to strongly prefer avoiding losses to acquiring gains. This explains why when we have invested in something, be it emotionally, financially or our time; we find it difficult to make decisions that would result in any loss from stopping doing it. A good example is tolerating poor performance; the loss of our investment in the person means we are less likely to decide to deal with the performance issue.
Endowment: Preferring an idea or project because it is yours or you are identified with it, otherwise known as ‘not invented here’. Endowment creates an inflated sense of the value of the thing. Think of a place on the management committee or the leadership programme. This is exacerbated by the Ikea effect. You have put great effort into making something; Ikea furniture, your performance management process, the new service center and you are therefore reluctant to give it up and you value it over its objective value.
A word about intuition
Intuition is not technically a decision bias but important to understand as it creates many of the biases. There is evidence that intuition is the brain putting together knowledge and experience in new ways - neurons making a jump to create a new pathway if you like. Experienced HR professionals can add value through these leaps.
But just as intuition became an acceptable way to make decisions in business, the science is pointing to its faults. Because the brain likes certainty, intuitive feeling is a reward - it creates a sense of certainty and makes us feel good. Robert Burton suggests this certainty bias fools us into placing too much store on intuition; we need to verify intuition with data.
For example, when I worked in the city we had a top rated strategist at the bank. He advised investment managers where to place money. They hung on his every word! He confessed to me all his ideas were intuitive and then he went to find the data to back them up. This is what Burton is advising.
HR often know intuitively that a policy isn’t working, a leader has favourites, an employee is not really performing; sometimes our reputation is enough for the leadership of the business to listen. However, if we believe the science, this is a dangerous approach. For me it also speaks of a lack of rigour. Your intuition may be right but what does the data say?
So how do you use these ideas?
Well I find forewarned is forearmed. But you can also take an issue and test your decision against the potential traps above; ask someone to challenge your thinking to ensure you have identified any bias. A similar approach is to ask colleagues to take the role of key stakeholders and from their perspective ask challenging questions. This usually flushes out bias.
Finally be aware we all make these bias decisions and they are not always bad. The Ikea effect works in the company’s favour, it creates commitment to tough projects and bonds people to a common purpose.
Thinking, Fast and Slow Daniel Kahneman
Kahneman, D. (2003). Maps of Bounded Rationality: Psychology for Behavioral Economics. The American Economic Review, 93(5), 1449-1475.
Kenrick, D. T., Griskevicius, V., Sundie, J. M., Li, N. P., Li, Y. J., & Neuberg, S. L. (2009). Deep Rationality: The Evolutionary Economics of Decision Making. Social Cognition, 27(5), 764-785.
Predictably Irrational: The Hidden Forces that Shape Our Decisions Dan Ariely
Kahneman, D. and Tversky, A. (1984). "Choices, Values, and Frames". American Psychologist 39 (4): 341–350.
Initiating Salary Discussions With an Extreme Request: Anchoring Effects on Initial Salary Offers, Todd J. Thorsteinson