Bribery Act & Proceeds of Crime - Written by on Saturday, June 13, 2015 3:52 - 2 Comments

Information Processing biases: By Etai Biran

Print Friendly

EtaiUnknownLast week we discussed cognitive failures that occur in the “information selection stage” of the decision making process.

In this week’s post I would like to discuss biases that occur in the information processing stage. This stage of the decision making process is where most biases tend to occur. In this stage our mind tends to take shortcuts to make information processing easier for us. While processing information, our brain’s cognitive activity is more intense than in other stages of the decision making process and therefore it is when our judgment is more likely to be biased.

Representativeness Bias

When people need to make a judgment about someone or something, they look for traits that someone or something might have which correspond with typical stereotypes they are familiar with. When people rely on representativeness to make judgments, they are likely to judge wrongly because the fact that something is more representative does not necessarily make it more likely. This shortcut is taken because it offers us a quick and familiar reference point for assessing our judgment calls. We refer to what we know although we are not really sure that our reference point is absolutely accurate.

For example, executives may disregard an investment opportunity in emerging markets because the common belief may be that these markets are considered to be primitive and underdeveloped. In this case, the representativeness bias prevails and overcomes other rational reasoning as to why it is worthwhile to invest in emerging markets (potential growth in the area, lack of competition, lower operating costs etc.). In another example, the representativeness bias may explain why investors tend to invest in state bonds. State bonds are often mistakenly regarded as a “sure investment”. For most people the decision to invest in state bonds is represented by the common belief that state bonds never fail and always guarantee a sure return. This was proven wrong with Greece’s default in 2012.

To test the representativeness bias, Tversky and Kahneman held an experiment where participants were provided with the following description of a woman:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in anti-nuclear demonstrations.

After reading the text, the participants were asked to rank the following statements by their probability from 1-8 (1 being most probable and 8 being least probable):

  1. Linda is an elementary school teacher
  2. Linda works in a book store and takes yoga classes
  3. Linda is active in the feminist movement (F)
  4. Linda is a psychiatric social worker
  5. Linda is a member of the League of Women Voters
  6. Linda is a bank teller (T)
  7. Linda is an insurance sales person
  8. Linda is a bank teller and active in the feminist movement (T+F)

The description of Linda was constructed to be representative of an active feminist (F) and unrepresentative of a bank teller (T) and the results indeed confirm this (F>T). However, results showed that 85% of the participants ranked the conjunctional option 8 as more probable than the less representative option 6 (F>T+F>T). However, the probability of two conditions (T+F) occurring simultaneously is obviously lower than just one (T) occurring. In other words, it is more likely that Linda is “just” a bank teller than Linda being both a bank teller and a feminist. The reason this bias occurs is because the conjunctional option (T+F) seems more “representative” of Linda, although it is clear that statistically it is less likely to be probable.

In some cases the representativeness bias can offer a good initial evaluation (for example determining that F>T), focusing us on our better options. However, it can also lead to serious errors and bad judgment (for example, in the past people believed that diseases were caused by evil spirits and therefore rejected medical help which resulted in many unnecessary deaths).

Conservatism and Herding Bias:

The conservatism bias is described as a behavior where people are too slow (too conservative) adjusting their beliefs in response to new information. An example to this might be managers who initially underreact to news about their firm, causing prices and other factors to be reflected according to the new information only gradually. Conservatism is usually a result of the brain referring to information it is already familiar with and comfortable with processing. New information requires adjustment and processing and therefore people tend to “stick” with what they already know.

Decision makers biased by conservatism may lean towards avoiding new information that requires adaptation to. The uncomfortable feeling of dealing with new information causes conservative decision makers to incorporate only familiar knowledge and information into their judgment and disregard new information that might prove to be important and essential.

The herding bias on the other hand, occurs when individuals think that a new trend is so popular that they simply have to be part of it – hence they follow the herd. This bias results from the comfort of being part of the crowd (and the feeling of security that comes along with it). People tend to feel that being part of the herd is less risky and more comfortable than being in a position on their own. However, the popularity of a particular trend is no guarantee for it being a good decision.

An example for the herding bias can be demonstrated in the initial public offering of “Facebook”. Prior to its listing, many investors were desperate not to miss out on buying Facebook shares. This created a trend and a herding effect where people were buying shares just because it was the popular thing to do. However, buyers did not really know (or try to find out) much about the overestimated evaluation of Facebook and its future ramifications on their newly purchased shares. Facebook investors have since seen the value of their shares decrease over time.

Next week I will further discuss additional biases that occur in the information processing stage.

For suggested readings related to this post please contact me at: etbiran@gmail.com

Share Button


2 Comments

You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.

Howard
Jun 14, 2015 19:55

Dear Etai,

I have been following your writings and I really enjoy them.
One question – what are the actions your recommend to take in order to minimize such biased behavior before it occurs? I am aware we cannot eliminate such behavior all together but from your experience, what can we do to overcome these brain failures?

Thank you!

Etai
Jul 10, 2015 2:09

Hi Howard,

That’s the million dollar question. Although there is no “magic pill” we can take to cure us from irrational behavior, there are ways to overcome these traps.

The first step would be to acknowledge that we have a problem. Understanding that our brains are wired in a way that causes cognitive failures helps an individual undergo an “unfreezing” process. The unfreezing process is one where we force ourselves to let go of our old habits, making room for new ones. With the knowledge and understanding of biases along the decision making process, we can train our brain to signal us when we are likely to fall prey to a bias trap. With this, we can each develop our own “rescue” mechanism to help save us from biased behavior. The second step would then be “refreezing” our brains with these new habits.

Nonetheless, we need to undergo constant unfreezing and freezing through time. Reducing irrational behaviour is a life-long ongoing process. That said, there is no greater lesson than learning from our mistakes.

More on practical solutions to overcome biased behavior in my upcoming posts. Stay tuned…

Brought to you by...

Barry Vitou &
Richard Kovalevsky Q.C.

The views expressed on this website are those of Barry Vitou & Richard Kovalevsky QC and/or our guest authors from time to time. Please see our terms of use