Está en la página 1de 1

Cognitive bias

When it comes to assessing risk, humans often fail to make rational decisions
because our brains take mental shortcuts that prevent us making the correct
choice. Since the 1960s behavioural scientists and psychologists have been
researching these failings, and have identified and labelled dozens of them.
Here are some that can cause havoc when it comes to assessing risks in business

ORIGIN
Social Failure to estimate The notion of cognitive biases was first introduced by psychologists Amos Tversky and Daniel
Kahneman in the early-1970s. Their research paper, ‘Judgment Under Uncertainty: Heuristics and
Biases’, in the Science journal has provided the basis of almost all current theories of decision-
Financial Short-termism making and heuristics. Professor Kahneman was awarded a Nobel Prize in 2002 after further
developing the ideas and applying them to economics.

BELIEF BIAS
Basing the strength of an argument
on the believability or plausibility of
the conclusion

“I didn’t quite follow your ANCHORING EFFECT


CONFIRMATION BIAS
Relying too much on the initial
Focusing on information that only argument but the conclusion
piece of information offered when
confirms existing preconceptions seems about right” making decisions

“We did loads of “The first test seemed


simulations. Most of them BLIND SPOT BIAS OK. Do we need to look
Viewing oneself as less biased any more?”
showed there’s no problem”
than others

COURTESY BIAS “Let’s ignore Sarah’s AVAILABILITY HEURISTIC


Giving an opinion/conclusion that is
views on this one. Overestimating the importance and
viewed as more socially acceptable so as
to avoid causing offence/controversy She’s biased” likelihood of events given the greater
availability of information

“The last time we discussed CLUSTERING ILLUSION “I saw something very similar
this the meeting lasted for Erroneously overestimating the to this on LinkedIn. We need
hours. Let’s move on” importance of small clusters or to take it seriously”
patterns in large data

ENDOWMENT EFFECT “This is the second week in a BANDWAGON EFFECT


Relying too much on the initial row that this has happened. Overestimating the importance and
piece of information offered when likelihood of events given the greater
There must be a problem”
making decisions availability of information

“I know it will cost a fortune “The whole department


to fix but it cost us £15,000. knows there’s no
We can’t just throw it away.” problem here”

“Dave from tech is worried “Now we’ve got the new


- but frankly the tech team equipment we can cut the
are always pessimists” time spent on maintenance”

STEREOTYPING RISK COMPENSATION


Assuming a person has characteristics “If it ain’t broke - don’t fix it” Taking bigger risks when perceived
because they are a member of a group safety increases; being more careful
when perceived risks increases

“Our competitors are only STATUS QUO BIAS


Preferring the current state of affairs
“Looks like we’ve run out
doing well because their over change of time to discuss this”
products are cheap”

OSTRICH EFFECT
REACTIVE DEVALUATION “We made a good Avoiding negative financial
Devaluing an idea because it originated call on that one” information by pretending it
from an adversary or opponent doesn’t exist

POST-PURCHASE
“This worked fine in the RATIONALISATION “The conveyor belt broke three
factory in the Korea, it Viewing oneself as less biased times last month. It’s pretty
should work fine here” than others
unlikely it’ll happen again.”

ILLUSION OF VALIDITY GAMBLER’S FALLACY


Overestimating the importance and “Let’s just get the deal Believing that future probabilities are
likelihood of events given the greater done ASAP” altered by past events, when in fact
availability of information they are unchanged

HYPERBOLIC DISCOUNTING
Preferring a smaller, sooner payoff
over a larger, later reward

También podría gustarte