People generally like to believe they are rational (Greenberg, 2015). Unfortunately, this isn’t usually the case (Tversky & Kahneman, 1974). People very easily fall prey to thinking biases which stops them from making a purely rational judgement (whether always making a rational judgement is a good thing is a discussion for another time). These are flaws in thinking e.g. the availability bias, where you judge the likelihood of an event or the frequency of a class by how easily you can recall an example of that event (Tversky & Kahneman, 1974). So after seeing a shark attack in the news, people think the probability of a shark attack is much higher than it is (because they can easily recall an example of one).
But what are some of the factors that protect you against falling for these thinking biases? You would think that the smarter someone is, the less likely they are to be affected by them. However, the evidence paints a different picture.
The first bias we are going to look at is called the “myside bias”, which is defined as the predisposition to “evaluate evidence, generate evidence, and test hypotheses in a manner biased toward their own prior beliefs” (Stanovich, West, & Toplak, 2013). The ability to view an argument from both sides and decouple your prior opinions from the situation is seen as a crucial skill for being rational (Baron, 1991; Baron, 2000). Interestingly, there have been multiple experiments showing that susceptibility to myside bias is independent of cognitive ability (Stanovich & West, 2007; Stanovich & West, 2008; Stanovich, West, & Toplak, 2013); it doesn’t matter how smart you are, you are just as likely to evaluate something from your perspective if you aren’t told to do otherwise.
Not only is there evidence to suggest the myside bias is uncorrelated with intelligence, there is further evidence to suggest that a whole host of thinking biases are unrelated to intelligence (Stanovich & West, 2008), including but not limited to: the anchoring effect, framing effects, sunk-cost effect etc. Further evidence from Teovanovic, Knezevic, & Stankov (2015) supports the idea that intelligence doesn’t protect you from these biases (because intelligence was only weakly or not at all correlated with performing well and therefore avoiding biases).
It has even been shown that the more intelligent someone is, the more likely they are to feel that others are more biased than they are and they are more rational by comparison (West, Meserve, & Stanovich, 2012). This is called the “bias blind spot” (they are blind to their own biases). Another study (Scopelliti et al., 2016) found susceptibility to “bias blind spot” is largely independent from intelligence, cognitive ability, and cognitive reflection.
However, it’s not a completely level playing field. On some tests where people might fall prey to thinking biases e.g. the selection test (Wason, 1966), intelligence was correlated with success; the more intelligent a participant was the more likely they were to get it right (Stanovich & West, 1998).
You would think being an expert in a field would also be a factor that helps you resist biases, but for the hindsight bias it appears not to matter whether you are an expert or not; you are just as likely to fall for it (Guilbault et al., 2004).
Some have argued that these biases aren’t actually biases at all (Gigerenzer, 1991) or that they are just performance errors or “mere mistakes” rather than systematic irrationality (Stein, 1996). However, these views have been argued against by Kahneman & Tversky (1996) and Stanovich & West (2000) respectively. Stanovich and West conducted a series of experiments testing some of the most famous biases and found that performance errors accounted for very little of the variation in answers, whereas computational limitations (the fact people aren’t purely rational) accounted for most of the times people fell for biases.
So it seems that being intelligent or an expert doesn’t always protect you against cognitive biases (and can even make you less aware of your own shortcomings). But what can? I’ll be exploring the techniques to protect yourself from biases in my next blog post.
References:
Baron, J. (1991). Beliefs about thinking. In Voss, J.; Perkins, D.; & Segal, J. (Eds.), Informal reasoning and education (p.169-186). Hillsdale, NJ: Lawrence Erlbaum Associates Inc.
Baron, J. (2000), Thinking and Deciding (3rd Ed.). Cambridge, UK: Cambridge University Press.
Gigerenzer, G. (1991). How to Make Cognitive Illusions Disappear: Beyond “Heuristics and Biases”. European Review of Social Psychology, 2, 83-115.
Greenberg, S. (2015). How rational do people think they are, and do they care one way or another? [Online] Available from: http://www.clearerthinking.org/#!How-rational-do-people-think-they-are-and-do-they-care-one-way-or-another/c1toj/5516a8030cf220353060d241
[Accessed: 21st July 2015].
Guilbault, R.L.; Bryant, F.B.; Brockway, J.H.; & Posavac, E.J. (2004). A Meta-Analysis of Research on Hindsight Bias. Basic and Applied Social Psychology, 26 (2&3), 113-117.
Kahneman, D. & Tversky, A. (1983). Choices, Values, and Frames. American Psychological Association, 39 (4), 341-350.
Kahneman, D. & Tversky, A. (1996). On the Reality of Cognitive Illusions. Psychological Review, 103 (3), 582-591.
Scopelliti, I.; Morewedge, C.K.; McCormick, E.; Min, H.L.; Lebrecht, S.; & Kassam, K.S. (2016). Bias Blind Spot: Structure, Measurement, and Consequences. Management Science, 61 (10) 2468-2486.
Stanovich, K.E. & West, R.F. (1998). Individual Differences in Rational Thought. Journal of Experimental Psychology: General, 127 (2), 161-188.
Stanovich, K.E. & West, R.F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioural and Brain Sciences, 23, 645-726.
Stanovich, K.E. & West, R.F. (2007). Natural Myside Bias is Independent of Cognitive Ability. Thinking and Reasoning, 13 (3), 225-247.
Stanovich, K.E. & West, R.F. (2008). On the failure of cognitive ability to predict myside and one-sided thinking biases. Thinking and Reasoning, 14 (2), 129-167.
Stanovich, K.E. & West, R.F. (2008). On the Relative Independence of Thinking Biases and Cognitive Ability. Personality Processes and Individual Differences, 94 (4), 672-695.
Stanovich, K.E.; West, R.F.; & Toplak, M.E. (2013). Myside Bias, Rational Thinking, and Intelligence. Association for Psychological Science, 22 (4), 259-264.
Staw. B.M. (1976). Knee-Deep in the Big Muddy: A Study of Escalating Commitment to a Chosen Course of Action. Organisational Behaviour and Human Performance, 16, 27-44.
Stein, E. (1996). Without Good Reason: The Rationality Debate in Philosophy and Cognitive Science. Oxford University Press [rKES].
Teovanovic, P.; Knezevic, G.; & Stankov, L. (2015). Individual differences in cognitive biases: Evidence against one-factor theory of rationality. Intelligence, 50, 75-86.
Tversky, A. & Kahneman, D. (1974). Judgements under Uncertainty: Heuristics and Biases. Science, 185 (4157), 1124-1131.
Wason, P.C. (1966). Reasoning. In B. Foss (Ed.), in New Horizons in Psychology, 135-151. Harmonsworth, England: Penguin.
West, R.F.; Meserve, R.J.; & Stanovich, K.E. (2012). Cognitive Sophistication Does Not Attenuate the Bias Blind Spot. Journal of Personality and Social Psychology, 103 (3), 506-519. (function(i,s,o,g,r,a,m){i[‘GoogleAnalyticsObject’]=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,’script’,’//www.google-analytics.com/analytics.js’,’ga’); ga(‘create’, ‘UA-63654510-1’, ‘auto’); ga(‘send’, ‘pageview’);
Image credit: http://www.anthraxinvestigation.com
Leave a Reply