The Danger of Cognitive Bias

Leave a Comment
We haven't really argued for it in the blog - rather, assumed it - but I think most readers would agree that one's beliefs should be rational. There should be some reason for believing them. To put it another way, there seems to be unspoken agreement that there is an imperative to form beliefs at least in part on the basis of rationality. And that if one's beliefs are shown to be irrational, they should be discarded.[1]

If you have read any of Descartes' work, you have to recognize that he was a genius, and largely responsible for kick-starting modern philosophy. He had some great insights. But there is one matter in which he was completely wrong. He thought our rational faculties never failed us - it is always something else that fails us. And of course this simply isn't the case. Our rational faculties do fail us. On the whole, they are fairly reliable, but they are not infallible.

One of the chief dangers to our beliefs is that our cognitive faculties are subject to error. In theology, sin's negative effects on the mind and intellect are known as the noetic effects of sin. It is a fascinating subject, if you'd like to pick it up.

At any rate, I'd like to focus on one way in which our intellect can fail us: cognitive bias. This is a danger to anyone currently breathing. It isn't that only atheists are in danger of it, or only Christians, or only agnostics, or Buddhists, or humanists, butchers, bakers, candlestick makers. You get the point.

Cognitive bias is a tendency or predisposition towards certain lines of thinking that do not align with rationality. Rationality draws a straight line, which cognitive biases fly off to the four corners of the earth. The real difficulty is that one might not be aware of cognitive bias. It seems like one is thinking rationally, when in fact one is in error.

I mentioned one cognitive bias in passing in a previous post on Alvin Plantinga's work. The basic idea is that if one has a strongly held belief, and someone provides a powerful argument against that belief, rather than accepting that belief is false, one may in fact further entrench that belief in the face of the evidence (apparently this is known as the backfire effect). In fact, one may abandon something else to hold on to the belief that has come under attack.

I don't really have any specific way of guarding against cognitive bias myself, other than trying to think carefully and spend time in honest introspection. But it may at least be useful to know what some common cognitive biases are. That may make it a bit easier to guard against them. Here are just a few that are relevant to discussions of philosophy, apologetics, and argumentation.

The one bandied about the most these days is confirmation bias. One falls prey to confirmation bias by selecting evidence that confirms one's preconceived notions, and rejecting all other evidence. For example, have you ever heard someone complain that it is always they who are stopped the red light and everyone else makes it through the intersection in the nick of time? Obviously, over the course of one's driving career that is almost certainly not true. That person is just ignoring all the times they made it through the light (probably because one's mind isn't on the light when able to simply cruise through the intersection). In the case of a belief system, one may look for evidence for one's own beliefs while subconsciously rejecting any opposing evidence.

Here's a biggie: the belief bias. With belief bias, one is predisposed against an argument not on the basis of it's logic but on the basis of the believability of the conclusion. Think of the Pevensie siblings in The Lion, the Witch, and the Wardrobe - especially Peter and Susan. Lucy is incredibly excited to tell them that she has found another world in the wardrobe. This announcement is met with utter skepticism. However, the professor calls them on their skepticism: Lucy's excitement seems genuine, and she is the most honest of the siblings. So, logically, she is probably telling the truth. But what she is claiming is plainly ridiculous! And yet it was true.[2]

Another is the focusing effect. Ever seen some sports pundit chalk up the success or failure of a recently completed season on just one factor? Happens all the time. And they are wrong all the time. The success of a team over the course of a season - or even one game - is dependent on a myriad of factors. But these pundits are so certain it was all due to this one thing they identified.

One more. In general, we have a tendency to reject information or an argument from an adversary simply because it is coming from our adversary. It's sort of a psychological ad hominem- that argument is bad because it came from my adversary. Someone who falls for this is suffering from the scourge of reactive devaluation.

In closing, it might go without saying but I am no psychologist. And I'm not a philosopher (many biases have actually been identified first by philosophers). Yet I think it is important that we think carefully and honestly about important issues, so having at least some idea of how we may be biased may help guard against these tendencies. That and wisdom.


Notes:
1. This of course can be taken too far. There are some beliefs that are not based on rationality but nonetheless seem completely reasonable. If you see a tree, you believe a tree is there not by some deductive argument but because your perception of the tree's existence is basic. The belief the tree is there is what Plantinga would call a properly basic belief.
2. Granted, this is an example from a fictional work. A more pressing example might be the arguments for the resurrection. Many are rejected outright, simply on the grounds that the idea of a man rising from the dead is ridiculous.
Next Post Newer Post Previous Post Older Post Home

0 comments :

Post a Comment