Confirmation Bias – Not Just for Pseudoscientists March 10, 2007Posted by Johan in Evolutionary Psychology, Learning, Neuroscience, Psychology Freakshow.
This is an excerpt from a lecture that James Randi gave at Princeton a few years back. For fans, the full lecture is available here. There is nothing new here if you are already familiar with the basic premises of homeopathy, but Randi does tell the story quite well. See also this amusing series of clips from a website advocating homeopathy.
See, at the Phineas Gage Fan Club we cover both sides of every story. Fair and balanced, just like Fox News. Ahem.
James Randi is perhaps most famous among academics for Project Alpha, an elaborate hoax which exposed the gullibility of researchers into the paranormal. Essentially, Randi had two of his magician friends present themselves as psychics at the high-profile Phillips group. The magicians used simple sleight-of-hand tricks to convince the scientists that they had paranormal abilities. Randi did eventually end this game, but not until the two magicians had become quite famous in psychic circles. Embarrassing enough for the researchers, Randi could show that he had in fact contacted them before the experiments began with a list of crucial factors that needed to be controlled during the experiments to prevent cheating. It turned out that the magicians had been able to exploit those very same gaps, since the researchers had neglected to take Randi’s points into consideration.
Most disturbingly, the field of “parapsychology” is enjoying some recognition, even today. Recently The Psychologist, the official magazine of the British Psychological Society, ran a cover story on parapsychology, written from a completely uncritical standpoint.
The real finding to come out of all these experiments into psi and telepathy is the sheer strength of confirmation bias. Whether it concerns belief in homeopathy, psychoanalysis, or the paranormal, it’s striking how successfully people manage to select and interpret information to make it consistent with their own views.
It’s easy to look down on this, as a supposedly objective scientist. But the reality is probably that we all fall victim to this confirmation bias to some extent. It’s just less noticeable when you believe in something that also happens to be consistent with existing evidence. When the evidence changes, surprisingly many stick to their beliefs, rather than to the evidence. For me, the strongest example of this is perhaps the last paper by Skinner (1990). Finished the evening before his death, but long after the cognitive revolution, he still argues against the usefulness of positing mental constructs between stimulus and response. He goes as far as suggesting that as we learn more about the workings of the brain, it only becomes less useful as a way of understanding behaviour – something that may sound a little outlandish now, in the age of fMRI, MEG, EEG and other acronyms. The strict behaviourism that Skinner advocated had run into trouble more than 20 years ago, yet Skinner, and many with him, clung to their paradigm.
This shouldn’t be interpreted as a criticism of Skinner. When the next shift comes around in psychology, say, if Buss somehow succeeds in making evolutionary psychology the next big thing (cf Buss, 1995), I’m sure lots of cognitive psychologists will cling to their beliefs – start their own journals, host their own conferences. It’s easy to assume that supposedly intelligent people, especially psychologists who know all about cognitive biases, should thus be immune to them.
This, if anything, is a dangerous belief, because in reality it actually makes you more vulnerable to biases. The most biased people are those who believe they are “fair and balanced.”
Buss, D.M. (1995). Evolutionary Psychology: A New Paradigm for Psychological Science. Psychological Inquiry, 6, 1-30.
Skinner, B.F. (1990). Can Psychology be a Science of Mind? American Psychologist, 45, 1206-1210.