The brain likes to take the road it knows best
Familiarity with the tried and tested can prevent us from seeking out more efficient solutions
It is possible to achieve checkmate in a well-known five-step manoeuvre, but also using a much less familiar three-move sequence. Photograph: Thinkstock
Try the following problem. You have three containers that can hold 21, 127 and three units of water respectively. How would you measure out 100 units of water by transferring water between the containers? You can fill and empty each container as often as you like, but you must fill each to its limit.
The solution I’m sure you found was to fill the 127 unit container and then to pour 21 units from it into the 21 unit container, leaving 106 units. Now fill and empty the three-unit container from the 106 units twice, leaving exactly 100 units in the 127 unit container. Three steps are required to solve this problem.
Now do the following problem. You have three containers that hold 23, 49 and three units of water respectively. Measure out 20 units as before using these three containers. Do this problem before proceeding further. Stick with me now.
Was your solution to fill the 49-unit container and then empty 23 units into the 23-unit container, leaving 26 units. Then to fill and empty the three-unit container twice from the 26 units, leaving 20 units in the 49-unit container? Or was your solution simpler – fill the 23-unit container and then use this to fill the three-unit container, leaving 20 units in the 23-unit-capacity container? In other words, this can be solved by a roundabout three-step way or by a simpler two-step way.
Many of you will have the problem using the three-step method method because your mind was pre-conditioned to the three-step method. But, if I had first drilled you through several problems that could only be solved in three steps, before introducing a problem that had alternative three- and two-step solutions, almost everyone would have chosen the less efficient three-step solution.
This is an example of the Einstellung effect – the brain’s persistent tendency to stick with a familiar solution to a problem and to ignore alternatives. It is described by Merim Biralic and Peter McLeod in Scientific American, May 2014.
The Einstellung effect has been studied in detail. In one scenario expert chess players were presented with specific arrangements of pieces on a virtual board and asked to achieve checkmate in as few moves as possible. It is possible to achieve this in a well-known five-step manoeuvre called “smothered mate”, but it is also possible to achieve checkmate using a much less familiar three-move sequence.
Almost all the players failed to find the shorter procedure. When questioned later, the players who used the “smothered mate” procedure reported that they had searched for simpler solutions but failed to find them.
However, eye movement experiments, using infrared cameras, carried out by Biralic and McLeod, showed that chess players who reported searching for simpler solutions never actually examined the board for such options, although they later believed they had done so. Familiarity with the tried and tested solution prevents the brain from seeking out a better solution.
The Einstellung effect is the basis for many cognitive biases, including the well-known confirmation bias: the persistent tendency of people to seek evidence that confirms their ideas and to ignore anything that contradicts them.
The authors describe examples of this bias recounted by Stephen Jay Gould in his book The Mismeasure of Man. For example, it was long thought that human intelligence correlated with brain size, but when French neurologist Paul Broca (1824-1890) found that German brains are larger than French brains, he explained away this fact by pointing out that German bodies were also larger, and so, proportionately, there was no real difference in brain size and hence no difference in intelligence. Of course, the real motivation for his explanation was that he couldn’t stomach the notion that Germans are smarter than the French.
Confirmation bias is also rife in areas such as medical diagnosis and jury deliberations. Radiologists often fixate on the first abnormality they see when examining an X-ray and fail to notice other signs that should be obvious. If only these other signs are present, however, radiologists see them right away. Studies have also shown that juries begin to decide on innocence or guilt long before all the evidence is presented.
The only way to counter confirmation bias is to be aware of its influence and to resist it robustly.
William Reville is an emeritus professor of biochemistry at UCC. http://understandingscience.ucc.ie