During World War II, the Royal Air Force became increasingly alarmed at how many aeroplanes they were losing to enemy anti-aircraft fire.
To address this problem, they planned to increase the amount of bulletproof armour carried on the planes. But because armour plating is heavy, (and weight is always at a premium with aeroplanes) knowing where to put that extra armour became a pressing question.
The engineering solution seemed straightforward – study where the planes were most damaged and add armour to cover those areas. By studying enough planes, clear patterns of damage could be identified. This approach makes intuitive sense but, as the mathematician Abraham Wald pointed out, it was also seriously flawed.
The problem with the engineers’ thinking was that it was based on studying the aeroplanes that made it back to base. By definition these were planes that had been shot and and yet still managed to fly home.
In this regard what their study showed was exactly where the RAF didn’t need to add any more armour. The proper solution, Wald pointed out, was to only add armour to those areas where the returning aeroplanes had no damage.
Wald’s counter-intuitive reasoning hold an important lesson for all of us. Whenever we make a decision we need to guard against what psychologist call “confirmation bias”. David McRaney describes confirmation bias as “a filter through which you see a reality that matches your expectations”. In other words, it’s the tendency to look for confirmation for our pre-existing ideas while ignoring any evidence that might disprove those ideas.