Our ability to predict the future is hindered of course by the nature of the world we’re trying to predict. Too many variables interact in too many unknown ways, producing outcomes too numerous and varied for us to meaningfully measure, process, and interpret correctly. Our ability is also hindered, as Kahneman and Tversky have shown, by the nature of the human brain. Too often we fall prey to the machine’s quirks. We commit the ‘representativeness heuristic,’ forgetting that the odds that something that looks like a duck and walks like a duck and quacks like a duck is actually a duck depends heavily on how many ducks actually inhabit the terrain we’re in. Or we commit the ‘availability heuristic,’ believing that what we can easily imagine — what comes easily to mind — is most common, or most likely to happen.
Yet our poor predictive prowess never seems to reduce our confidence. I challenge you to find one pundit, talking head, expert, or think tank in the last ten years who has uttered the words: “I don’t know” about the future, sincerely, and without qualification. We tend to be as confident about prediction as we are bad at it. Psychological research has shown time and again that your confidence about your prediction is often quite unrelated to the odds of it being accurate. The confidence with which you remember or report something is not related to whether the thing you remember actually happened. When a relationship between prediction and result is found, it tends to be in the direction of overconfidence. For example, as Daniel Gilbert has shown, people routinely overestimate how certain events will affect their future prospects. Most of the things we think will make us very happy tend to make us only slightly happier, if at all. Things we lament as unspeakable catastrophes end up harming us less than expected, or not at all.