Further Information About Calibrated Probability Assessment

The major result of calibration studies is that people are almost always significantly overconfident (p.11 Wilson 1994).

Correcting for one's own overconfidence is apparently difficult. Assessors who are given feedback about their calibration only show modest improvements.

One one apparently effective way of improving your calibration is to explicitly stop and consider reasons why you might be wrong (Koriat 1980, Pious 1993). Doing this helps reduce overconfidence. Another strategy is to take calibration tests like the one found on this website and look at the feed back and try to adjust accordingly because such feedback has been shown to improve calibration somewhat (Wilson 1994).

However, it is questionalbe wether this will carry over into more natural decision making environments, as calibration is quite context dependent. For example, calibration depends on the difficulty of question difficulty. People end to be underconfident on easy problems and overconfident on hard problems (this is called the Hard-easy effect) (Wilson 1994).

It is possible to perform calibration tests for probability assessments similar to the ones available on this site for probability distributions of continuous variables. Research shows that people tend to be significantly over confident when giving probability distributions than when, meaning that their probability distributions are too tight (p.324 Kahneman and Tversky 1982).

Research shows that in some contexts, it is possible to be very well calibrated. For example, Wilson 1994 states "[W]eather forcasters are almost perfectly calibrated. Their success can be attributde to several advantages they have when making their assessments: their task is repetitious, there is excellent supporting information, feedback is provided, and rewards are given for good performance." However, experts are also often overconfident when they do not ave similar advantages.

Sources
S. Lichtenstein, B. Fischhoff, and L. D. Phillips, "Calibration of Probabilities: The State of the Art to 1980," in Judgment under Uncertainty: Heuristics and Biases, ed. D. Kahneman and A. Tversky, (Cambridge University Press, 1982)
Alyson Wilson, "Cognitive Factors Affecting Subjective Probability Assesment" 1994