Accurate probability judgments are crucial for good decision making. One property of good probability judgments is that they are "well calibrated" meaning that the proportion of events judged to have a certain probability (say 60%) that actually come occur is equal to that probability. For example, 60% of events judged to have a 60% chance of occuring should actually end up happening.

Unfortunately, research shows that people are usually over confident about their judgments; people assign probabilities to events which are larger than the proportion of those events which actually end up happening (more information).

One method for evaluating your calibration is to answer a list of trivia questions and give confidences for each answer. When you finish the set of questions, you construct a "calibration curve" which plots your confidence in answers against the fraction of questions answered correctly (example).

The purpose of this site is to let you take such a calibration test in order to generate a calibration curve. This will help you know the limitations of your own judgment as well as try to have more calibrated assessments, though this is not easy.

To take a calibration test, select a question list. If this is your first time, use the default list: