Wednesday, May 18, 2011

INSEAD Working Paper 2011/61/DS

Subjective probabilistic judgments are inevitable in many real life domains. A common way to obtain such judgments is to assess fractiles or confidence intervals. However, such judgments tend to be systematically overconfident. For example, 90% confidence intervals for future uncertain quantities (e.g., future stock prices) are likely to capture only 50-60% of the actual realizations. Furthermore, it has proved particularly difficult to de-bias forecasts and improve the calibration of expressed subjective uncertainty. This paper proposes a simple process that systematically leads to wider assessed confidence intervals than is normally the case, thus potentially improving calibration and hence reducing overconfidence. Using a series of lab and field experiments with professionals forecasting in their domain of expertise, we show that unpacking the distal future into intermediate more proximal futures has a substantial effect on subjective forecasts. For example, simply making it salient that between now and three months from now there is one month from now and two months from now increases the uncertainty assessors have in their three month forecasts, which helps mitigate the overconfidence in those forecasts. We refer to this phenomenon as the time unpacking effect and find that it is robust to different elicitation formats. We also address the possible reasons for the time unpacking effect and propose future research directions.