Watch this space for recent news of interest to Researchers:

SAS grants announced for 2011
(29 November, 2010)
The famous M-competition data is now available in Excel format (10 March, 2008)

 Sources of Knowledge

 The State of Knowledge in 2001 (summarized as principles)

A description of forecasting principles along with evidence as they appeared in Forecasting Principles: A Handbook for Researchers and Practitioners.

 What is Known to Date (summarized as principles)

The current state of forecasting knowledge, including changes and additions to principles as new evidence has emerged, is incorporated into the Forecasting Audit.

 Research Needs

  • Research Needs for Forecasting (PDF) describes principles of forecasting that are in need of research. Of the 139 principles, 33 are based on common sense and, thus, do not need research. Of the remaining 106, there is a weak need for research on 41 principles, typically because adequate research has already been done. 42 principles are in moderate need of research. Finally, and of key importance, 23 principles strongly need research. For a summary of prior research on these principles, see Principles of Forecasting (2001).

  • Systematic Approach to Research on Time Series - to aid in identifying research conditions, a common set of features is proposed for time series.

 Research Funding Sources

 New Forecasting Principles

  • A new forecasting principle has been proposed by Magne Jørgensen for the estimation of prediction intervals. This principle stands in marked contrast to current practice in many fields. Comments are solicited; in particular, is there additional research to add and are there other conditions?

14.14 Ask for a judgmental likelihood that a forecast will fall within a pre-defined minimum-maximum interval (not by asking people to set upper and lower confidence levels).

Traditionally, people are requested to provide minimum-maximum intervals to indicate the uncertainty of their estimates. The traditional request leads to over-optimistic views about the level of uncertainty. Jørgensen (2004) proposed that a person different from the estimator should identify minimum and maximum values and that the expert assesses how likely it is that the actual value will be inside the interval. Evidence was obtained from a previously reported experiment and from field studies in two software companies. In these companies, information was obtained from 47 projects applying the traditional framing and 23 projects applying the pre-defined framing. The latter led to better calibrated (less overconfident) prediction intervals. Argumentations, replications and supporting studies are also found in Teigen and Jørgensen (2004), Jørgensen, Teigen & Moløkken-Østvold (2004), Jørgensen and Teigen (2002), and Winman, Hansson & Juslin (2004).


Jørgensen, Magne (2004), "Increasing Realism in Effort Estimation Uncertainty Assessments: It Matters How You Ask," IEEE Transactions on Software Engineering 30(4):209-217, 2004 - full text.

Jørgensen, Magne (2002) & K. H. Teigen, "Uncertainty Intervals versus Interval Uncertainty: An Alternative Method for Eliciting Effort Prediction Intervals in Software Development Projects," in: Proceedings of International Conference on Project Management (ProMAC), Singapore, pages 343-352 - full text.

Jørgensen, Magne, K. H. Teigen & K. J. Moløkken-Østvold (2004), "Better sure than safe? Overconfidence in judgment based software development effort prediction intervals," Journal of Systems and Software, 70(1-2):79-93 - full text

Teigen, K. H. & M. Jørgensen (2005), "When 90% confidence intervals are only 50% certain: On the credibility of credible intervals," Applied Cognitive Psychology, 19:455-475, 2005 - full text

Winman, Anders, Hansson, Patrik, Juslin, Peter (2004), "Subjective probability intervals: how to reduce overconfidence by interval evaluation," Journal of Experimental Psychology: Learning, Memory and Cognition, 30(6), 1167-1175.

 Papers with New Evidence on Principles

This section lists papers that contain new evidence relevant to forecasting principles.
[To list your paper here, see instructions.]

  • Wright, M. and MacRae, M. (2007). "Bias and variability in purchase intention scales", Journal of the Academy of Marketing Science, 35, 617-624. - Full Text
  • J. Scott Armstrong, Kesten C. Green, Randall J. Jones, and Malcolm Wright (2008). "Predicting Elections From Politicians’ Faces" - Full Text

  • Green. K. C. and Armstrong J. S. (2007). "The Ombudsman: Value of Expertise for Forecasting Decisions in Conflicts", Interfaces, 37, 287-299. [Appears with introduction by Paul Goodwin (pp. 285-286) and commentary by Shelly A. Kirkpatrick, Jonathon J. Koehler, and Philip E. Tetlock.] Description

  • In a paper titled, "The Impact of Institutional Change on Forecast Accuracy: A Case Study of Budget Forecasting in Washington State" (Full Text) (expanded summary), Elaine Deschamps (2004) explored the relationship between organizational change and forecast accuracy by analyzing the budget forecasting process in the state of Washington. Principles were tested on 180 budget forecasts produced before and after the creation of the independent agency. Deschamps found that forecast error decreased by 22% (from a MAPE of 6.8% to 5.3%) after an independent forecasting agency was established.

  • Goodwin, P. (2002) "Integrating management judgment with statistical methods to improve short-term forecasts," Omega, 30, 127-135. Description

 Working Papers with Evidence on Principles

Working papers are posted at this site to, first, establish a claim, and second, obtain peer review. Please contact the authors directly with suggestions, such as informing them of a relevant paper that was overlooked. Contact J. Scott Armstrong if you would like your working paper to be considered.

  • J. Scott Armstrong and Kesten C. Green (2011), "Demand Forecasting: Evidence-based Methods," Working Paper Full Text
  • Alfred G. Cuzàn, J. Scott Armstrong, and Randall J. Jones, "Combining Methods to Forecast the 2004 Presidential Election: The Pollyvote" - Full Text
  • Kesten C. Green (2003), "Forecasting Decisions in Conflicts: Analogy, Game Theory, Unaided Judgement, and Simulation compared," PhD Thesis at Victoria University of Wellington. - Full Text 
  • Thomas H. Tessier, "Conditional Lodging Sales Forecasts through 1980," MBA Thesis at the Wharton School, March 1974. - Full Text



 Data Used in Published Papers


  • Weatherhead II data (MS Excel, 196 KB), collected in 1995 by Monica Adya for her thesis Critical issues in the Extension of Rule-Based Forecasting Systems at The Weatherhead School of Management at Case Western Reserve. It includes 489 annual time series from U.S. Statistical Abstracts, ValueLine, INFORUM website, and The World Watch Institute. For details, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

 Peer Review

Seeking feedback on working papers? One objective of the site is to allow for rapid and open peer review of work on forecasting principles. Provide your working paper to gain comments.

IIF Discussion Group: A list serve that allows researchers to get advice from researchers in the International Institute of Forecasters discussion group).

 Researchers Who Do Consulting

If you are also consulting, and if you are a member of the International Institute of Forecasters, you can add yourself to the list of consultants on forecasting (which appears on the Practitioners page).

 Requests for Proposals

Funders can list RFP's here.