The past 25 years has seen phenomenal growth of interest in judgemental approaches to forecasting and a significant change of attitude on the part of researchers to the role of judgement. While previously judgement was thought to be the enemy of accuracy, today judgement is recognised as an indispensable component of forecasting and much research attention has been directed at understanding and improving its use. Human judgement can be demonstrated to provide a significant benefit to forecasting accuracy but it can also be subject to many biases. Much of the research has been directed at understanding and managing these strengths and weaknesses. An indication of the explosion of research interest in this area can be gauged by the fact that over 200 studies are referenced in this review.
Scenario planning can be a useful and attractive tool in strategic management. In a rapidly changing environment it can avoid the pitfalls of more traditional methods. Moreover, it provides a means of addressing uncertainty without recourse to the use of subjective probabilities, which can suffer from serious cognitive biases. However, one underdeveloped element of scenario planning is the evaluation of alternative strategies across the range of scenarios. If this is carried out informally then inferior strategies may be selected, while those formal evaluation procedures that have been suggested in relation to scenario planning are unlikely to be practical in most contexts. This paper demonstrates how decision analysis can be used to structure the strategy evaluation process in a way which avoids the problems associated with earlier proposals. The method is flexible, versatile and transparent and leads to a clear and documented rationale for the selection of a particular strategy.
Decision makers and forecasters often receive advice from different sources including human experts and statistical methods. This research examines, in the context of stock price forecasting, how the apparent source of the advice affects the attention that is paid to it when the mode of delivery of the advice is identical for both sources. In Study 1, two groups of participants were given the same advised point and interval forecasts. One group was told that these were the advice of a human expert and the other that they were generated by a statistical forecasting method. The participants were then asked to adjust forecasts they had previously made in light of this advice. While in both cases the advice led to improved point forecast accuracy and better calibration of the prediction intervals, the advice which apparently emanated from a statistical method was discounted much more severely. In Study 2, participants were provided with advice from two sources. When the participants were told that both sources were either human experts or both were statistical methods, the apparent statistical-based advice had the same influence on the adjusted estimates as the advice that appeared to come from a human expert. However when the apparent sources of advice were different, much greater attention was paid to the advice that apparently came from a human expert. Theories of advice utilization are used to identify why the advice of a human expert is likely to be preferred to advice from a statistical method.
Publisher's copyright statement:Additional information: Use policyThe full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that:• a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders.Please consult the full DRO policy for further details. In this paper we review and analyse scenario planning as an aid to anticipation of the future under conditions of low predictability. We examine how successful the method is in mitigating issues to do with inappropriate framing, cognitive and motivational bias, and inappropriate attributions of causality. Although we demonstrate that the scenario methods contain weaknesses, we identify a potential for improvement. Four general principles that should help to enhance the role of scenario planning when predictability is low are discussed: (i) challenging mental frames, (ii) understanding human motivations, (iii) augmenting scenario planning through adopting the approach of crisis management, and (iv), assessing the flexibility, diversity, and insurability of strategic options in a structured option-against-scenario evaluation.3
Publisher's copyright statement:Additional information: Use policyThe full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that:• a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders.Please consult the full DRO policy for further details. The limits of forecasting methods in anticipating rare events AbstractIn this paper we review methods that aim to aid the anticipation of rare, high-impact, events. We evaluate these methods according to their ability to yield well-calibrated probabilities or point forecasts for such events. We first identify six factors that can lead to poor calibration and then examine how successful the methods are in mitigating these factors. We demonstrate that all the extant forecasting methods -including the use of expert judgment, statistical forecasting, Delphi and prediction markets -contain fundamental weaknesses. We contrast these methods with a nonforecasting method that is intended to aid planning for the future -scenario planning. We conclude that all the methods are problematic for aiding the anticipation of rare events and that the only remedies are to either (i) to provide protection for the organization against the occurrence of negatively-valenced events whilst allowing the organization to benefit from the occurrence of positively-valenced events, or (ii) to provide conditions to challenge one"s own thinking -and hence improve anticipation. We outline how use of components of devil"s advocacy and dialectical inquiry can be combined with Delphi and scenario planning to enhance anticipation of rare events.3 3
The use of surrogate weights based on rankings has been proposed as a method for avoiding difficulties associated with the elicitation of weights in multi-attribute decision analysis. When the simple multiattribute rating technique using swings (SMARTS) method is being employed it has been suggested that rank order centroid (ROC) weights are the best surrogate weights to use. This study shows that ROC weights are appropriate to use as a substitute for original weights that are constrained to sum to a fixed total (usually 1 or 100) as used in the point allocation method. If, however, the original weights are determined without any initial restrictions, as in the direct rating method, and are then normalized, which is the common procedure in SMARTS analysis, then the ROC weights do not provide the best approximations to the original weights. This paper shows how to obtain rank order distribution (ROD) weights that provide a better approximation than the ROC approach to unrestricted original weights. The paper also shows that, as the number of attributes in a decision problem increases, the ROD weights approximate to the more easily calculated rank sum weights.
Accurate forecasts are crucial to successful planning in many organizations and in 2001 forty international experts published a set of principles to guide best practice in forecasting. Some of the principles relate to the use management judgment. Almost all organisations use judgment at some stage in their forecasting process, but do they do so effectively? While judgment can lead to significant improvements in forecasting accuracy, it can also suffer from biases and inconsistency. The principles therefore
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.