Weather forecasting, mixed with business forecasting, can make for remarkable success. That’s the case brought up in this article by Christopher Dyke. The technique is called Model Output Statistics (MOS) and involves tracking how forecast models perform “compared to real results and then accounting for this deviation in future model runs (forecasts).” There are two techniques for implementing MOS: one involves computing historical average for demand. The second is a bit more complex: A second technique is available but a little more complex. The first step in this process is to run correlations on variables that might influence realized demand. This is referred to as a screening regression and more detail is provided in Bryan (1944). Once this step is completed select the top variables that correlate to demand. These variables will now be used in a multiple regression analysis. The results of this analysis will be used as an adjustment factor for the raw model output. The benefits of the more complex option is flexibility, versatility, and more insight into “uncharacteristic changes in demand.” Dyke is quick to point out the negatives of the second technique, too: it requires constant updating and more time to use. Regardless of which technique you use to implement MOS, the simple fact is that having Model Output Statistics implemented can and will help prevent outlier results from damaging business operations. Supply chains which implement MOS can expect to see the ability to become more proactive and adjust to upcoming challenges before those challenges demand the change to occur.