Next we consider superforecasting and its use by a compliance function. Imagine that as a Chief Compliance Officer (CCO), you could create a team which might well dramatically improve your company’s compliance and risk forecasting ability, but to do so you would be required to expose just how unreliable the professional corporate forecasters have been? Could you do so and, more importantly, would you do so? Most generally this is the predictive capability that organizations have used. However, the new “superforecasting” movement, led by Philip E. Tetlock and others, has been gaining strength to help improve this capability.
The concepts around superforecasting came of age after the intelligence failures leading up to the Iraq War. This led to the founding of the Good Judgment Project, which had as key component a multi-year predictive tournament, which was a series of gaming exercises pitting amateurs against professional intelligence analysts. The results of the Good Judgment Project was presented in a recent Harvard Business Review (HBR) article, by Tetlock and Paul J. H. Schoemaker, entitled “Superforecasting: How to Upgrade Your Company’s Judgment”. The authors had three general observations. First “talented generalists can outperform specialists in making forecasts.” Second, “carefully crafted training can enhance predictive acumen.” Third, “well-run teams can outperform individuals.”
To move to superforecasting, the authors laid out four precepts. The first is to find the sweet spot, which is somewhere between predictions that are “entirely straight-forward or seemingly impossible.” They note the sweet spot “that companies should focus on is forecasts for which some data, logic, and analysis can be used but seasoned judgment and careful questioning also play key roles. Predicting the commercial potential of drugs in clinical trials requires scientific expertise as well as business judgment.” I find the same to be true in compliance where “Assessors of acquisition candidates draw on formal scoring models, but they must also gauge intangibles such as cultural fit, the chemistry among leaders, and the likelihood that anticipated synergies will actually materialize.”
Next is to train for good judgment. This requires employees to learn the basics in such techniques as probability concepts, the definition of what is to be predicted and an understanding of numerical probabilities. As cognitive biases are widely know to skew judgment, companies need to raise awareness for this issue to arise. Finally, training to understand the psychology behind such biases narrowed predictive domains.
Next is to build the right kind of teams. The initial thing to realize is the importance of the composition of the team. The authors found that “cautious, humble, open-minded, analytical - and good with numbers. In assembling teams, companies should look for natural forecasters who show an alertness to bias, a knack for sound reasoning, and a respect for data.” Equally critical is that the “forecasting teams be intellectually diverse. At least one member should have domain expertise (a finance professional on a budget forecasting team, for example), but nonexperts are essential too - particularly ones who won’t shy away from challenging the presumed experts. Don’t underestimate these generalists.” Clearly your compliance superforecasting team should draw from the diversity within your organization not only in discipline but in temperament as well.
After the composition is considered, the authors move to “diverging, evaluating and converging.” The authors suggest “a successful team needs to manage three phases well: a diverging phase, in which the issue, assumptions, and approaches to finding an answer are explored from multiple angles; an evaluating phase, which includes time for productive disagreement; and a converging phase, when the team settles on a prediction. In each of these phases, learning and progress are fastest when questions are focused and feedback is frequent.”
The final component of composition is trust as there must be trust among your team members to facilitate good outcomes. This might also be understood that if the superforecasters demonstrate the errors or miscalculations of others in the group, not only will they be protected by senior management but their work will be defended. The authors note, “Few things chill a forecasting team faster than a sense that its conclusions could threaten the team itself.”
You then have to “track performance and give feedback” as the authors believe that it is essential to track the prediction outcomes and provide timely feedback to improve forecasting going forward. This also has the added benefit of providing an audit trail so that a company can learn from both the good and bad predictions. This leads to the authors’ next insight, which, in the process, is critical.
Such a feedback loop in the compliance sphere could lead to some of the following questions being posed: What information might others have that you don’t that might affect the compliance risk? What cognitive traps might skew your judgment on this transaction or risk? Why do you believe the company can safely navigate this compliance risk?
Answers to these and other questions can provide insight into not only specific predictions but also the process by which a team moved forward so that it can be replicated, in the future through an audit trail. [Think Document Document Document.] Also, “Well-run audits can reveal post facto whether forecasters coalesced around a bad anchor, framed the problem poorly, overlooked an important insight, or failed to engage (or even muzzled) team members with dissenting views. Likewise, they can highlight the process steps that led to good forecasts and thereby provide other teams with best practices for improving predictions.”
Like any innovation, there must be a commitment from senior management on moving forward. There must be data available both internally and research conducted externally with auditable trails on judgments, underlying assumption and data sources. The keys to success include frequent, precise predictions and measuring accuracy of predictions for comparison with real-world events. Nevertheless, such an exercise might well be exactly what a compliance function should do going forward. It might give the company enough information to take such a seemingly risky business move, when the prediction shows the risk was lower than the ‘experts’ said. Yet the authors end on this note, “But companies will capture this advantage only if respected leaders champion the effort, by broadcasting an openness to trial and error, a willingness to ruffle feathers, and a readiness to expose “what we know that ain’t so” in order to hone the firm’s predictive edge.”
Three Key Takeaways
This month’s podcast series is sponsored by Oversight Systems, Inc. Oversight’s automated transaction monitoring solution, Insights on Demand for FCPA, operationalizes your compliance program. For more information, go to OversightSystems.com.