By Regina Joseph " Senior US intelligence leaders are starting to doubt whether ‘experts’ are the best forecasters of emergi...
By Regina Joseph
"Senior US
intelligence leaders are starting to doubt whether ‘experts’ are the best
forecasters of emerging risks. Regina Joseph, however, has other culprits in
mind. Familiar cultural and bureaucratic obstacles may be more to blame for the
foresight training and analysis problems intelligence agencies face today."
The aim of
intelligence analysis is straightforward enough: to foresee emerging threats to
the extent that one can prepare sufficiently in advance to either prevent or at
least mitigate them. Research lies at the core of this enterprise in
forecasting risk, whether via classified or unclassified data. But at a time
when open source information is exponentially increasing in direct proportion
to levels of uncertainty, how such foresight is conducted and by whom has
become a key concern.
In 2011, the
US government took a bold step in attempting to address those concerns. The
Intelligence Advanced Research Projects Activity (or IARPA, a division of the
Office of the Director of National intelligence) invested in a four-year
exploration of the underpinnings of better foresight analysis. Known as the
Aggregative Contingent Estimation (ACE) program, the initiative used a
tournament originally consisting of five teams of forecasters to determine
which individuals were most adept at forecasting future geopolitical outcomes
and which traits shaped the best. Investigators and observers alike were
surprised by the tournament’s results.
Echoes from
the last century
While the
harnessing and development of expertise has become the default sine qua non of
what is considered to be good analysis, there was a time when this was not
necessarily supported by the entire US intelligence community. During the 1980s
and 1990s, key analysts argued against the use of experts as the sole guide to
good analytical ability. In his lauded collection of essays and articles in
Psychology of Intelligence Analysis , CIA analyst Richards J. Heuer, Jr.
suggested that expertise alone is simply not enough. Influenced by Nobel
laureate Daniel Kahneman’s and Amos Tversky’s investigations into human
psychology and decision-making (summarized in Kahneman’s modern classic, “
Thinking Fast and Slow”), Heuer, Jr. proclaimed process in critical thinking as
a better catalyst in analytical performance.
Heuer’s focus
on cognitive style - a departure from academic and government thinking at the
time - established a few clear boundaries. He believed, for instance, that
training analysts to recognize their own cognitive biases had no positive
impact in the accuracy of their foresight. In addition, Heuer, Jr. argued
against placing an emphasis on analytical accountability. As he saw it,
measuring analytical precision was a foolhardy pursuit due to the fog of
uncertainty surrounding geopolitical affairs. His larger message instead
emphasized the constant challenging of analytical assumptions—the “mental
models” analysts developed (Heuer 1999)—by updating and refining through the
use of alternative points of view.
Unfortunately,
Heuer’s advice to avoid the trap of relying upon “more and better information”
as an analytical solution was not always heeded by the intelligence community.
Indeed, in his foreword to Heuer’s book , fellow CIA analyst Douglas MacEachin
highlights the default tendency to throw more expertise at a problem when
analysis goes wrong, as well as “the ideological and bureaucratic imperatives”
that get in the way of more effective analytical techniques.
Fast forward
to today, and the echoes of Heuer’s work reverberate as strongly in the
practical realm as in the theoretical. In early 2015, the US intelligence
community communicated its awareness that it could no longer conduct business
as usual. At the beginning of March, CIA Director John Brennan announced a
major restructuring of the Agency. A greater emphasis is to be placed on
enhancing the skills of CIA staff by breaking down the walls between
“operations” and “analysis”.
Prior to
Brennan’s announcement, Jane Harman, CEO of the Woodrow Wilson International
Center for Scholars and a former ranking Democrat on the US House Intelligence
Committee, echoed the need for a "disruptive upgrade" and
"smarter spying”. In several articles, she also highlighted the importance
of developing a cyber-ready corps with open source media (especially social
media) expertise. Adding to the burgeoning chorus of demands for improved
intelligence analytical skills is the 9/11 Review Commission’s most recent
report to FBI Director James B. Comey. In keeping with comments made by Brennan
and Harman, it also urges the agency to accelerate its efforts in adapting its
intelligence cadre to the increasing pace of threats.
Lessons to be
learned
Indeed, Heuer’s
important message now requires further review, given the amount of empirical
data that has been gathered by researchers in recent years. This data has, in
turn, been the key output of the winner of IARPA’s ACE tournament, the Good
Judgment Project. Led by principal investigators Philip Tetlock and Barbara
Mellers at University of Pennsylvania, and Don Moore at University of
California, Berkeley, the Project took cues from Tetlock’s own work, Expert
Political Judgment: How Good Is It? How Can We Know? Like Heuer, Tetlock
eschewed the validity of experts as the best forecasters. However, unlike
Heuer, Tetlock based his assessment on a clinical evaluation of the accuracy of
thousands of forecasts made by experts and generalists alike. Tetlock’s
examination suggested that open-minded generalists could be far better than
experts in analyzing and predicting outcomes.
As recognized
specialists in the fields of political judgment and decision-making, the Good
Judgment team focused on—among other issues—the psychological qualities of
perception and cognition apparent in the top analysts. Aided only by open
source data, Good Judgment forecasters were able to surpass the forecasting
accuracy of intelligence community analysts with access to classified
information , thus substantiating Tetlock’s theory regarding the predominance
of select generalists over experts.
But perhaps
most germane to both public and private sector needs in light of mounting
uncertainty was the determination that forecasting accuracy is a trainable
skill. While the data collected by the Good Judgment Project supports Heuer’s
primary assertion that the constant challenging of beliefs is an integral
aspect of more accurate analysis, it also refutes his aversion to cognitive
de-biasing and analytical accountability. The core results from The Good
Judgment Project—train, team and track for better analytical
training—demonstrate the statistically significant effect on forecasting
accuracy achieved by teaching people how to recognize and limit their cognitive
biases; gain awareness over the potential gap in confidence over what they
think they know and what they actually know; how to think in an actively
open-minded manner; and how to apply probabilities in a very granular way when
assessing outcomes.
However, the
gains made from training analysts in cognitive technique are not enough. The
Good Judgment Project urges keeping analysts accountable by tracking their
performance over time and teaming the best forecasters with each other to
compound the effects of greater accuracy. Taken together, the results of the
Good Judgment Project surpass the contrarian formidability of Heuer’s work and
point the way forward in measurably improving analytical technique and
preparing a new generation of analysts, whether in business, the intelligence
community or other government departments.
Barriers…still
The final
outcome of the ACE Program poses an obvious question: how could the success of
the Good Judgment Project be factored into the modernization and improvement of
21st century intelligence analysis? With great difficulty is the short answer.
Irrespective of the CIA’s recent announcement, the transition to new analytical
standards will require time and patience, as well as new training and
development programs. Analysts are rarely required to step back from their
niche remits to take the broader and actively open-minded views that are
foundational to better accuracy. Consequently, intelligence operatives are not
taught how to develop this attribute. Moreover, stovepiping and the
compartmentalization of analysis—impediments to the kind of crowdsourced
teamwork from which Good Judgment’s most accurate forecasters benefited--won’t
be easily reduced, whether in government or business circles. And let’s not
forget that the politics of hierarchy also plays a role. When experts within
bureaucracies serve vested interests, their replacement by process happens
fitfully, if at all.
Entrenched
cultures aren’t the only barriers that need breaking down in order to reform
analytical structures. The vast stores of open source information that analysts
must sift through will require better aggregation and filtering if this
revolution in analytical affairs is to succeed. While machine learning,
artificial intelligence and natural language processing methods are quickly
ramping up to semantically trawl the zettabytes-worth of data the Internet
contains, they have yet to be incorporated into a usable and widely distributed
tool for analysts. Over time, as dependency on algorithms in forecasting risk
increases, accountability will dictate putting human analysts at the front and
center as a counterbalance to the limitations of computation alone.
The first
quarter of 2015 could represent a watershed moment in history when the
intelligence community acknowledged the need for a sea change in analytical
approaches and a swift transition. While the recognition that change must come
has arrived, the execution of its transition is far less assured.
About the Author:
About the Author:
Regina Joseph
is the founder of Sibylink, an international consultancy based in The Hague and
New York devoted to providing strategic foresight on global issues through
futures forecasting and analytical training. A Good Judgment Superforecaster,
she is also the co-founder of Super-Powered, which produces analytical media.
She is also a faculty member at New York University’s Center for Global
Affairs, where she will be launching a Futures Lab in Fall 2015.
Her website
can be found at http://www.sibylink.com; http://www.super-powered.com;
Super-Powered on YouTube Super-Powered THE SHOW!; LinkedIn
https://www.linkedin.com/pub/regina-joseph/1/b21/780; Twitter: @Superforecastr;