The Role of Bias in Intelligence Analysis

Embed Size (px)

Citation preview

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    1/16

    The Role of Organizational

    and Intelligence DisciplineBias in Intelligence

    Analysis, and Structured

    Analytic Methods to

    Overcome ThisJeremy Levin; Student ID #3049427

    INTL 506: Analytics II

    7/30/2011

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    2/16

    In the wake of the 11 September 2001 terrorist attacks against the World Trade Center

    and Pentagon--commonly referred to as 9/11--and the faulty intelligence that sent the United

    States (US) to war in Iraq, intelligence failure has become one of the most discussed national

    security topics in both academia and national security circles. From online blogs devoted to

    exposing intelligence dysfunction to the 9/11 Commission Report, every aspect of the

    intelligence cycle has been picked apart, and failure attributed to every aspect. A common

    thread among much of this criticism is the intelligence stovepiping, or maintenance of

    intelligence within an organization rather than disseminating it to the intelligence community

    (IC). Most pundits assert this stovepiping was a result of either security concerns--that those

    outside the originating organization would leak or spill the intelligence--or born of the desire to

    use the withheld reporting make their organizations' analysis more valuable to national decision

    makers. Whatever the reason, intelligence organizations were not sharing information, resulting

    in several failures to piece together complete intelligence pictures for national leaders.

    In 2005, leadership of the intelligence community was taken away from the Director of

    Central Intelligence--the position that headed the community since 1946, predating the Central

    Intelligence Agency's (CIA) creation--and given to the newly-created Director of National

    Intelligence (DNI), in large part in an attempt to eradicate the stovepiping that had plagued the

    US' intelligence efforts for decades.

    Despite this mandate and the subsequent creation of numerous joint intelligence groups to

    facilitate intelligence sharing, stovepiping continues to plague the intelligence community,

    suggesting the root of the problem lies deeper than habit or neglect. This paper proposes the

    stovepiping blamed for much of the intelligence community's dysfunction and several

    intelligence failures results from a more fundamental problem--bias. This bias primarily takes

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    3/16

    one of two forms: bias within an organization in favor of that organization's reporting, or against

    the reporting of other organizations, and bias for or against a particular type of intelligence,

    usually due to its method of collection.

    Assuming the bulk of authors and experts correctly state stovepiping was largely

    responsible for previous flawed intelligence that left the US vulnerable to terrorist attack, and

    that stovepiping is an ongoing issue, it logically follows that the US continues to be vulnerable to

    attack due, in part, to ongoing, underlying bias that contributes to stovepiping within the

    intelligence community. This paper will explore the manifestations of this bias toward reporting

    from different intelligence collection organizations, and toward reporting from different types of

    intelligence collectors. It will then explore whether and which structured analytic techniques

    could effectively mitigate the identified biases among intelligence analysts, managers, and

    consumers.

    Literature Review:

    While not specifically detailing intelligence stovepiping, in his book, "Bureaucracy,"

    James Q. Wilson described organizational turf protectionism as a key factor in bureaucratic

    operations. Wilson asserts agencies and organizations attach high priority to their decision

    making autonomy, or turf, and strongly resist and resent attempts to infringe on this turf. Wilson

    further states "struggles over autonomy are especially visible when the organizations involved

    have similar tasks." While Wilson was referring specifically to the armed forces, this sentiment

    is applicable to intelligence stakeholders as well. (Wilson 1989)

    In his book, "Getting Agencies to Work Together," Eugene Bardach asserts agencies

    often resist collaboration for fear it will blur agency missions and political accountabilities.

    Bardach further states agencies may fear obsolescence, potentially resulting in a "ecosystem

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    4/16

    effect" in which policy aims to protect individual species--in this case, individual agencies and

    organizations--rather than the ecosystem writ large--in this case, the IC. Bardach identifies the

    same issues regarding autonomy Wilson put forth, stating agencies may reject collaboration to

    preserve their decision making autonomy and minimize potential threats from necessary

    relationships. (Bardach 1998)

    In August 2007, former Director of National Intelligence Mike McConnell wrote his

    recommendations for intelligence reform in his piece, "Overhauling Intelligence," published in

    Foreign Affairs. Among his recommendations for intelligence reform, he identified several

    problem areas within the intelligence community; stovepiping intelligence was one of these

    areas. McConnell asserts that in addition to a divide between foreign intelligence and law

    enforcement that remained in place as intelligence sharing otherwise expanded post-9/11, the IC

    component organizations' unique mandates and narrowly focused missions inhibited IC unity.

    McConnell specified that few analysts knew their counterparts in other agencies, and there were

    few mechanisms in place to facilitate, let alone ensure collaboration and analytic exchange.

    (McConnell 2007)

    A 2008 Office of the Director of National Intelligence (ODNI) study into IC management

    challenges, conducted by the Office of the Inspector General, identified IC information sharing

    as a key challenge. The study found that most analysts relied on personal relationships with

    counterparts for information sharing, as agencies responsible for intelligence collection

    continued to limit access to data and products to the wider community. Part of the problem,

    according to the study, was that information technology systems were largely disconnected and

    incompatible, and interoperability between networks was lacking. However, the study claimed

    turf battles and agency protectionism continued to be a problem, despite efforts to improve and

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    5/16

    increase collaboration and sharing, because there were few, if any consequences for failure to

    collaborate. (Office of the Director of National Intelligence, Office of the Inspector General

    2008)

    Kenneth Lieberthal's "The U.S. Intelligence Community and Foreign Policy," written for

    the Brookings Institution, was written to identify problem areas within the IC and recommend

    changes to make the IC more relevant and responsive to policy and decision makers' needs. The

    study identifies numerous problem areas, from overemphasis on a single intelligence product line

    to the recruitment of poor quality analysts; for our purposes, Lieberthal's section on

    overemphasizing classified reporting and de-emphasizing open source reporting that provides

    greater context to intelligence products. Lieberthal also assets a "culture of insularity and

    secrecy" works to the detriment of both the IC and its products by limiting dissemination of

    intelligence traffic due to variations in security screenings, and lamented that participants in

    National Intelligence Estimates (NIEs) represented their agencies rather than themselves, and

    therefore perpetuated inter-agency rivalries and partisan manipulation. (Lieberthal 2009)

    Mark Lowenthal details IC interrelationships in his book "Intelligence: From Secrets to

    Policy." Lowenthal contends many of the developments within the IC since 2001 have actually

    increased rather than eradicated interagency rivalries, most prominently between foreign and

    domestic intelligence and between civilian, military, and law enforcement intelligence--

    especially between the CIA, DoD, and FBI. Lowenthal also identified the potential for "footnote

    wars," (as he describes them), in which producers of finished intelligence products attempt to

    maintain separate points of view on intelligence topics. Further, Lowenthal describes

    stovepiping as a result agencies attempting to highlight their individual relevance to ensure

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    6/16

    continued funding, and asserts one way intelligence analysts attempt to make their products stand

    out is to emphasize the unique nature of their sources. (Lowenthal 2009)

    Dr. Rob Johnston's "Analytic Culture in the US Intelligence Community," written for the

    CIA's Center for the Study of Intelligence, is a comprehensive look at how the IC conducts

    analysis, and many of the problems entailed in this analysis. Johnston identifies several factors

    that contribute to intelligence stovepiping and bias, including the conflict between secrecy and

    efficacy, agency reluctance to reverse or alter corporate judgments, and confirmation bias in

    agency selection and weighing of data according to classification. Johnston cites interviews with

    senior analysts and intelligence managers, in which common themes include bias in favor of

    classified reporting over open-source intelligence, trust in technical collection over other

    intelligence collection, and agency preference for intelligence collected by that agency. Johnston

    further contends agency managers believe personnel, training, and readiness should be tailored

    specifically for their own organizations, which inhibits interaction with other agencies. Johnston

    also asserts that even when placed in a joint environment, analysts will revert to previous

    isolationist patterns if not given specific joint processes, principles, and operational structures.

    (Johnston 2005)

    Steven W. Peterson's "US Intelligence Support to Decision Making," written for the

    Weatherhead Center for International Affairs at Harvard University, details the primary

    intelligence failures leading to both 9/11 and the 2003 invasion of Iraq. Peterson reiterated the

    9/11 Commission's determination that intelligence stovepiping and agency parochialism as a

    matter of policy and practice were key contributors to these failures. Peterson also assigns fault

    to decision makers' expectation that intelligence can provide certainty, and asserts this

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    7/16

    expectation causes bias in favor of technical intelligence collection and reporting against

    adversary actions, and against collection and reporting against adversary intent. (Peterson 2009)

    Gregory Treverton's essay "Intelligence Analysis: Between 'Politicization' and

    Irrelevance" addresses both decision maker and agency bias in intelligence analysis. Treverton

    details several facets of politicization, including senior policy officials' direct pressure and

    agency "house line" assessment. Treverton describes the same desire for certainty Peterson

    detailed, but asserts the less technically fact-driven analysis on intent and adversary mindset is

    still sought after and of value in shaping policy makers' strategic goals. Treverton implies

    decision makers discount non-technical intelligence on intent and mindset when addressing

    tactical situations. (George and Bruce 2008)

    Frank Cilluffo, Ronald Marks, and George Salmoiraghi's study titled "The Use and

    Limits of U.S. Intelligence" published in the Washington Quarterly's Winter 2002 edition

    attributed many of the US' HUMINT collection difficulties to a long-running political shift away

    from HUMINT collection in the US intelligence community. The authors argue US political

    structures degraded HUMINT collection due to the perception HUMINT was more "dirty" than

    technological collection methods, such as SIGINT, and attempts to make HUMINT "cleaner"--

    such as directives to ensure HUMINT assets were "boy scouts," as the authors quote a

    clandestine service officer--further ensured HUMINT collection would not adequately address

    US intelligence needs. (Cilluffo, Marks and Salmoiraghi 2002)

    The existing literature on the subject clearly indicates stovepiping is an ongoing issue

    within the intelligence community, and suggests this stovepiping results at least in part from bias

    on the part of policy and decision makers, intelligence managers, and intelligence analysts alike.

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    8/16

    Discussion:

    In discussing bias that results in intelligence stovepiping, we must differentiate between

    unconscious bias--that which affects the perception of intelligence reporting or analysis writ

    large, without a specific goal or aim--and deliberate bias--selecting reporting for use in analysis

    and assessment to support specific goals, aims, or agenda. We must further specify how

    different types of bias affect participants in the intelligence production process.

    Unconscious Bias:

    Desire for Certainty

    Johnston, Treverton, and Peterson all describe a preference for technical collection over

    non-technical collection. Johnston and Treverton directly attribute this to the desire for certainty,

    and the belief that technical collection and analysis based on technical collection can provide

    more certain assessments. This appears to be driven by policy makers and intelligence managers

    more than analysts, and often results in increased funding for organizations and agencies

    supporting technical intelligence collection at the expense of those supporting non-technical

    collection. This leadership expectation does impact analysis, however, in that analysts will use a

    disproportionate amount of technically collected intelligence in finished intelligence products to

    ensure these product appeal to, and are read by their customers.

    Preference for Classified Reporting

    Lieberthal and Johnston identify the preference to use classified reporting rather than

    open source information in intelligence analysis, and anecdotal evidence from more than fifteen

    years in the IC suggests this bias is prevalent among intelligence analysts and managers. In de-

    emphasizing openly available information--especially adversary or target-nations leaders' public

    statements and concerns--intelligence analysts lose sight of the context and social drivers

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    9/16

    impacting their analysis and assessments. Additionally, classified intelligence collection

    depends on collection assets being available and placed where they are able to collect

    information, and being focused on targets that ensure all relevant information is collected.

    Classified collection platforms that are not perfectly placed and targeted will result in incomplete

    or incorrect intelligence collection, and over-reliance on this classified but incomplete reporting

    will invariably lead to intelligence failure.

    Preference for One's Own Agency's Reporting

    Lowenthal and Johnston both identify analyst preference to use unique sources of

    information or reporting originating from their own agencies. Once again, anecdotal evidence

    suggests this bias is not only prevalent, but pandemic among intelligence analysts and managers.

    National decision makers require full-spectrum assessments to identify what adversaries intend,

    of what they are capable, the state of their preparation, etc, and no one intelligence discipline--

    such as HUMINT, SIGINT, MASINT, IMINT, and so forth--is capable of providing such full-

    spectrum assessment. Ostensibly "all-source" analysis that emphasizes one discipline or

    collection agency over others is disingenuous, misleading to national decision makers, and

    contributes to intelligence failure.

    Perception of "Clean" and "Dirty" Intelligence

    Cilluffo et al identify the perception among decision makers that technical collection is

    "cleaner" or more gentlemanly than clandestine HUMINT collection via spies. While this bias

    appears primarily with policy and decision makers, it can prompt these leaders to discount

    assessment based on HUMINT or affect budget and resource allocation which subsequently

    impact both collection and analysis.

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    10/16

    Deliberate Bias:

    Ensuring Relevance and Resources

    Bardach, Lieberthal, and Lowenthal identify intelligence managers' selection and use of

    intelligence reporting for analysis to ensure decision makers perceive their organizations as

    relevant and necessary for national security. This results in incomplete and skewed intelligence

    pictures provided to senior decision makers, which increases the likelihood of poor decisions that

    leave us vulnerable to intelligence failure and attack. As with the unconscious bias in favor of

    one's own agency's reporting, selecting reporting for ostensibly "all-source" national security

    analysis and assessment to support an organizational agenda is disingenuous and damaging.

    Protecting Autonomy

    Wilson and Bardach identify the overarching desire of any and every bureaucratic

    organization to seek and protect its decision making autonomy. One way to do this is to ensure

    one's customers--in this case, national level decision and policy makers--rely on the products

    your organization provides, in the manner your organization provides them, with the

    understanding that pressure from or forced collaboration with external stakeholders will degrade

    that product. Additionally, the ODNI's determination that the systems and networks amongst the

    IC are unable to communicate and lack interoperability likely stems from the protectionism each

    agency performs: maintaining separate and distinct means of communication translates to less

    external influence.

    Structured Analysis:

    Advances in message handling systems--such as the Multi-Media Messaging (M3) used

    by many Department of Defense components--have greatly improved cross-organizational

    reporting to facilitate its availability to all analysts. However, the problem of bias in the

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    11/16

    selection and use of reporting in finished intelligence products continues. The top-down

    approach to eliminating stovepiping and bias in intelligence analysis and assessment undertaken

    in the wake of 9/11 and the Iraq conflict has so far failed; not surprising, as policy and decision

    makers are unlikely to acknowledge their biases as such, and intelligence managers are unlikely

    to cease their deliberately applied bias.

    This leaves it up to the intelligence analyst to remove bias. Several structured analytic

    techniques have been developed with the goal of removing bias from intelligence analysis. In

    the initial phase of research, gathering information, and winnowing available reporting for use in

    an intelligence product, analysts should evaluate the available sources of information to

    determine reliability, credibility, the quality of information, and its relevance to the issue at hand.

    Beyond this, using a structured technique to objectively compare reporting to hypotheses further

    minimized the effect of bias in analysis and assessment.

    In the selection and evaluation stage, the Defense Intelligence Agency's "A Tradecraft

    Primer: Basic Structured Analytic Techniques" would prove of immense value. Detailed in this

    primer are methods to perform relevance checks, source checks, and quality of information

    checks--all necessary steps when producing finished intelligence. (Defense Intelligence Agency

    2009)

    Relevance Check

    This check enables analysts to winnow available reporting to that which directly

    addresses the issue at hand, mitigates bias when reviewing information and reporting, and

    increases analyst confidence that all facets of the issue have been thoroughly analyzed. While

    time consuming if there is a large body of reporting on the subject, relevance checks ensure each

    piece of data assessed relates to the central issues or alternative possibilities being analyzed. Of

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    12/16

    note, the relevance check should not be used to gather information and data that supports a

    particular hypothesis or opinion--it should determine reporting related to the central topic, which

    will then help form hypotheses.

    Source Check

    This is conducted as part of the first full review of the relevant reporting, and is designed

    to mitigate bias for or against a given source. To perform a source check, a series of questions

    must be asked when reading each report; for example, when evaluating a HUMINT source, the

    analyst should analyze the source information and context, then ask whether the source's stated

    placement and access would grant him/her this information, and whether the source appears

    biased or is approaching the information from a particular point of view. For ELINT or

    MASINT, the analyst should ask about the capabilities and limitations of the collection platform

    compared to the data collected, the frequency and duration of collection, the platform's coverage,

    etc. The answers to the questions asked while performing a source check will allow the analyst

    to both objectively weigh reporting from multiple sources, and to accurately convey confidence

    in these sources to intelligence managers and decision makers.

    Quality of Information Check

    This check enables analysts to evaluate the completeness and validity of available

    information. Analysts should question the reports' actual information for completeness, signs of

    bias, signs of deception or intention to influence, whether the information is corroborated by

    separate sources or intelligence platforms, and whether the information is consistent with

    previous information--and if not, whether the information is anomalous, and therefore likely

    incorrect, or signifies a paradigm shift that alters or cancels previous assessments.

    Structured Analysis

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    13/16

    Possibly one of the most well known and best regarded analytic techniques is Richards

    Heuer's analysis of competing hypotheses (ACH), intended to minimize bias and objectively

    weigh reporting against mutually exclusive hypotheses. The eight-step procedure helps analysts

    ensure and display thoroughness, and can be used to demonstrate the objective relevance of

    reporting collected from varying platforms to intelligence managers and decision makers. (Heuer

    1999)

    Once the relevance, source, and quality of information checks have been completed,

    analysts can use any of several techniques to determine mutually exclusive hypotheses on a

    particular intelligence issue or question. ACH compares reporting against these hypotheses in a

    matrix designed to determine consistency or inconsistency with each hypothesis. Such a matrix

    could look like this:

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    14/16

    Question: Will Iran retaliate against Israel for Israel's perceived involvement in the 23 July

    killing of a nuclear scientist in Tehran?

    Hypotheses:H1: Iran will not retaliateH2: Iran will retaliate only with deniable covert/clandestine/terrorist operations

    H3: Iran will retaliate with overt military strikes against Israel

    - = Inconsistent; + = Consistent; N/A = Neither consistent nor inconsistent

    H1 H2 H3

    E1: CIA report of

    Iranian intent to

    conduct retaliatorycovert/clandestine

    attacks against Israel

    - + N/A

    E2: Israeli liaisonreport of Iranian

    intent to conductballistic missile

    launches against Israel

    - N/A +

    E3: Iranian public

    statements vowingretaliation for the

    killing

    - + +

    E4: NSA report of

    Iranian militaryleaders traveling to

    Lebanon for ameeting with

    Hizballah

    + + +

    E5: CIA report stating

    Iranian military

    leaders believe theyare unprepared to

    repel an Israeli air

    strike

    + + -

    E6: DIA report statingIranian Islamic

    Revolutionary GuardCorps personnel were

    photographing the

    Israeli embassy inTurkey

    N/A + N/A

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    15/16

    Once completed, the analyst can use the matrix results to support a most likely

    hypothesis, as well as assign relative likelihoods to alternate hypotheses. For our purposes,

    utilizing all available, relevant reporting in such a matrix effectively removes unconscious bias in

    source selection and utilization, and can be used to demonstrate each intelligence collection

    platform and organization's relevance to intelligence managers and decision makers.

    Conclusion:

    Structured analytic techniques can assist intelligence analysts avoid unconscious bias

    associated with stovepiping in the creation of finished intelligence products, and assist analysts

    display unbiased relevance of all organizations' reporting to intelligence consumers. While these

    techniques will not directly remove or decrease deliberate unconscious bias in decision makers

    and intelligence managers, consistent, transparently advertised use of these techniques can make

    such biases more difficult to justify and maintain.

  • 8/3/2019 The Role of Bias in Intelligence Analysis

    16/16

    Works Cited

    Bardach, Eugene. Getting Agencies to Work Together. Washington D.C.: Brookings InstitutionPress, 1998.

    Cilluffo, Frank J., Ronald A. Marks, and George C. Salmoiraghi. "The Use and Limits of U.S.

    Intelligence." The Washington Quarterly, 2002: 61-74.

    Defense Intelligence Agency.A Tradecraft Primer: Basic Structured Analytic Techniques.

    Defense Intelligence Agency, 2009.

    George, Roger Z, and James B Bruce.Analyzing Intelligence. Washington D.C.: Georgetown

    University Press, 2008.

    Heuer, Richards J. Psychology of Intelligence Analysis. Washington D.C.: Center for the Study

    of Intelligence, 1999.

    Johnston, Rob.Analytic Culture in the US Intelligence Community. Academic Study,

    Washington D.C.: Center for the Study of Intelligence, 2005.

    Lieberthal, Kenneth. The U.S. Intelligence Community and Foreign policy. Academic Study,

    Washington D.C.: The Brookings Institution, 2009.

    Lowenthal, Mark M.Intelligence: From Secrets to Policy. Washington D.C.: CQ Press, 2009.

    McConnell, Mike. "Overhauling Intelligence." Foreign Affairs 86, no. 4 (2007): 49-58.

    Office of the Director of National Intelligence, Office of the Inspector General. CriticalIntelligence Community Management Challenges. Washington D.C.: ODNI, 2008.

    Peterson, Steven W. US Intelligence Support to Decision Making. Research Paper, Cambridge:

    Weatherhead Center for International Affairs, 2009.

    Wilson, James Q.Bureaucracy. Basic Books, Inc, 1989.