User talk:Hcberkowitz/Analysis

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Intelligence analysis is the process of producing formal descriptions of situations and entities of strategic importance. Although its practice is found in its purest form inside intelligence agencies, such as the Central Intelligence Agency (CIA) in the United States or the Secret Intelligence Service (SIS, MI6) in the UK, its methods are also applicable in fields such as business intelligence or competitive intelligence.

Intelligence analysis is a way of reducing the ambiguity of highly ambiguous situations, with the ambiguity often very deliberately created by highly intelligent people with mindsets very different from the analyst's. Albert Einstein said "God is subtle, but he is not malicious," but the opponents of intelligence agencies may be monsters in human form -- and of demonic cunning. Obviously, a set of problem-solving talents are essential for analysts. Since the other side may be hiding their intention, the analyst must be tolerant of ambiguity, of false leads, and of partial information far more fragmentary than faces the experimental scientist. According to Dick Heuer [1],"The analyst then follows a problem as additional increments of evidence are received and the picture gradually clarifies--as happened with test subjects in the experiment demonstrating that initial exposure to blurred stimuli interferes with accurate perception even after more and better information becomes available...the experiment suggests that an analyst who starts observing a potential problem situation at an early and unclear stage is at a disadvantage as compared with others, such as policymakers, whose first exposure may come at a later stage when more and better information is available.

"The receipt of information in small increments over time also facilitates assimilation of this information into the analyst's existing views. No one item of information may be sufficient to prompt the analyst to change a previous view. The cumulative message inherent in many pieces of information may be significant but is attenuated when this information is not examined as a whole. The Intelligence Community's review of its performance before the 1973 Arab-Israeli War noted [in the only declassified paragraph](Heuer 1999-2).

The problem of incremental analysis--especially as it applies to the current intelligence process--was also at work in the period preceding hostilities. Analysts, according to their own accounts, were often proceeding on the basis of the day's take, hastily comparing it with material received the previous day. They then produced in 'assembly line fashion' items which may have reflected perceptive intuition but which [did not] accrue from a systematic consideration of an accumulated body of integrated evidence

Writers on analysis [2] [3] have suggested reasons why analysts come to incorrect conclusions, by falling into cognitive traps. Without falling into the trap of avoiding decisions by wanting more information, analysts also need to recognize that they always can learn more about the opponent.

This article will consider some of the ways in which intelligence analysts produce generally successful analyses. No intelligence analyst is perfect, but the best ones learn from their own mistakes and positive experience, as well as the mistakes and experiences of others. Even if the top authorities are fully ethical and effective, just as generals don't succeed without great privates, the environment needs to encourage the junior analyst, and encourage growth in what is a profession, but sometimes seemingly a black art where intuition must be cherished.

Probability can be seductive, when not used as a tool. Many analysts prefer the middle-of-the-road explanation, rejecting high or low probability explanations. Analysts may use their own standard of proportionality as to the risk acceptance of the opponent, rejecting that the opponent may take an extreme risk to achieve what the analyst regards as a minor gain. Above all, the analyst must avoid the cognitive trap projecting what she or he wants the opponent to think, and using available information to justify that conclusion.

Be Bold and Honest[edit]

Wikipedia's motto of "be bold" applies to intelligence analysis. While a good analyst must be able to consider, thoughtfully, alternative viewpoints, an analyst must be willing to stand by his or her position. This is especially important in specialized areas, when the analyst may be the only one that reads every field report, every technical observation on a subject. In another similarity to Wikipedia, (Watanabe) advises "Do not take the editing process too seriously [only when].... the changes do alter the meaning, however, do not be afraid to speak up and contest the changes

"Believe in your own professional judgments. Always be willing to listen to alternative conclusions or other points of view, but stand your ground if you really believe the intelligence supports a certain conclusion. Just because someone is your boss, is a higher grade, or has been around longer than you does not mean he or she knows more about your account than you do. You are the one who reads the traffic every day and who studies the issue.

At the same time, Watanabe observes, "It is better to be mistaken than wrong...." Not willing to be wrong is also a disease of the highest policymaker levels, and why there needs to be a delicately balanced relationship, built of trust, between a policymaker and his closest intelligence advisors. He was was once "responsible for evaluating foreign export control systems to determine if they could protect sensitive Western technology. I was convinced that one of the countries I was studying was not able to protect sensitive technologies... and I had written my intelligence assessments accordingly.... [After direct observation in the country] I was surprised to find that it was far more secure than I had believed, and I reversed my earlier assessments of its unreliability. Had I stuck to my original analysis, I would have been wrong."

"Being an intelligence analyst is not a popularity contest.... But your job is to pursue the truth. I recall a colleague who forwarded an analysis that called into question the wisdom behind several new US weapon systems. This analysis caused criticism of the CIA, of his office, and of himself. He stood his ground, however; the Agency supported him, and eventually he was proved right. He did not make a lot of friends, but he did his job.[4]

Intelligence analysts are expected to give policymakers opinions both support and reality checks [5]. The most effective products have several common features:

  • Opportunities and dangers for interests of the analyst's country, especially unexpected developments that may require a reaction.
  • Motives, objectives, strengths, and vulnerabilities of adversaries, allies, and other actors.
  • Direct and indirect sources of friendly parties' leverage on foreign players and issues.
  • Tactical alternatives for advancing stated national policy goals.

Reality checking is not to be underestimated. In WWII, the Allies launched an air offensive against a target system that they really did not understand: the V-1 cruise missile. Their rationale to attack it was that if the enemy apparently valued it, it must be worth attacking [6]. This may have been rational when there were large numbers of aircraft and pilots, but it might not be warranted today, at least until the analysts have a chance to verify the target system is not a decoy. If the system is real, then it might have been warranted to defer attack until a massive one could have been delivered.

Getting Started[edit]

Many new analysts find that getting started is the hardest part of their job. Experienced analysts recommend starting with understanding the consumer and his needs, and how you complement them.

Recognize one's own styles of analysis, and those of analysts that one respects, and go with natural inclinations that prove useful. Sometimes, it is useful to begin with a very rough outline of the product. Other analysts find it useful to spread out hard copy documents, or at least computer files, and start trying to see patterns. When doing this with computers, a basic knowledge of Web page design is a start; HTML can be a very productive way to cross-reference.

It is well to be able to write a few sentences about what the consumers want, even if those desires are phrased more emotionally than rationally.

Agreement on Content[edit]

In good working relationship, the analytic process is interactive with the customer. For example, the first IMINT of Soviet missiles during the Cuban Missile Crisis was verified and quickly taken to the President and Secretary of Defense. The highest level of authority immediately requested more detail, but also wanted a persective on the Soviet strategy, which was not available from photography. As the White House requested more CIA and Navy support for photography, it simultaneously searched for HUMINT and SIGINT from Cuba, as well as diplomatic HUMINT. Until John F. Kennedy was briefed by excellent briefers, he probably did not understand the capabilities of IMINT[7].

Frequently, the intelligence service will organize the production process and its output to mirror the customer organization. Government production by the single-source intelligence agencies is largely organized geographically or topically, to meet the needs of all-source country, region, or topic analysts in the finished-intelligence producing agencies.

In terms of intended use by the customer, both business and government producers may generate intelligence to be applied in the current, estimative, operational, research, science and technology, or warning context. Serendipity plays a role here, because the collected and analyzed information may meet any or all of these criteria.

A good example is warning intelligence. Military and political analysts are always alert for target indications that an emergency, such as outbreak of war, or a political coup, is imminent. Standing procedures dictate that routine operations switch to warning mode in this case, so that time-sensitive intelligence on the situation can be issued to all relevant customers.

Orienting oneself to the Consumers[edit]

Experienced analysts recommend seeing oneself as a specialist on a team, with 5-10 key players. Learn something about each of them, both in terms of how they express themselves, and how you can reinforce their strengths and support their weaknesses. The analyst must constantly ask himself, "what do they want/need to know? How do they prefer to have it presented? Are they still trying to select the best course of action, or have they committed and now need to know the obstacles and vulnerabilities on their chosen path?"

Others on the team may know how to handle the likely challenges. The analyst's contribution is in recognizing the unlikely, or providing connections that are not obvious. Consumers must get information in a timely manner, not after they commit to a decision they might not have made having rougher information available sooner.

Sometimes, when the producer is struggling with how to meet the needs of both internal and external customers, the solution is to create two different types of products, one for each type of customer. An internal product might contain detail of sources, collection methods, and analytic techniques, while an external product is more like journalism. Remember that journalists always address:

  • Who
  • What
  • When
  • Where
  • Why

"How" is often relevant to journalists, but, in intelligence, may wander into that delicate area of sources and methods, appropriate only for internal audiences. The external consumer needs to know more of potential actions. Actions exist in three phases:

  1. The decision to act
  2. The action
  3. Disengagement from the action [8]

Internal products contain details about the sources and methods used to generate the intelligence, while external products emphasize actionable target information. Similarly, the producer adjusts the product content and tone to the customer’s level of expertise.

Orienting Yourself to Peers[edit]

Even in professional sports, where there are strict anti-fraternization rules on the playing field, players often have deep friendships with counterparts on opposing teams. They might have been on a college team together, or are simply aware that the team they oppose today might be the team to which they might be traded tomorrow. If a technique is personal, rather than a proprietary idea of a coach, one professional might be quite willing to show a nominal opponent how he does some maneuver.

"If you are examining a problem and there is no intelligence available, or the available intelligence is insufficient, be aggressive in pursuing collection and in energizing collectors. ... As an analyst, you have the advantage of knowing both what the consumer needs to know (sometimes better than the consumer knows himself) and which collectors can obtain the needed intelligence.

"Aggressively pursue collection of information you need. In the Intelligence Community, we have the unique ability to bring substantial collection resources to bear in order to collect information on important issues. An analyst needs to understand the general capabilities and limitations of collection systems. ...If the analyst is in a technical discipline, the analyst might have an insight about a collection system that the operators have not considered. (Watanabe) observes "If you are not frequently tasking collectors and giving them feedback on their reporting, you are failing to do an important part of your job."

Your peers, consumer and analyst, also have a psychological context. Johnston wrote [9] suggests the three major components of that context are:

  1. socialization
  2. degree of risk taking or risk aversion
  3. organizational-historical context

(Devlin 2005) observes that while traditional logical work does not consider socialization, work on extending logic into the real world of intelligence requires it. "The first thing to note, and this is crucial, is that the process by which an agent attaches meaning to a symbol always takes place in a context, indeed generally several contexts, and is always dependent on those contexts. An analytic study of the way that people interpret symbols comes down to an investigation of the mechanism captured by the diagram:

[agent] + [symbol] + [context] +. . . + [context] => [interpretation]

Things that are true about contexts include:

  1. Contexts are pervasive
  2. Contexts are primary
  3. Contexts perpetuate
  4. Contexts proliferate
  5. Contexts are potentially pernicious

The discipline of critical discourse analysis will help organize the context. Michael Crichton, [10] in giving examples of physicians communicating with other physicians, points out that laymen have trouble following such discourses not only because there is specialized vocabulary in use, but the discourse takes place in an extremely high context. One physician may ask a question about some diagnostic test, and the other will respond with a result from an apparently unrelated test. The shared context was that the first test looked for evidence of a specific disease, while the answer cited a test result that ruled out the disease. The disease itself was never named, but, in the trained context, perfectly obvious to the participants in the discourse.

Intelligence analysis is also extremely high context. Whether the subject is political behavior or weapons capabilities, the analysts and consumers share a great deal of context. Intelligence consumers express great frustration with generic papers that waste their time by giving them context they already have internalized.

Organizing What You Have[edit]

Collection processes provide analysts with assorted bits of information, some critical and some irrelevant, some true and some false (with many shades inbetween), and some requiring further preprocessing before they can be used in analysis. Collation describes the process of organizing raw data, interpolating known data, evaluating the value of data, putting in working hypotheses.

The simplest approaches often are an excellent start. With due regard for protecting documents and information, a great deal can be done with pieces of paper, a whiteboard, a table, and perhaps a corkboard. Maps often are vital adjuncts, maps that can be written upon.

There are automated equivalents of all of these functions, and each analyst will have a personal balance between manual and machine-assisted methods. Unquestionably, when quantitative methods such as modeling and simulation are appropriate, the analyst will want computer assistance, and possibly consultation from experts in methodology. When combining maps and imagery, especially different kinds of imagery, a geographic information system is usually needed to normalize coordinate systems, scale and magnification, and the ability to suppress certain details and add others.

When trying to understand individual motivation, consider tools such as the Myers-Briggs Type Indicator (MBTI) and its variants. With due regard to the danger of oversimplification, one may map personalities not against a simple left-right political spectrum, but perhaps with the dimensions of economic freedom versus economic planning, and individual rights versus societal duties.

Outlining, possibly in a word processing program, or using visualization tools such as mind maps can give structure, as can file folders and index cards. Data bases, with statistical techniques such as correlation, factor analysis, and time series analysis can give insight.

Mind-map showing a wide range of nonhierarchical relationships

Some analysts speak of a Zen-like state in which they allow the data to "speak" to them. Others may meditate, or even seek insight in dreams, hoping for an insight such as that given to August Kekulé in a daydream that resolved one of the fundamental structural problems of organic chemistry.

(Krizan) took criteria from [11]. Regardless of its form or setting, an effective collation method will have the following attributes:

  1. Be impersonal. It should not depend on the memory of one analyst; another person knowledgeable in the subject should be able to carry out the operation.
  2. Not become the “master” of the analyst or an end in itself.
  3. Be free of bias in integrating the information.
  4. Be receptive to new data without extensive alteration of the collating criterion.

Semantic maps are related to mind maps, but are more amenable to computer discovery of relationships.

Semantic network; compare formalism to [mind map]

The more interactive that the relationship between producer and consumer becomes, the more important will be tools[12]:

  • Collaboration tools. These include all media: voice, video, instant messaging, electronic whiteboards, and shared document markup
  • Databases. Not only will these need to be interoperable, they need to reflect different models, when appropriate, such as the semantic web. There may no longer be a clear line between databases and web applications.
  • Analytic tools. These will cover a wide range of pattern recognition and knowledge organization

The Nature of Analysis[edit]

An analysis is not a nicely arranged scrapbook of raw data. It should have a summary of the key characteristics of the topic, followed by the key variables and choices. Increasingly deep analysis can explain the internal dynamics of the matter being study, and eventually to prediction, known as estimation.

The purpose of intelligence analysis is to reveal to a specific decisionmaker the underlying significance of selected target information. Analysts should begin with confirmed facts, apply expert knowledge to produce plausible but less certain findings, and even forecast, when the forecast is appropriately qualified. Analysts should not, however, engage in fortunetelling that has no basis in fact.

Food chain in intelligence analyst: the bigger the fish, the more unlikely it is

The mnemonic “Four Fs Minus One” may serve as a reminder of how to apply this criterion. Whenever the intelligence information allows, and the customer’s validated needs demand it, the intelligence analyst will extend the thought process as far along the Food Chain as possible, to the third “F” but not beyond to the fourth.

Types of Reasoning[edit]

Objectivity is the intelligence analyst’s primary asset in creating intelligence that meets the Four Fs Minus One criterion. To produce intelligence objectively, the analyst must employ a process tailored to the nature of the problem. Four basic types of reasoning apply to intelligence analysis: induction, deduction, abduction and the scientific method.

Induction: Seeking Causality[edit]

The induction process is one of discovering relationships among the phenomena under study. It may come from human pattern recognition ability, looking at a seemingly random set of events, perhaps writing them on cards and shuffling them until a pattern emerges.

An analyst might notice that when Country X's command post with call sign ABC sent out a message on frequency 1 between Thursday and Saturday, an air unit will move to a training range within one week. The acknowledgement will take one day, so the analyst should recommend intensified COMINT monitoring of the appropriate frequencies between Friday and Sunday. Another kind of causality could come from interviews, in which soldiers might describe the things that warn them of an impending attack, or how the ground might look when an improvised explosive device has been emplaced.

While induction, for human beings, is usually not at a fully rational level, do not discount the potential role of software that uses statistical or logical techniques for finding patterns. Induction is subtly different from intuition: there usually is a pattern that induction recognizes, and this pattern may be applicable to other situations.

Deduction: applying the General[edit]

Deduction, is the classic process of reasoning from the general to the specific, a process made memorable by Sherlock Holmes: "How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?" Deduction can be used to validate a hypothesis by working from premises to conclusion.

The pattern of air maneuvers described above may be a general pattern, or it may be purely General X's personal command style. Analysts need to look at variables, such as personalities, to learn whether a pattern is truly general doctrine, or simply idiosyncratic.

Not all intelligence officers regard this as a desirable approach. At his confirmation hearing for CIA Director, Gen. Michael V. Hayden said he believes that intelligence analysis should be done by "induction," under which "all the data" are gathered and general conclusions determined, rather than by "deduction," under which you have a conclusion and seek out the data that support it[13]

Trained Intuition[edit]

Analysts need to harness trained intuition: the recognition that one has come to a spontaneous insight. The steps leading there may not be apparent, although it is well to validate the intuition with the facts and tools that are available.

Polish cryptanalysts first were reading German Enigma ciphers in 1932, although the commercial version may have been broken by the British cryptanalyst, Dilwyn Knox, in the 1920s. Poland gave critical information to the French and British in 1939, and production British cryptanalysis was well underway in 1940. The Enigma, with German military enhancements, was quite powerful for a mechanical encryption device, and it might not have been broken as easily had the Germans been more careful about operating procedures. Throughout the war, Germany introduced enhancements, but never realized the British were reading the traffic almost as fast as the Germans.

US cryptanalysts had broken several Japanese diplomatic ciphers, but, without ever seeing the Purple machine until after the war, they deduced the logic. Purple was actually mechanically simpler than Enigma, but the US Army team struggled with a mechanical reproduction until Leo Rosen had the unexplained insight that the critical building block in the Purple machine was a telephone-type stepping switch rather than the rotor used in Enigma and in more advanced US and UK machines. Rosen, Frank Rowlett, and others of the team recognized Rosen's insight as based on nothing but a communication engineer's intuition.

Experienced analysts, and sometimes less experienced ones, will have an intuition about some improbable event in a target country, and will collect more data, and perhaps send out collection requests within his or her authority. These intuitions are useful just often enough that wise managers of analysts, unless the situation is absolutely critical, allow them a certain amount of freedom to explore.

Scientific Method[edit]

Astronomers and nuclear physicists, at different ends of the continuum from macroscopic to microscopic, share the method of having to infer behavior, consistent with hypothesis, not by measuring phenomena to which they have no direct access, but by measuring phenomena that can be measured and that hypothesis suggests will be affected by the mechanism of interest. Other scientists may be able to set up direct experiments, as in chemistry or biology. If the experimental results match the expected outcome, then the hypothesis is validated; if not, then the analyst must develop a new hypothesis and appropriate experimental methods.

In intelligence analysis, the analyst rarely has direct access to the observable subject, but gathers information indirectly. From these gathered data, the analyst may proceed with the scientific method by generating tentative explanations for a subject event or phenomenon. Next, each hypothesis is examined for plausibility and compared against newly acquired information, in a continual process toward reaching a conclusion. Often the intelligence analyst tests several hypotheses at the same time, whereas the scientist usually focuses on one at a time. Furthermore, intelligence analysts cannot usually experiment directly upon the subject matter as in science, but must generate fictional scenarios and rigorously test them through methods of analysis suggested below.

Methods of Analysis[edit]

As opposed to types of reasoning, which are ways the analyst drafts the product, the following methods are ways of validating the analyst's results of reasoning.

Opportunity Analysis[edit]

Opportunity analysis identifies for policy officials opportunities or vulnerabilities that the customer’s organization can exploit to advance a policy, as well as dangers that could undermine a policy. Lawyers apply the test cui bono (who benefits?) in a rather similar way.

To make the best use of opportunity analysis, there needs to be a set of objectives for one's own country, preferably with some flexibility to them. The next step is to examine personalities and groups in that target country to see if there are any with a commonality of interest. Even though the different sides might want the same thing, it is entirely possible that one or the other might have deal-breaking conditions. If that is the case, then ways to smooth that conflict need to be identified, or no more work should be spent on that alternative.

Conversely, if there are elements that would be utterly opposed to the objectives of one's side, ways of neutralizing those elements need to be explored. They may have vulnerabilities that could render them impotent, or there may be a reward, not a shared opportunity, that would make them cooperate.

Linchpin Analysis[edit]

Linchpin analysis proceeds from information that is certain, or with a high probability of being certain. In mathematics and physics, a similar problem formation, which constrains the solution by certain known or impossible conditions, is the boundary value condition

By starting from knowns (and impossibilities), the analyst has a powerful technique for showing consumers, peers, and managers that a problem has both been thoroughly studied and constrained to reality[14]. Linchpin analysis was introduced to CIA by Deputy Director for Intelligence (1993-1996) Doug MacEachin, as one of the "muscular" terms he pressed as an alternative to academic language, which was unpopular with many analysts. He substituted linchpin analysis for the hypotheses driving key variables. MacEachin required the hypotheses -- or linchpins -- needed to be explicit, so policymakers could be aware of coverage, and also aware of changes in assumptions.

This method is an "anchoring tool" that seeks to reduce the hazard of self-inflicted intelligence error as well as policymaker misinterpretation. It forces use of the checkpoints listed below, to be used when drafting reports:

  1. Identify the main uncertain factors or key variables judged likely to drive the outcome of the issue, forcing systematic attention to the range of and relationships among factors at play.
  2. Determine the linchpin premises or working assumptions about the drivers. This encourages testing of the key subordinate judgments that hold the estimative conclusion together.
  3. Marshal findings and reasoning in defense of the linchpins, as the premises that warrant the conclusion are subject to debate as well as error.
  4. Address the circumstances under which unexpected developments could occur. What indicators or patterns of development could emerge to signal that the linchpins were unreliable? And what triggers or dramatic internal and external events could reverse the expected momentum?


Analysis of Competing Hypotheses[edit]

Dick Heuer spent years in the DO and DI, and worked on methodology of analysis both in his later years and after retirement. Some of his key conclusions, coming from both experience and an academic background in philosophy, include:

  1. The mind is poorly "wired" to deal effectively with both inherent uncertainty (the natural fog surrounding complex, indeterminate intelligence issues) and induced uncertainty (the man-made fog fabricated by denial and deception operations).
  2. Even increased awareness of cognitive and other "unmotivated" biases, such as the tendency to see information confirming an already-held judgment more vividly than one sees "disconfirming" information, does little by itself to help analysts deal effectively with uncertainty.
  3. Tools and techniques that gear the analyst's mind to apply higher levels of critical thinking can substantially improve analysis on complex issues on which information is incomplete, ambiguous, and often deliberately distorted. Key examples of such intellectual devices include techniques for structuring information, challenging assumptions, and exploring alternative interpretations.

In 1980, he wrote an article, "Perception: Why Can't We See What Is There to be Seen?" which suggests to (Davis 1999) that Heuer's ideas were compatible with linchpin analysis. Given the difficulties inherent in the human processing of complex information, a prudent management system should

  1. Encourage products that (a) clearly delineate their assumptions and chains of inference and (b) specify the degree and source of the uncertainty involved in the conclusions.
  2. Emphasize procedures that expose and elaborate alternative points of view--analytic debates, devil's advocates, interdisciplinary brainstorming, competitive analysis, intra-office peer review of production, and elicitation of outside expertise.

According to Heuer, analysts construct a reality based on objective information, filtered through complex mental processes that determine which information is attended to, how it is organized, and the meaning attributed to it. What people perceive, how readily they perceive it, and how they process this information after receiving it are all strongly influenced by past experience, education, cultural values, role requirements, and organizational norms, as well as by the specifics of the information received. To understand how the analysis results, one must use good mental models to create the work, and understand the models when evaluating it. Analysts need to be comfortable with challenge, refinement, and challenge. To go back to linchpin analysis, the boundary conditions give places to challenge and test, reducing ambiguity. '

More challenge, according to Heuer, is more important than more information. He wanted better analysis to be applied to less information, rather than the reverse. Given the immense volumes of information that modern collection systems produce, the mind is the limiting factor. Mirror-imaging is one of Heuer's favorite example of a cognitive trap, in which the analyst substitutes his own mindset for that of the target. "To see the options faced by foreign leaders as these leaders see them," according the Heuer, " one must understand [the foreign leaders'] values and assumptions and even their misperceptions and misunderstandings. ... Too frequently, foreign behavior appears "irrational" or "not in their own best interest." Projecting American values created models that were inappropriate for the foreign leader, be that Saddam Hussein, Nelson Mandela, or Margaret Thatcher.

Heuer's answer was making the challenge of "Analysis of Competing Hypotheses" (ACH) the core of analysis. In ACH, there is competition among competing hypotheses of the foreign leader's assumptions, which will reduce mirror-imaging even if they do not produce the precise answer. The best use of information, in this context, is to challenge the assumption the analyst likes best.

One of the key motivations for ACH, according to Heuer, is to avoid rejecting deception out of hand, because the situation looks straightforward. Heuer observed that good deception looks real. "Rejecting a plausible but unproven hypothesis too early tends to bias the subsequent analysis, because one does not then look for the evidence that might support it. The possibility of deception should not be rejected until it is disproved or, at least, until a systematic search for evidence has been made and none has been found."

The steps in ACH are[15]:

  1. Identify the possible hypotheses to be considered. Use a group of analysts with different perspectives to brainstorm the possibilities.
  2. Make a list of significant evidence and arguments for and against each hypothesis.
  3. Prepare a matrix with hypotheses across the top and evidence down the side. Analyze the "diagnosticity" of the evidence and arguments--that is, identify which items are most helpful in judging the relative likelihood of the hypotheses.
  4. Refine the matrix. Reconsider the hypotheses and delete evidence and arguments that have no diagnostic value.
  5. Draw tentative conclusions about the relative likelihood of each hypothesis. Proceed by trying to disprove the hypotheses rather than prove them.
  6. Analyze how sensitive your conclusion is to a few critical items of evidence. Consider the consequences for your analysis if that evidence were wrong, misleading, or subject to a different interpretation.
  7. Report conclusions. Discuss the relative likelihood of all the hypotheses, not just the most likely one.
  8. Identify milestones for future observation that may indicate events are taking a different course than expected.

Keith Devlin has been researching the use of mathematics and formal logic in implementing Heuer's ACH [16] paradigm.

Analogy[edit]

If there is some goal for one's side that simply has not been of significant concern for the other, are there things, not threatening to one's own side, that might be of perceived value and traded? Such would be an example of analogies for policies that are not critical for either side.

Analogy is common in technical analysis, but engineering characteristics seeming alike do not necessarily mean that the other side has the same employment doctrine for an otherwise similar thing. Sometimes, the analogy was valid for a time, such as the MiG-25 aircraft being designed as a Soviet counter to the perceived threat of the high-altitude, supersonic B-70 bomber. The Soviets could have cancelled the MiG-25 program when the US changed doctrines to low altitude penetration and cancelled the B-70 program, but they continued building the MiG-25.

One of the Soviet variants was a high-speed, high-altitude reconnaissance aircraft (MiG-25R), which, for a time, was thought comparable to the US SR-71 aircraft. Several additional points of data, however, showed that an analogy between the SR-71 and MiG-25R. HUMINT revealed that a single Mach 3.2 flight of the MiG wrecked the engines beyond hope of repair, and the cost of replacement was prohibitive unless there was no other way to get the information. The SR-71, however, could make repeated flights with the same engines. The dissimilarity of engine life was not only expensive, but meant that the MiG-25R could operate only from bases with the capability to change engines.

The US had applied "reverse engineering" to the MiG, essentially saying "if we had an aircraft with such capabilities, what would we do with it?" In the fighter-interceptor role, however, the US gives the pilot considerable flexibility in tactics, where the Soviets had a doctrine of tight ground control. For the US doctrine, the aircraft was too inflexible for American fighter tactics, but made sense for the Soviets as an interceptor that could make one pass at a penetrating bomber, using an extremely powerful radar to burn through jamming for final targeting.

Many of these assumptions fell apart after Viktor Belenko flew his MiG-25 to the West, where TECHINT analysts could examine the aircraft, and doctrinal specialists could interview Belenko [17].

Cognitive Traps[edit]

A host of traps lie between an analyst and clear thinking. They may be facets of the analyst's own personality, or of the analyst's organizational culture. They may involve difficulty in understanding the foreign mindset, when that perspective is significantly different.

Oversimplifying into individual or group comfort zones[edit]

It is a human tendency to stay within one's "comfort zone", where established patterns avoid harm. Evoked-set reasoning is the tendency to have a dominant threat in one's thinking, and, when new information arrives, bend, fold, file or otherwise modify the new data until it fits with the common thread. Another aspect of the same tendency is to see superficial similarities to historical events, when serious analysis would show they are quite different.

There can be a desire to finish an unpleasant task, for individual reasons, or perhaps to avoid being perceived as slowing the group. In that case, prematurely formed views are those that will "get along" with the common belief system, ignoring testing them to see if they also "get along" with an accurate interpretation of a situation. Analysts may be concerned about their being perceived as other than experts by the group, or even in their self-image, and being unwilling to examine variants of what has been, is, will be, or should be. When challenged, the analyst may regard a legitimate question as a personal attack, rather than looking beyond ego to the merits of the question.

Target fixation, where a pilot, so intent on delivering a bomb to a target that he forgets to fly the airplane and crashes into the target, is a more basic human tendency that many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with a pre-formed view, and ignore other relevant views. A desire for rapid closure is another form of idea fixation.

Analogies can be extremely useful, but, when the analogy is forced,and full of assumptions of cultural or contextual equivalences, inappropriate analogies are yet another cognitive trap, especially when the analyst is unaware of differences in one's own context and that of others. It can be hard to admit that one does not know something. It is even harder to deal with a situation when one is unaware one lacks critical knowledge. Ignorance can be a lack of study, an inability to mesh new facts with old, or a simple denial of conflicting facts.

Organizational Culture[edit]

Even though a individual may be a creative thinker, the analyst's organization may not support productivity. Managers especially concerned with appearances often are at the root of suppressing creative conflict, and staying with stereotypes. Stereotyping can relate to stovepiping, where a group, especially invested in a collection technology, may ignore valid information from other sources (functional specialization). It was a Soviet tendency to value HUMINT from espionage beyond all other sources, such that OSINT had to develop outside the state intelligence organization, but in the [18] USA Institute (later USA-Canada) in the Soviet Academy of Sciences.

Another problem of specialization may be as a result of security compartmentation. If an analytic team has unique access to one source, they may overemphasize its significance. This can be a strong problem with long-term HUMINT relationships, where the partners develop personal bonds.

Just as analysts can reject evidence that contradicts prior judgments, the same phenomenon can capture groups. There is a very delicate balance between thoughtful use of deliberately contrarian "red teams" and politicized use of ideologues who want support only for one policy. The latter has, recently, been called stovepiping, not of intelligence collection disciplines but of informaton flow.

The Other Culture[edit]

There are many levels at which one can misunderstand another culture, be it of an organization or a country. One key trap is the rational-actor hypothesis, which ascribes "rational" behavior to the other side, but with the definition of rationality coming from one's own culture. The social anthropologist Edward T. Hall illustrated one such conflict [19] with an example of conflict in the American Southwest. Drivers from the "Anglo" culture became infuriated when "Hispanic" traffic police would cite them for going one mile per hour above the speed limit, but see the "Hispanic" judge dismiss charges. "Hispanic" drivers were convinced that the "Anglo" judges were unfair because they would not dismiss charges due to circumstances.

Both cultures were rational, but in different ways, pertaining to the enforcement of laws and the adjudication of charges. Both believed that one of the two had to be flexible and the other had to be formal. In Anglo culture, police had discretion about speeding tickets, while the court was expected to stay within the letter of the law. In the Hispanic culture, the police were expected to be strict, but the courts would balance the situation. There was a fundamental Lack of Empathy. Each side was ethnocentric and assumed the other culture was its mirror image. Denial of rationality, in the traffic example, happened in both cultures. Each culture, however, was acting rationally within its own value set.

In a subsequent interview, Hall spoke widely about intercultural communication [20]. He summed up years of study in the simple statement "I spent years trying to figure out how to select people to go overseas. This is the secret. You have to know how to make a friend. And that is it!"

To make a friend, one has to understand the culture of the potential friend, one's own culture, and how things rational in one may not translate to the other. Key questions are

  • "what is culture"
  • "how can an individual be unique within a culture?

Hall said "If we can get away from theoretical paradigms and focus more on what is really going on with people, we will be doing well. I have two models that I used originally. One is the linguistics model, that is, descriptive linguistics. And the other one is animal behavior. Both involve paying close attention to what is happening right under our nose. There is no way to get answers unless you immerse yourself in a situation and pay close attention. From this, the validity and integrity of patterns is experienced. In other words, the pattern can live and become apart of you.

"The main thing that marks my methodology is that I really do use myself as a control. I pay very close attention to myself, my feelings because then I have a base. And it is not intellectual."

Proportionality bias assumes that small things are small in every culture. Things are of different priority in different culture. In Western, especially Northern European, culture, time schedules are very important. Being late can a major discourtesy. Waiting one's turn is a cultural norm, confusing Westerners who ascribe failing to stand in line as a cultural failing. "Honor killing" seems bizarre to some cultures, but very serious in others.

Even within a culture, individuals are just that. Presumption of Unitary Action by Organizations is another trap. In Japanese culture, the lines of authority are very clear, but the senior individual will also seek consensus. American negotiators may push for quick decisions, when the Japanese need to build consensus first -- and once it exists, they may execute faster than Americans.

Producing the Analysis[edit]

Simply identifying an issue is something the team players can do without the analyst. The analyst must convert issues into questions, and then answer the questions. Those questions may serve nicely as section headings in a report. Every section should contribute to making decisions. If a section neither provides context nor answers, it should come out of the report. Just as analysts need to try to understand the thinking of the adversary, analysts need to know the thinking of their customers and allies.

Three key features of the intelligence product are:

  • timeliness
  • scope
  • periodicity.

Timeliness includes not only the amount of time required to deliver the product, but also the usefulness of the product to the customer at a given moment. Scope involves the level of detail or comprehensiveness of the material contained in the product. Periodicity describes the schedule of product initiation and generation.

Packaging[edit]

Government intelligence products are typically packaged as highly structured written and oral presentations, including electrical messages, hardcopy reports, and briefings. Many organizations also generate video intelligence products, especially in the form of live daily “newscasts,” or canned documentary presentations.

LTG William "Gus" Pagonis, who commanded logistic support during Desert Shield and Desert Storm, made a practice, through his career, of seeking efficiencies. In his autobiography [21], he relates realizing that he was spending a great deal of time correcting minor errors in graphics for a fairly routine procedure. He told his graphic artist simply to draw the topic roughly on brown butcher paper, and let him know if anyone complained. Subsequently, Pagonis used rough drawings when they did the job.

Analysts should understand the relationship between the analyst's and the consumer's organization. There may be times that while the ultimate consumer and originating analyst simply want to pass information, a manager in either chain of command may insist on a polished format. If there is no way to avoid that, it may be best to allow time for it rather than wast time fighting it.

Analysts should also be aware of any staff through which the material will have to pass before reaching the end consumer. If staff may return return a report unless it meets some artificial criterion, a wise analyst picks battles: is it appropriate to call on one's own chain of command to force the report through, or merely comply with the artificialities and move on?

Customer Feedback and Production Evaluation[edit]

The production phase of the intelligence process does not end with delivering the product to the customer. Rather, it continues in the same manner in which it began: with interaction between producer and customer. For the product to be useful, the analyst and policymaker need to hear feedback from one another, and they refine both analysis and requirements.

Feedback procedures between producers and customers should include key questions, such as: Is the product usable? Is it timely? Was it in fact used? Did the product meet expectations? If not, why not? What next? The answers to these questions will lead to refined production, greater use of intelligence by decisionmakers, and further feedback sessions. Thus, production of intelligence actually generates more requirements in this iterative process.

The Opposition Doesn't Want You to Succeed[edit]

Adversaries do not want to be analyzed correctly by foreign intelligence services, but their own cultural characteristics may keep them from understanding your approach, or cause them to be blind to their own vulnerabilities. They may, however, fall into the cognitive traps the analyst recognizes and avoids, which can be an immense advantage.

The Other Side is Different[edit]

The analyst's country (or organization) is not identical to that of the opponent. One error is to mirror-image the opposition, and assume it will act as your country and culture would under the same circumstances. Mirror-imaging can also take place among one's own analysts, who commit to a common set of assumptions rather than challenging assumptions. Another country may have quite different cultural frameworks that affect the targets of intelligence, positively or negatively. In WWII, the Japanese seemed to believe that their language was so complex that even if their cryptosystems were broken, outsiders could not truly understand the content. That was not strictly true, but it was sufficiently so to find cases where the intended recipients did not clearly understand the writer's intent.

Mirror imaging, or assuming the other side thinks as you do, is one of the great hazards. Again looking at policymakers rather than analysts, this was a huge problem during Vietnam, with the President and Secretary of Defense making assumptions that Ho Chi Minh would react to situations in the same way that would Lyndon Baines Johnson, or, the epitome of Western rationality, Robert S. McNamara. In like manner, completely aside from US political manipulations of intelligence, there was a serious misapprehension that Saddam Hussein would view the situation vis-a-vis Kuwait as the State Department and White House viewed it.

The Other Side is Complex[edit]

Opposing countries are not monolithic, even within their governments. There can be bureaucratic competition that becomes associated with different ideas. Some dictators, such as Hitler and Stalin, were known for creating internal dissension, so only the leader was in complete control.

Opponents are not always rational, or they may be more risk-taking that one's own country would be. Risk-taking to give the illusion of a WMD threat appears to have been one of Saddam Hussein's doctrines. They will be unlikely either to act in the best case for your side, or to take the worst case approach to which you are most vulnerable. There is sometimes an internal danger, among analysts, to assume the opponent is all-wise and will always know all of your side's weaknesses.

The Other Side is Active[edit]

Analysts need to form hypotheses, but the analysts need to be open to data that either confirms or disproves a hypothesis, rather than searching for evidence that supports only one theory. They must remember that the enemy may be deliberately deceiving them with information that seems plausible to the enemy. Bacon observed "the most successful deception stories were apparently as reasonable as the truth. Allied strategic deception, as well as Soviet deception in support of the operations at Stalingrad, Kursk, and the 1944 summer offensive, all exploited German leadership’s preexisting beliefs and were, therefore, incredibly effective." Stories that Hitler believed implausible were not accepted. Western deception staffs mixed "ambiguity-type" and "misleading-type" deceptions, the former intended simply to confuse analysts and the latter to make one false alternative especially likely.

Of all modern militaries, the Russians treat strategic deception, or, in their word, maskirovka, which goes beyond our phrase to include deception, operational security and concealment, as an integral part of all planning, in which the highest levels of command are involved. Bacon wrote "The battle of Kursk was also an example of effective Soviet maskirovka. While the Germans were preparing for their Kursk offensive, the Soviets created a story that they intended to conduct only defensive operations at Kursk. The reality was the Soviets planned a large counteroffensive at Kursk once they blunted the German attack. .... German intelligence for the Russian Front assumed the Soviets would conduct only “local” attacks around Kursk to “gain a better jumping off place for the winter offensive.” The counterattack by the Steppe Front stunned the Germans[22].

The opponent may try to overload the analytical capability[23], As a warning to those preparing the intelligence budget, and to those agencies where the fast track to promotion is in collection, one's own side may produce so much raw data that the analyst is overwhelmed, even without enemy assistance.

Intelligence is Useful only when Disseminated[edit]

"Ambassador Robert D. Blackwill ... seized the attention of the class of some 30 [intelligence community managers] by asserting that as a policy official he never read DI analytic papers. Why? "Because they were nonadhesive." As Blackwill explained, they were written by people who did not know what he was trying to do and, so, could not help him get it done:

"When I was working at State on European affairs, for example, on certain issues I was the Secretary of State. DI analysts did not know that--that I was one of a handful of key decisionmakers on some very important matters....

"More charitably, he now characterizes his early periods of service at the NSC Staff and in State Department bureaus as ones of "mutual ignorance": DI analysts did not have the foggiest notion of what I did; and I did not have a clue as to what they could or should do.[24]

Blackwill explained how he used his time efficiently, which rarely involved reading general CIA reports. "I read a lot. Much of it was press. You have to know how issues are coming across politically to get your job done. Also, cables from overseas for preparing agendas for meetings and sending and receiving messages from my counterparts in foreign governments. Countless versions of policy drafts from those competing for the President's blessing. And dozens of phone calls. Many are a waste of time but have to be answered, again, for policy and political reasons.

"One more minute, please, on what I did not find useful. This is important. My job description called for me to help prepare the President for making policy decisions, including at meetings with foreign counterparts and other officials.... Do you think that after I have spent long weeks shaping the agenda, I have to be told a day or two before the German foreign minister visits Washington why he is coming?"

Peer Review[edit]

If the information is not disseminated, it is useless, and dissemination includes people who need it within the intelligence agency. "Coordination with peers is necessary...If you think you are right, and the coordinator disagrees, let the assessment reflect that difference of opinion and use a footnote if necessary. But never water down your assessment to a lowest common denominator just to obtain coordination.When everyone agrees on an issue, something probably is wrong. "As an example, following the collapse of the Soviet Union, there was an almost unanimous belief that large numbers of Russian ballistic missile specialists would flood into the Third World and aid missile programs in other states (the so-called brain drain). ...As it turned out, there was no [expected] mass departure of Russian missile specialists, but Russian expertise was supplied to other states in ways that had been ignored due to the overemphasis on the brain drain.

End Users[edit]

The end user wants information that he needs to know, but, unless he happens to have an interest in a particular subject, does not care how much you know about the topic. William Donovan, the head of the WWII OSS, began to get FDR's ear because he gave vividly illustrate, well-organized briefings that would be common today, but were unprecedented in WWII. Today, there is danger of becoming too entranced with the presentation and less with its subject. This is also a delicate dance of overemphasizing the subjects that interest high officials, and what they want to hear declared true about them, rather than hearing what the analysts believe is essential.

"Most consumers do not care how attractive a report looks or whether the format is correct. I have lost count of the number of times consumers have told me they do not care if an assessment has a CIA seal on it, if it is in the proper format, or even if it has draft stamped all over it; they just want the assessment in their hands as soon as possible, at least in time to help make a decision." Unfortunately, a number of mid-level managers get overly worried about form, and wise top-level intelligence officials make sure that does not happen.

References[edit]

  1. ^ Heuer, Richards J. Jr. (1999). "Psychology of Intelligence Analysis. Chapter 2. Perception: Why Can't We See What Is There To Be Seen?". History Staff, Center for the Study of Intelligence, Central Intelligence Agency. Heuer 1999-2. Retrieved 2007-10-29.
  2. ^ North Atlantic Treaty Organization (November 2001). "NATO Open Source Intelligence Handbook" (PDF). Retrieved 2007-10-23.
  3. ^ Krizan, Lisa (June 1999). "Intelligence Essential for Everyone". Joint Military Intelligence College. Retrieved 2007-10-23.
  4. ^ Watanabe, Frank (1997), "Fifteen Axioms for Intelligence Analysts: How To Succeed in the DI [Directorate of Intelligence]", Studies in Intelligence, Watanabe 1997, retrieved 2007-10-23
  5. ^ Central Intelligence Agency, Directorate of Intelligence (March 1995). "Note 1: Addressing US Interests in DI Assessments". A Compendium of Analytic Tradecraft Notes. Retrieved 2007-10-25.
  6. ^ Kalisch, Robert B. (July–August 1971). "Air Force Technical Intelligence". Air University Review. Retrieved 2007-10-27.{{cite web}}: CS1 maint: date format (link)
  7. ^ "The Kennedy Tapes: Inside the White House During the Cuban Missile Crisis". 1996. May 1996. Retrieved 2007-10-23. {{cite web}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  8. ^ Ikle, Fred (2005). Every War Must End. Columbia University Press.
  9. ^ Johnston, Rob (2005). "Analytic Culture in the US Intelligence Community: An Ethnographic Study". Center for the Study of Intelligence, Central Intelligence Agency. Retrieved 2007-10-29.
  10. ^ Crichton, Michael (1970). Five Patients. Ballantine Books. ISBN 0345354648.
  11. ^ Mathams, R.H. (1995). "The Intelligence Analyst’s Notebook". Joint Military Intelligence Training Center. {{cite book}}: |work= ignored (help); line feed character in |work= at position 35 (help)
  12. ^ National Intelligence Production Board (2001). "Strategic Investment Plan for Intelligence Community Analysis". NIPB-2001. Retrieved 2007-10-28.
  13. ^ "Nominee Has Ability To Bear Bad News: Some Senators Unsure He Will Use It With Bush". Washington Post. May 19, 2006. Retrieved 2007-10-28. {{cite web}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  14. ^ Davis, Jack (1999). "Improving Intelligence Analysis at CIA: Dick Heuer's Contribution to Intelligence Analysis". Psychology of Intelligence Analysis, Center for the Study of Analysis, Central Intelligence Agency. Davis 1999. Retrieved 2007-10-27.
  15. ^ Heuer, Richards J. Jr. (1999). "Psychology of Intelligence Analysis. Chapter 8: Analysis of Competing Hypotheses". History Staff, Center for the Study of Intelligence, Central Intelligence Agency. Heuer 1999-8. Retrieved 2007-10-28.
  16. ^ Devlin, Keith. "Confronting context effects in intelligence analysis: How can mathematics help?" (PDF). Devlin 2005. Retrieved 2007-10-29. {{cite web}}: Text "date - July 15, 2005" ignored (help)
  17. ^ Barron, John (1983). Mig Pilot: The Final Escape of Lt. Belenko. ISBN 0380538687. {{cite book}}: Unknown parameter |Publisher= ignored (|publisher= suggested) (help)
  18. ^ "The Amerikanisti". Time. Jul. 24, 1972. Retrieved 2007-10-28. {{cite web}}: Check date values in: |date= (help)
  19. ^ Hall, Edward T. (1973). The Silent Language. Anchor. ISBN 0385055498.
  20. ^ Kathryn Sorrells (Summer 1998). "Gifts of Wisdom: An Interview with Dr. Edward T. Hall". The Edge: The E-Journal of Intercultural Relations. Retrieved 2007-10-28.
  21. ^ Pagonis, William (1994). Moving Mountains: Lessons in Leadership and Logistics from the Gulf War. Harvard Business School Press. ISBN 0875845088.
  22. ^ Bacon, Donald J. (December 1998). "Second World War Deception: Lessons Learned for Today's Joint Planner, Wright Flyer Paper No. 5" (PDF). (US) Air Command and Staff College. Retrieved 2007-10-24.
  23. ^ Luttwak, Edward (1997). Coup D'Etat: A Practical Handbook. Harvard University Press.
  24. ^ Davis, Jack (1995). "A Policymaker's Perspective On Intelligence Analysis". Studies in Intelligence. Retrieved 2007-10-28. {{cite web}}: Unknown parameter |Volume= ignored (|volume= suggested) (help)

Category:Intelligence analysis Category:Military Category:Military intelligence

tr:İstihbarat Çarkı