Library

 

Chapter One: Making Sense of the US Intelligence Community

A Complex Adaptive System

The Intelligence Community is an exemplar, even if not a healthy one, of a truly complex adaptive system.

With its fifteen diverse agencies and its wide range of functional responsibilities, the Intelligence Community presents a very complicated set of organizational arrangements. Thinking of it in terms of traditional organizational analysis or systems engineering methods in an effort to explain its working does not suffice because it far more resembles a living ecology with a complex web of many interacting entities, dynamic relationships, non-linear feedback loops (often only partially recognized), and specific functional niches that reflect momentarily successful adaptations to the environment.[1] These complex interrelationships among its components create dynamic adaptations to changing conditions and pressures and make the Intelligence Community especially difficult to understand.[2] In fact, it is an exemplar, even if not a healthy one, of a truly complex adaptive system.

During the Cold War, proportionately more resources supporting a larger cadre of experienced analysts devoted to a simpler and relatively static priority target, as well as a broad array of established sources, disguised many of the Intelligence Community’s dysfunctional aspects and growing internal problems. The community’s loosely federated structure and complicated, if not Byzantine, processes had previously appeared tolerable, even if not fully successful, because making changes appeared to present a greater risk.[3] In the face of a drastically changed security environment, however, it is exactly the combination of complexity and opaqueness that has masked the increasingly dysfunctional misalignment of “dinosaur” analytic processes and methodologies from earlier recognition by both analysts and consumers of intelligence, much less by outsiders.[4]

Even for insiders, the workings of the Intelligence Community are difficult to understand because, as a rule, its members are not deeply self-reflective about its practices and processes. For outsiders, however, these difficulties are magnified by the community’s compartmentation, security restrictions, and intrinsic opaqueness. That is why applying traditional organizational analysis that concentrates on structure is doomed to failure; understanding these complex adaptive systems requires more synthesis than traditional “reductionist” analysis.[5] In this case, moreover, it is a complex adaptive system that, insulated by security barriers, has managed to ignore and—probably because of its centralized direction, however imperfect—suppress important external signs of change and to amplify self-protective internal signals, which often reflect strongly ingrained cultural preferences.

The results of the Intelligence Community’s failure to recognize the increasing dysfunction were both paradoxical and unfortunate. They were paradoxical because—although it has been accused of not adapting to dramatically changed conditions—the community adapted all too well. And they were unfortunate because the pressures to which it did adapt flowed from misperceptions inside and outside the Intelligence Community engendered by the collapse of the Soviet Union: that there would be no significant challenges to American interests; that the end of the Cold War reduced the need for a “national security state”; that there should be a substantial “peace dividend,” a large part of which would be paid by the Intelligence Community. The community’s adaptive processes did accommodate these changes internally—especially the need to “survive” the huge budget cuts and to become relevant to the articulated needs of the paying customers.

It is important not only to locate the level at which obvious symptoms occur, but also the level at which problems can be solved.

However, these internal pressures outweighed the huge new challenges emerging in the external security environment. Responding to these would demand new expertise and a new knowledge base, along with appropriate methods, tools, and perspectives—all of which required more resources, focused leadership, and strong commitment, which was not there. As a result, the community fostered a series of processes that were increasingly maladapted to needs emerging in the new geostrategic environment. By responding to the wrong signals, it created Perrow’s “error-inducing systems.”[6]

 

 

Relating Structure and Process

Unfortunately, most Intelligence Community reform proposals concentrate on changes in structure and in directive and managerial authorities. Analytic problems, however, actually take place not just at the level of the community as a whole, but at four distinct levels, as well as in the complex interrelationships, both vertical and horizontal, among them.[7] Thus, it is important not only to locate the level at which the obvious symptoms appear, but also the level at which the problem can be solved. In this way, the root causes of failure can be identified and appropriate and effective corrective measures taken.

The National Security Community. The relevant entities include the National Security Council (NSC), the Office of the Director of National Intelligence (ODNI), and the national policymaking and operational elements in the Department of State and the Department of Defense.[8] Among the failures at this level can be misdirected priorities and misallocation of resources; poor communication and coordination; and inconsistent apportionment of authority, responsibility, and capability among the main entities. Such failures flow downward and can easily percolate throughout the subordinate organizations.

For the Intelligence Community, a particular problem at this level may involve its relationships with top-level users, especially managing their expectations. On the one hand, for example, the Intelligence Community often demonstrates an inability or unwillingness to say “no” to consumer requests, which leads to additional priority taskings without concomitant resources or relief from other ongoing activities. Similarly, the Intelligence Community often conveys an illusion of omniscience that fails to make clear its state of knowledge on an issue, the underlying quality of the intelligence, or the degree of uncertainty—all of which can leave the Intelligence Community seemingly responsible for decisions taken on the basis of “bad intelligence.”

The Intelligence Community. This level currently includes the fifteen component intelligence agencies. Failures at this level can include misdirected priorities and budgetary allocations within the Intelligence Community; lack of effective procedures and oversight of them among component agencies; poor communication and coordination among agencies; a lack of enforceable quality-control processes; toleration of substandard performance by individual agencies; poor communitywide technical standards and infrastructure that hinder information sharing; and poor management and oversight of security procedures that impede effective performance. Errors at this level also encompass failures by groups or individuals to make critical decisions, to exercise appropriate authority, or to take responsibility for gross errors that should be worthy of sanction or dismissal.[9]

The Individual Analytic Units and Organizations. It is essential to appreciate the importance of particular analytic environments within specific sub-organizations—an office within the CIA’s Directorate of Intelligence, for example. It is these entities, rather than the organization as a whole, that create the work processes and practices that form the immediate cultural matrix for an analyst’s behavior.[10] Failures at this level can include dysfunctional organizational processes, practices, and cultures that inhibit effective analysis by individuals and sub-units; management attitudes and directives that stress parochial agency objectives; toleration of poor performance; excessive compartmentation and special security procedures that erect barriers to effective execution; poor prioritization and assignment of workflow; inability to create and protect “slack” and conceptual space for intellectual discovery; ineffective recruitment and training; maintaining stand-alone information and analysis infrastructures, including ineffective support for individual analysts; poor direction and management of the analytic process; and, simply, ineffective management of the analytic cadre. This is probably the most important level for creating consistently high-quality analysis because of its impact on the analytic environment, on the selection of methods and processes, and on the work life of individual analysts. Errors at this level are perhaps the most pernicious, however, and they have been widespread and persistent.

Individual Analysts. Failures at this level can include poor performance due to lack of ability, lack of domain knowledge, lack of process expertise, poor social network contacts, or ineffective training; pressures to favor product over knowledge; lack of time; being too busy and too focused to maintain peripheral vision and curiosity, even on high priority work; failure to cooperate and collaborate with others; lack of suitable tools and support; misguided incentives and rewards; and an organizational culture and work practices that tolerate second-rate analysis.

To illustrate the impact of this multi-level hierarchy and underscore the importance of correctly identifying the locations of causative factors in analytic errors, for example, consider the case of an analyst who fails to interpret correctly the evidence pertinent to a task and draws a wrong conclusion. At first glance, the obvious approach should be to focus corrective actions on the analyst: what caused the failure, and what are the appropriate remedies? Simple incompetence, a rush to complete the assignment, a lack of domain knowledge needed to recognize critical linkages, or a failure to employ appropriate methods could all be causative factors. At this level, the obvious remedies to these problems are better screening, training, and mentoring.

It could be, however, that the problem lies with the analytic unit, its work processes, and its management: the tasking was high priority, and this analyst, whose expertise is on another subject, was the only one available; appropriate tools and methods were not provided; training in relevant domain knowledge or on effective new tools had been postponed due to production pressures; or, given the production cycle, the analyst lacked sufficient time to search for all the relevant evidence. The problem could reside even farther up the hierarchy, among the agencies of the Intelligence Community: key data from another agency was not made available, due to compartmentation restrictions or because incompatible information infrastructures prevented the analyst from easily searching another agency’s holdings. Finally, the failure could actually reside at the topmost level, with community management: this account was given such low priority that no collection resources had been assigned to gather information or to provide consistent analytic coverage or, because of the thinness of the evidence base, the inability to answer the question was not made clear to the requester at the start.

However, it is exactly here that the “5 Whys Approach” of the Quality Movement proves its value.[11] Applying this approach, which features a progressively deeper, recursive search, forces the investigator to trace a causative factor to its source.[12] Assume that, in this example, it is a lack of domain knowledge.

Why was an analyst not fully knowledgeable in the domain working that account?

She was covering for the lead analyst, who is away on temporary duty (TDY).

Why did the analytic manager assign that analyst to the task?

She was the only one available.

Why was the analyst not fully knowledgeable on her backup account?

She is an apprentice analyst with only a short time on the account and inadequate mentoring. Her training had been postponed due to scheduling. She didn’t have time to be curious and follow the information scent. She could not access the lead analyst’s “shoebox.”[13]

Why couldn’t she access the shoebox of the lead analyst?

It is his personal collection of tentative hypotheses and uncorrelated data kept as a personal Word file and is not in an accessible database. The shoebox is actually a pile of paper put in temporary storage when the lead analyst went on TDY.

Why is the lead analyst unwilling to share his shoebox?

Why is there no accessible collaborative system for sharing shoeboxes?

The questions would continue through as many rounds as the questioner needed to satisfy himself that he had found the root cause.

Although the previously cited reports on intelligence failures usually point to organizational stove-piping and technical shortcomings as the most important contributors to failures in collaboration, the sources of such failure are actually more widespread and complex—and more frequently reflect shortcomings in work practices and processes, organizational culture, and social networks.[14] In addition, the proposed solutions that focus on structures and authorities disregard the critical interrelationship between structure and processes and ignore as well the importance of organizational culture on institutional effectiveness. As Stephen Marrin, among others, has noted:

Structure and process must work together in a complementary fashion, and structural changes alone without corresponding changes to existing processes would simplify the workings of the Intelligence Community in some ways, but cause greater complexity in others.[15]

The significant structural reforms legislated in 2004 will also entail substantial short-term transition costs to effectiveness as new organizational arrangements are implemented, processes are developed, and outmoded roles and systems are replaced. The really difficult task will be to redesign the processes, so that they are consistent and complementary to the structural changes that are being made.

 

The Analysis Phase-Space

Incorrect diagnoses of the causes of analytic failures probably arise from not recognizing the variety and complexity of the roles, missions, and tasks that confront analysts.

At a basic level, incorrect diagnoses of the causes of analytic failures probably arise from not recognizing the variety and complexity of the roles, missions, and tasks that confront analysts. This diversity results in a complex phase-space, illustrated below, that contains a significant number of discrete analytic regions. These certainly cannot be treated as though their perspectives and needs were homogeneous or even similar. The tasks required of a signals intelligence analyst attempting to locate a terrorist’s cell-phone call are fundamentally different from those of an all-source analyst drafting an NIE on Chinese strategic nuclear doctrine. Therefore, because intelligence collection and analysis are not based either on a suite of all-purpose tools or on fungible human expertise that can be instantly swiveled to focus effectively on a different set of problems, this phase-space also implies the need for a similar diversity of analytic processes, methods, knowledge bases, and expertise.

See Graph: A Large and Diverse Intelligence “Phase-Space” [PDF 52.8KB*]

 

Differentiating Intelligence Roles

Moreover, given this diverse phase-space, conflating three distinct roles played by all-source intelligence adds to the underlying confusion over intelligence missions and functions, the priorities among them, their requirements, and the capabilities needed to make each effective. The traditional assumption that there were only two sets of intelligence consumers, each with distinct mission needs, often led to contraposing support to military operations, which was assumed to be tactical in focus, and national user support, which was assumed to demand deep analysis. In reality, meeting the disparate needs of the users intelligence must serve requires recognizing three distinct roles for all-source intelligence.[16] Two of them, Support to Military Operations (SMO) and Support to Policy Operations (SPO), focus primarily on issues needing immediate information capabilities to assist decisionmaking on current operations. Although SMO and SPO issues are of interest to both national and departmental users, the third role, Warning and Estimative Intelligence (WEI), largely emphasizes issues that are almost exclusively the province of national users and usually take place over longer time horizons.[17]

In all cases, however, although it still uses the term “support,” the Intelligence Community must move beyond the notion that it is segregated from the rest of the national security community and that it merely provides apolitical information to decisionmakers. Intelligence has now become an integral element of both the policy and military operational processes; and the success or failure of its judgments can have the most significant consequences in both domains.[18] Increasingly-integrated military operations, in which intelligence directly drives operations, and command centers in which intelligence personnel are fully integrated, are tangible evidence of such changes. As a result, it is important that intelligence appreciate not only the centrality of its role, but also the increased obligations and responsibilities that such a role brings.[19]

Support to Military Operations (SMO): This traditional intelligence role has usually focused on assisting current military operations. Much of this information concerns current numbers, locations, and activities of hostile units, and other information addresses significant elements of the physical environment in which military forces are operating.[20] Other military users need quite specific current data on subtle technical characteristics of adversarial equipment and forces to serve, for example, as targeting signatures or to support electronic warfare (EW) activities. Regardless of type, intelligence supporting operating forces demands extraordinary accuracy, precision, and timeliness to ensure that it is immediately “actionable” under conditions that are highly stressful and potentially lethal.[21]

Increasingly, however, military operators have other operational intelligence needs, such as support for information operations and for security and stabilization in Iraq. To prosecute these missions successfully, the military now also needs far more cultural awareness and timely accurate information on adversary thinking, motivations, and intentions.

Support to Policy Operations (SPO): Making explicit that this is a distinct role emphasizes the importance of intelligence to daily policymaking across the entire spectrum of national security concerns; it is the “national user” cognate of SMO. SPO provides policymakers and senior officials (importantly including senior civilian defense officials, combatant commanders, and other military officers) with indispensable situational awareness, including important background information, to assist them in executing and overseeing ongoing policy activities and in planning and framing policy initiatives. As it is as intensely focused on providing actionable information, it is as heavily oriented as SMO to current intelligence and reporting. However, SPO differs from SMO somewhat in content and priorities in that it has always included a greater proportion of less quantifiable, softer information, such as political and economic trends in major countries and groups and assessments of foreign leaders and their intentions.

See Graph: Three Distinctive Needs for Analytic Support [PDF 48.6KB*]

Warning and Estimative Intelligence (WEI): Mary McCarthy, a former National Intelligence Officer (NIO) for Warning, commented on the recommendations of a DCI-chartered study conducted in 1992:

A warning process . . . allows decisionmakers to think through responses they might be obliged to make in haste.

According to that ten-member panel of highly respected intelligence and policy veterans, providing policymakers with persuasive and timely intelligence warning is the most important service the Intelligence Community can perform for the security of the United States.[22]

McCarthy defines warning as “a process of communicating judgments about threats to US security or policy interests to decision-makers.”[23] Thus, warning provides vital support to “national users” in their principal strategic missions—understanding the complex geostrategic environment, fostering vision of objectives, assessing alternatives and determining strategy, and protecting against consequential surprise; most importantly, when done properly, warning is forward looking and anticipatory.[24]

Warning is sometimes thought to be merely alerting decisionmakers to immediately threatening activities, but, in reality, it is a far more complex function and actually addresses two very different kinds of problems. One type of warning is concerned with monitoring activities previously recognized as potentially dangerous, such as a hostile missile launch, and cueing appropriate responses. The second type is a discovery function that assists decisionmakers in identifying those situations and activities whose consequences could have significant (and usually adverse) effects—and which may not necessarily be obvious. When performed effectively, a warning process provides decisionmakers with an anticipatory sensitization that allows them to think through, in a disciplined way, the responses they might someday be obliged to make in haste. Assessments and estimates, on the other hand, also are usually forward looking, but they are designed to be informative rather than part of a process closely tied to triggering contingent responses.

Further complicating the matter is that both types of warning also operate over three different horizons. Strategic warning has always been understood as looking out toward the distant future; it is intended to recognize that a possible threat may be looming—even if it is not imminent—and to provide time to take appropriate preparatory actions, including policies and actions that might prevent the threat from eventuating.[25] Operational warning also looks out in order to identify the characteristics of the threat (the likely and particular methods of attack), so that offsetting contingency plans and actions can be prepared. From this detailed understanding of enemy intentions, capabilities, and concepts, operational warning also serves to identify indicators that an attack is in preparation. Finally, tactical warning is the immediate alerting function that a specific (with respect to time, place, or character) hostile activity has begun or is about to begin.

An important but often overlooked element of warning over all three horizons is the key role played by negative evidence, which can help confirm that potentially threatening activities are not occurring and prevent costly and potentially consequential responses from being taken or scarce resources from being squandered.[26] During the confrontation between the United States and the Soviet Union, and, in particular, during periods of high tension between them, one of the most important functions of warning was to inform the leaders that, “Today is not the day.”

Both the warning and estimative functions are designed to focus more on informing decisionmaking with context and long-term implications than with supporting ongoing activities. The preparation of assessments and estimates, as well as development of warning indicators, has more to do with analysis and judgment than with reporting; it demands deep expertise as well as an ability to place knowledge of the subject in broad context. These important functions serve the entire national security community.[27]

Although warning is often misconstrued as a current intelligence problem, even tactical warning of specific targets, times, and means must build on this deeper foundation of pre-emptive analysis of threats and responses if it is to be effective. During the Cold War, recognizing that we were engaged in a long-term competition, we were prepared to adjust our intelligence priorities so that analysts could provide assessments of future capabilities and indications of intentions, even though the day-to-day threats were most grave. As was the case in facing the Soviet Union, there may well be tensions today in choosing between serving SMO and SPO, on the one hand, and assuring adequate resources for WEI, on the other hand, as continued access to information needed for an understanding of enemy intentions and capabilities could be sacrificed by meeting the needs for immediately actionable intelligence.

Today’s decisionmakers have many more sources of information than did their predecessors ...; in turn, the Intelligence Community holds far less of a monopoly over information about foreign events and technology developments.

Although warning and estimative intelligence may be seen as the core missions of strategic intelligence, they are also less tied to the details of ongoing operations in which the formal relationships between policymakers and intelligence provide a unique advantage and leverage for intelligence insights. Today’s decisionmakers have many more sources of information than did their predecessors when the Intelligence Community was created; in turn, the Intelligence Community holds far less of a monopoly over information about foreign events and technology developments. Moreover, as one senior intelligence official noted, policymakers see themselves and their staffs as substantively knowledgeable on issues of interest as the Intelligence Community and capable of serving as their own intelligence analysts.[28] As a result, users increasingly see themselves as participants in a process of judgments.[29] An experienced national-level user wrote recently,

Today, the analyst no longer sets the pace of the information flow. The sources of information now available to the policy-level consumer…are far, far greater than a quarter century ago. It is almost a given that today’s policy-level consumer of intelligence is well informed in his or her area of interest and not dependent on an intelligence analyst for a continuing stream of routine, updating information.[30]

 

Implications of Differentiating Roles

Careful differentiation among the three intelligence roles discloses dimensions that are both analytically important and more meaningful for contemplating intelligence reform than the usually misleading bimodal distinctions between national vs. military users or tactical vs. strategic objectives. What truly distinguishes these intelligence roles is their perspective and emphasis—a significant distinction that has been lost in recent arguments over intelligence reform.

To begin with a particularly important point, a tactical or a strategic focus does not necessarily distinguish military from civilian users.[31] Moreover, the less quantifiable and, therefore, softer information and analysis on individuals, decisionmaking, and social dynamics that used to be produced primarily for national users is now increasingly demanded to support military operations at the tactical level. Such information is inherently more judgmental and inferential—and, therefore, less precise—than analysis of physical or technical characteristics in orders-of-battle (OOBs) and tables of organization and equipment (TOEs). It is less amenable to counting or to the gathering of external physical signatures by technical collection systems; it is more dependent on language skills, deep expertise on the region and cultures, and knowledge of the personalities.[32] It is also harder to validate or prove than estimates of technical factors. Such capabilities go beyond “reporting” that used to be the core of current intelligence.

However, both SMO and SPO are, by nature, mission- or task-oriented and tightly focused on the problem at hand; and this narrowed focus has significant time and perceptual implications for analysts and the intelligence sources supporting them.[33] Given the stress, time pressures, and immediate—as opposed to potential—stakes attendant on current operations, human decisionmakers try to concentrate only on the immediate situation and the information relevant to it, while actively screening out other inputs. This is the intelligence analogue of human “foveal vision,” which offers the highest visual resolution but also a very narrow field-of-view.[34]

In contrast, warning and estimative intelligence are the analogues of human “peripheral vision,” in which there is low resolution but a wide field-of-view. Peripheral vision is very sensitive to cues of dynamic change, which trigger anticipatory responses. Although warning is concerned with activating the response cycle, and estimative intelligence is intended to create a frame of reference for the decisionmaker, both are intended, through preconditioning and anticipatory consideration, to enable a more appropriately and contextually sensitized response on the part of users.[35]

Another important implication of the differing emphasis on decisions with a long-term view and those requiring prompt action—the classic distinction between strategic and tactical—concerns the nature of the advantage to be gained from the information, and, therefore, how it is exploited. In recent years, the tasks of intelligence, and its successes and failures, have focused on providing immediately actionable (in this sense, tactical) intelligence to users—information that can provide a rapid or near-instantaneous advantage, whether for interdicting hostile military forces, preventing terrorist incidents, or supporting diplomatic initiatives. Emphasizing current intelligence for actionable exploitation may have created an unintended mind-set that undervalues the immense importance of knowing and understanding the adversary’s intentions throughout the course of the confrontation, even at the cost of foregoing exploitation of these sources for temporary advantage on the battlefield or in the diplomatic conference room.[36] This stress on current intelligence also influences the priorities among the types and attributes of information we collect, the nature of the collection and processing systems, the analytical methods we use, the stresses we place on analysts, and the metrics by which we assess the performance of intelligence.[37]

When one tries to assess the adequacy of Intelligence Community performance … or prescribe changes … the appropriate answers will almost certainly differ greatly from one role to the other.

There is yet another important distinction between these roles. By looking out to the future, WEI is basically a surprise-preventing function intended to heighten a policymaker’s ability to visualize the consequences of anticipated and unanticipated events and to prepare for them mentally; it is not designed to be “evidence-based truth-telling” that will stand up in court. In addition, as we better appreciate the implications of emergence and the emergent behaviors of complex adaptive systems, we need to place greater emphasis on anticipation while recognizing that precise prediction or forecasting is even harder than previously understood. Appreciating the differences in perspective created by these roles is very important because failing to make clear distinctions between them may aggravate a major problem before the Intelligence Community: the disconnect between the emphasis on current reporting or providing situational awareness, which must be evidence-based, and the policymaker's need for anticipatory judgments, which, by nature, trade the confidence derived from hard evidence for an intuitive, or gestalt, understanding of the whole derived from inference.[38] It is unlikely that analysts will have firm evidence to offer the policymaker to support alternative interpretations of the future, and they will need to rely on inference and informed speculation that is persuasive to decisionmakers.[39] In particular, as one experienced intelligence analyst noted,

“Getting inside the other guy’s head” can only be conjectural because, in most cases, even the “other guy” doesn’t know exactly why he’s doing what he’s doing.[40]

Even if there are predictive judgments to be made in both SMO and SPO,[41] they tend to have short time horizons and reasonably short inferential chains; as the predictive time-constants are short, observation of adversary actions can serve to validate or disprove these judgments and thereby improve confidence in them and the analyst’s judgment.

Those providing SPO, in particular, must continually walk a fine line between serving the policymakers’ needs for relevant, focused, direct support and maintaining objectivity in providing the evidence and analysis. Staying close to the evidence assists the analyst in walking this line. At the same time, the author of this monograph noted a clear consensus among senior intelligence officers at a recent non-attribution conference that analysts can best serve policymakers by offering them thoughtful and thought-provoking views that challenge their assumptions. It must be recognized, however, that helping to alter policymakers’ assumptions is intruding directly into the policymaking process and, thereby, crossing the boundary that Sherman Kent tried to establish. As the policymakers demand judgments on actions and consequences farther in the future (moving the intelligence role from SPO to WEI), not only will the intrinsic uncertainties increase, but also the potential for tensions between policymaker and analyst over the objectivity (and validity) of the judgments and the conflicts among differing judgments.[42]

Another of these distinctions affects intelligence requirements and planning. Unlike SMO and SPO, where the users can clearly identify their areas of interest, priority issues, and information needs, the Intelligence Community must look beyond its users’ perceptual horizons if it is to perform warning and estimative functions effectively. Almost by definition, with anticipatory intelligence, policymakers will be unable to tell the community where to look. Unfortunately, although the Intelligence Community must recognize that attempting to divine requirements for warning and other anticipatory intelligence from the users is not likely to be fruitful, it also must appreciate that it alone will bear the blame for failing to warn against the inevitable surprises arising from outside the fields-of-view of users. This demands, in turn, that the Intelligence Community have some discretion and flexibility to allocate resources in areas not currently considered to be priority targets: listening too closely to the customers, and looking only where directed, guarantees future strategic warning failures.

It is absolutely essential that the Intelligence Community and those who depend on it understand the principal distinctions between these two functions. In their conclusions about the nature of the Intelligence Community’s problems, the extraordinary differences between the report of the 9/11 Commission and that of the SSCI on Iraqi WMD reveal the dangers of conflating the two distinct functions or ignoring the differences between the three roles. When one tries to assess the adequacy of Intelligence Community performance across these domains, identify shortfalls, or prescribe changes—whether in business practices, tools, or organizational arrangements—the appropriate answers will almost certainly differ greatly from one role to the other.

 

Footnotes:

[1]A feedback loop, in systems analysis, is a relationship in which information about the response of the system to stimuli is used to modify the input signal (see “Feedback,” Principia Cybernetica Web). A non-linear loop is one that creates non-proportional responses to stimuli.

[2]See Peter M. Senge, The Fifth Discipline: The Art & Practice of the Learning Organization. Senge is the founder of the Organizational Learning Laboratory at MIT.

[3]The pressures of the Manichean confrontation with the Soviet Union tempered enthusiasm for drastic and disruptive changes. These might have improved effectiveness, but they would also have provoked bureaucratic and congressional battles over power and jurisdiction.

[4]After all, the dinosaurs were superbly adapted to their environment; even if they perceived the signals of change, they became extinct because they could not adapt to unfamiliar environmental conditions.

[5]An appreciation of the distinction between a complicated system and one that is complex and adaptive is important for accurate diagnosis and effective solutions. A hallmark of complex adaptive systems is that they produce “emergent behavior,” which cannot be prediced by analysis of their component elements or structure.

[6]See Perrow, Normal Accidents.

[7]The briefing on “Analytic Pathologies” graphically illustrates the multi-level interplay of these problems. See Appendix A for a summary.

[8]At this level, for the Intelligence Community, it is the ODNI and the Intelligence Community elements that are responsible for critical functions—collection, analysis, special activities, and community management—that interact directly with senior principals. With a DNI and an ODNI organization in place, these relationships are likely to become even more complicated.

[9]See Statement by Admiral David Jeremiah (USN, ret.), Press Conference, CIA Headquarters, 2 Jun 1998, for a suggestion that failures by senior managers to make key decisions had been an important factor in the CIA’s failure to warn of an impending Indian nuclear test. (The subject was the “Jeremiah Report” on the 1998 Indian nuclear test.)

[10]See Karl E. Weick, Sensemaking in Organizations.

[11]The Quality Movement took root in the United States during the 1990s, when US auto manufacturers were challenged by the emergence of higher quality Japanese automobiles made by automakers who had adopted the principles of two US engineers, W. Edwards Deming and Joseph Juran. The principles provide a systematic set of processes and metrics for improving the quality of manufacturing processes.

[12]A recursive search is one in which successive searches build on the results of earlier searches to refine the answers returned. (See National Institute of Standards and Technology, Dictionary of Algorithms and Data Structures.)

[13]Although seldom used today, many analysts once referred to the personal files where they stored such items as the results of research as “shoeboxes.” It is used here to emphasize the particularity of the methods employed by analysts.

[14]Technical systems and infrastructures enabling collaboration are important, but they are only a small part of the solution to fostering effective collaboration. For more on this topic, see discussion beginning on page 57.

[15]Stephen Marrin, in a review of William E. Odom, “Fixing Intelligence: For a More Secure America,” Political Science Quarterly, 119, no. 2 (Summer 2004): 363.

[16]It is important to recognize that these regions have fuzzy boundaries, overlap to some degree, and are not totally distinct.

[17]The intelligence role that often leads to confusion over appropriate categorization is warning, and especially the tactical warning component. Because warning is intimately connected to a decision on a responsive action, it is sometime mistakenly considered to be a decision-support activity; in reality, it is more appropriately seen as a part of the informative function that assists policymakers in thinking about issues before they occur, helping to create coherent, contextualized reference frames. Moreover, because tactical warning is tactical, it is often forgotten that it is of principal concern to high-level strategic users because it almost always involves activities that could have the most serious political and strategic consequences. Thus, these three roles cover two distinct functions: SMO and SPO emphasize situational awareness and immediate decision support, while WEI focuses on anticipation of future circumstances.

[18]I am grateful to Dr. Russell Swenson of the Joint Military Intelligence College for persuading me to sharpen this point. See Russell G. Swenson, with Susana C. Lemozy, “Politicization and Persuasion: Balancing Evolution and Devolution in Strategic Intelligence,” unpublished manuscript. When the CIA was created, expectations about intelligence capabilities and its role were significantly different than they are today. At the policy level as well, there is now an expectation that intelligence will be available to guide policy creation and inform course changes if necessary.

[19]A valuable guide to appropriate comportment in these circumstances is Herbert Goldhamer’s The Adviser.

[20]The US Army, which has extensive doctrine on operations, calls this intelligence preparation of the battlefield (IPB). This includes specific information on mission, enemy, time, terrain, and troops available (METT-T).

[21]A critical example is the need for technical details, so that enemy weapons, such as improvised explosive devices (IEDs), can be countered.

[22]Mary McCarthy, “The National Warning System: Striving for an Elusive Goal,” Defense Intelligence Journal 3 (1994): 5. Warning is considered the classic “strategic intelligence” role and was the principal reason for the creation of the CIA.

[23]Ibid.

[24]There have always been terminological problems associated with the word “strategic.” During the Cold War, users of the word often conflated level of analysis (global and synoptic), time horizon (forward-looking), and magnitude of the stakes (very large), with instrumentality (nuclear) and distance (intercontinental).

[25]To some degree, these terms have always been confusing because they described two very different types of problems. Strategic, operational, and tactical warning related to surprise nuclear attack were very patterned, but focused on two distinct problems: surprise attack executed by known forces and surprises that were truly unanticipated.

[26]Perhaps the scarcest resource is a senior decisionmaker’s attention, which can easily be wasted.

[27]As an indication of the long time horizon involved in this function, both civilian and military defense officials need to look well into the future to develop strategy, plan forces, support research and development, and acquire systems.

[28]See Charles E. Allen, “Intelligence: Cult, Craft, or Business?” in “Comments of the Associate Director of Central Intelligence for Collection” at a Public Seminar on Intelligence at Harvard University, spring 2000. See http://pirp. harvard.edu/pdf-blurb.asp?id+518: 15. Henry Kissinger may be the most obvious example of this tendency, but it has continued since the Nixon administration and has come to include a far greater proportion of policy officials, especially as the sources of information on foreign developments have expanded dramatically and become available in near real-time. See Henry A. Kissinger, “America’s Assignment,” Newsweek, November 8, 2004: 38–43.

[29]As one senior intelligence official remarked in a private meeting, “We are all in the business of making judgments; but too many in the Intelligence Community continue to believe that they are instead providing crystalline analyses.”

[30]Clift, “Intelligence in the Internet Era,” Studies in Intelligence 47, no. 3 (2003).

[31]This distinction might have been clearer before civilian leaders began taking detailed interest in overseeing tactical operations—as began during the Vietnam War with presidential interest in the selection of bombing targets. With respect to the Soviet Union during the Cold War, national and military users had clear areas of primary interest; national users were focused on intelligence illuminating key political, economic, technological, and social factors affecting national power and intentions, while military users were more focused on the likely force capabilities and doctrines of potential adversaries. After the Cold War, improving technical capabilities and the emergence of the “strategic corporal” combined to increase the interest of civilian policymakers in overseeing tactical operations.

[32]Several senior military participants at the Charlottesville conference highlighted these demands. See Charlottesville Conference Report, 2–5.

[33]There is a very large body of literature on the physical and psycho-perceptual effects on human judgment and decisionmaking under stress that is relevant to these distinctions.

[34]The fovea, a small pit at the back of the retina, forms the point of sharpest vision. The intensely narrow concentration of foveal vision is recognized as being a major contributor to “change blindness.”

[35]One type of warning function amenable to focused monitoring involves potential surprise from a recognized adversary undertaking a feared but pre-identified activity (such as a Warsaw Pact invasion across the Inner-German border). This other warning function serves to guard against truly unexpected or unforeseen events. In both cases, they are designed to encourage thinking about, and contingency planning for, “surprises” before they occur.

[36]The widely repeated—but apocryphal—story that perhaps best exemplifies this understanding of “strategic intelligence” is that of Churchill’s allowing Coventry to be bombed in order to safeguard the long-term informational advantages gained from Allied code-breaking achievements against the Axis. The immeasurable importance of such intelligence in the successful Allied efforts to interdict Rommel’s supply lines during the North African campaign and in winning the crucial Battle of the Atlantic testify to these other equities with possibly higher priority.

[37]Barger, 26. Although her specific comment refers to the impact of precision (timeliness and resolution, for example) on the quality and quantity of intelligence, her larger point is that functional needs stemming from roles and missions drive what the Intelligence Community provides its users and how it does so.

[38]Gestalt, a German word meaning “form” or “shape” is used in psychology to connote holistic understanding of the entirety of a phenomenon. This follows Kendall’s approach of “creating pictures,” as noted by Jack Davis in “The Kent-Kendall Debate of 1949,” Studies in Intelligence 35, no. 2 (1991).

[39]This is a comment to the author by a senior intelligence officer who has served in both roles.

[40]Private communication to the author from John Bodnar, 3 November 2004.

[41]The military, in particular, is increasingly emphasizing “predictive battlespace intelligence” as a central component of “information superiority.” It is usually, however, a different kind of prediction than that required to support SPO.

[42]As the founder of Air Force and Joint Staff Studies and Analyses, Lt. Gen. Glenn Kent (USAF, ret.), paraphrasing Shakespeare, once warned analysts, “Neither a prostitute nor a proselytizer be.”

* Adobe® Reader® is needed to view Adobe PDF files. If you don't already have Adobe Reader installed, you may download the current version at www.adobe.com (opens in a new window). [external link disclaimer]


Historical Document
Posted: Mar 15, 2007 04:11 PM
Last Updated: Jun 27, 2008 09:59 AM