A Complex Adaptive System
The Intelligence Community is an exemplar, even if not a healthy one, of a truly complex adaptive system.
With its fifteen diverse agencies and its wide range of functional responsibilities, the Intelligence Community presents a very complicated set of organizational arrangements. Thinking of it in terms of traditional organizational analysis or systems engineering methods in an effort to explain its working does not suffice because it far more resembles a living ecology with a complex web of many interacting entities, dynamic relationships, non-linear feedback loops (often only partially recognized), and specific functional niches that reflect momentarily successful adaptations to the environment. These complex interrelationships among its components create dynamic adaptations to changing conditions and pressures and make the Intelligence Community especially difficult to understand. In fact, it is an exemplar, even if not a healthy one, of a truly complex adaptive system.
During the Cold War, proportionately more resources supporting a larger cadre of experienced analysts devoted to a simpler and relatively static priority target, as well as a broad array of established sources, disguised many of the Intelligence Community’s dysfunctional aspects and growing internal problems. The community’s loosely federated structure and complicated, if not Byzantine, processes had previously appeared tolerable, even if not fully successful, because making changes appeared to present a greater risk. In the face of a drastically changed security environment, however, it is exactly the combination of complexity and opaqueness that has masked the increasingly dysfunctional misalignment of “dinosaur” analytic processes and methodologies from earlier recognition by both analysts and consumers of intelligence, much less by outsiders.
Even for insiders, the workings of the Intelligence Community are difficult to understand because, as a rule, its members are not deeply self-reflective about its practices and processes. For outsiders, however, these difficulties are magnified by the community’s compartmentation, security restrictions, and intrinsic opaqueness. That is why applying traditional organizational analysis that concentrates on structure is doomed to failure; understanding these complex adaptive systems requires more synthesis than traditional “reductionist” analysis. In this case, moreover, it is a complex adaptive system that, insulated by security barriers, has managed to ignore and—probably because of its centralized direction, however imperfect—suppress important external signs of change and to amplify self-protective internal signals, which often reflect strongly ingrained cultural preferences.
The results of the Intelligence Community’s failure to recognize the increasing dysfunction were both paradoxical and unfortunate. They were paradoxical because—although it has been accused of not adapting to dramatically changed conditions—the community adapted all too well. And they were unfortunate because the pressures to which it did adapt flowed from misperceptions inside and outside the Intelligence Community engendered by the collapse of the Soviet Union: that there would be no significant challenges to American interests; that the end of the Cold War reduced the need for a “national security state”; that there should be a substantial “peace dividend,” a large part of which would be paid by the Intelligence Community. The community’s adaptive processes did accommodate these changes internally—especially the need to “survive” the huge budget cuts and to become relevant to the articulated needs of the paying customers.
It is important not only to locate the level at which obvious symptoms occur, but also the level at which problems can be solved.
However, these internal pressures outweighed the huge new challenges emerging in the external security environment. Responding to these would demand new expertise and a new knowledge base, along with appropriate methods, tools, and perspectives—all of which required more resources, focused leadership, and strong commitment, which was not there. As a result, the community fostered a series of processes that were increasingly maladapted to needs emerging in the new geostrategic environment. By responding to the wrong signals, it created Perrow’s “error-inducing systems.”
Relating Structure and Process
Unfortunately, most Intelligence Community reform proposals concentrate on changes in structure and in directive and managerial authorities. Analytic problems, however, actually take place not just at the level of the community as a whole, but at four distinct levels, as well as in the complex interrelationships, both vertical and horizontal, among them. Thus, it is important not only to locate the level at which the obvious symptoms appear, but also the level at which the problem can be solved. In this way, the root causes of failure can be identified and appropriate and effective corrective measures taken.
The National Security Community. The relevant entities include the National Security Council (NSC), the Office of the Director of National Intelligence (ODNI), and the national policymaking and operational elements in the Department of State and the Department of Defense. Among the failures at this level can be misdirected priorities and misallocation of resources; poor communication and coordination; and inconsistent apportionment of authority, responsibility, and capability among the main entities. Such failures flow downward and can easily percolate throughout the subordinate organizations.
For the Intelligence Community, a particular problem at this level may involve its relationships with top-level users, especially managing their expectations. On the one hand, for example, the Intelligence Community often demonstrates an inability or unwillingness to say “no” to consumer requests, which leads to additional priority taskings without concomitant resources or relief from other ongoing activities. Similarly, the Intelligence Community often conveys an illusion of omniscience that fails to make clear its state of knowledge on an issue, the underlying quality of the intelligence, or the degree of uncertainty—all of which can leave the Intelligence Community seemingly responsible for decisions taken on the basis of “bad intelligence.”
The Intelligence Community. This level currently includes the fifteen component intelligence agencies. Failures at this level can include misdirected priorities and budgetary allocations within the Intelligence Community; lack of effective procedures and oversight of them among component agencies; poor communication and coordination among agencies; a lack of enforceable quality-control processes; toleration of substandard performance by individual agencies; poor communitywide technical standards and infrastructure that hinder information sharing; and poor management and oversight of security procedures that impede effective performance. Errors at this level also encompass failures by groups or individuals to make critical decisions, to exercise appropriate authority, or to take responsibility for gross errors that should be worthy of sanction or dismissal.
The Individual Analytic Units and Organizations. It is essential to appreciate the importance of particular analytic environments within specific sub-organizations—an office within the CIA’s Directorate of Intelligence, for example. It is these entities, rather than the organization as a whole, that create the work processes and practices that form the immediate cultural matrix for an analyst’s behavior. Failures at this level can include dysfunctional organizational processes, practices, and cultures that inhibit effective analysis by individuals and sub-units; management attitudes and directives that stress parochial agency objectives; toleration of poor performance; excessive compartmentation and special security procedures that erect barriers to effective execution; poor prioritization and assignment of workflow; inability to create and protect “slack” and conceptual space for intellectual discovery; ineffective recruitment and training; maintaining stand-alone information and analysis infrastructures, including ineffective support for individual analysts; poor direction and management of the analytic process; and, simply, ineffective management of the analytic cadre. This is probably the most important level for creating consistently high-quality analysis because of its impact on the analytic environment, on the selection of methods and processes, and on the work life of individual analysts. Errors at this level are perhaps the most pernicious, however, and they have been widespread and persistent.
Individual Analysts. Failures at this level can include poor performance due to lack of ability, lack of domain knowledge, lack of process expertise, poor social network contacts, or ineffective training; pressures to favor product over knowledge; lack of time; being too busy and too focused to maintain peripheral vision and curiosity, even on high priority work; failure to cooperate and collaborate with others; lack of suitable tools and support; misguided incentives and rewards; and an organizational culture and work practices that tolerate second-rate analysis.
To illustrate the impact of this multi-level hierarchy and underscore the importance of correctly identifying the locations of causative factors in analytic errors, for example, consider the case of an analyst who fails to interpret correctly the evidence pertinent to a task and draws a wrong conclusion. At first glance, the obvious approach should be to focus corrective actions on the analyst: what caused the failure, and what are the appropriate remedies? Simple incompetence, a rush to complete the assignment, a lack of domain knowledge needed to recognize critical linkages, or a failure to employ appropriate methods could all be causative factors. At this level, the obvious remedies to these problems are better screening, training, and mentoring.
It could be, however, that the problem lies with the analytic unit, its work processes, and its management: the tasking was high priority, and this analyst, whose expertise is on another subject, was the only one available; appropriate tools and methods were not provided; training in relevant domain knowledge or on effective new tools had been postponed due to production pressures; or, given the production cycle, the analyst lacked sufficient time to search for all the relevant evidence. The problem could reside even farther up the hierarchy, among the agencies of the Intelligence Community: key data from another agency was not made available, due to compartmentation restrictions or because incompatible information infrastructures prevented the analyst from easily searching another agency’s holdings. Finally, the failure could actually reside at the topmost level, with community management: this account was given such low priority that no collection resources had been assigned to gather information or to provide consistent analytic coverage or, because of the thinness of the evidence base, the inability to answer the question was not made clear to the requester at the start.
However, it is exactly here that the “5 Whys Approach” of the Quality Movement proves its value. Applying this approach, which features a progressively deeper, recursive search, forces the investigator to trace a causative factor to its source. Assume that, in this example, it is a lack of domain knowledge.
Why was an analyst not fully knowledgeable in the domain working that account?
She was covering for the lead analyst, who is away on temporary duty (TDY).
Why did the analytic manager assign that analyst to the task?
She was the only one available.
Why was the analyst not fully knowledgeable on her backup account?
She is an apprentice analyst with only a short time on the account and inadequate mentoring. Her training had been postponed due to scheduling. She didn’t have time to be curious and follow the information scent. She could not access the lead analyst’s “shoebox.”
Why couldn’t she access the shoebox of the lead analyst?
It is his personal collection of tentative hypotheses and uncorrelated data kept as a personal Word file and is not in an accessible database. The shoebox is actually a pile of paper put in temporary storage when the lead analyst went on TDY.
Why is the lead analyst unwilling to share his shoebox?
Why is there no accessible collaborative system for sharing shoeboxes?
The questions would continue through as many rounds as the questioner needed to satisfy himself that he had found the root cause.
Although the previously cited reports on intelligence failures usually point to organizational stove-piping and technical shortcomings as the most important contributors to failures in collaboration, the sources of such failure are actually more widespread and complex—and more frequently reflect shortcomings in work practices and processes, organizational culture, and social networks. In addition, the proposed solutions that focus on structures and authorities disregard the critical interrelationship between structure and processes and ignore as well the importance of organizational culture on institutional effectiveness. As Stephen Marrin, among others, has noted:
Structure and process must work together in a complementary fashion, and structural changes alone without corresponding changes to existing processes would simplify the workings of the Intelligence Community in some ways, but cause greater complexity in others.
The significant structural reforms legislated in 2004 will also entail substantial short-term transition costs to effectiveness as new organizational arrangements are implemented, processes are developed, and outmoded roles and systems are replaced. The really difficult task will be to redesign the processes, so that they are consistent and complementary to the structural changes that are being made.
The Analysis Phase-Space
Incorrect diagnoses of the causes of analytic failures probably arise from not recognizing the variety and complexity of the roles, missions, and tasks that confront analysts.
At a basic level, incorrect diagnoses of the causes of analytic failures probably arise from not recognizing the variety and complexity of the roles, missions, and tasks that confront analysts. This diversity results in a complex phase-space, illustrated below, that contains a significant number of discrete analytic regions. These certainly cannot be treated as though their perspectives and needs were homogeneous or even similar. The tasks required of a signals intelligence analyst attempting to locate a terrorist’s cell-phone call are fundamentally different from those of an all-source analyst drafting an NIE on Chinese strategic nuclear doctrine. Therefore, because intelligence collection and analysis are not based either on a suite of all-purpose tools or on fungible human expertise that can be instantly swiveled to focus effectively on a different set of problems, this phase-space also implies the need for a similar diversity of analytic processes, methods, knowledge bases, and expertise.
See Graph: A Large and Diverse Intelligence “Phase-Space” [PDF 52.8KB*]
Differentiating Intelligence Roles
Moreover, given this diverse phase-space, conflating three distinct roles played by all-source intelligence adds to the underlying confusion over intelligence missions and functions, the priorities among them, their requirements, and the capabilities needed to make each effective. The traditional assumption that there were only two sets of intelligence consumers, each with distinct mission needs, often led to contraposing support to military operations, which was assumed to be tactical in focus, and national user support, which was assumed to demand deep analysis. In reality, meeting the disparate needs of the users intelligence must serve requires recognizing three distinct roles for all-source intelligence. Two of them, Support to Military Operations (SMO) and Support to Policy Operations (SPO), focus primarily on issues needing immediate information capabilities to assist decisionmaking on current operations. Although SMO and SPO issues are of interest to both national and departmental users, the third role, Warning and Estimative Intelligence (WEI), largely emphasizes issues that are almost exclusively the province of national users and usually take place over longer time horizons.
In all cases, however, although it still uses the term “support,” the Intelligence Community must move beyond the notion that it is segregated from the rest of the national security community and that it merely provides apolitical information to decisionmakers. Intelligence has now become an integral element of both the policy and military operational processes; and the success or failure of its judgments can have the most significant consequences in both domains. Increasingly-integrated military operations, in which intelligence directly drives operations, and command centers in which intelligence personnel are fully integrated, are tangible evidence of such changes. As a result, it is important that intelligence appreciate not only the centrality of its role, but also the increased obligations and responsibilities that such a role brings.
Support to Military Operations (SMO): This traditional intelligence role has usually focused on assisting current military operations. Much of this information concerns current numbers, locations, and activities of hostile units, and other information addresses significant elements of the physical environment in which military forces are operating. Other military users need quite specific current data on subtle technical characteristics of adversarial equipment and forces to serve, for example, as targeting signatures or to support electronic warfare (EW) activities. Regardless of type, intelligence supporting operating forces demands extraordinary accuracy, precision, and timeliness to ensure that it is immediately “actionable” under conditions that are highly stressful and potentially lethal.
Increasingly, however, military operators have other operational intelligence needs, such as support for information operations and for security and stabilization in Iraq. To prosecute these missions successfully, the military now also needs far more cultural awareness and timely accurate information on adversary thinking, motivations, and intentions.
Support to Policy Operations (SPO): Making explicit that this is a distinct role emphasizes the importance of intelligence to daily policymaking across the entire spectrum of national security concerns; it is the “national user” cognate of SMO. SPO provides policymakers and senior officials (importantly including senior civilian defense officials, combatant commanders, and other military officers) with indispensable situational awareness, including important background information, to assist them in executing and overseeing ongoing policy activities and in planning and framing policy initiatives. As it is as intensely focused on providing actionable information, it is as heavily oriented as SMO to current intelligence and reporting. However, SPO differs from SMO somewhat in content and priorities in that it has always included a greater proportion of less quantifiable, softer information, such as political and economic trends in major countries and groups and assessments of foreign leaders and their intentions.
See Graph: Three Distinctive Needs for Analytic Support [PDF 48.6KB*]
Warning and Estimative Intelligence (WEI): Mary McCarthy, a former National Intelligence Officer (NIO) for Warning, commented on the recommendations of a DCI-chartered study conducted in 1992:
A warning process . . . allows decisionmakers to think through responses they might be obliged to make in haste.
According to that ten-member panel of highly respected intelligence and policy veterans, providing policymakers with persuasive and timely intelligence warning is the most important service the Intelligence Community can perform for the security of the United States.
McCarthy defines warning as “a process of communicating judgments about threats to US security or policy interests to decision-makers.” Thus, warning provides vital support to “national users” in their principal strategic missions—understanding the complex geostrategic environment, fostering vision of objectives, assessing alternatives and determining strategy, and protecting against consequential surprise; most importantly, when done properly, warning is forward looking and anticipatory.
Warning is sometimes thought to be merely alerting decisionmakers to immediately threatening activities, but, in reality, it is a far more complex function and actually addresses two very different kinds of problems. One type of warning is concerned with monitoring activities previously recognized as potentially dangerous, such as a hostile missile launch, and cueing appropriate responses. The second type is a discovery function that assists decisionmakers in identifying those situations and activities whose consequences could have significant (and usually adverse) effects—and which may not necessarily be obvious. When performed effectively, a warning process provides decisionmakers with an anticipatory sensitization that allows them to think through, in a disciplined way, the responses they might someday be obliged to make in haste. Assessments and estimates, on the other hand, also are usually forward looking, but they are designed to be informative rather than part of a process closely tied to triggering contingent responses.
Further complicating the matter is that both types of warning also operate over three different horizons. Strategic warning has always been understood as looking out toward the distant future; it is intended to recognize that a possible threat may be looming—even if it is not imminent—and to provide time to take appropriate preparatory actions, including policies and actions that might prevent the threat from eventuating. Operational warning also looks out in order to identify the characteristics of the threat (the likely and particular methods of attack), so that offsetting contingency plans and actions can be prepared. From this detailed understanding of enemy intentions, capabilities, and concepts, operational warning also serves to identify indicators that an attack is in preparation. Finally, tactical warning is the immediate alerting function that a specific (with respect to time, place, or character) hostile activity has begun or is about to begin.
An important but often overlooked element of warning over all three horizons is the key role played by negative evidence, which can help confirm that potentially threatening activities are not occurring and prevent costly and potentially consequential responses from being taken or scarce resources from being squandered. During the confrontation between the United States and the Soviet Union, and, in particular, during periods of high tension between them, one of the most important functions of warning was to inform the leaders that, “Today is not the day.”
Both the warning and estimative functions are designed to focus more on informing decisionmaking with context and long-term implications than with supporting ongoing activities. The preparation of assessments and estimates, as well as development of warning indicators, has more to do with analysis and judgment than with reporting; it demands deep expertise as well as an ability to place knowledge of the subject in broad context. These important functions serve the entire national security community.
Although warning is often misconstrued as a current intelligence problem, even tactical warning of specific targets, times, and means must build on this deeper foundation of pre-emptive analysis of threats and responses if it is to be effective. During the Cold War, recognizing that we were engaged in a long-term competition, we were prepared to adjust our intelligence priorities so that analysts could provide assessments of future capabilities and indications of intentions, even though the day-to-day threats were most grave. As was the case in facing the Soviet Union, there may well be tensions today in choosing between serving SMO and SPO, on the one hand, and assuring adequate resources for WEI, on the other hand, as continued access to information needed for an understanding of enemy intentions and capabilities could be sacrificed by meeting the needs for immediately actionable intelligence.
Today’s decisionmakers have many more sources of information than did their predecessors ...; in turn, the Intelligence Community holds far less of a monopoly over information about foreign events and technology developments.
Although warning and estimative intelligence may be seen as the core missions of strategic intelligence, they are also less tied to the details of ongoing operations in which the formal relationships between policymakers and intelligence provide a unique advantage and leverage for intelligence insights. Today’s decisionmakers have many more sources of information than did their predecessors when the Intelligence Community was created; in turn, the Intelligence Community holds far less of a monopoly over information about foreign events and technology developments. Moreover, as one senior intelligence official noted, policymakers see themselves and their staffs as substantively knowledgeable on issues of interest as the Intelligence Community and capable of serving as their own intelligence analysts. As a result, users increasingly see themselves as participants in a process of judgments. An experienced national-level user wrote recently,
Today, the analyst no longer sets the pace of the information flow. The sources of information now available to the policy-level consumer…are far, far greater than a quarter century ago. It is almost a given that today’s policy-level consumer of intelligence is well informed in his or her area of interest and not dependent on an intelligence analyst for a continuing stream of routine, updating information.
Implications of Differentiating Roles
Careful differentiation among the three intelligence roles discloses dimensions that are both analytically important and more meaningful for contemplating intelligence reform than the usually misleading bimodal distinctions between national vs. military users or tactical vs. strategic objectives. What truly distinguishes these intelligence roles is their perspective and emphasis—a significant distinction that has been lost in recent arguments over intelligence reform.
To begin with a particularly important point, a tactical or a strategic focus does not necessarily distinguish military from civilian users. Moreover, the less quantifiable and, therefore, softer information and analysis on individuals, decisionmaking, and social dynamics that used to be produced primarily for national users is now increasingly demanded to support military operations at the tactical level. Such information is inherently more judgmental and inferential—and, therefore, less precise—than analysis of physical or technical characteristics in orders-of-battle (OOBs) and tables of organization and equipment (TOEs). It is less amenable to counting or to the gathering of external physical signatures by technical collection systems; it is more dependent on language skills, deep expertise on the region and cultures, and knowledge of the personalities. It is also harder to validate or prove than estimates of technical factors. Such capabilities go beyond “reporting” that used to be the core of current intelligence.
However, both SMO and SPO are, by nature, mission- or task-oriented and tightly focused on the problem at hand; and this narrowed focus has significant time and perceptual implications for analysts and the intelligence sources supporting them. Given the stress, time pressures, and immediate—as opposed to potential—stakes attendant on current operations, human decisionmakers try to concentrate only on the immediate situation and the information relevant to it, while actively screening out other inputs. This is the intelligence analogue of human “foveal vision,” which offers the highest visual resolution but also a very narrow field-of-view.
In contrast, warning and estimative intelligence are the analogues of human “peripheral vision,” in which there is low resolution but a wide field-of-view. Peripheral vision is very sensitive to cues of dynamic change, which trigger anticipatory responses. Although warning is concerned with activating the response cycle, and estimative intelligence is intended to create a frame of reference for the decisionmaker, both are intended, through preconditioning and anticipatory consideration, to enable a more appropriately and contextually sensitized response on the part of users.
Another important implication of the differing emphasis on decisions with a long-term view and those requiring prompt action—the classic distinction between strategic and tactical—concerns the nature of the advantage to be gained from the information, and, therefore, how it is exploited. In recent years, the tasks of intelligence, and its successes and failures, have focused on providing immediately actionable (in this sense, tactical) intelligence to users—information that can provide a rapid or near-instantaneous advantage, whether for interdicting hostile military forces, preventing terrorist incidents, or supporting diplomatic initiatives. Emphasizing current intelligence for actionable exploitation may have created an unintended mind-set that undervalues the immense importance of knowing and understanding the adversary’s intentions throughout the course of the confrontation, even at the cost of foregoing exploitation of these sources for temporary advantage on the battlefield or in the diplomatic conference room. This stress on current intelligence also influences the priorities among the types and attributes of information we collect, the nature of the collection and processing systems, the analytical methods we use, the stresses we place on analysts, and the metrics by which we assess the performance of intelligence.
When one tries to assess the adequacy of Intelligence Community performance … or prescribe changes … the appropriate answers will almost certainly differ greatly from one role to the other.
There is yet another important distinction between these roles. By looking out to the future, WEI is basically a surprise-preventing function intended to heighten a policymaker’s ability to visualize the consequences of anticipated and unanticipated events and to prepare for them mentally; it is not designed to be “evidence-based truth-telling” that will stand up in court. In addition, as we better appreciate the implications of emergence and the emergent behaviors of complex adaptive systems, we need to place greater emphasis on anticipation while recognizing that precise prediction or forecasting is even harder than previously understood. Appreciating the differences in perspective created by these roles is very important because failing to make clear distinctions between them may aggravate a major problem before the Intelligence Community: the disconnect between the emphasis on current reporting or providing situational awareness, which must be evidence-based, and the policymaker's need for anticipatory judgments, which, by nature, trade the confidence derived from hard evidence for an intuitive, or gestalt, understanding of the whole derived from inference. It is unlikely that analysts will have firm evidence to offer the policymaker to support alternative interpretations of the future, and they will need to rely on inference and informed speculation that is persuasive to decisionmakers. In particular, as one experienced intelligence analyst noted,
“Getting inside the other guy’s head” can only be conjectural because, in most cases, even the “other guy” doesn’t know exactly why he’s doing what he’s doing.
Even if there are predictive judgments to be made in both SMO and SPO, they tend to have short time horizons and reasonably short inferential chains; as the predictive time-constants are short, observation of adversary actions can serve to validate or disprove these judgments and thereby improve confidence in them and the analyst’s judgment.
Those providing SPO, in particular, must continually walk a fine line between serving the policymakers’ needs for relevant, focused, direct support and maintaining objectivity in providing the evidence and analysis. Staying close to the evidence assists the analyst in walking this line. At the same time, the author of this monograph noted a clear consensus among senior intelligence officers at a recent non-attribution conference that analysts can best serve policymakers by offering them thoughtful and thought-provoking views that challenge their assumptions. It must be recognized, however, that helping to alter policymakers’ assumptions is intruding directly into the policymaking process and, thereby, crossing the boundary that Sherman Kent tried to establish. As the policymakers demand judgments on actions and consequences farther in the future (moving the intelligence role from SPO to WEI), not only will the intrinsic uncertainties increase, but also the potential for tensions between policymaker and analyst over the objectivity (and validity) of the judgments and the conflicts among differing judgments.
Another of these distinctions affects intelligence requirements and planning. Unlike SMO and SPO, where the users can clearly identify their areas of interest, priority issues, and information needs, the Intelligence Community must look beyond its users’ perceptual horizons if it is to perform warning and estimative functions effectively. Almost by definition, with anticipatory intelligence, policymakers will be unable to tell the community where to look. Unfortunately, although the Intelligence Community must recognize that attempting to divine requirements for warning and other anticipatory intelligence from the users is not likely to be fruitful, it also must appreciate that it alone will bear the blame for failing to warn against the inevitable surprises arising from outside the fields-of-view of users. This demands, in turn, that the Intelligence Community have some discretion and flexibility to allocate resources in areas not currently considered to be priority targets: listening too closely to the customers, and looking only where directed, guarantees future strategic warning failures.
It is absolutely essential that the Intelligence Community and those who depend on it understand the principal distinctions between these two functions. In their conclusions about the nature of the Intelligence Community’s problems, the extraordinary differences between the report of the 9/11 Commission and that of the SSCI on Iraqi WMD reveal the dangers of conflating the two distinct functions or ignoring the differences between the three roles. When one tries to assess the adequacy of Intelligence Community performance across these domains, identify shortfalls, or prescribe changes—whether in business practices, tools, or organizational arrangements—the appropriate answers will almost certainly differ greatly from one role to the other.
* Adobe® Reader® is needed to view Adobe PDF files. If you don't already have Adobe Reader installed, you may download the current version at www.adobe.com (opens in a new window). [external link disclaimer]