Scientific knowledge, like language, is intrinsically the common property of a group or else nothing at all. To understand it we shall need to know the special characteristics of the groups that create and use it.
The more we learn about the world, and the deeper our learning, the more conscious, specific, and articulate will be our knowledge of what we do not know.
The purpose of this research was to identify and describe conditions and variables that negatively affect intelligence analysis, to develop relevant and testable theory based on these findings, and to identify areas in which strategies to improve performance may be effective. Although there has recently been a great deal of concern that intelligence error and failure rates are inordinately high, in all likelihood, these rates are similar to those of other complex socio-cognitive domains, such as analysis of financial markets. The significant differences are that other complex domains employ systematic performance improvement infrastructures and that the consequences of intelligence error and failure are disproportionately high in comparison with other domains.
It is evident from the literature that intelligence organizations recognize the need to improve their performance and that it is possible to make the domain of intelligence analysis into a coherent scientific discipline. The first step in this transition is to identify and describe performance gaps. Once gaps have been identified, it will be possible to introduce performance improvement methods systematically and to measure the effectiveness of the results. This work is intended to further research toward creating intelligence organizations that are more effective.
The Problem of Bias
Although a researcher might pretend to be neutral and unbiased in presenting his findings and conclusions, personal biases can creep into a finished product. The methods ethnographers employ to collect raw data and the use of interpretational analysis to extract meaning and generate theory virtually guarantee it. In my view, one should be candid about this possibility. I noted in Chapter One that ethnographers bring a certain amount of experiential baggage to their work, myself included. At this point, before discussing analytical difficulties and problems I identified during my research, I want to make the readers aware of an additional personal bias that has developed from observing the Intelligence Community.
During my research, I developed a great deal of empathy for individual analysts and the problems they face in trying to perform their jobs. The reason for this is straightforward and something every anthropologist recognizes. It is part of the process that anthropologists reach a point where they can modify their own identity in order to gain insight into a different culture. The risk is that empathy and identity modification will induce the researcher to “go native” and produce bias in his findings.
Although I may empathize with analysts personally, it is critical for theory development to avoid parroting the views, kudos, or complaints of individual analysts, who may or may not be dissatisfied with their unique professional experience. In order to counteract the empathy bias, I employed multiple data collection techniques and then used those data to refute or confirm each categorical finding. Triangulation is not an infallible system, however, and the reader is advised to approach these findings with both a critical eye and the foreknowledge that this researcher has a number of personal and professional biases.
Finding: Secrecy Versus Efficacy
Secrecy and efficacy conflict. Secrecy interferes with analytic effectiveness by limiting access to information and sources that may be necessary for accurate or predictive analysis. In turn, openness interferes with security by degrading the value of information resources and by revealing specific sources and methods.
Perfect secrecy would ultimately be unproductive, because it would restrict information to one mind or to a very small group of minds. Limiting available resources in this way would produce organizational failure in competition with resources available to a large and diverse group of adversaries. Perfect openness would also lead to organizational failure, because, with full access to all information, there would never be an instance of advantage for any one group over any other group. In addition, perfect openness would result in adversaries being aware they are under observation and could lead them to alter their behavior to deceive the observer if they so desired.
Between these two extremes, there is some notional point where secrecy and openness converge to create an optimal performance tradeoff. My perception is that, within the Intelligence Community, more organizational emphasis is placed on secrecy than on effectiveness. It is important, in my view, that there be a voice in favor of openness to counterbalance the many voices whose sole or primary responsibility is the advocacy and maintenance of secrecy. I believe this secrecy-efficacy conflict can be stated as a theory, along the following lines.
The more open the system (where zero is perfect information access and sharing on the X axis secrecy scale [as shown on the above graph]), the more access an analyst has to all sources of information within the Intelligence Community regarding an adversary. In addition, this openness encourages interorganizational communication, interaction, and sharing of information among analysts and increases the likelihood that an analyst will be more efficient (in this case the Y axis efficiency scale) and therefore effective or accurate in his or her assessment of a situation.
Conversely, counter-intelligence is negatively affected by zero-level secrecy and perfect openness. The less open or more compartmentalized the system, the more efficient and effective are counterintelligence activities. Notionally, the two curves would meet somewhere in the tradeoff between efficiency and secrecy. Where they meet would depend on program goals and a clear definition of starting points and end-states.
The notional set of curves above illustrates the tradeoff between system efficiency and system secrecy and the effect that the tradeoff has on performance effectiveness, both positive and negative. In this case, the starting and ending points of effectiveness for analysis and for counterintelligence are arbitrary and could be positioned anywhere along a continuum between zero and ten. In this theory, analytic efficiency and effectiveness are purely functions of system openness and do not take into account analytic methods or personnel.
This theory will require additional refinement, and it may or may not be represented by a tradeoff curve like the one proposed here. The theory will also require numerous controlled quantitative experiments to test its explanatory power.
Finding: Time Constraints
The work itself is a 24-hour-a-day job, but it never seems like I have any time to actually analyze anything when I’m at my desk. I spend most of my time reading daily traffic, answering e-mail, coordinating papers with everybody, and writing. Mostly I read and write, but when the workday is over, I go home and think. It isn’t like I can turn off my brain. So, I guess I do most of my real analysis on my own time.
The majority of the analysts interviewed indicated that time was one of their greatest constraints at work. This comment triangulated with the findings from direct and participant observation. In addition, analysts indicated that there has been a communitywide shift toward focusing on short-term issues or problem solving, thereby addressing the immediate needs of intelligence consumers. This shift in product focus, coupled with a growth in available all-source raw intelligence, has resulted in a change in the pace of analytic production. In order to generate the daily products, analysts have had to change the way they go about doing their work.
I haven’t been doing this very long, but I wish I had been a journalism major instead of poli-sci. The pace is excruciating.
I don’t get much sleep. It’s like cramming for finals, except we do it every day.
Everything I do is reactive. I don’t have time to work my subject. We’re not pro-active here.
I’m so busy putting out today’s fires, I don’t have any time to think about what kind of catastrophe is in store for me a month from now.
About 15 years ago, I used to have 60 percent of my time available for long-term products. Now, it’s between 20 and 25 percent.
I probably have about 30 percent of my time for self-initiated products.
You know, someday somebody is bound to notice that velocity isn’t a substitute for quality. We’ve gotten rid of the real analytic products that we use to make, and now we just report on current events.
Not all analysts indicated that time constraints and information load had a negative effect on their performance. A minority indicated that there was sufficient time to perform analytic duties and prepare analytic products.
This is a tactical shop. It’s all we do. Current reporting is our job.
I work a slow desk. I have plenty of time for self-initiated products —maybe 60 percent or more.
I multitask pretty well. I don’t really experience a time-crunch.
Maybe I just process better than other people, but I don’t really feel pressed for time. Besides, I’d rather be at a hot desk than at a cold desk.
Analytic supervisors were more evenly mixed in their opinions about time constraints. A slight majority of the managers interviewed said time constraints had negative effects on the work environment, work processes, and the morale of their staff. A majority of them also put analytic time constraints in a larger context of policy making. They indicated that the decision-cycle of policymakers was 24 hours a day and that their responsibility was to support that decision cycle with current intelligence.
In discussing their perceptions of consumer demand, the managers’ views of the nature of those demands were mixed.
I want my analysts to produce long-term products. I want them thinking through their subjects. The decision makers want well-thought-out products, not just daily briefs.
Our customers want current production. They never complain about the daily products and, frankly, I doubt they have time to read the longer stuff.
My consumers like the bigger pieces. They like having the context and broader picture. They don’t want to be spoon fed.
I’ve never had a customer tell me they want more to read.
Our customers want to avoid surprise. As long as we keep them from being surprised, I don’t care if we do daily or long-term production. I don’t think they care either.
Finding: Focus on Current Production
The present daily production cycle and the focus on current intelligence also affect group interactions and the analytic process.
It doesn’t matter if I’m writing a piece myself or if I’m coordinating a piece with some group. We don’t sit around and test hypotheses, because we’re too busy writing. We’ve got serious deadlines here.
If, by group analysis, you mean the senior expert in the room tells everybody what he thinks, and then we generally agree so that we can get back to our own deadlines, then, sure, there’s a group process.
We used to have groups that did current reporting and different groups that did longer term products. We still have some of that, but it is very limited. I couldn’t say what happened exactly, but we’re all doing current production now.
The Analytic Process:
People seem to have confused writing with analyzing. They figure that if you just go through the mechanics of writing something, then you must have analyzed it. I don’t know about everybody else, but it doesn’t work that way for me. I need time to think through the problem.
Our products have become so specific, so tactical even, that our thinking has become tactical. We’re losing our strategic edge, because we’re so focused on today’s issues.
Alternative analysis is a nice concept, but I don’t have the time to do it. I’ve got to keep up with the daily traffic.
I use several analytic techniques that are relatively fast. Scenario development, red teams, competing hypotheses, they’re all too time consuming.
We’ve got Bayesian tools, simulations, all kinds of advanced methods, but when am I supposed to do any of that? It takes all my time to keep up with the daily reporting as it is.
I don’t have time to worry about formal analytic methods. I’ve got my own system. It’s more intuitive and a lot faster.
Finding: Rewards and Incentives
The shift in the analytic production cycle is not only reflected in the products and processes but also in the way analysts perceive the system by which intelligence organizations reward and promote employees. Employees see their opportunities for promotion as being tied directly to the number of daily products they generate and the amount of social capital or direct consumer influence they amass, most often when their work is recognized by senior policymakers.
In any given week, I could devote about 20 percent of my time to longer think pieces, but why should I? You can write all the think pieces you want, but, if you don’t write for the daily briefs, you aren’t going to move into management. These days the only thing that matters is getting to the customers.
If I write a 12-page self-directed piece that goes out as a community product, and somebody else writes one paragraph with two bullet points that goes into a daily brief, the guy who got in the daily brief is going to get the recognition. Why waste my time with the big products?
It isn’t really official policy, but the reality is that sheer production equals promotion. People talk about quality, but, in the end, the only measurable thing is quantity.
Our group has a “team award” of 5,000 bucks. Last year, they gave it to the one guy who published the most. I’m not sure how that one guy won a “team award,” but there you go.
Technically, I have four bosses. The only thing that seems to keep them all happy is volume. It’s like piece work.
Quality? How do you measure quality? Quantity—now that’s something you can count.
Promotion is based on production—pure and simple.
In sum, aside from specific tactical groups, staff positions that generate limited social capital, and individual cognitive differences, there is a majority sentiment among the analysts interviewed that the combination of a shorter production cycle, information load, a shift in product focus, and organizational norms regarding promotion have had an impact on analytic work and intelligence analysis itself.
Finding: “Tradecraft” Versus Scientific Methodology
Human beings do not live in the objective world alone, nor alone in the world of social activity as ordinarily understood, but are very much at the mercy of the particular language which has become the medium of expression for their society…The fact of the matter is that the “real world” is to a large extent unconsciously built upon the language habits of the group…We see and hear and otherwise experience very largely as we do because the language habits of our community predispose certain choices of interpretation.
The Intelligence Community, in its culture and mythos and in its literature, tends to focus on intelligence operations rather than on intelligence analysis. Open literature about the community certainly does so. Along with time constraints and the analytic production cycle, the private and public focus on operations has had an effect on intelligence analysts and analytic methodology. The principal effect is the spread of the concept of “tradecraft” within the analytic community.
Community members quite often used the word “tradecraft” to describe intelligence analysis during the interviews, observations, training programs, workshops, and actual analytic tasks that I performed for this study. Analysts, managers, instructors, and academic researchers employed the word “tradecraft” as a catchall for the often-idiosyncratic methods and techniques required to perform analysis. Although the intelligence literature often refers to tradecraft, the works tend to be a collection of suggestions and tips for writing and communicating with co-workers, supervisors, and consumers instead of focusing on a thorough examination of the analytic process and techniques.
The notion that intelligence operations involve tradecraft, which I define as practiced skill in a trade or art, may be appropriate, but the analytic community’s adoption of the concept to describe analysis and analytic methods is not. The obvious logical flaw with adopting the idea of tradecraft as a standard of practice for analytic methodology is that, ultimately, analysis is neither craft nor art. Analysis, I contend, is part of a scientific process. This is an important distinction, for language is a key variable in anthropology and often reveals a great deal about the cognition and culture of a community of interest.
The adoption by members of the analytic community of an inappropriate term for the processes and methods employed in their professional lives obfuscates and complicates the reality of their work. The adoption of the word “tradecraft” demonstrates the analytic community’s need to create a professional identity separate and unique from other disciplines but tied directly to the perceived prestige and cachet of intelligence operations. Adopting “tradecraft” as a term of reference for explaining work practices and as a professional identity marker may seem trivial. Yet the term, and its effect on the community, has unanticipated consequences.
Tradecraft purposefully implies a mysterious process learned only by the initiated and acquired only through the elaborate rituals of professional indoctrination. It also implies that the methods and techniques of analysis are informal, idiosyncratic, unverifiable, and perhaps even unexplainable. “Good” methods are simply those that survive, and then are passed on by “good” analysts to novice analysts. Unfortunately, “good” in both instances is not an objective measure. That is, there is no formal system for measuring and tracking the validity or reliability of analytic methods, because they are both perceived and employed within the context of idiosyncratic tradecraft. When asked to describe the analytic process, analysts responded in a variety of ways.
First, I figure out what I know and what I don’t know about some situation. Then, I look for information to fill the gap.
I have a model of the situation in my head. Whenever something new comes in, I see if it fits with the model. If it does, I add it to the model; if it doesn’t, I try to figure out why.
I’ve found a system that lets me keep up. I just look for anomalies. When I see any novel data, then I worry.
I’m always looking for anything strange or out of place. Then, I source it to see if it is meaningful.
The current data ought to fit a certain pattern. If it doesn’t, I know something is wrong.
First, I print the daily traffic I’m concerned with; then I lay out all of the relevant stuff in front of me on my desk or the floor; then I start looking for threads.
I’m looking for links and patterns. Once I figure out the pattern, I can figure out where to look next.
I use patterns. If things start happening, out of the ordinary things, I pay attention to them.
I try to build patterns out of the data. It helps me predict what will happen next.
I come up with a few scenarios and see what the evidence supports.
I look for data that are diagnostic: some piece of evidence that rules out certain possibilities.
I try to weigh the evidence to see which scenario it supports.
Although anomaly-detection, pattern-recognition, and weighing data may appear to be idiosyncratic tradecraft based on individual expertise and cognitive skills, these methods can be formalized and replicated if the operating parameters, variables, and rules of evidence are made explicit. This is to say that intelligence analysis can be reconstructed in the context of a scientific method, which is merely an articulated, formal process by which scientists, collectively and over time, endeavor to put together a reliable, consistent, and nonarbitrary representation of some phenomena. Broadly, the steps include:
observation and description of phenomena;
formulation of hypotheses to explain phenomena;
testing of hypotheses by independent experts;
refutation or confirmation of hypotheses.
These steps do not suggest that any specific scientific methodology results in what is ultimately the truth, rather that scientific methods are merely formal processes used to describe phenomena, make predictions, and determine which hypothesis best explains those phenomena. The principal value of any type of methodological formalism is that it allows other researchers to test the validity and reliability of the findings of any other researcher by making explicit, and therefore replicable, the means by which anyone reaches a specific conclusion.
The idea that intelligence analysis is a collection of scientific methods encounters some resistance in the Intelligence Community. The interview data analyzed in this study highlight many subtle—and not so subtle—prejudices that analysis is not a science. That is, it is an art or craft in which one can attain skill but not a formal discipline with tested and validated methodology.
What we do is more art and experience than anything else.
Science is too formal. We can’t actually run experiments here.
How would you actually test a hypothesis in intelligence?
Science is what you do in a lab.
We’re not scientists; we’re analysts. We don’t generate the data.
We don’t worry too much about theory; we worry about the facts.
In my discipline, I might be a scientist, but, in intelligence, I am a practitioner.
I use science for my area, but I don’t think intelligence analysis is science.
As long as intelligence analysis continues to be tradecraft, it will remain a mystery. The quality of any tradecraft depends on the innate cognitive capabilities of the individual and the good fortune one has in finding a mentor who has discovered, through many years of trial and error, unique methods that seem to be effective. This process of trial and error is, in general, similar to any scientific process, except that the lessons learned in tradecraft, unlike those of other disciplines, often occur without being captured, tested, or validated.
In an oral tradition, individual tradecraft methods are passed on by means of apprenticeship. The consequence for any culture tied to an oral tradition is the loss of important knowledge that occurs with the loss of practitioners. In organizations, the retirement of experts and innovators leads to the loss of that expertise and innovation, unless there is some formal written and educational system to keep that knowledge alive.
The data collected through both interviews and observation indicated that there were, in fact, general methods that could be formalized and that this process would then lead to the development of intelligence analysis as a scientific discipline. The principal difficulty lies not in developing the methods themselves, but in articulating those methods for the purpose of testing and validating them and then testing their effectiveness throughout the community. In the long view, developing the science of intelligence analysis is easy; what is difficult is changing the perception of the analytic practitioners and managers and, in turn, modifying the culture of tradecraft.
Finding: Confirmation Bias, Norms, and Taboos
Organization is key, because it sets up relationships among people through allocation and control of resources and rewards. It draws on tactical power to monopolize or parcel out liens and claims, to channel action into certain pathways while interdicting the flow of action into others. Some things become possible and likely; others are rendered unlikely.
Eric Wolf 
Time constraints affect both the general analytic production cycle and analytic methodology by contributing to and exacerbating cognitive biases. Although there are any number of cognitive biases to which the human mind is susceptible, one in particular became evident during the triangulation phase and interpretive analysis of the interview and observation data of this study. The cognitive bias identified most often was confirmation bias, which is the tendency of individuals to select evidence that supports rather than refutes a given hypothesis.
Although the psychological mechanism by which confirmation bias occurs is in debate, confirmatory behavior is a consistent finding throughout the experimental psychology and cognitive science literature. Rather than focus on the mechanism and nomenclature, the term “confirmation bias” is used in this work as a description of confirmatory behavior. This behavior was described by participants during the interviews and observed during direct and participant observations throughout the fieldwork.
Analysts were asked to describe the work processes they employed to answer questions, solve problems, describe and explain phenomena, make forecasts, and develop intelligence products. The process they described began with an examination of previous analytic products developed by their organization in order to establish a baseline from which they could build their own analysis.
When a request comes in from a consumer to answer some question, the first thing I do is to read up on the analytic line.
The first thing I do is check the pervious publications, and then I sort through the current traffic.
I’ve looked at our previous products, and I’ve got a good idea of the pattern; so, when I sort through the traffic, I know what I’m trying to find.
I try to keep up with all the products that come out of our area, so I know where to start my piece.
A literature search is often the first step in any research endeavor. The utility of this practice is not merely to define and understand the current state of research in the field but also to determine major controversies and divergences of opinion. Trying to discern controversies and divergence in intelligence products is often difficult, because some of them—national intelligence estimates (NIE), in particular—are specifically designed to produce a corporate consensus for an audience of high-level policymakers.
These products can and do include divergent opinions, in the form of footnotes, but these tend to indicate inter-, rather than intra-, organizational differences. Dissenting footnotes are products of the coordination process, the result of an inability on the part of one or several community organizations to convince the others of a particular point of view. Not surprisingly, the least probable opinion is often the hardest to defend, whereas the most probable opinion is the easiest to support.
The literature search approach may promote a logical consistency among analytic products, but it has the unintended consequence of imposing on the analyst using it a preexisting mental model of the phenomena in question. The existing analytic products describe, implicitly or explicitly, a set of working hypotheses that an analyst may wish to reflect in his or her own work. Of course, these existing hypotheses are rarely tested each time they are incorporated into new products. What tends to occur is that the analyst looks for current data that confirms the existing organizational opinion or the opinion that seems most probable and, consequently, is easiest to support. As this strategy is also the most time-efficient technique, it reduces the time constraints associated with the daily production cycle.
This tendency to search for confirmatory data is not necessarily a conscious choice; rather, it is the result of accepting an existing set of hypotheses, developing a mental model based on previous corporate products, and then trying to augment that model with current data in order to support the existing hypotheses. Although motivational and heuristic factors and a tendency toward “groupthink” might contribute to confirmatory behavior in intelligence analysis, my observations and interviews during this study suggest that the predominant influence is selectivity bias in order to maintain a corporate judgment.
The maintenance of a corporate judgment is a pervasive and often-unstated norm in the Intelligence Community, and the taboo against changing the corporate product line contributes to confirmation biases. Once any intelligence agency has given its official opinion to policymakers, there exists a taboo about reversing or significantly changing the official or corporate position to avoid the loss of status, trust, or respect. Often, policymakers perceive a change in judgment as though the original opinion was wrong, and, although unstated, there are significant internal and external social pressures and consequences associated with being perceived as incorrect.
An analyst can change an opinion based on new information or by revisiting old information with a new hypothesis; in so doing, however, he or she perceives a loss of trust and respect among those with whom the original judgment was shared. Along with this perceived loss of trust, the analyst senses a loss of social capital, or power, within his or her group.
It is even more difficult for an intelligence agency to change its official position once it has made its judgments known to those outside of the organization. There is a sense that changing the official product line will be seen outside of its context—the acquisition of new information, for instance—and that it will be perceived by the policymakers as an example of incompetence or, at least, of poor performance on the part of the intelligence agency.
This perception then carries with it the threat of a loss in status, funding, and access to policymakers, all of which would have a detrimental effect on the ability of the intelligence agency to perform its functions. In short, it serves the interest of the intelligence agency to be perceived as decisive instead of academic and contradictory, and that message is transmitted to the analysts. In response to the organizational norm, the analyst is inclined to work the product line rather than change it.
Our products are company products, not individual products. When you publish something here, it’s the official voice. It’s important for us to speak with one voice.
It doesn’t do us any good if people think we can’t make up our mind.
Access matters; if people think you don’t know what you’re talking about, then they stop seeing you.
We already briefed one thing. I can’t go in there and change it now. We’ll look like idiots.
When I was new, I wrote a piece that disagreed with our line. Let’s just say, I’m more careful about that now.
Another organizational norm that contributes to confirmation bias in the Intelligence Community is the selection and weighing of data according to classification. Secrets carry the imprimatur of the organization and, in turn, have more face validity than information collected through open sources.
Most analysts indicated that they considered “secret” data collected by covert means to be more important or meaningful than “open” or unclassified data. Analysts said that they rely on open sources to help fill in missing pieces of their mental models but that they test the model’s validity with secret information. Choosing to rely on classified data as more meaningful to problem solving and as a tool for testing the validity of their hypotheses serves to exacerbate the confirmation bias.
I’m an all-source analyst, so I use whatever I can get my hands on; but, if the traffic comes from operations, I tend to pay more attention to it than to information in the open literature.
There is something special about the word “secret” in my business. It says that it must be important because people had to go and get it rather than its just showing up in the news. We tend to weigh classified material as more important than other sources.
We get all kinds of sourced material, but I think I trust technical collection more than the other INTs.
I try to use everything we get, but, if we are jammed, I rely on sources we collect.
Our value-added is classified sourcing. Everybody has access to the Web and CNN.
All our customers are analysts these days. What we bring to the party is information no one else has.
We’re in the business of secrets. If you see that stamped on something, it must be there for a reason.
The over reliance on classified information for hypothesis testing creates a situation in which the data are screened and sorted by the organization before they are selected and tested by the analysts. Classified information comes from very specific types of technical and human sources, and it is filtered through very specific reporting channels. It also has a tendency to become homogeneous because of the source types and reporting mechanisms. Because it is generated and packaged in specific formats using specific processes, classified information lacks the diversity that is inherent in open information, and this contributes to confirmation bias.
In sum, operating under difficult time constraints, trying to make new work accord with previous products, trying to maintain the prestige and power of the organization, and assigning greater weight to secret information than to open information have a cumulative effect, and the analyst often finds himself or herself trying to produce daily products using the most time-efficient strategies available instead of generating or testing hypotheses by way of refutation.
The persistence of the notion of tradecraft, coupled with organizational norms, promotes the use of disjointed analytic strategies by separating intelligence analysts from other scientific disciplines. These conditions have had an effect on the self-concept of analysts and have molded the way analysts perceive their own identity.
Finding: Analytic Identity
The self is something which has a development; it is not initially there, at birth, but arises in the process of social experience and activity, that is, develops in the given individual as a result of his relations to that process as a whole and to other individuals within that process.
Asked to define their profession, the majority of analysts described the process of analysis rather than the actual profession. The question, “What is an intelligence analyst?” resulted most often in a description of the work day and of the production cycle of analytic products and very seldom in an explanation of analytic methodology or a definition of an analyst outside of some specific context. With very few exceptions, analysts did not describe intelligence analysis as its own discipline with its own identity, epistemology, and research tradition.
This is not necessarily uncommon. When physicians are asked to describe their profession, they tend to respond with a specific subdiscipline: “I’m a cardio-thoracic surgeon,” for example. When asked for a more general description, however, they tend to respond, “I’m a doctor” or “I’m a physician.” That is, in selective, insular professional cultures, practitioners are able to define their role in both a specific and general fashion. Intelligence analysts had difficulty defining their professional identity in a general way and often relied on specific context to explain what it is that they do and, by extension, who they are.
The perception of individual analysts regarding their professional identity was associated most often with their organization’s function or with their own educational background and not with intelligence analysis as its own unique discipline.
I work counternarcotics.
I work counterterrorism.
I’m a military analyst.
I’m a leadership analyst.
I’m an economist.
I’m a political scientist.
In addition to these categories, many analysts described their professional identity in terms of intelligence collection methods or categories.
I do all-source analysis.
I’m a SIGINT analyst.
I’m an IMINT analyst.
I’m a technical analyst.
The shift in focus to daily analytic products, the changes in the production cycle, and a heterogeneously defined professional discipline have had an additional effect on the professional identity of analysts within the Intelligence Community. Analysts often commented that they perceived their job and their daily work routine as more akin to reporting than to analysis.
Basically, on a day-to-day basis, it’s like working at CNN, only we’re CNN with secrets. Actually, it’s more like CNN’s Headline News.
Imagine USA Today with spies—bullet points, short paragraphs, the occasional picture. You know, short and simple.
I think of myself as a writer for the most important newspaper in the world.
Many analysts expressed dissatisfaction with the shift in work processes from long-term forecasts and toward current reporting and the subsequent shift in their own professional identity within the Intelligence Community. The current sentiment about identity was often contrasted against an idealized past that was described as being freer of current production practices and products.
About 15 years ago, I would have described myself as a scholar. Now, I’m a reporter. I’ve got 15 people trying to change my work into bullet points. Presumably, nobody has time to read anymore.
When I joined, it seemed that the word “analyst” was shorthand for “problem solver.” Now, it’s shorthand for “reporter.”
I’m proud of where I work. I’m proud of the job that we do. But, it is hard to take pride in one paragraph. I have to look at the big picture, or I would get discouraged.
I spend most of my waking hours doing this, but I still can’t really say what an analyst is.
I’m not a reporter, and I’m not an academic. I’m somewhere in between.
The heterogeneous descriptions and definitions of intelligence analysis as a professional discipline were consistent findings during this study, indicating that there needs to be a clear articulation and dissemination of the identity and epistemology of intelligence analysis. A clearly defined professional identity would help to promote group cohesion, establish interagency ties and relationships, and reduce intra- and interagency communication barriers by establishing a professional class throughout the Intelligence Community. At an individual level, a clearly defined professional identity helps to reduce job dissatisfaction and anxiety by giving larger meaning to an individual’s daily actions.
Finding: Analytic Training
When I started, there wasn’t much training available. There were a few advanced courses, but, for the most part, it was on the job.
A professional identity is generally a disciplinary norm, and it regularly occurs in other domains that are as cognitively demanding as intelligence analysis, such as medicine, aeronautics, and jurisprudence. These other domains practice a general system of professional enculturation that progresses from a basic education program to specialized training. These training programs help to differentiate communities of practitioners from the general public, create specific and unique professional identities, and develop basic communication and task-specific skills. They also help the profession to continue to advance through formal research efforts.
This is not the case within the Intelligence Community as a whole. Generally, the intelligence agencies that do provide basic and advanced training do so independently of other intelligence organizations. A number of intelligence agencies do not provide basic analytic training at all or have only recently begun to do so, relying instead on on-the-job experiences and informal mentoring.
We haven’t had a culture of training analysts here in the past. It’s only in the last year or so that we’ve started to change that.
When I started here, analysts were considered administrative personnel. We didn’t have a training program. I think they just started one this year.
My background was technical analysis, and we had a lot of operational training where I used to work. But now that I’m doing more strategic analysis, I’ve had to make it up as I go along.
We have a basic training program, but it is different from the other agencies. Our mission is different. The problem is that we talk past each other all the time.
When I got hired, I had an advanced degree. People assumed that, if I had a Masters, I could just figure out what I was supposed to do.
The focus of training within the community varies widely and is shaped by the mission of the agency, such as technical, tactical, and operational. Many spend a considerable amount of time teaching new analysts how to prepare briefings, write papers, and perform administrative functions unique to their agency. This is logical from the perspective of agency managers, who naturally believe that investments made in personnel, training, and readiness ought to be tailored specifically for their own organizations.
The problem with an agency-centric view is that, without a general communitywide training program for intelligence analysts, agencies and their analysts have difficulty finding, communicating, and interacting with one another. Analysts often said they were disinclined to draw on resources outside of their own agency, indicating that either they do not know whom to contact or their experience in the past has been influenced by a strict organizational focus.
The media keep talking about intelligence failures and communication breakdowns in the Intelligence Community. What do they expect? We don’t even speak the same language.
It’s taken me 15 years to build my own network. If I didn’t have my own contacts, I wouldn’t know who to call.
I don’t bother going outside. Our focus is different here.
We have official channels, but it only really works if you trust the person on the other end of the phone. That’s hard to do if you don’t know them.
Without an inclusive communitywide basic training program, differentiation between the intelligence analysis discipline, as a whole, and other fields of study is unlikely. A community of practitioners will have difficulty interacting with one another, communicating between and within organizations, and establishing a professional identity, which is a key ingredient in the development of a professional discipline.
 Philosopher of science Thomas Kuhn described the now-common concept of paradigm shifts in scientific revolutions. He posited that paradigm shifts are tied to cultural and social constructionist models, such as Vygotsky’s (See footnote 22 in Chapter Three). Thomas Kuhn, The Structure of Scientific Revolutions.
 Karl Popper was one of the 20th century’s pre-eminent philosophers of science. Karl Popper, Conjectures and Refutations: The Growth of Scientific Knowledge.
 Performance gaps are the difference or distance between ideal (perfect) organizational performance and actual organizational performance. In this case, ideal performance includes complete data sets, reportorial accuracy, and the ability to avoid strategic, operational, and tactical surprise.
 Throughout the project, my data collection method consisted of written field notes. Anthropologists traditionally include specific detail from participant input or direct observation. Usually, this is in the form of precise descriptions of the actual behavior of participants and transcripts of their verbal interactions. It is also standard practice in field work to capture these data, and the data from the interviews and focus groups, on audio- or videotape. These practices were not followed in this particular case for two reasons: first, the nature of my work was not to document actual practices and procedures; rather, it was to derive categories of variables and individual variables in order to create a taxonomy, and to use the prototype taxonomy to structure the interactions; second, the nature of intelligence work and the environment in which it occurs, as well as its professional practitioners, require that certain data be restricted.
 This has been demonstrated in the psychological literature and is referred to as the Hawthorne Effect. Derived from research that began with an experimental program at Western Electric’s Hawthorne Works conducted between 1927 and 1930, the Hawthorne Theory, broadly interpreted, states that the behavior of subjects changes when they are aware of being observed. See Fritz J. Roethlisberger and William J. Dickson, Management and the worker; Elton Mayo, The Human Problems of an Industrial Civilization.
 I would like to credit and thank Matthew Johnson at the Institute for Defense Analyses for his help in formulating this theory.
 Social capital refers to the set of norms, social networks, and organizations through which people gain access to power, resources, and reciprocity and through which decisionmaking and policy creation occur. In other words, whom you know is just as important as what you know. Pierre Bourdieu, “The Forms of Capital”; Robert Putnam, “The Prosperous Community” and Bowling Alone. See also the empirical work on social capital summarized in Tine Feldman and Susan Assaf, Social Capital: Conceptual Frameworks and Empirical Evidence.
 Edward Sapir is best known for the Sapir-Whorf hypothesis, which asserts linguistic/cognitive relativity (language and thought are inseparable; therefore, different languages mean different ways of thinking). Edward Sapir, Language.
 The literature on this subject is extensive. For a representative list, see the appendix.
 A corollary to these methods can be found in the practice of radiologists. See Chapter Five for more on expertise.
 Rather than engage in the longstanding and ongoing debate in the academic community about what is and what is not science or a scientific method, suffice it to say that any scientific method needs to be explicit, replicable, and refutable. The literature surrounding this debate is voluminous. The philosophy of science, logic, language, and epistemology has taken this debate in a number of directions. There is, however, a general theme that replication is a key ingredient to any scientific method.
 See section on Endangered Languages in Barbara Grimes, ed., Ethnologue. 14th ed.
 Eric Wolf was an anthropologist who focused on power, social structures, and the third world. His work on power and the lives of peasants is considered a modern anthropological classic. Eric Wolf, Pathways of Power.
 There is a fair amount of disagreement in the psychological literature regarding the mechanism by which an individual displays confirmatory behavior. Some researchers attribute it to motivational factors, for example, a desire to maintain respect within a group. Other researchers attribute it to selectivity factors, an unconscious cognitive selection of data that confirms the current status quo. Some researchers attribute it to social factors, a subspecies of groupthink (see Irving Janis, Groupthink). Still others ascribe it to a misapplication of heuristics, whereby an individual learns a set of rules that solves one problem and then begins using that same set of rules to try to solve other types of problems. Although the literature is extensive, Karl Popper’s The Logic of Scientific Discovery provides a foundation for understanding the issue. Jonathan Evans’ Bias in Human Reasoning: Causes and Consequences is still a useful and concise summary of the research related to confirmation bias.
 Reciprocity in this case has to do with information, judgment, and trust. The classic anthropological text on social reciprocity and trust within and between groups is Marcel Mauss’s The Gift. Originally published in 1950 and based in part on the work of his uncle and mentor, Emile Durkheim, Mauss’s work (Essai sur le Don in its French version) lays the foundation for his contention that reciprocity is the key to understanding the modern concept of social capital.
 In research methodology, face validity is the concept that a measurement instrument appears or seems to measure what it is actually intended to measure and requires no theoretical supporting material. In contrast, content validity depends on the content of the domain and established theories to determine its measures of validity. See David Brinberg and Joseph McGrath, Validity and the Research Process; Edward Carmines and Richard Zeller, Reliability and Validity Assessment; Jerome Kirk and Marc Miller, Reliability and Validity in Qualitative Research; Mark Litwin, “How to measure survey reliability and validity”; William Trochim, The Research Methods Knowledge Base.
 George Mead was an American pragmatist philosopher and social psychologist, who, with John Dewey, made the University of Chicago the home of pragmatist philosophy and the “Chicago School” of sociology at the end of the 19th century. George Mead, Mind, Self, and Society.
 Philip Cushman, Constructing the Self, Constructing America; Anthony Giddens, Modernity and Self-Identity; John P. Hewitt, Self and Society; Lewis P. Hinchman and Sandra K. Hinchman, Memory, Identity, Community; Carl Jung, The Undiscovered Self; George Levine, ed., Constructions of the Self.
 Enculturation is the process or mechanism by which a culture is instilled in a human being from birth until death. In this instance, professional enculturation refers to the acquisition of a professional identity through specific cultural rituals and practices, as displayed, for example, by practitioners who have graduated from medical school, law school, and basic military training.
 See footnote 7 in the Introduction for several recent cross-agency training initiatives.
 Stephen Marrin, CIA’s Kent School: A Step in the Right Direction and “Improving CIA Analysis by Overcoming Institutional Obstacles.”
* Adobe® Reader® is needed to view Adobe PDF files. If you don't already have Adobe Reader installed, you may download the current version at www.adobe.com (opens in a new window). [external link disclaimer]