Library

 

Teaching Intelligence Analysts in the UK

What Analysts Need to Understand:
The King’s Intelligence Studies Program

Michael S. Goodman and David Omand

Most of all, the program offers a containing
space in which
analysts from every part of the community can explore with each other the interplay of ideas about their profession.

Origins

In April 2007, a British newspaper the Mail on Sunday ran a story headlined “Can Sherlock Holmes restore the reputation of our bungling spies?” The report observed, “Spies and Whitehall officials are being given a crash course in Sherlock Holmes’ deduction techniques to prevent a repeat of the intelligence failures in the run-up to the Iraq war.” Although not quite accurate, this was the first public mention of an innovative course created in the aftermath of Lord Butler’s report on intelligence and Iraqi WMD.[i]

In this article we shall outline some of the conclusions we have drawn from the first four courses that we have run over the past two years. What do experienced analysts—those with five to 10 years on the job—need to know? Or, rather, what do analysts need to understand?

We are not concerned here with the acquisition of subject knowledge or the honing of techniques of analysis. Such teaching is best delivered in a secure environment with the classified databases and tools to which analysts would have access in their work. Exposure to an academic environment, such as the Department of War Studies at King’s College London, can add several elements that may be harder to provide within the government system: close access to academic disciplines, such as military history, intelligence history, international relations, social sciences and so on; an introduction to the relevant literature; and exposure to a variety of critical views, including the unorthodox. But most of all, it offers a containing space in which analysts from every part of the community can explore with each other the interplay of ideas about their profession.

We have earlier written how “intelligence is not a new phenomenon, the academic study of intelligence is.” That article went on to describe how “intelligence” as an academic discipline is studied and taught in the United Kingdom.[ii] It is worth briefly reiterating some of its findings as they pertain to the training of government intelligence officers. The CIA had recognized as early as 1960 how beneficial it would be to use universities as a means of intelligence training.[iii] Put simply, it was felt that by enhancing the understanding of practitioners of intelligence, they would be able to work more effectively.[iv] Such a course would be led by someone with “extensive and well-rounded intelligence experience” and as a whole would “apply the teachings of many academic disciplines.”[v]

Lord Butler, in his 2004 Review of Intelligence on Weapons of Mass Destruction, called for an increase in the number of British intelligence analysts and suggested forming a specialization of analysis with scope for advancement within it across the entire British intelligence community. It fell to David Omand, then UK intelligence and security coordinator, to start to turn the report into action. He chaired a high-level implementation group with the chairman of the Joint Intelligence Committee, the heads of the UK intelligence agencies and the permanent heads of the government departments most concerned. It was recognized that:

  • The high level of secrecy that is inevitable within an intelligence community means that training has to be largely in-house, but that, in turn, makes it more important to provide opportunities for analysts to meet and develop a wider professional outlook.
  • Care is needed that analysts do not come to see themselves as a professional “closed shop” that might make it harder for the intelligence agencies to rotate their intelligence officers between operational, analytical and managerial duties, bringing the experience of their service to bear during their tours of duty in the analytical environment, for example when seconded to the Cabinet Office Assessments Staff or to the Joint Terrorism Analysis Centre (JTAC).
  • The label “analyst” should be interpreted widely to include researchers who regularly use secret intelligence, for example in the Foreign Office or in the Serious and Organised Crime Agency (SOCA), and not just be confined to “all-source analysts.”

A professional head of intelligence analysis (PHIA), working within the Cabinet Office’s Intelligence and Security Secretariat, was subsequently appointed to promote the idea of greater professionalism in analysis and to help generate this sense of profession, albeit a virtual one. One of her early initiatives was to commission us at King’s College London to develop a course for experienced analysts.

With a small staff, the PHIA’s main tasks are to provide advice in the security, defense and foreign affairs fields on gaps and duplication in analyst capabilities, on recruitment of analysts, their career structures and interchange opportunities; to advise on analytical methodology across the intelligence community; and to develop more substantial training on a cross-government basis for all analysts working in these fields. The overall aim of these tasks is to enhance the analytic capability of the United Kingdom’s intelligence community to enable it to work together more effectively and provide the highest quality intelligence to ministers and policy makers.

[Top of page]

 

Approach

The aim set for the course can be summarized as promoting multidisciplinary understanding of the concepts, issues and debates regarding intelligence. Analysts will thus become more aware of issues around the meaning, value, nature and proper use of intelligence, and more confident in their own discussions of these topics. Fostering that sense of being part of a single UK intelligence community, and of belonging to the virtual profession of analysts, represents a key underlying motivation for the course.

To achieve this aim we offer the analysts encouragement to look at their profession from four points of view, based on Stafford Thomas’s pioneering fourfold typology of intelligence studies:[vi]

  • The functional approach: studying an intelligence cycle appropriate for the needs of a 21st century national security strategy, looking at the development of intelligence activities, processes, and technologies. The choice of analytic methodology is examined, drawing on the experience of other professions grappling with problems of knowledge.
  • The historical/biographical approach: studying the historical experiences of the use of intelligence, good and bad; examples examined have included the controversy over Iraqi WMD, the Falklands War, and UK counterintelligence against the Soviet Union.
  • The structural approach: studying the institutional development of the UK intelligence community, especially the Joint Intelligence Committee and the more recent JTAC. We look in particular at how the UK intelligence community has adapted to an era of avowal, greater openness, and judicial and parliamentary oversight.
  • The political approach: looking at the part that pre-emptive intelligence now plays in operational decision-making in counterterrorism and other areas. This provides the opportunity to sensitize the analysts to the institutional dynamics of analytical organizations and the obvious pathologies that can occur in the relationship between the intelligence community and its customers. The ethics of intelligence gathering, sharing and public use are examined in the context of current counterterrorism strategies.

These four ways of looking at the subject are inter-woven through the classes, each two hours long, typically comprising a mixture of lecture and discussion. Learning in the 10-week course is assessed by means of a 4,000-word essay, marked to King’s College London MA marking criteria. For this, participants are explicitly required not to rely on practical experience but to utilize the wide intelligence studies literature. In their essay they will normally choose the one approach with which they have come to feel most comfortable. One outcome of this is that those who take and pass the course are given a number of credits, which they can then use toward one of the nine MA degrees offered by the Department of War Studies, or indeed any other MA offered within King’s College London; in effect it is a means of encouraging thinking about broader personal and professional development.

[Top of page]

 

What do the sessions cover?

1) The functional approach

Starting with the functional approach, the emphasis is on developing an awareness—a self-consciousness—of the mental processes that we all employ when we do what we call “analysis.” There is much we can learn here from other professions and from recognizing the differences between them. We draw attention to the relevant methods of analysis employed by journalists, physicians, historians, paleontologists, detectives, mathematicians, and physical and social scientists. Each group has something of methodological value to offer to the debate in terms of what makes for reliable evidence, how to judge between competing theories, what makes theories useful, and how uncertainty is dealt with.

One unusual example is paleontology, an academic discipline that has had to develop a methodology and tools for assessing fragmentary and often incomplete evidence, on an internationally collaborative basis, and drawing general conclusions from the evidence. For instance, from the example of modern human origins (MHO) comes discussion of paradigm shifts and competing hypotheses and how best to select between them when direct experimentation is not possible. One intelligence tool that we explore is the use of the Heuer model,[vii] as developed by the UK Defence Intelligence Staff (DIS) and which provides a structured way for analysts to relate competing hypotheses to their essential assumptions. The need for care over deception, in the form of examples such as the Piltdown fraud, can also be introduced here.

From the mathematicians comes the Bayesian approach, where we emphasize the way that new information can be reliably and consistently incorporated to revise an estimate of the believability of a hypothesis. Heuristics, such as those of the mathematician George Polya, are introduced, including his advice to draw diagrams, try and recognize when similar problems have been solved in the past, and the notion that if a problem is too hard to solve, attempt solution of a related but much simpler version.

At the same time the fuzzy logic school provides the analysts with cautionary lessons concerning the less than Cartesian categories of the typical real-world problem. A general issue young analysts invariably raise at this point is how far such theoretical examinations of decisionmaking can have application to their real-world problems. An example we have used will illustrate the point. The example below sets out an apparently simple practical problem that just might be posed to an analyst supporting an arms control inspection regime:

You are an imagery analyst looking for an unlawful biological warfare trailer. You think it could be hidden in one of three equally likely locations, A, B or C. You pick one, say site C, and start to prep the arms control inspectors for a snap inspection. The host country then unexpectedly throws open one of the other sites, site A, to journalists so it is obviously not there. You have the chance to change your advice to the inspection team and tell them now to go to site B or stick with your original choice, C. Should you change, or stick to C?

When posed this question, analysts immediately split into two camps. The minority quickly spots the underlying structure of what in North America is known as the “Monty Hall problem,” from the name of the game show host.[viii] As a problem in probability it is straight forward, if paradoxical. The majority of analysts who have not come across the problem refuse to believe the result when they first come across it but can be persuaded to follow the probabilistic reasoning, as set out in Figure 1.Figure 1

That, however, is the start of the teaching point. The analysis of the probabilities in the graphic depends upon a set of strict assumptions that are not explicit in the question. For the intelligence analyst little, if anything, should be taken for granted, especially statements from the opponent. What unlocks a proper analysis of the problem for the analyst is understanding where implicit assumptions are being made about the reporting being received. For example, do we assume that the opponent knows which initial site was picked (the question does not say so)? If not, the solution is quite different. Would it be safe not to assume he knows, given the history of arms inspection regimes? Is the opponent engaged in deception, using the media as a shield? Can it be safely assumed that the opponent who threw open the site was privy to the secret of where the bio-trailer was actually located? And so on.

In the end, the problem reduces to a number of alternative hypotheses, on a number of different assumptions, and the analyst can use the Heuer table approach to rank these. Our calculations show that the problem is asymmetric: the wise analyst will advise switching on the grounds that some assumptions will improve the prediction, while on others it makes the chances no worse.

One of the objectives of taking the analysts through such exercises is to emphasize that prediction may not match reality because the model of human motivation being used to interpret the intelligence has built-in inappropriate assumptions. This lesson about the nature of explanation is important for analysts to understand. The point has been well made by a leading quantum physicist, as originally attributed to Bertrand Russell in his philosophy lectures, but which we adapt to the intelligence world. Imagine a chicken farm where the chickens spy on the farmer and intercept a message that he is ordering much more chicken food. The JIC of chickens meets. Is their key judgment that at last peaceful coexistence has come and the farmer is going to feed them properly in the future? Or is it that they are all doomed since they are about to be fattened for the kill? It is the same raw reporting, but different implicit assumptions about human behavior.

The fact that the same observational evidence can be extrapolated to give two diametrically opposite predictions according to which explanation one adopts, and cannot justify either of them, is not some accidental limitation of the farmyard environment: it is true of all observational evidence under all circumstances.[ix]

Or, to put it another way, as the Nobel prize–winner Paul Dirac said of the early Bohr model of the hydrogen atom, it is possible to get the right answer for the wrong reason.

We have found that many young analysts implicitly carry in their heads what might be described as an inductivist model of their work, involving experience of being able to generalize from patterns or from changes to recognized patterns. They need to be reminded of

the asymmetry between experimental refutation and experimental confirmation. Whereas an incorrect prediction automatically renders the underlying explanation unsatisfactory, a correct prediction says nothing at all about the underlying explanation. Shoddy explanations that yield correct predictions are two a penny, as UFO enthusiasts, conspiracy theorists and pseudo-scientists of every variety should (but never do) bear in mind.[x]

We emphasize too the risk of overinterpreting evidence and contriving ever more complex explanations to fit available data. As the late Professor R. V. Jones, the father of scientific intelligence, put it in a dictum he called Crabtree’s bludgeon:

No set of mutually consistent observations can exist for which some human intellect cannot conceive a coherent explanation.[xi]

Discussion with analysts usually leads to their volunteering examples from their experience of the human tendency to try to explain away apparently contradictory evidence that might confound the favorite explanation of the moment. A temptation we have all noticed is likely to be unconsciously stronger if that explanation is known to be favoured by the senior customer, or if deciding upon it has been particularly stressful for the organization, in which case a form of cognitive dissonance may effectively blank out discussion of alternative explanations.

A well-documented case that illustrates the pitfalls here, which we give the analysts to examine, is the 1982 “yellow rain” allegation of Soviet BCW agent use in Laos and Cambodia.[xii] In that case there were good reasons for initially giving credence to the reports, but as contrary evidence began to emerge it was explained away by ever more complex explanation. Thus, as an example, the alleged agent particle size was smaller than might have been expected, well, that just showed how fiendishly clever the enemy was because smaller particles could be ingested more quickly through the lungs as well as through skin absorption. In the end, analysis by labs such as the UK’s Porton Down showed no trace of BCW agent and the organic substance found was probably pollen from clouds of defecating wild bees—as perhaps the analysts might have found out if experts on the fauna of the region had been consulted initially, another useful learning point. There may well have been covert activity going on in the region, but this was not the way to go about uncovering it.

We introduce the students gently to postmodern critiques of international relations and the role of intelligence—the only session that we might describe as turbulent since our experience is that most analysts are impatient with modern structuralist thinking. However, it is important for analysts to realize how the language they habitually use, such as intelligence collection, production, analysis, assessment and so-called finished product (and the meaning that different generations of customers may ascribe to words such as probably) are categories that can shape and constrain thinking.

Figure2In discussion with analysts we have found our own thinking about the “intelligence cycle” being reshaped. The depiction of the intelligence cycle in Figure 2 uses “access” to cover all three types of information that can be turned into intelligence: traditional secret sources, open sources (including nonintelligence government information, such as diplomatic reporting) and the third increasingly important category of private information covered by data protection legislation (such as financial, credit, travel, passport, biometrics and communications records).

We have found that the analysts respond readily to the term “access,” that deliberately conjures up the image of the analyst and the collector working together and the development of a new skill set of mission management to connect them. We only have time in the course for the merest glimpse of the technological possibilities that the future will bring here for their work, for example in data mining and pattern recognition software.

Our description of the cycle uses “elucidation” to describe the ways in which usable intelligence can be created by shedding new light on what is going on in theaters of interest, providing a crucial element to situational awareness and providing surer explanations of what has been experienced from which more reliable predictions can be generated.

As Winston Churchill put it: “The further back you look the further ahead you can see.” Certainly the traditional evidence-based inferential work is still there, as it was during the Cold War, but so is seeing inside the head of the enemy. The term “dissemination” is used to convey the sowing of seeds in the minds of other analysts as well as customers, and to a much wider group of potential users, including local police officers or operators of the critical national infrastructure, interested in data streams, pictures, maps and video as well as written reports of the traditional kind.

From these discussions we have the impression that analysts are being pulled in two different directions. On the one hand, the center of gravity of UK intelligence work has shifted to “action-on” intelligence, to use the old SIGINT expression. That brings a very close interaction with the user operating in real time or near real time, a feature of both support for military operations and support for what in UK parlance we might call the civil authority, including law enforcement over terrorism, narcotics, proliferation and serious criminality.

On the other hand, the demands for high-level analysis have become more demanding, with military involvement in Iraq and in Afghanistan, where strategic judgments depend crucially on deep knowledge of language, customs, history, religion, tribal relationships and personalities, and topography that place exceptional demands on the analyst. The future will hold many demands for such deep analysis of global phenomena, such as resource shortages and the security impact of climate change, posing real challenges for the next generation of young analysts.

[Top of page]

 

2) The historical/biographical approach

Under the heading of the historical approach, the analysts have been able to hear Professor Sir Lawrence Freedman analyzing the dynamic interaction between UK and Argentine intelligence in the run-up to the invasion of the Falkland Islands and showing how perceptions of the moves made by one side affected the other.[xiii] For example, Argentine intelligence incorrectly assumed that a nuclear attack submarine was leaving Gibraltar for the South Atlantic. The UK government was not unhappy to have such a deterrent message understood, but the Joint Intelligence Committee failed to assess that the Argentine junta would, as a result, actually accelerate its plans for invasion before supposed British reinforcements arrived. Such dynamic situations are much the hardest that the intelligence analyst ever has to face. Another important lesson is that dictators may not react the way democracies would.

Different lessons about the use of intelligence have been provided by Gill Bennett, until recently chief historian of the Foreign Office, with her analysis of the meticulous intelligence case built up against Soviet espionage that allowed the UK to expel 105 Soviet officials in 1971 (Operation Foot), a blow from which their effort against the UK never recovered.[xiv] She contrasts that with the hasty and botched action in 1927 against ARCOS, the Soviet trade society that had been fomenting industrial subversion. In attempting to defend his action, Prime Minister Stanley Baldwin revealed to Parliament the contents of an intercepted Soviet telegram with the obvious result that readability of Soviet diplomatic cyphers was promptly lost.

[Top of page]

 

3) The structural or institutional approach

It would be fair to say that we have found the analysts less knowledgeable than they need to be about the history of the wider intelligence community outside their own employing agency or organization. In particular, the history of the UK’s Joint Intelligence Committee has many lessons for the analyst in understanding the developing relationship with the policy customer.

Examples abound of JIC key judgments that illustrate predictive intelligence at its worst and best. At its worst, we examine the conclusions of the recently declassified Nicoll Report that provide the basis for a rich discussion of mirror imaging, perseveration, transferred judgment, etc., all made worse by group think.[xv] At its best (leaving aside the double negative which would be disapproved of today), we have the following historical key judgment based on fresh HUMINT in 1939:

Apparently the reason which was supposed to have led Herr Hitler and his advisers to come to this decision was that they felt the rearmament of the democratic powers was proceeding at such a pace that Germany’s relative strength would inevitably decline. This was therefore the moment to strike…by reason of [intelligence reports] which show which way the wind was blowing, it is unfortunately no longer possible to assume that there is no likelihood of Germany “coming West” in 1939.[xvi]

And this judgment from 1956 on Suez:

Should Western military action be insufficient to ensure early and decisive victory, the international consequences both in the Arab States and elsewhere might give rise to extreme embarrassment and cannot be foreseen.

This shows a nice delicacy about reaching a judgment, not about the enemy but about your own government’s proposed actions.[xvii]

One of the course sessions that has been the most popular has been that dealing with the history of avowal and oversight. We examine how the use of pre-emptive intelligence in countering terrorism has brought greater public awareness and, at times, criticism of intelligence work. We engage the analysts in a vigorous debate about the ethics of intelligence, one of the most appreciated sessions on the course, given sensitivities over the uses that may be made of their intelligence to guide military or police action.

On a lighter note, we have devoted one session in each course to examining how the serious media now operate. Students have been fascinated to talk to the foreign editor of a leading journal and to a leading BBC correspondent to learn first hand about how the process of serious reporting is managed, open and private sources handled, and editorial discretion exercised, since in journalism, as in intelligence analysis, to edit is to choose. Writing accurately and clearly, to a tight deadline, is a skill that both professions have to exercise.

Our media representatives readily concede, however, that there is one big difference. As the Economist put it many years ago on the retirement of Sir Kenneth Strong as the director general of defence intelligence:

Modern intelligence has to do with the painstaking collection and analysis of fact, the exercise of judgment, and clear and quick presentation. It is not simply what serious journalists would always produce if they had time: it is something more rigorous, continuous, and above all operational—that is to say, related to something that someone wants to do or may be forced to do.[xviii]

4) The political approach

Under this heading, the course examines the analyst/customer (variously called the producer/consumer) relationship. Two models are compared at the outset of the course, broadly the school associated in the literature with Bob Gates’s time as DCI and that espoused decades earlier by Sherman Kent. Most of the analysts feel comfortable adapting their approach to circumstances. We discuss times when the former approach is more appropriate, for example in strategic assessment of issues of peace and war (Iraq), and times when a very close mutual understanding is needed (uncovering terrorist networks).

We have many more publicly documented case studies of problems in intelligence assessment to draw on than there are documented successes. The Butler inquiry has provided useful case histories, including A.Q. Khan and Libya, to balance its strictures about intelligence on Iraqi WMD. In the course, we do however look in detail at the now reasonably well documented controversy over pre-war associations between al-Qa’ida and Iraq and, in particular, the case of Curveball and Iraqi BW trailers.

We encourage the analysts to distinguish between intelligence “gaps” and intelligence “failures.” Certainly, as far as domestic counterterrorism is concerned, they need to accept that the former will always exist—the analysts are, we find, very balanced in their views about the acceptable limits of surveillance. To be classed as a failure, there has to be a reasonable expectation that the analysts could have had access to actionable intelligence that would have provided timely warning were it not for some negligence, including that resulting from over-stretch, inadequate training, personal dereliction of duty, institutional rivalries and so on. The analyst needs to be alert to the first warning signs of incipient failure conditions.

In looking at the relationship with the user, the writings of Professor R.V. Jones provide examples during WW II when he resorted to advocacy rather than presenting facts neutrally, fearing important warnings were not being heeded. Who could blame an analyst for advocacy, faced with, say, a General Percival in Singapore refusing to accept the reality of the impending Japanese invasion or a secretary of defense, as Robert Macnamara admits in his own memoir, resisting appreciation of the true state of affairs developing in Vietnam?

But the analysts are quick to recognize this must never, ever, become the slanting of intelligence. And the analyst must encourage the customer to recognize that what the analyst is painting is an impressionist portrait, without the complete detail that you would find in a photograph. So what is included as the essential highlights and what is left out as distracting detail is a matter of analytical judgment. Customer and analyst alike need to be conscious of this.

We look, therefore, in a final session at institutional dynamics as they might apply to teams of analysts and their interactions with users. What modes of behavior are likely to encourage innovation and creativity (or not)? How much latitude should the dissenting analyst expect, and what safety valves exist, such as the use of the intelligence counsellor, an independent senior retired figure who can be consulted in confidence over professional issues of conscience? What are the first symptoms of group think and blame culture? We find that most of the answers here come from the analysts with little or no prompting from the tutors, demonstrating that recent experiences have had their impact on the intelligence community.

[Top of page]

 

Conclusion

To conclude, as a result of having worked with four iterations of the course, we think we have a better understanding now of what, outside the professional tools of their trade, it would be helpful for the up and coming analyst to understand better. Much of this understanding revolves around self-knowledge and development of sound instincts of curiosity.

The first permanent secretary that David Omand ever met was in the Ministry of Defence in London over 35 years ago. He sat in a large, elegant Whitehall office and inquired kindly about how this new recruit was settling in and then he said, “You may wonder what a permanent secretary does all day. Let me tell you.” He went on, “I sit behind my desk and I transfer papers from my intray to my outtray. And, as I lift them, I sniff them, and 35 years in Whitehall has given me the ability to tell when advice going through to the minister is soundly based and well timed, and it has also given me the nose to detect a wrong’un.”

This encounter was, of course, before the advent of managerialism in the British public service. But his words were good advice in relation to developing strong professional instincts. Perhaps, for he was a highly educated man, he had in mind Wittgenstein’s account of a visit to a tailor, when the experienced customer who knows his own mind came to indicate his choice from an endless number of patterns of suiting—almost beyond words of explanation—no, this is slightly too dark, this is slightly too loud, this is just right.[xix] The experienced mind is demonstrated by the way choice and selection is indicated.

Much of the early career may necessarily be spent in acquiring mastery of the necessary technical skills of the analytic trade, processing raw intelligence, in searching through imagery or communications patterns and collating data of every kind. For experienced analysts, however, what will make the difference are the instincts—which we believe can be developed—that can be brought to bear to generate hypotheses worth testing on the evidence base. It may rest on the ability to get into the mind of the adversary, to understand the responses of a foreign culture, to sense when new thinking is needed, and—in the words of that permanent secretary—to spot a wrong’un. It will rest also on deep understanding of the world inhabited by the users of their intelligence, to understand what intelligence they need to do their job better, and also to sense what they do not yet know that they need to know, and that the intelligence community might be able to provide if appropriately tasked.

To conclude with the words of Richards Heuer, which might have been written for the course at King’s College London:

Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.[xx]

[Top of page]

 


Footnotes

[i]HC 898. The Lord Butler of Brockwell. Review of Intelligence on Weapons of Mass Destruction. (London: The Stationary Office, 2004).

[ii]M.S. Goodman, “Studying and Teaching About Intelligence: The Approach in the United Kingdom,” Studies in Intelligence 50, no. 2 (2006): 57–65.

[iii]P.J. Dorondo, “For College Courses in Intelligence,” Studies in Intelligence 4, no. 3 (1960).

[iv]S.T. Thomas, “Assessing Current Intelligence Studies,” International Journal of Intelligence and Counterintelligence 2, no. 2 (1988): 239.

[v]Dorondo, A15–A16.

[vi]Thomas, 239.

[vii]R Heuer, The Psychology of Intelligence Analysis, (CIA: The Center for the Study of Intelligence, 1999)

[viii]An entertaining simulation can be found at math.ucsd.edu/~crypto/Monty/monty.html

[ix]D. Deutsch. The Fabric of Reality. (London: Allen Lane, 1997)

[x]Ibid.

[xi]R.V. Jones. Reflections on Intelligence (London: Mandarin, 1989).

[xii]With acknowledgments to Professors Meselson and Perry Robinson, who generously allowed us to draw on their work on this subject as part of the Harvard Sussex program.

[xiii]L. Freedman, The Official History of the Falklands Campaign (London: Routledge, 2005).

[xiv]Documents on British Policy Overseas. Series III: Volume I – Britain and the Soviet Union, 1968-1972. (London: The Stationery Office, 1997).

[xv]M.S. Goodman, “The Dog That Didn’t Bark: The Joint Intelligence Committee and the Warning of Aggression,” Cold War History 7, no 4 (November 2007): 529–51.

[xvi]Cited in W. Wark, The Ultimate Enemy: British Intelligence and Nazi Germany, 1933–1939 (Ithaca, NY: Cornell University Press, 1985).

[xvii]Cited in P. Cradock, Know Your Enemy: How the Joint Intelligence Committee Saw the World. (London: John Murray, 2002).

[xviii]The Economist, 1 October 1966: 20.

[xix]L. Wittgenstein, Lectures and Conversations (Oxford, Basil Blackwell, 1966), 7.

[xx]Heuer, ch. 4.

[Top of page]
All statements of fact, opinion, or analysis expressed in this article are those of the author. Nothing in the article should be construed as asserting or implying US government endorsement of an article’s factual statements and interpretations.

Posted: Jan 07, 2009 01:27 PM
Last Updated: Jan 07, 2009 01:27 PM