Library

 

Commentary

Thinking About Rethinking: Examples of Reform in Other Professions

William Nolte

The Intelligence Reform and Terrorism Prevention Act of 2004 provided a very broad definition of intelligence. In that provision the United States may find an opportunity for something more important and lasting than organizational reform.

One of the major judgments of the 9/11 Commission was that among the failures contributing to the disasters of September 2001 was a “failure of imagination,” one that involved intelligence as well as other elements of America’s national security structure. Subsequent efforts to reform the Intelligence Community have been intended, at least in part, to deal with this failure. Prominent among these efforts has been the Intelligence Reform and Terrorism Prevention Act of 2004 that created the Director of National Intelligence and, not incidentally, provided a very broad definition of intelligence. It is in that latter, relatively unnoticed, provision that the United States may find an opportunity for something more important, more effective, and more lasting than structural or organizational reform. That provision may provide a significant opportunity to rethink intelligence: what it is, what we want its instrumental role in American society to be, and how we as citizens want it to operate within the broader framework of American laws and values.[i]

Too often in Washington, reform means “let’s fix the wiring diagram,” hoping that enhanced function and performance will follow form. It is at least possible that the opposite is true, that something resembling the Bauhaus precept of form following function (and in this case purpose) may lead to a better outcome. Doing so must include a fundamental rethinking of intelligence.

Such a process need not entail the wholesale abandonment of everything we have heretofore known or thought about intelligence. Some functions and even some organizations will surely survive a fundamental rethinking, with the survivors benefiting from the outcomes of a rethinking process, not presumptions that bar serious review and renewal.

The late historian Carroll Quigley, long the scourge of first-year students at the Georgetown School of Foreign Service, argued that societies establish armies, economies, justice systems, and a host of other bodies, as instruments to achieve societal goals.[ii]  In this view, the initial focus of an organization is outside the organization, at the societal objective for which it was established. Of necessity, some amount of time, effort, and resources is needed to look within the organization, on its staffing, structure, and resources.

Over time, Quigley argued, the amount of effort extended in this internal, institutional, effort grows, ultimately competing with the effort expended on meeting the organization’s instrumental focus. The instrument thus tends to become a vested interest, allowing institutional survival to compete with societal needs as the organization establishes its priorities and deploys its assets. (Nietzche described a similar phenomenon when he noted that the greatest error in human effort came when we forgot what it is we originally intended to do.)

This is, of course, an old story, and history is littered with organizations that once dominated their environment but which eventually succumbed, either to competitors within that environment or to an environment so radically transformed that the organization could not operate within it effectively. Some of us are still old enough to remember when the building rising above Grand Central Terminal in New York City bore the name Pan Am rather than Met Life, or when US Steel was a symbol of American industrial might.

This is not simply a phenomenon for the private sector. In the early part of the 20th century, the Federal Bureau of Investigation, now struggling to redefine itself and in many respects struggling with its own traditions and legacies, was once a showplace for innovation in many areas of law enforcement, especially in its applications of science and technology.

 

The United States Army

Bureaucracies and corporations age, but they can also renew. That reality should serve as encouragement to the men and women now attempting to renew US intelligence. The United States has rarely witnessed, for example, a greater example, of institutional exhaustion than that experienced by the United States Army by 1975. A decade and a half later, however, the army demonstrated what a focused, courageous, and honest process of self-examination and self-renewal could produce. One aspect of the army’s renewal was a willingness to think hard about itself, to dedicate resources to the effort, and to create save havens where rethinking could occur without interference from those who would have argued that fundamental rethinking was unnecessary or disruptive.[iii]  The army’s renewal effort produced, beyond improved institutional performance, a literature of that renewal. It is on such literature, across a range of institutions, that the rest of this article will focus.

The army after 1975 and the military services in general have a professional advantage over their civilian colleagues in the intelligence profession. Scholars of professionalism have long noted that the hallmarks of a profession include such characteristics as a defined (and presumably) lengthy process of professional education, including continuing education after admission to the profession; a strong fiduciary sense and a code of conduct or ethics; and, as a result of the other characteristics, a strong sense of identity.[iv]

To continue with the army as an example, the stereotype of the US army between the world wars is of an impoverished institution in which officers languished in grade for a decade or more, equipment aged and became obsolete, and soldiers drilled in one sleepy, irrelevant garrison or another. Edward Coffman, in his wonderful The Regulars, paints a different picture, of an institution materially and financially strapped, to be sure, but intellectually rich and focused on what it could be and how it could function when called upon to defend the nation. Indeed, one could spend a great many years as a captain or major in the army of the 1920s and 1930s, but one could also spend a great deal of time in school, at the National War College, at one of the branch schools, or at the Command and General Staff College.

Mammals, when confronted with a freezing environment, concentrate oxygen in the brain, even at the expense of the limbs. Stupid or short-sighted bureaucracies react to freezes of a different sort by withdrawing budgetary oxygen from things like training and strategic studies to preserve day-to-day operation. The army leadership of the interwar period resisted this tendency, giving it a marvelous cadre of mid-grade officers ready for rapid promotion after Pearl Harbor. The intelligence agencies should make note of this example.

 

In the Private Sector

As noted above, the phenomenon of institutionalization takes place in the private sector as well as the public sector. And it takes place not just in steel or other manufacturing industries. In part because of the pace of environmental change surrounding it, entertainment is a private sector industry constantly reinventing and rethinking itself. One of the problems in the shift of instruments to institutions is that environmental change can invalidate expertise. A generation (or more than one, depending of the pace of change) that comes to lead because it captures the flow of the environment finds itself, over time, trying to retain its positions of leadership by defending its expertise against a newer generation that argues that what was once new and innovative has now become retrograde.

The leadership that assumed its position based on its mastery of the earlier environmental novelty, finds itself clinging to power by hoping for an environmental reprieve or simply by trying to discredit the insurgents. The American movie industry from the 1920s to the 1940s was a global phenomenon of wealth, corporate power, and glamour, a powerful combination. MGM used to boast it had “more stars than the heavens.”

Barely a decade later, the studio giants were gasping for life. By the 1960s, many of their fabled back lots were subdivisions, and by the 1990s, Discovery Communications could describe itself as a movie studio without the back lots and other front end investments of an earlier generation.[v]

Today, the question for Discovery is whether it is now the old line corporation defending its turf against insurgents. (MySpace.com is experiencing this phenomenon within even a more abbreviated cycle.) Moore’s Law may not yet apply to all corporate settings, but the half-life of success does seem to be shrinking.

 

Baseball

Perhaps no industry in American life has been over time surer of its purpose and its rules than baseball. Except for free-agency and the opening of the game to minorities, few American traditions have survived for so long with, or so it seemed, so little change. The 90-foot diamond field and the 60 feet 6 inch pitching distance, probably determined more by happenstance than plan, seem eternal. A sharply hit ball to the shortstop by a fast runner produces an out by one step. The same ball hit by most catchers produces an out by two steps. True in 1940, almost certainly true in 2040.

But the free agency of players did create a fundamental change in the way teams acquired and retained players. And the assumption was that over time rich teams (those that could purchase players developed by poorer teams) would accumulate a stranglehold on talent. Michael Lewis’s Moneyball, subtitled “The art of winning an unfair game,” describes how several teams, starting with the Oakland Athletics, upended this assumption. In perhaps the most conservative of sports, the Oakland leadership, confronted with a market that could never allow them to compete with rich teams in New York, Chicago, or other major cities, took advantage of information technology and a willingness to rethink everything “everyone” knew about baseball.

The data they used was available to all their competitors, but their competitors neither used nor saw the data the way Oakland’s planners did. For 100 years, for example, baseball insiders knew that advancing a runner from first base to second by stealing a base was an advantage in scoring more runs. In the unfortunate event, the runner was less than swift, sacrificing the runner (i.e., intentionally making an out to advance the runner) was a wise move. Why? In part because John McGraw did it that way in 1903; and, therefore, everyone knew that was “the way we’ve always done it.”

Oakland General Manager Billy Beane, with the advantage of technology that permitted his staff to research every game, every at bat, every attempted stolen base in history, ran the data and discovered a simple reality: the way they’d always done it was wrong. Advancing a runner from first to second by giving up an out reduced a team’s scoring chances. The risk of being caught stealing (and thus expending an out) outweighed the gain of successfully stealing the base. In sort, the most important asset a baseball team has is that it gets to keep trying to score until it commits, in most instances, 27 outs.

 

Discovering the New

Sometimes, the lesson suggests, rethinking means discovering the new: new technology, new tools, new information. In many cases, however, and one suspects this is especially true in data-rich and information-rich environments, the data or knowledge is already available. But it needs to be used, reused, or rethought. In the intelligence case, for example, we have “known” for half a century that most—85 per cent? 90 per cent?—of the information available to decision makers is from open source information.

Think of that: perhaps 90 percent of the information available to solve a problem is available from a source that occupies what percentage of the Intelligence Community’s time and attention. Certainly not 90 percent. Nor 80 percent. Nor, one suspects, 10 percent. Now the DNI has declared that open source will the the “source of first resort,” an encouraging (and correct) decision. All that’s left is to convince several large, complex, heavily capitalized secrets industries to abandon or at least alter “the way they’ve always done it.”

The better integration of open source information and expertise (expertise representing perhaps the greater part of both the problem and the opportunity), information sharing, and a fundamental review of security practices represent an iron triangle of intelligence reform and reconceptualization. Success in any demands success in all three. Failure in any reduces or perhaps eliminates any chance of success in the other two. It is difficult to imagine that even the talented, dedicated men and women of the US intelligence services can succeed in such a difficult task without embedding into their professional practice and culture the concept of ongoing, fundamental, scrupulously rigorous rethinking of who they are and what they do.

Let me draw several concluding thoughts. First, rethinking only happens when every option is on the table. When Douglas MacGregor suggested that the division was perhaps not the organizational principle for the 21st century army, he stepped hard on sacred ground. In this respect, he followed an important tradition of, among others, Billy Mitchell. History tells us the Mitchells of the world are often wrong but—and here’s the important point—not completely so.[vi]  Air power never replaced armies and navies, but the discussion engendered by Mitchell was an important one.

Most of the effort at intelligence reform since 2001 or the WMD controversy has been about better integrating the pre-existing intelligence agencies. Perhaps it’s worth suggesting that the intelligence agencies—and even the concept of an Intelligence Community—as we’ve known them deserve fundamental review. In a world, for example, where pandemic disease may be as great a national security issue as terrorism, aren’t the Centers for Disease Control important “intelligence” instruments, as the term is understood in the Intelligence Reform and Terrorism Prevention Act?

If so, we could, as one option, add the flag of the Centers for Disease Control and Prevention (CDC) to the other agency flags and seals that mark membership in the community, and get their people top secret clearances. And build enhanced security systems around their buildings and their computers. And make it difficult for their experts to interact with experts from other centers of expertise. But why would we want to to that? A better approach would be to realize that in the 21st century, intelligence will be primarily about information, and less about secrets. At that point, we could work on better integrating CDC (and state and local officials, and the private sector) into a trusted security network that is truly national and not just an instrument of one part of the federal government.

Perhaps “rethinking” intelligence means asking whether the better integration of the Intelligence Community is or should be an interim step. Perhaps the longer term question is whether the metaphor of an Intelligence Community needs to be rethought, in favor of something broader and more in keeping with today’s realities, such as a national security information network.

The second conclusion must be an an express preference for instrumental thinking over institutional thinking. This is absolutely critical (and horribly difficult) in a time of environmental volatility. In the late 1930s, the chief of cavalry in the US army, MG John Herr, wrote the chief of staff recommending a significant increase in the number of horse cavalry regiments. He noted that the expansion of the battlefield had created a problem because it was impossible to increase the stamina of the horse proportionate to the growth of the battlefield. Herr’s recommendation was a system (called porteeing) in which the horses would be brought near the battlefield in trailers, where they would match up with troopers conveyed to the scene in trucks. At that point, the troops would mount and charge. It’s difficult to imagine that Kasserine Pass could have proved worse, but porteeing might have made that possible.

The point is that MG Herr was carrying out his orders, which were to make the cavalry relevant and effective in a future war. His plan was the best he could do within those terms of reference, narrowly conceived. The danger is that institutions will almost always see the future narrowly conceived, that is, assuming the future of the institution. One of Herr’s protégés, LTC George Patton, saw the problem differently, that is to say, in terms of how to make the army, not the cavalry, effective and relevant. He soon transferred to the new armor branch, to his own benefit and that of the nation.[vii]

Intelligence requires similar courage and clarity. The question cannot be how to fix CIA or NSA or any of the others. The question is what constitutes intelligence in the 21st century and what instruments are needed to conduct intelligence. Addressing the intelligence portion of the national “failure of imagination” identified by the 9/11 commission requires an instrumental answer, not an institutional one.

Third, keep in mind that metaphors can be useful and important; they are rarely real. That is to say, most metaphors represent only a fragmentary view of a larger reality. The Intelligence Community is one example of a metaphor gone rigid. So is the intelligence production cycle, a monument to 19th and 20th century industrial concepts, focused on a sequential production line from needs to output and back again.

Does anyone think information works this way in the 21st century? Why shouldn’t collectors deal directly with end users? Do I really submit my information needs to Google, then let someone process, manipulate them, and assign them to someone for delivery? The dominant metaphor for the early 21st century information environment is either neural or cellular, and any structure attempting to react to that environment through sequential, industrial processes is doomed. Even more dangerous, it is protected from the fate of Pan American, TWA, Montgomery Wards, and other failed former industry leaders, only by the guarantee of an annual congressional appropriation. And it will survive institutionally, but it will not achieve success as an instrument of public policy.

Fourth, intelligence must be open—more open, perhaps—to lessons from other situations, other professions, and other ìindustries. Roughly speaking, American intelligence is in its third generation (the first two being the Second World War and the Cold War, the period before 1941 serving as something of a pre-history.) This relatively limited past is further limited by insufficient attention to that past. The result is that US intelligence has tended to operate in a “constant present tense,” with inadequate investment in strategic looks to the future or to lessons from the past. Within this narrow framework, a preoccupation with “the way we’ve always done it” has been inevitable. Even if some practice has in reality only been in place for ten or 20 years, a virtual historical nanosecond.

The idea of a central intelligence agency was not discovered on a stone tablet. It was worked out within a bureaucratic and political context, and then it evolved further over time. NSA and NGA have their origins in differing (but analogous) forms of communication, information, and information formatting. But changes in the information environment should at least permit inquiry into whether the differences require separate institutions.

This is not to suggest an outcome. It is to suggest that US intelligence has much to learn and much to be encouraged by a deeper understanding of its development over time. The challenges are formidable, but they are not necessarily more daunting than those previous generations faced. More established professions—including law, medicine, and the military —have confronted more generations and more evolutions than intelligence, and there are important lessons to be learned from their experiences.

In the current climate, the financial services industry and the information technology industry seem to share many of the concerns of the intelligence services, among them information-sharing, including how how to provide information to some, while simultaneously denying it to others. That is, after all, the crux of the security dilemma.

To some degree, this means shedding a bit of the exceptionalism that has developed around intelligence over the last half century. “But we’re unique,” is something anyone who has worked in congressional affairs for any intelligence agency has heard over the years as they try to answer the question “Why do we have to tell them so much?”

Leaving aside the thought that the law, James Madison, and now decades of practice require it, the reality is that the Department of Agriculture is also unique: the country has only one such department. And NIH, NASA, and many other agencies deal in highly technical data. Add to these considerations the role of federal agencies outside the intelligence community, state and local government, the private sector, and the academic community in providing the information and expertise on which US security in the 21st century will depend, and an earlier sense of exceptionalism needs to be at least tempered.

Intelligence has much to protect from outside scrutiny. But it also has much to learn from professionals in public health, medicine, and other professions. Several years ago, Steven Levitt, in his entertaining and provocative Freakonomics, drew some explicitly impressionistic conclusions on a vast number of issues, including the decline of crime in the United States through the 1990s.[viii]  Franklin Zimring, in The Great American Crime Decline, took great exception to Levitt’s conclusions, amassing an impressive amount of data in the effort.

The point is not to choose between Levitt and Zimring, but to note that two or more decades of research in criminology have given scholars and law enforcement officials enormous amounts of data on which to base training, education, and operational decisions involving the nation’s 18,000 or so law enforcement agencies. It is that data and the investment in study and rethinking that have taken law enforcement from a relatively low-prestige, hands-on profession to one in which research and innovation are highly regarded. It is not coincidental that American law enforcement, through such concepts as community-based policing and now intelligence-based policing, has become noted as a world leader in theory, doctrine, and practice.

Mature professions consider introspection and renewal to be critical to professionalism. The models and literature available to intelligence professionals as they rethink their future are almost unlimited.[ix]  The only limits in fact are the limits of the intelligence imagination, which should, within law and an internal sense of ethics, be virtually unlimited.

[Top of page]

 


Footnotes


[i]Readers may note in this title an allusion to “Thinking about Thinking,” the title of Richards J. Heuer’s first chapter in Psychology of Intelligence Analysis (Washington, DC: Center for the Study of Intelligence, 1999). It is meant as a small tribute to what continues to be an essential work in the literature of professional intelligence analysis.

[ii]Carroll Quigley, The Evolution of Civilizations (New York: MacMillan, 1961).

[iii]Among the products of such a haven, the military services have used centers at the service and national war colleges, and sabbaticals for serving officers at outside think tanks. Douglas MacGregor’s Breaking the Phalanx (1997) is but one example of the provocative work produced by this extraordinarily wise practice in intellectual investment. For an even more radical “insider” view of the future of war, see Rupert Smith, The Utility of Force (2007). Any study by a retired senior military professional beginning “War no longer exists” is worth at least a second glance.

[iv]A. M. Carr-Saunders and P. A. Wilson, The Professions   (Oxford, UK: Oxford University Press, 1933)

[v]Steve Twomey, “Network’s Roots May Help Town Bloom,” Washington Post, 14 February 2000.

[vi]Jackie Fisher, the father of the all-big-gun battleship, was the visionary who dominated naval warfare for half a century. It is worth noting, however, that his other great vision, the battle cruiser, was a disaster of enormous proportions, as demonstrated both at Jutland and in the short exchange between the Bismarck and HMS Hood decades later.

[vii]See Coffman, The Regulars, (Cambridge: Harvard University Press, 2004), 270, 388. I often have to remind students, when they begin smiling at the Herr’s story, that this was a capable and competent officer doing the best he could in a hopeless conceptual framework. It is, I must admit, hard to avoid a bit of a smile when recounting that his final suggestion in reforming the cavalry was to restore the saber as the regulation side weapon for officers.

[viii]Freakonomics: A Rogue Economist Explores the Hidden Side of Everything (New York: William Morrow, 2005). Franklin R. Zimring, The Great American Crime Decline (Oxford, UK: Oxford University Press, 2006).

[ix]I have said little about the medical profession, which Stephen Marrin and Jonathan Clemente have discussed in “Improving Intelligence Analysis by Looking at the Medical Profession,” International Journal of Intelligence and Counterintelligence 18, 4. Works like Jerome Groopman’s How Doctors Think (New York: Houghton Mifflin, 2007) are worth examination because they encourage physicians to achieve a more effective balance between conceptual and technical tools in their professional practice.

[Top of page]

 


Historical Document
Posted: Jun 20, 2008 07:11 AM
Last Updated: Jun 25, 2008 07:07 AM