Effectiveness and Implications for the Intelligence Community
J. D. Fletcher
The Intelligence Community has begun to invest substantial resources in the training and education of its analysts. With the exception of a few advanced courses available through distance learning networks, this instruction is delivered using a conventional classroom model. This model possesses a number of inherent inefficiencies, including inconsistent instruction, strict ties to time and place of instruction, large student-to-instructor ratios, and limited active participation by students due to class size and scheduling.
Research suggests that significant improvements can be achieved through the use of computer-based instructional technology. According to these studies, this technology can increase instructional effectiveness and reduce time needed to learn. It can achieve these efficiencies, moreover, while both lowering the cost of instruction and increasing its availability. This chapter summarizes evidence on the promise of instructional technology for intelligence analysis training.
The argument for the use of instructional technology usually begins with a comparative examination of the effectiveness of classroom instruction and individual tutoring. For instance, the graph below illustrates the combined findings of three dissertation studies that compared one-on-one tutoring with one-on-many classroom instruction.
It is not surprising that such comparisons would show that tutored students learned more than those taught in classrooms. What is surprising is the magnitude of the difference. Overall, as the figure shows, it was two standard deviations. This finding means, for example, that with instructional time held fairly constant one-on-one tutoring raised the performance of 50th percentile students to that of 98th percentile students. These, and similar empirical research findings, suggest that differences between one-on-one tutoring and typical classroom instruction are not only likely, but also very large.
Why then do we not provide these benefits to all students? The answer is straightforward and obvious. With the exception of a few critical skills, such as aircraft piloting and surgery, we cannot afford it. One-on-one tutoring has been described as an educational imperative and an economic impossibility.
The success of one-on-one tutoring may be explained by two factors. First, measured in terms of questions asked and answered, tutors and their students engage in many more instructional interactions per unit of time than is possible in a classroom. Second, one-on-one tutoring can overcome the substantial spread of ability, measured by the time needed to reach minimal proficiency, that is found in practically every classroom. Tutoring reduces time-to-learn by adapting each interaction to the needs of each student. Less time is spent on material the student has already learned, and more time is spent on material remaining to be mastered.
To investigate the intensity of instructional interactions, Art Graesser and Natalie Person compared questioning and answering in classrooms with those in tutorial settings. They found that classroom groups of students ask about three questions an hour and that any single student in a classroom asks about 0.11 questions per hour. In contrast, they found that students in individual tutorial sessions asked 20–30 questions an hour and were required to answer 117–146 questions per hour. Reviews of the intensity of interaction that occurs in technology-based instruction have found even more active student response levels.
Differences in the time needed by individuals in any classroom to meet instructional objectives are also substantial. Studies on this issue have reported ratios varying from 1:3 to 1:7 in the times the fastest learners need to learn compared to the times needed by the slowest learners. Although these differences may be due initially to ability, these studies suggest that such ability is quickly overtaken by prior knowledge of the subject matter. This effect is particularly evident in instruction for post-secondary-school students, because prior knowledge rapidly increases with age and experience. Technology-based instruction has long been recognized for its ability to adjust the pace of instruction to individual needs, advancing through instructional material as quickly or as slowly as required. The overall result has been substantial savings in the time required to meet given instructional objectives.
It should be emphasized that these benefits are not achieved at the expense of instructional quality. Research has found that many instructional technologies have a positive impact on learning across a wide variety of student populations, settings, and instructional subject matters.
This research suggests that technology-based instruction results in substantial savings of time and money. Studies have shown that the times saved average about 30 percent, as seen in the table below. The reduction in overhead expenses averages 20–30 percent. Research has shown that the cost ratios (calculated as the ratio of experimental intervention costs over the costs of a control group) for interactive multimedia technology (computer-based instruction with enhanced audio, graphics, and/or video; CD-ROM and DVD-based instruction; interactive video, etc.) favor it over conventional instruction along with time savings of about 31 percent. Simulation of such systems as helicopters, tanks, and command-control systems for training combat skills has also proven to be cost-effective. The operational costs for simulation are, on average, 10 percent of the costs of using the actual systems to train.
Meta-analysis Demonstrates the Effectiveness of Instructional Technology
Researchers often use a meta-analytic approach to review and synthesize quantitative research studies on a variety of issues, including instructional effectiveness. This method involves a three-step process, which begins with the collection of studies relevant to the issue using clearly defined procedures that can be replicated. Next, a quantitative measure, “effect size,” is used to tabulate the outcomes of all the collected studies, including those with results that are not statistically significant. Finally, statistical procedures are used to synthesize the quantitative measures and describe the findings of the analysis. Meta-analysis appears to be especially suited for synthesizing the results of instructional research, and it has been widely used for this purpose since its introduction in 1976.
Meta-analysis is still being developed as a technique, and some matters concerning its use, notably the “file-drawer” problem and calculation of effect size, remain unsettled. Chapter Twelve presents a more detailed explanation and these considerations. Briefly, however, meta-analytic reviews of instructional technology effectiveness have found substantial results favoring its use over traditional technologies of classroom instruction.
Overall, effect sizes for post-secondary school instruction average about 0.42, which is roughly equivalent to raising the achievement of 50th percentile students to that of 66th percentile students. Reviews of more elaborate forms of instructional technology, such as those using applied artificial intelligent techniques, have found effect sizes in excess of 1.0, which is roughly equivalent to raising the achievement of 50th percentile students to that of the 84th percentile. It seems reasonable to conclude that the reduced costs and reduced time to learn obtained in applications of instructional technology are not achieved at the expense of instructional effectiveness.
Encouraging as these favorable results are, our ability to apply instructional technology efficiently may be in its infancy. Findings thus far have been based on instructional applications intended to teach facts (e.g., What is the capital of Brazil? What is the Spanish word for chapel? Who was the first director of the Central Intelligence Agency?) concepts (e.g., What is a mass spectrometer used for? What is the difference between micro- and macro-economics? When must you use a torque wrench?), and procedures (e.g., How do you record a movie from television? How do you prepare a purchase requisition? How do you calibrate a radar repeater?). All intelligence analysts must possess a repertoire of facts, concepts, and procedures to perform their craft, and instructional technology holds great promise for increasing both the efficiency with which they might develop this repertoire and their access to instructional resources for doing so.
However, the capabilities analysts may seek through instruction are likely to include more abstract, or “higher,” cognitive processes. For instance, in addition to learning a procedure, analysts may need the capability to recognize the procedure’s applicability in unfamiliar situations, modify it as needed, and use it to develop new approaches and procedures. Early on, Bloom discussed learning objectives as a hierarchy beginning with knowledge at the most rudimentary level and ascending through comprehension, application, analysis, and synthesis to evaluation. Bloom’s is not the only such hierarchy to emerge from research on instructional design, but it seems to be the best known, and it describes as well as any the various levels of knowledge, skill, and ability to which learners may aspire.
Current Research on Higher Cognitive Abilities
Analysts have begun to discuss development of the higher cognitive abilities needed to deal with unanticipated and novel challenges. Components of such “cognitive readiness” may include:
Situation awareness—the ability to comprehend the relevant aspects of a situation and use this understanding to choose reasonable courses of action. Practice and feedback in complex, simulated environments have been shown to improve situation awareness.
Memory—the ability to recall and/or recognize patterns in a situation that lead to likely solutions. It may be supported by two underlying theoretical mechanisms: encoding specificity, which stresses the importance of responding to relevant external and internal perceptual cues, and transfer-appropriate processing, which stresses the actions performed during encoding and retrieval. Some instructional techniques, such as overlearning, have been shown to enhance long-term retention.
Transfer—the ability to apply what is learned in one context to a different context. It can be perceived either as the ability to select and apply procedural knowledge gained in one context to another (“low road” transfer) or as the ability to apply the principles abstracted from a set of contexts to another (“high road” transfer). Extensive practice, with feedback, will enhance the former. Instruction in developing mental abstractions will enhance the latter.
Metacognition—the executive functions of thought, more specifically, those needed to monitor, assess, and regulate one’s own cognitive processes. Meta-cognitive skills can be enhanced by exercises designed to increase awareness of self-regulatory processes.
Pattern Recognition—the ability to distinguish the familiar from the unfamiliar. It may be accomplished by “template matching,” which involves comparing retained images with incoming sensory impressions; or by “feature comparison,” which involves recognizing and generalizing from distinctive features of a structure held in memory with incoming sensory impressions. Pattern recognition can be taught through a combination of extensive practice, with feedback, and instruction in forming abstractions.
Automaticity—processes that require only limited conscious attention. Automaticity can be taught by providing extensive practice, with feedback.
Problem Solving—the ability to analyze a situation and identify a goal or goals that flow from it, identify tasks and subtasks leading to the goal, develop a plan to achieve them, and apply the resources needed to carry out the plan. Practice, with feedback, and overlearning can enhance problem-solving ability in many tasks. Techniques for problem solving can be successfully taught, as can the knowledge base needed to implement them.
Decisionmaking—a component of problem solving, but the emphasis in decisionmaking is on recognizing learned patterns, reviewing courses of action, assessing their impact, selecting one, and allocating resources to it. Instruction in assessing courses of action has been shown to improve decisionmaking, but some aspects of successful decisionmaking are more likely to be inborn than trained.
Mental Flexibility and Creativity—the ability to generate and modify courses of action rapidly in response to changing circumstances. It includes the ability to devise plans and actions that differ from and improve upon “school solutions.” Capabilities that widen the range of options can be taught, but higher levels of creativity are more likely to be inborn than trained.
The above review suggests, first, that the creative processes needed by analysts can, to some extent, be broken down into components, and second, that these components can, again to some extent, be taught. Instructional technology can now substantially aid analysts in acquiring the facts, concepts, and procedures needed to perform their craft. However, it must become increasingly “intelligent” if it is to compress the years of experience analysts now need to become proficient and help them more rapidly acquire the advanced cognitive capabilities—those higher in Bloom’s hierarchy—that they also need. To do this successfully, instruction must be tailored to the specific background, abilities, goals, and interests of the individual student or user. Instructional technology must provide what has been called “articulate expertise.” Not only must it supply helpful and relevant guidance in these more advanced levels of knowledge, skills, and abilities, it must do so in a way that learners and users with varying levels of knowledge and skill can understand.
At this point, it may be worth reviewing the capabilities provided by “non-intelligent” instructional technology since the 1950s. It has been able to:
accommodate the rate of progress of individual students, allowing as much or as little time as each needs to reach instructional objectives;
tailor both the content and the sequence of instructional content to each student’s needs;
make the instruction easy or difficult, specific or abstract, applied or theoretical as necessary;
- adjust to students’ most efficient learning styles (collaborative or individual, verbal or visual, etc.).
Intelligent tutoring systems are a different matter. They require quite specific capabilities that were first targeted in the 1960s. Two key capabilities are that intelligent tutoring systems must:
allow either the system or the student to ask open-ended questions and initiate instructional, “mixed-initiative” dialogue as needed or desired;
generate instructional material and interactions on demand instead of requiring developers to foresee and store all the materials and interactions needed to meet all possible eventualities.
Mixed-initiative dialogue requires a language for information retrieval, tools to assist decisionmaking, and instruction that is shared by both the system and the student/user. The system must have the capability (referred to as “generative capability”) to devise, on demand, interactions with students that do not rely on predicted and prestored formats. This capability involves more than generating problems tailored to each student’s needs. It must also provide the interactions and presentations that simulate one-on-one tutorial instruction, including coaching, hints, and critiques of completed solutions.
Cost containment is one motivation for wanting to generate responses to all possible student states and actions instead of attempting to anticipate and store them. Another arises from basic research on human learning, memory, perception, and cognition. As documented by Neisser among others, during the 1960s and 1970s, the emphasis in basic research on human behavior and on the way in which it is understood shifted from the strict logical positivism of behavioral psychology, which focused on directly observable actions, to consideration of the internal, cognitive processes that were needed to explain empirically observed behavioral phenomena and are assumed to mediate and enable human learning.
The hallmark of this approach is the view that seeing, hearing, and remembering are all acts of construction, making more or less use of the limited stimulus information provided by our perceptual capabilities. Constructivist approaches are the subject of much current and relevant discussion in instructional research circles, but they are firmly grounded in the foundations of scientific psychology. For instance, in 1890, William James stated his General Law of Perception: “Whilst part of what we perceive comes through our senses from the object before us, another part (and it may be the larger part) always comes out of our mind.”
In this sense, the generative capability sought by intelligent instructional systems is not merely something nice to have. It is essential if we are to advance beyond the constraints of the prescribed, prebranched, programmed learning and ad hoc principles commonly used to design technology-based instruction. The long-term vision is that training, education, and performance improvement will take the form of human-computer conversations.
There has been progress toward this end. This conversational capability has been realized in systems that can discuss issues with students using a formal language, such as computer programming or propositional calculus. More recent research suggests that significantly improved natural-language dialogue capabilities can be achieved by instructional technology. Such an interactive, generative capability is needed if we are to deal successfully with the extent, variety, and mutability of human cognition. Much can now be accomplished by instructional technology, but much more can be expected.
The research discussed above suggests that instructional technology can:
reduce costs of instruction;
increase the accessibility of instruction;
increase instructional effectiveness for analysts;
reduce the time analysts need to learn facts, concepts, and procedures;
track progress and ensure that all learners achieve instructional targets;
provide opportunities for helping analysts to compress experience and achieve the higher cognitive levels of mastery demanded by their craft.
In addition, the findings suggest a rule of “thirds.” This rule posits that the present state-of-the-art in instructional technologies can reduce the cost of instruction by about a third and either increase achievement by about a third or decrease time to reach instructional objectives by a third. Eventually, instructional technology should provide a conversation between the analyst and the technology that will tailor instruction in real time and on demand to the particular knowledge, skills, abilities, interests, goals, and needs of each individual. This capability, now available in rudimentary forms, can be expected to improve and develop with time. Even in its current state of development, however, instructional technology deserves serious attention within the Intelligence Community.
 Dr. J. D. Fletcher is a research staff member at the Institute for Defense Analyses, where he specializes in issues of manpower, personnel, and training. He holds graduate degrees in computer science and educational psychology from Stanford University.
 Because instructional technology makes few distinctions between formal education and professional training, the term “instruction” will be used for both in this chapter.
 Benjamin S. Bloom, “The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring.” The dissertation studies were performed under Bloom’s direction.
 M. Scriven, “Problems and Prospects for Individualization.”
 Art Graesser and Natalie Person, “Question-Asking During Tutoring.”
 J. D. Fletcher, Technology, the Columbus Effect, and the Third Revolution in Learning.
 Sigmund Tobias, “When Do Instructional Methods Make a Difference?”
 J. D. Fletcher, “Evidence for Learning From Technology-Assisted Instruction.”
 Ken Spencer, “Modes, Media and Methods: The Search for Educational Effectiveness.”
 Jesse Orlansky and Joseph String, Cost-Effectiveness of Computer-Based Instruction in Military Training; H. Solomon, Economic Issues in Cost-Effectiveness Analyses of Military Skill Training; James Kulik, “Meta-Analytic Studies of Findings on Computer-Based Instruction”; Rob Johnston, “The Effectiveness of Instructional Technology”; Ruth Phelps et al., “Effectiveness and Costs of Distance Education Using Computer-Mediated Communication”; J. D. Fletcher Effectiveness and Cost of Interactive Videodisc Instruction in Defense Training and Education.
 J. D. Fletcher, “Computer-Based Instruction: Costs and Effectiveness.”
 Jesse Orlanksy et al., The Cost and Effectiveness of the Multi-Service Distributed Training Testbed (MDT2) for Training Close Air Support.
 Jesse Orlansky et al., The Value of Simulation for Training.
 Gene Glass, “Primary, Secondary, and Meta-Analysis of Research.”
 Chen-Lin Kulik., James Kulik and Barbara Shwalb, “Effectiveness of Computer-Based Adult Education: A Meta-Analysis”; Chen-Lin Kulik and James Kulik, “Effectiveness of Computer-Based Education in Colleges”; Rob Johnston and J. D. Fletcher, A Meta-Analysis of the Effectiveness of Computer-Based Training for Military Instruction; J. D. Fletcher, “Evidence for Learning from Technology-Assisted Instruction.”
 Sherrie P. Gott, R. S. Kane, and Alan Lesgold , Tutoring for Transfer of Technical Competence.
 Benjamin. S. Bloom, Taxonomy of Educational Objectives.
 J. E. Morrison, and J. D. Fletcher, Cognitive Readiness.
 M. R. Endsley, “Design and Evaluation for Situation Awareness Enhancement.”
 E. Tulving and D. M. Thomson, “Encoding Specificity and Retrieval Processes in Episodic Memory.”
 C. D. Morris, J. D. Bransford, and J. J. Franks, “Level of Processing Versus Transfer-Appropriate Processing.”
 The use of specific problem-solving methods repetitively.
 R. A. Wisher, M. A. Sabol, and J. A. Ellis Staying Sharp: Retention of Military Knowledge and Skills.
 G. Salomon and D. N. Perkins “Rocky Roads to Transfer: Rethinking Mechanisms of a Neglected Phenomenon.”
 J. H. Flavell, “Metacognitive Aspects of Problem Solving.”
 D. J. Hacker, Metacognition: Definitions and Empirical Foundations [On-line Report].
 M. H. Ashcraft, Fundamentals of Cognition.
 R. M. Shiffrin and W. Schneider, W. “Controlled and Automatic Human Information Processing: II. Perceptual Learning.”
 J. R. Hayes, The Complete Problem Solver.
 P. Slovic, S. Lichtenstein, and B. Fischoff, “Decision-making.”
 D. Klahr, & H. A. Simon, “What Have Psychologists (and Others) Discovered About the Process of Scientific Discovery?”
 E. Galanter, Automatic Teaching; R. C. Atkinson and H. A. Wilson, Computer-Assisted Instruction; P. Suppes and M. Morningstar, Computer-assisted Instruction at Stanford 1966-68; J. D. Fletcher and M. R. Rockway, “Computer-based Training in the Military.”
 J. S. Brown, R. R. Burton, and J. DeKleer, “Pedagogical, Natural Language and Knowledge Engineering in SOPHIE I, II, and III.”
 J. R. Carbonell, “AI in CAI: An Artificial Intelligence Approach to Computer-Assisted Instruction”; J. D. Fletcher & M. R. Rockway.
 U. Neisser, Cognitive Psychology.
 For example, T. M. Duffy, and D. H. Jonassen, Constructivism and the Technology of Instruction; S. Tobias and L. T. Frase, “Educational psychology and training.”
 William James, Principles of Psychology: Volume I.
 For example, BIP and EXCHECK, respectively. For the first, see A. Barr, M. Beard, and R. C. Atkinson, “A rationale and description of a CAI Program to teach the BASIC Programming Language”; for the second, see P. Suppes and M. Morningstar.
 A. C. Graesser, M. A. Gernsbacher, and S. Goldman, Handbook of Discourse Processes.
* Adobe® Reader® is needed to view Adobe PDF files. If you don't already have Adobe Reader installed, you may download the current version at www.adobe.com (opens in a new window). [external link disclaimer]