REPORT ON THE 26 JUNE 1985 CONFERENCE OF THE EVALUATION PANEL ON CIA ANALYSIS TRAINING

Document Type: 
Collection: 
Document Number (FOIA) /ESDN (CREST): 
CIA-RDP87-00956R000100050004-9
Release Decision: 
RIPPUB
Original Classification: 
K
Document Page Count: 
17
Document Creation Date: 
December 22, 2016
Document Release Date: 
October 19, 2010
Sequence Number: 
4
Case Number: 
Publication Date: 
October 28, 1985
Content Type: 
MEMO
File: 
AttachmentSize
PDF icon CIA-RDP87-00956R000100050004-9.pdf643.99 KB
Body: 
Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 MEMORANDUM FOR: Director of Training and Education FROM: SUBJECT: Report on the 26 June 1985 Conference of the Evaluation Panel on CIA Analysis Training All members of the Evaluation Panel have approved the attached report on the 26 June 1985 Conference, including Helene Boatner and Robert Dorn, who did not attend the Conference. STAT STAT Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 10 September 1985 MEMORANDUM FOR: Director of Training and Education FROM: Evaluation Panel on CIA Analysis Training SUBJECT: Report on the 26 June Conference 1. The representatives of the Directorate of Intelligence and the non-CIA participants in the 26 June conference agreed to constitute a continuing Curriculum Evaluation Panel on CIA Analysis Training. 2. The Evaluation Panel strongly endorsed the quality and utility of the three courses under review: New Analyst Course (NAC); Seminar on Intelligence Analysis (SIA); and Seminar on Intelligence Successes and Failures (ISF). 3. The principal recommendations by the Panel for increasing the impact of the courses on the Directorate were: a. That the NAC introduce a unit on Concept Papers, 1/ include more research-oriented materials and tasks in its exercises, and increase the time devoted to computer skills. b. That SIA experiment with clusters of students from three or four offices or divisions. c. That ISF accept less-experienced analysts on the recommendation of their division chief. 4. The next meeting of the panel is scheduled for 10 January 1986. Because of the concerns expressed by Panel members about the training of branch chiefs, the meeting will concentrate on training of branch chiefs, especially the Supervision of Analysis Seminar. The participants at the 26 June conference agreed to constitute a continuing Curriculum Evaluation Panel on CIA Analysis Training. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 The Deputy Director for Intelligence selected the following members for the Panel: Richard Kerr, Associate Deputy Director. Helene Boatner, Director, Office of Management, Planning and Services.* Director, Office of Global Issues. STAT Chief, National Issues Group, Office of Soviet Analysis. John Helgerson, Director, Office of African and Latin American Analysis. The Director of Training, in conjunction with the Deputy Director for Intelligence, selected the non-CIA members of the Panel. Dr. Richard Betts, Brookings Institution. Professor Robert Jervis, Columbia University. Robert Dorn, Center for Creative Leadership.* The following CIA officers have also agreed to serve as members of the Panel: Associate Director for Curriculum, Ottice ot raining. Senior Training Officer, Directorate ot Intelligence Office of Training, will serve as Executive Secretary to the Panel and as Conference Coordinator. As indicated in the background paper for the conference (at annex to this report), the Director of Training and Education has established the Evaluation Panel to strengthen the curriculum of courses on analysis training offered on the behalf of the Directorate of Intelligence. In opening the conference, he stated that his goal was to solicit the advice of Directorate managers and outside experts for making an already strong program the best of its kind in the country. *Did not attend the 26 June conference, but have approved this report and agreed to serve on the Panel. STAT STAT STAT Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 The Director of Training and the Evaluation Panel agree that achievement of this goal will require regular review of the curriculum to see that in fact the courses individually and collectively meet the priority needs of the Directorate in the area of analysis training. The effectiveness of the program also depends on the availability of high-quality and well-supported course directors, who command the respect not only of the participating students but also of the Directorate managers who sponsor them. The first substantive discussion addressed definition of the special requirements of and recurring barriers to effective analysis that should undergird the curriculum. The Evaluation Panel endorsed the list elaborated in the conference paper (pages 3-5): 1) Policy Relevance; 2) Ambiguous Information; 3) Effective Use of Assumptions; 4) Over- and Under-Confidence; 5) Clarifying Levels of Confidence; and 6) Alternate Analysis. The DI Panel members, however, produced a list of the attributes of an effective analyst, which they believe also require priority attention in the curriculum. 1. Effective command of "tools," especially writing, briefing, and computer skills. 2. Readiness to perform the full range of roles of an intelligence analyst (e.g., developing data bases as well as publishing current intelligence). The standard presented to analysts should be "Perform or Perish," not "Publish or Perish." 3. Greater understanding of the substantive review process, and command of the skills to facilitate it (e.g., eliciting and giving feedback). 4. Ability to use Concept Papers effectively, to clarify the purposes and audience of assessments, and thus to speed the review process. 5. Effective relationships with counterparts throughout the Agency and Intelligence Community. The first two items on the list fall almost exclusively in the province of the New Analyst Course. But the list in general will serve as another standard for testing the scope and emphasis of the syllabi of all analysis training courses. For example, the Seminar on Intelligence Analysis can address facilitation of the review process in its coverage of utility analysis. And the Seminar on Intelligence Successes and Failures can emphasize Concept Papers in its coverage of policy relevance. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 The Panel members present at the conference strongly endorsed the quality of the courses under review and their utility for the Directorate of Intelligence. The DI managers, while they tabled most of the specific criticisms, were also the most outspoken in expressing appreciation of the program. Dick Betts and Bob Jervis, the non-CIA panelists who attended the conference, were more tentative in both their criticism and their praise -- reflecting their initial limited exposure to both the purposes and performance of the courses. However, they both brought to bear their broad experience on the general subject of effective intelligence analysis, and thereby helped to define some of the inherent tradeoffs confronting the program (e.g., between emphasis on basic skills and on creative analysis). NEW ANALYST COURSE (NAC) The NAC evoked the most attention at the conference -- testimony to its importance to DI managers during a period of unprecedented influx of new analysts. The Panel strongly endorsed the value of the NAC for the students and for the Directorate. The Panel agreed that the course's length should not be extended beyond its present six weeks; and that it should remain a "survival course," concentrating on the skills and values w analysts need to survive their three-year period of probation. Bob Jervis noted, however, that the "model" of a single full time and relatively short course sets the limits for its values, within which only marginal improvements can be sought. The DI has rejected two alternate models which would provide different values: a series of part-time courses tailored to the needs of individual recruits; and a prolonged program (such as Career Training) which would provide more time for the covering these wide-ranging needs. The following specific recommendations for the NAC were raised, mostly by the DI Panelists. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 L 0 N~ V~ 14~, 1. Concept Papers. Too many draft assessments still show a diffuseness of purpose that encumbers the review process. New analysts must tailor their assessments to a specific audience and purpose. Concept Papers constitute essentially a contract between analysts and managers, and the NAC should introduce a unit on the value of and formats for Concept Papers. who spoke most forcefully to the need for STAT such a unit kindly volunteered to present it at the next running of the course. 2. Compute. Shills. ADP competency will be essential to the future functioning of the Directorate, and an expansion of the present two and one-half days of instruction will probably be needed as this increased dependency develops. In fact, the NAC now includes five days on ADP. 3. The NAC should increase its attention both to research and to non-political analysis. The unit on source familiariz n, for example, can address data bases as well as time-sensitive traffic. And the writing drills can include an article for the weekly economic serial. An exercise on conventional military analysis has already been added. 4. In this context, of broadening the definition of what is important to the Directorate and therefore career enhancing for new analysts, the Panel recommended that the standard of "Publish or Perish" be replaced with "Perform or Perish." Toward this end, new analysts should be instructed that they are expected to develop competency in all aspects of intelligence analysis: e.g., the development of data bases as well as policy-relevant current analysis. SEMINAR ON INTELLIGENCE ANALYSIS (SIA) The Panel also strongly endorsed SIA. Both the student participants and the sponsoring office managers see the course as providing substantial and distinctive values in the form of greater capability to apply discipline (structure) and creativity to analytical assignments. The main concern was how to cope with the analysts' perception of the lack of positive organizational incentives for applying the techniques learned in the course. No one could come up with a credible definition of the source of the resistance--although it was suggested it might be the residual influence of long-retired "city-room" supervisors. In a post conference discussion of the problem of resistance to change, Panel member Bob Dorn recommended that division chiefs be tasked to provide the needed incentives, since they can afford to take more risks than branch chiefs. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 The Panel also recommended increasing the post-course impact by structuring each class with clusters of four or five students from three or four divisions or offices. This arrangement will be attempted in the near future--in order to provide some commonality in substantive specialities (not available when the students represent all the DI offices) and also some diversity in organizational dynamics (not available when the class represents a single office). The Panel agreed that SIA should continue the recent trend of accepting analysts with three to five years of experience (rather than the old norm of seven or more years). One of the DI Panel members indicated that SIA should nonetheless still be seen as a course for a limited number of analysts. The non-CIA members, in contrast, thought that all or most analysts should be exposed to the values of the course. SEMINAR ON INTELLIGENCE SUCCESSES AND FAILURES (ISF) The Evaluation Panel also strongly endorsed ISF as presently constituted, in terms of the beneficial impact on the students and on the Directorate. One of the DI members suggested that junior analysts, on the recommendation of their division chiefs, be accepted for the Seminar. The next meeting of the Evaluation Panel is tentatively scheduled for the 10 January 1986. During the 26 June conference, concern about the development and training of branch chiefs was repeatedly raised as a priority concern in the Directorate. Because of a relative dearth of senior analysts, new branch chiefs are being selected with less experience than in the past. Moreover, there are fewer experienced analysts in their units to share the responsibility for supervision and on-the-job training. Consequently, the January meeting of the Evaluation Panel will address the Supervision of Analysis Seminar -- a two week course for new branch chiefs, as well as other OTE supervisory courses. At the same time, the Panel could discuss what other services the Office of Training can provide to address the general problem of relatively inexperienced and overtaxed supervisors. Perhaps arrangements can be made for course directors and DI annuitants on contract to OTE to visit the branches periodically to assist with on-the-job training (circuit-riding instructors). Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 =6, I ? , i f S (,( Work Style Preference Thinking Styles Myers-Briggs FIRO - B ALPORT/VERNON/LINDZEY GUILFORD/ZIMMERDMAN COOPERATIVE/COMPETITIVE RELATIONSHIPS TJOSVOLD/ANDREWS Turn Around Simulation XY Exercise Course Advanced Intelligence Seminar Midcareer Course Advanced Intelligence Seminar Introduction to Intelligence Assistance Experienced Intelligence Assistants Course Intelligence Analysis Course New Analyst Course Seminar on Intelligence Analysis Supervision of Analysis Course Executive Leadership Forum ELF Courses ELF Courses ELF Courses Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Leadership Styles & Behavior Management Development Course DESCRIPTION Styles of Leadershi Measures oneg Survey (Hall & Williams) managerial style according to managerial Grid. Management of Motives Index (Hall) Assesses assumptions and practices which characterize the manager's attempts to motivate others Conflict Management Survey (Hall) Looking Glass, Inc. Skills Assessment (LGI) Form Program on Creative Management Health Risk Appraisal Situational Leadership Questionnaire Kirton Adaption- Innovation Surveys one's characteristic reactions to and handling of conflicts between oneself and others. Assesses managerial skills of participants. Being discontinued 1 Jan 86. Predicts medical areas of future concern. Measures style of management Measures differences in how participant defines and solves problems because of a preference for adaptive or innovative approach to new information and change. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 COURSE Supervisory Counseling Course INSTRUMENT FIRO-B Explores the typical way participant interacts with people. California Psychometric instrument Psychological with 18 scales measuring Inventory psychological well-being and behavior. 3-S Questionnaire Measures a person's preference for structure in the work environment. Myers-Briggs Type Indicates how the Indicator participant prefers to look at things and go about deciding things. Managerial Job Satisfaction Questionnaire Measures level of participant satisfaction with management and comparison of responses with data base response. Strong-Campbell Interest Inventory Leadership Decision Styles Survey Leadership Style Indicator (LSI) Conflict Management Instrument USC) Helps people make occupational decisions by identifying patterns in likes and dislikes and comparing patterns with those of people in a wide range of occupations. Indicates leadership decision style preference in 16 situations. Accesses managerial skills of participants. Assesses the overall conflict management style- competing vs collaborat'STAT etc. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 SUBJECT: Instruments Used in Administrative Systems Branch Courses Career Development Strong Campbell Interest Inventory Helps individual understand work interests and shows some kinds of work in which they might be comfortable. Women in the Work Force Personal Profile System (1) Identifies work behavioral style. Helps individual understand self and others in the work environment. t -,A(vh,-t Effective Development Course Myers-Briggs Type Indicator Shows how individuals look at things and make decisions. Individuals learn their preference and the preferences of others which helps them understand how people relate to each other. Values Analysis Profile (Massey) Helps individual identify value systems which gives insight on how they approach decisions and choices. Management Skills for Strength Deployment inventory (2) Secretaries and Administrative Assesses the strengths used in relating Assistants (MSSAA) to others under two conditions--when everything is going well in relationships and when faced with conflict and Opposition. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Job Descriptive Index (3) Indicates how individual currently feels about different aspects of their job. Myers-Briggs Type Indicator (See above) (3) Strong Campbell Interest Inventory (3) (See above) Fundamental Interpersonal Realtions Orientation-Behavior (FIRO-B) (3) Looks at interpersonal behavior. (1) Used once so far by one contractor (2) Used regularly by one contractor (3) Used in CCL version of MSSAA Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 24 October 1985 MEMORANDUM FOR: Chairman, Curriculum Committee Chief, Career Trainee Division SUBJECT: "Instruments" Used in CTD Courses 1. Myers-Briggs is "used" in the Interpersonal Skills and Orientation to the DI segments of the Career Trainee Development Course. The CTs have already taken the test as part of their EOD processing but have not received feedback (although one of the PSD psychologist, does go over this test with the people he interviews). The test itself is discussed in the Interpersonal Skills segment and feedback is given to help the CTs understand themselves better and thereby facilitate their interaction in groups. The Myers-Briggs instrument also is touched on lightly in the DI portion of the CTDC in an effort to provide information on how analysts perceive other analysts in the workplace. 2. NUSITE will be done for the first time in the Administrative Trainee Course on 1 November 1985 by Helen This course is for CTs going to the Administration STAT STAT STAT STAT Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 I 1INISTRATIVE-INTERNAL USE ( LY ? OFFICE OF TRAINING AND EDUCATION INSTRUCTION KANADEMEW OTE INSTRUCTION 81-12 5 October 1981 COURSE REPORTS 1. The purpose of this Instruction is to provide guidance for preparing and submitting Office of Training and Education course reports to the Director of Training and Education.' OTR INSTRUCTIONS TRI 7-4,' dated 20 June 1977, and TRI 7-3/T8, dated 4 April 1978, are rescinded. 2 ' Annual course reports are to be prepared on Office of Training and Education courses. The, due date for annual course reports is 15 January. New Office of Training and Education courses will have End-of-Course Reports prepared for each of the first three runnings and are due within 10 working days : -after. completion of the course. Thereafter, course reports will be submitted annually. Courses or workshops conducted on a special or ad hoc basis will also require an End-of-Course Report at the completion of each running. Deviations from this schedule of reporting will be established on a specific case basis by the Director of Training and Education and the school chief involved. 3. In preparing course reports, course directors should provide a reasonably comprehensive accounting of successes and failures in meeting course objectives in a(brief report (no more than three pages in length). The course schedule, containing scope notes and guest speakers, will be included as an attachment. To establish uniformity in course report content, course directors should use the following general outline: Introduction This paragraph Includes an identification of the course: whether it was a regular or a special running, when and where the course was run, and how many participants attended. Detailed statistics on course costs and student grades, ages, Agency service, home directorate, etc., will be included only if the information adds significantly to an understanding of the course. 'The Language School will be exempt from the general guidance outlined in this Instruction, but it will continue issuing its own course reports. Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 A -NISTRATIVE-INTERNAL USE Summary This paragraph includes highlights of successes and failures in meeting course objectives, significant participant or speaker presentations, and concerns relating to events and trends affecting the Agency. Evaluation This paragraph includes an .overview of participants' evaluation of course and whether course objectives were of value, etc. Recommendations .This paragraph includes recommendations of the course director on the course (improving course objectives, parts to change, parts to keep the same, etc.). Course reports will be disseminated as follows: -The original copy will be forwarded to the Director and Deputy Director of Training and Education via the Executive Officer and the appropriate school chief. The original copy will be returned to the course director for retention. --A copy will be forwarded to the Plans Group; Chief, Administration Division; Chief, Training Support Division; and Chief, Central Registrations Branch. This copy will be retained by Registrations for six months and then destroyed. --Copies, with course schedules, will be forwarded to the Senior Training Officers for the National Foreign Assessment Center, Directorate of Operations, Director of Central Intelligence Area, and Directorate of Science and Technology for all Office of Training and Education courses in which their employees are enrolled. These copies should accompany the original course report to the Director of Training and Education. The Senior Training Officer for the Directorate of Administration will be provided copies of course reports only upon his request. -Copies of course reports may be provided to other Office of Training and Education school chiefs and instructors for information purposes. 2 ADMINISTRATIVE-INTERNAL USE ONLY Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9 'INISTRATIVE-INTERNAL USE ( '.t I Additional dissemination outside the Office of Training and Education of course reports or information contained therein will be at the discretion of the Director and Deputy STAT Director of Training and Education. Director of Training and Education { 3 ADMINISTRATIVE-INTERNAL USE ONLY Sanitized Copy Approved for Release 2010/10/19: CIA-RDP87-00956R000100050004-9