Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
DOCUMENT RESUME
ED 226 713' I. IR 010 594ra
AUTHOR' 1Ricard, G. L., Ed.; And Others'...
TITLE Workshop oi Instructional Features andImstructor/Operator Station Design fdr Training
.. Systems. .,
INSTITUTION naval Training Equipment Center, Orlando, Fla. HumanFactors Lab.
REPORT NO NAVTRAEQUIPCEN-411-341PUB4DATE Oct 82
. NOTE . 182p. . /
PUB TYPE Collected Works - Conference Proceedings (021) --.Guides Non-Classroom Use (055) .
EDRS PRIbE ""siMF01./PC08 Plus Postage.DESCRIPTORS Classroom Design; Computer Oriented Programs;
'Facility Guidelines; *Human Factors Engineering; *ManMachine Systems; *Military T4aining; PostsecondaryEducation; *Simulation; Systems Development; TeacherRolel *Training Methods; Work Environment
IDENTIFIERS *Computer. Simulation; Interactive Systems
ABSTRACTThese 19 papers review current research and
development work related to the operation of the instructor's stationof training systems, with emphasis on developing functional station ,
specifications applicable to a variety pf simulation7based trainingsituations. ToPics include (1) instructional features; (2)instructor/operator station research and development; (3) operationalproblems in instructor operation station designs; (4) instructional .
'support features in flying training.simulato5s; (5) the instructor as' .
a uses in automated training systems; (6)'instructor roles an'doptimization of the instructor station interface; (7) front-endanalysis leading to VTX instructor operating stations functionalspecifications; (8) conceptual design of an instsuctional supportsystem for fighter and trainer aircraft flight simulators; (9) theOAS/CMC PTT instructbr station; '(10) training effectivenessevaluations of simulator instructional support systems; (11)computer-aided design systems in workstation desiogn; (12)instructor/operator station design concepts; (13) modular control ofsimulators; (14) simplified control of the simulator instructional'system; (15) the instructional subsystem; (16) procedures monitoringand scoring in a simulator; (17) student self-training capability in '
flight simulators; (18) generic skmulator instructor training; and(19) application of artificial intelligence and voicerecognition/synthesis to naval training. (LMM)
L.
************************************************************************ ( Reproductions §upplied by EDRS are the best that can be made
from the original document.. - ,
**********************************************************************
41,
411*
4.
U.S. DEPARTMENT OF EDUCATIONNATIONAL INSTITUTE OF EDUCATION
EDUCATIONAL RESOURCES INFORMATIONCENTER IERIC)
Trus document has been reproduced ast metered from the person or tacanuaton
ongnatmgMoor changes have been made to enproveretroduchon qua' ri
Roots of ..ey, or opoons stated fl thrs docu
mem do ',dr necessarr.y represent office) ME
pt.OrrOtpOt4CY
.
WORRSHOF ON INSTRUCTIONAL FEATURES AND
A INSTRUCTOR/OPERATOR STATION DESIGN FOR
TRAINING SYSTEMS
editedby -
G.L. Ricard
T.N. Crosby'
E,Y. Lambert
Naval Training Equipment CenterOrlando, Florida 32813October 1982Techhical Report: NAVTRAEQUIPCEN 341.
2
.
NAVTRAEQPIFICEN I1-341 .
:
1...PREFACE
On the 10th and llth of August 1982, the Naval Training
Equipment Center was the host organization for a Workshop on
Instructional Features and Instructor/Operator Station pesign
for Training Systems sponsored by the Chief of Naval Research.
The goal of the workhop was to review current research and
development work related to the,opération of the instructor's
station of training Systems particularly with an eye fOr
developing functional specifications'for these stations.
Historically much of the research and development work has
been sponsored by the Naval Air Systems Command and the
Air Force; but we-feel that many of the deyelopme'nts are
applicable to a variety of simulation-based training situ-
ations.
S6eral people deserve thanks fos tptir help in making
the workshop possible. 'GDR P. 11. Curran of the Office 0'
Naval Technology cheerfully encouraged us and iirovided
support, ind locally at the Human FaCtors Laboratory of the
Naval Training Equipment Center, Catherine 8ottelman,and
Hilda Worsham both performed the many chores necessary
during the workshop and later edited the papers and pre-
pared the report for printing. We thank them all.
The Editors
4
wetNAVTRAEQUIPCEh IH
TABLE OF CONTENTS
PAGE
COMMENTS ON INSTRUCTIONAL FEATURESDr. GilbertL. Ricard 5
INSTRUCTOR/OPERATOR STATION RESEARCH AND DEVELOPMENT ATAFHRL/OTDr. Harold D. Warner V 9
OPERATIONAL PROBLEMS IN INSTRUCTOR OPERATION STATION DESIGNSDr. John P. Charles 13
THE BEAL WORLD AND INSTRUCTIONAL SUPPORT FEATURES IN FLYING TRAINING SIMULATORS-. Mr. Clarence P. Semple andstir. Barton K. Cross' II 23
THEY CAN MAKE DR BREAK YOU: CONSIDERING THE INSTRUCTOR AS A USER IN AUTOMATEDTRAINING SYSTEMS
Dr. Robert Halley 33
INSTRUCTOR ROLES: OPTIMIZATION OF THE INSTRUCTOR STATION INTERFACEDr. B. E. Mulligan and Lt Thomas N. Crosby 39
FRONT-END ANALYSIS LEADING TO VTX INSTRUCTOR OPERATING STATIONS FUNCTIONALSPECIFICATIONS
Jeffrey N. fRunches 53o
CONCEPTUAL DESIGN OF AN INSTPUCTIONAL SUPPORT SYSTEM FOR FIGHTER AND TRAINERAIRCRAFT FLIGHT SIMULATORS
Mr. Vernon E. Carter61'
A DESIGN WITH THE INSTRUCTOR IN MIND; THE OAS/CMC PTTASTRUCTOR STATIONMajor Joseph C. Stein and Captain Michael E. Shannon 79
TRAINING EFFECTIVENESS EVALUATION OF SIMULATOR INSTRUCTIONAL SUPPORT SYSTEMSDr. Steve R. Osborne and Dr. George W. Menzer, 91
'USE OF COMPUTER-AIDED DESIGN.SYSTEMS IN WORKSTATION DESIGNDr. Gerhard Eichweber 95
INSTRUCTOR/OPERATOR STATION DESIGN CONCEPTS.Dr. William M. Hinton and Mr. Walter M. Komanski 99
MODULAR CONTROL OF'SIMULATORSMr. Steve Seidensticker and Ms. Catherine Meyn 107
SIMPLIFIED CONTROL Of THE SIMULATOR INSTRUCTIONAL SYSTEM -
Mr. Robert A. Gamache 115-
THE INSTRUCTIONAL SUBSYSTEM, A MAJOR COMPONENT OF THE TRAINING DEVICEDr. Thomas J..Harilme11
PROCEDURES MONITORING AND, ,SCDRING IN A SIMULATORMr. Steve Seidensticker and Mr. Terry Kryway
A
127
149
THE CASE FOR A STUDENT SELE-TRAINING CAPABILITY IN FLIGHT SIMULATORS ,
Mr. Joseph L. Dickman 155
GENERIC SIMULATOR INSTRUCTOR TRAININGpr. George W. Menzer and Dr. Steve R..0sborne 165
APPLICATION OF ARTIFICIAL INTELLIGENCE AND VOICE RECOGNITION/SYNTHESIS TONAVAL TRAINING'
Mr. John A. Modrick, Mr. Jan D. Wald and Mr. John Brock 169
374
4
COMMENTS ON INSIRUCT1ONA FEATURES .
GILBERT L. RICARDNaval Training Equikent Center
During the past decade we have seen dra-matic changes/of the role instructors play in/device-based training and of the trainingtasks to which simulation has-Veen applied.The main change we have seen has been the ex-tension of synthetic training to tasks whichsimulation previously could not support. Muchof'this haspbeen due to the development ofsoftware and hardware to create visual dis-plays, the application of new means of inter-acting with computeg-based trainers, and theproliferation of smal computer systems withlarge amounts of memory.. Of course, some ofthis change has been the r*sult of an in- 4/creased celiance on simulation for training--both bY military training commands and ci-vilian organizations--but most has resultedfrom thegreatly increased capabilities of Viedevices themselves. All change seems to haveits good and bad aspects, and while the ex-tension of simulator-based training to complextasks is most welcome, our nOtions of how in-structional personnel should,interact withthese new trainers have not kept pace withhardware development. Three trends have con-tributed to this problem:
1. Device-based flight training hasdeveloped from the,simulations of takeoff andlanding of a decade.ago to almost all aspectsof the tactical use of aircraft. Largely,this result has come from the development ofsimulations of sensors--radar, 9onar, forward:looking infrared, low-light level TV, andcomputer-generated visual scenes--which allowall members of a crew to Onteract in the sametraining scenario. Tbus,( individual proce-dural training has expanded to tactical teamtraining where more than one trainee's actionsmqst be monitored by instructional personnel.As the number pf trainees participating in atraining event increases, so also does thenumber of instructors needed. The resultingrequirement for the design of training equip-ment has been to reduce the workload and sup-port the Communication of these instructors.Partly this is to make their jobs easier (bothto do and to learn), and partly this is toenable them to act MOre effectively as in-structors.
2. The instructor/operator station is .
the interface instructional personnel have tba training device and ttle services it pro-.vides. The past decade has seen a good dealof change in these consdles. The repeaterinstruments and knobs and dials once commonhaye,been replaced by cathode ray tubes, alpha-numOic and multifunction keyboards, joy-sticks, and the like. Today training tasksare selected by entering data in response to
choices presented in menu form on programmabledisplays and light pens are used to selectmany of the options. On the horizon, we seeflat panels with touch sensitive surfaces.re-placing'much of the input/output hardware.incurrent use. Such systems present in succes-s sjve displays the information and choices thatpreviously were spread over console surfaces.
/These systems pose the design problem of r,e--"casting the rules of the spatial grouping of "-iigormation and choices into successive tem-poral groupings. In the future we expect thata single surface will serve to present infor-mation and record inputs, and-system designersshall have_to be, both Selective and efficientin their 'choice of what to display at a giventime.and what information to require from in-structors. Rules, or at least guidelines,will have to be developed for moving about inmenu-driven displays, for recovering fromincorrect inputs, and for keeping track ofchoices that have been made- This problem ofengineering the flow of events which;presentinformation to instructors and allow them toexert control over a device seems to be at,theheart of deficiencies in today's comp)extrainers. In fUture systems, probably no-where will the conflicting requirements ofallowing for flexible usage and of providingfor standardized instruction need to be moredelicately bfilarited than at the operator'sinterface to*the-device.
5
3. The past decade has seen great in-creases of the processing power and storagecapability of computersystems--all at everfalling prices. It has become common to seetraining devices with several minicomputers
' networked to process an appropriately parsedproblem,.and we expect that the future willonly see'more of this as microcomputers aredevoted to smaller and smaller elements of theinformation 'processing and display tasks.This is true now for many of the calculationsinvolved in a simulation and will be true alsofor the services a &Ice offers its users.From an instructional point of view, thesedevelopments allow possibilities previouslyconsidered impracticable--where the computersystem of a trainer not only supports thesimulation of a training task but also storesand manipulates data-useful for instructibnalcontrol. Today's computers are not only dataprocessors, but machines to store and operateupon data bases as well as engines that canmanipulate symbolic relations and make infer-ences. These new capabilities, emerging assoftware with supporting hardware, will allowtraining systems undreamed of a decade ago.The problem for design then is to define howsuch servsices should operate and bow to fit
46'
them into operational training devices.
, These three trends have resulted in therequirement to define better the instructional
role a training device's computer capabilityshould fulfill and to define functionally the-services a training device should support.Research and development'efforts have stressedthe observation that devices should providenot only a simulation of a task byt should
also be equipment to support the trainfng,of
that task. Much of the research work has ex-amined instructional features that,have beenincorporated into current devices or ha§ sug-gested new features. Presently instructional
features fall into two classes: thosede-.signed to support real-time instruction usinga device and those designed to provide off-line services for instructional or administra-tive personnel. Many of the real-time train-ing features are old friends such as the abil-ity to freeze a task, set environmental con-ditions, move vehicles and the like. Re-
search efforts have produced others such asmodels to determine the actions of targets orthe output of voice synthesizers, computermeasurement of a trainee's progress, and
schemes to automate feedback for training.Features designed for off-line support tend tobe relatively new because' the storage regyredfor them has only recently become available.Here we have seen developments in the keepingof students' records, in providing briefingand debriefing information, in the incorpora-tion of software to tutor instructors on theuse of a device, and in programs to define andcreate new training exercises. While neither
of these lists is inclusive, they do provide aflavor of the developments we expect to see-inthe training device4 of the future,
Every:so often it is prudent to review the
II
ways we do things anti-the techn ogy we employ
to do them. From problems pn vered in train-
ers currently operational, it is clear that
now is the time to examine the-functional de-sign of the instructor/operator stations oftraining equipment. We need to be able to
take advantage of hew technology by knowinghow to design it well so that it can supOrttraining Rasily. The goal of this workshop
then is to provide information for the'devel-opment of new systems as well as to present anoverview of the knowledge currently available.
Large and growing literature exists onmany of the topics related to the design ofinstructor/operator stations. These include
topics such as instructional systems devel-opment, workplace design, and the utilftationof training devices as well as discussions oTthe design of man-computer dialogues and de-scriptidns of display and control equipme)it for
man-machine interfaces. Only a smell portion
of this writing is devoted to the design andoperation of traioing equipment; and for Ihemost part it exists as a technical literature
-
6
which may not be widely distributed. Because
of this, a bibliography follows that listsmany of the reports discussing instructionalfeatures and console operation particularlyfor training applications.
BIBLIOGRAPHY
Boff, K. R., and E. A. Martin, Aircrew Infor-mation Requirements in Simulator
Display Design: The Integrated Cuing
Requirement Study. In Proceedings of
the Znd lnterservice/Industry Train-ing Equipment Conference and Exhibi-
tion, November 1980, PP. 355-359-
Bell, J. D., Elosed-Loop Training SystemsThrough the Application of Instruc-tional Features. In Proceedings of
4the 2nd Interservice/Industry Train-ing Equipment Conference and Exhibi-tion, November 1980, pp. 180-188.
Charles, J. P., Instructor Pilot's Role inSimulator Training (Phase II). Tech-
nical Report NAVTRAEQUIPCEN 76-C-0034-1, August 1977.
Charles, J. P., Instructor Pilot's Role inSimulator Training (Phase III).Technical Report NAVTRAEQUIPCEN76-C-0034-2, June 1978.
Charles, J. P., Device 2F119 (EA-6B WST)InstrIttor Console Review. Technical
ReporT NAVTRAEOUIPCEN 81-M-1083-1, InPress.
Charles, J. P., Device 2F112 (F-14A WST)Instructor Console Review. Te hnical
Report NAVTRAEONIPCEN 81-M-1 1-1, In!
Press.
Charles, J. P., G. Willard, and G. Healey,
. Instructor Pilot's Role in Simulator
Training. Technical Report. .
NAVTRAEQUIPCEN 75-C-0093-1, March
1975."
Cotton, J. C., arid M. E. McCauley, Voice -
Technology Design GUides for Navy
Training Systems. Technical ReportNAVTRAEQUIPCEN 80-C-0057-1, In Press.
Dickman, J. L., A Comparative Analysis ofThree Approaches,to the Man-MachineInterface Problem at the Flight
Simulafor Instructor/Operator Sta-tion. In Proceedings of the 3rdInterservice/Industry Training Equip-
ment Conference and Exhibition.November 1981, pp. 158-174.
Elworth, C., Instructor/Operator Display Eval-uation Methods. Technical ReportAFHRL-TR-79-41, March 1981.
6
Gregory, R. L., The Display Problem of FlightSiMulation..,In 3rd Flight Simula-' 0
tion SympoAlum (Theory and Practicein Flight Simulation), kOril 1976.
. 4
Hammell, T. J.., and T. N. Crosby,.Developme ntof SMARTTS Training Technology. In
Proceedings of the 2nd Interservice/. Industry Training Equiment Confer-.
env' and Exhibition, November 1980,pp. 269-276.
Hinton, Jr., W. M., arid W. M. Komanski,
Instructor/Operator Station Design.Study. Technical Report NAVTRAEQUIP-CEN 80-D-0009-1, April 1982.
Hughes, R. G., Enabling Features Versus
Instructional FeatUres in Flying .
Training Simulation. In Proceedingsof the llth'NTEC/Industry ConferenceTechnical Report NAVTRAEQUIPCEN IH-306, November 1978, pp. 137-144.
Hughes, R. G., Advanced Training Features:
Bridging the Gap Between In-Flight,and Simulator-Based Models of FlyingTraining. Technical Report AFHRL-TR-78-96, March 1979.
Hughes, R. G.., S. T. Hannan, and 14,.E. Jones,
Application of Flight SimulatorRecord/Replay Feature. Technical
, Report AFHRL-TR-79-52, 1979.
Hughes, R. G., G. Lintern, D. C. Nightman,R. B. Brooks, and J. Sinqleton,,Application of Simulator Freeze toCarrier Glideslope Tracking Instruc-tion. Technical Report NAVTRAEQUIP-CEN 78-C-0060-9/AFHRL-TR-82-3,July 1982.
Isley,,R. N., and E. J. Miller, The Role of' Automated Training in Future Army.
flight Simulators. Technical ReportPM TRADE FR-ED-76-27, October 1976.
Mitchell, D. R., Trainee Monitoring, Perfor-mance Measuring, Briefing, and De-Briefing. In.proceedings of the 1stInterservice/Tndustry Training Equip-ment Conference. Technical ReportNAVTRAEQUIPCEN IH-316, November 1979,pp. 117-122.
Murphy, G. L. Advancements in InstructorStation Design.. In Naval TrainingADevice Center 25th AnniversaryCommemorative Technical Journal,Naval Training Equipment Center,Orlando, Florida, November 1971,pp. 175-182.
Pohlman, L. D., R. N. Isley, and P. 14.'Cardl,
A Mechanism for Commilnicatirig
-Simulator Instructional Feature
4
7/8"
Requirements. In Proceedings of the1st Intersermice/Industry TrainingEquipment Conference. TechnicalReport NAVTRAEQUIPCENNovember 1979, pp: 133-144.
Semple, C. A., J! C. Cotton, and D. J.
Sullivan, Aircrew Training Devices: '
Instructional Support Features.Technical Report AFHRL-TR-80-58,January 1981.
Semple, C. A., D. Vreuls, J. C. Cotton,'D. R. Durfee, T. Hooks, and E. A.Butler, Functional Design of anAutomated Support System for Doer-,ational Flight Trainers. TechnicalReport NAVTRAEQUIPCEN 76-C-0096-1,January 1979.
Smode, A. F., Human Factors Inputs tolheTraining !Yes/ice Design Process.Technical Report NAyTRADEVCEN 69-C-0298-1, September 1471.
Smode, A. F., Training Device Design: HumanFactors Requirements in the TechnicalApproach. Technical Report NAVTRA-EQUIPCEN 71-C-0013-1, August 1973.
Smode, A. F., Recent Developments in Instruc-tor Station Design and Utilization ,
for Flight Simulators. Human Factors,1974, 16, 1-18.
INSTRUCTOR/OPERATOR STATION RESEARCH AND DEVELOPMENT AT AFFIRL/OT
Dr. Harold D. WarnerUniversity of Daytpn Research Institute
INTRODUCTION
The Operations Training Division ofthe Air Force Human Resources LaboratorY(AFHRL/OT) has as its mission the improve-ment of Air Force combat effectivenessthrough training related 'science and tech-nology. Located at Williams Air Forceease, Arizona, AFHRL/OT performs thismission through two primary functions:behavioral RID 6 solve flying trainingproblems through improved technology; and,engineering RAD to develop trairOng devicesAs vehicles for training R&D. The person-nel who support these two areas form a
,diverse, multidisciplinary team ofspecialists ranging from research psycholo-gests, research instructor pilots, and
human factors -specialists to aerospaceengineers, mathematicians, and computerprogrammers. AFHRL/OT operates and -main-'tains No of the Aatiom's most advancedsimulation facilities for training and RID,the Advanced SImulator for Pilot Training(AS ) at Williams Air Torce Base, and- theSi ulator for Air-to-Air Combat (SAAC) -atLuke Air Force,Base. At present. the ASPTis configured with an F-16 and an A-10cockpit, and the SAAC provides two F-4cockpits for simulated air-to-air combat.
AFHPL/OT R&D effort's have been accom-plished in a variety of areas" includingsimulator training effectiveness, trainingmethods, pilot performance measurement, andpilot perception and cognition. In the
area uf, simulator training effectiveness,effoqs have been directed toward thedeVe1Oment and assessment of visual
Ei
display uttems and display generationtechniquesmVPhcluding h lmet-mounted visualdisplays and computer generated imagery;
the evaluation of alternative motion cueingsystems such as the G-seat, G-suit, andcockpit motion platform; and, the deter- -mination of visual displaV requirementsincluding field-of=view, color, resolueion,aerial perspective, and terrain cues. Asfor the area of training methods, efforts
,have addressed, for example, whole vs.
part-task training using full-missionflight simulators and desk-top. special-function trainers. The area of pi'otperformance measurement has also beenstrongly supported and includes experimentson measdres of basic ,flying and, combatskills, pilot workload, and stress. In thearea of pflet perception and cognition, a
'variety of pilbt,factors have been ekamined
such as decision making, tactical planning,and visual cue utilization.
Because the efficacY of aircrew trainingis dependent not only on the training capa-bilities of flight simulators but also onthe quality of flight instruction., AFHRL/OThas applied _considerable resources to thedesign and- evaluation of instructor/operator station (I0S) controls, displays,and workstation layouts as well as theevaluation of various IOS configurations.Tbe specific efforts that weee accomplished.by AFHRL/OT: and the activities in
progress, are as follows.
AFHRL/OT IOS R&D ACCOMPLISHMENTS
)
1./ A-10 Flight Simulator IOS.' Ani IOS
was designed for the Air Force A-10 Opera-tional Flight Trainer. This effort involveda determination of the IOS controls anddisplays required for instructing simulatedA-10 flfght and, the design of an A-10 IOSconscstent with these reiluirements. In
this R&D effort. a groop of A-10 instructorpilots (IPs1 wag observed during the con-duct of flight simulator training in theAdvanced Simulator for Pilot Training.(ASPT)and each IP was administered a questionnaire
. when the training was concluded. Thequestionnaire was designed to assess whatcontrols and displays were used to instruetthe student pilots and what should be added .
to the IOS to facilitate flight instruction.Both the observatiohal and quettionnairedata were analyzed to determine the controland display requirements for an A-10 simu-lator IOS. An IOSwas subsequently designedwhich incorporated these controls and dis-plays. In general. the IOS consists ofseven Cathode Ray Tubes (CRTs), A-10repeater instruments, a variety of controlsand indicators, and a CRT hardcopy unit.two of' the CRTs are video monitors whichwould be used ,to -display the studehtpilot's out-the-window visual scene, andone CRT is a closed circuit television
' monitor that would enable the-IP to monitorthe ih-cockpit 'activities of the studeqt,pilot.. The remaining CRTs would provideinformational feedback about'the student'sflight performance. A variety of displayswere developed fO,.. these CRTs and the
function and operation of each display weredetermined. The IOS also included A-10repeater instruments which were arranged.the sage as they are in the actual aircraft.Additionally, a number of controls were
n,
9
incorporated in the IOS for the executil/eand administrative control- of the trainingprocess. .The specific configuration'
these ' controls was provided and the
functton, operation, and layout of the
controls were specified. This IOS design
effort was fully described by Gray, Chun,Warner, and Eubanks 4981),
9. A-10 Training Features Evaluation.The utility of various inill'uEtTiiar---rei-
tures of the ASPT IOS were evaluated in thecontext of A-10 flight training. Six of
the features available on the AFT IOS wereevaluated. ' They were: problem and para-
meter freeze, rapid initialization, auto-
matic demonstration, automated performance
feedback, self-confrontation, ' and tasl.;
difficulty. In the study, IPs were required
to rate the utility of each instructitialfeature and a systematic observational
procedure was used to determine the fre'
quency with which each featbre was used bythe IRs. The data analysis indicated that '
performance feedback and initialization
,* were rated highest and were used most
frequently. Problem and parameter freeze
was also used regularly. Many of the
features, such as automated- performance
feedback, task difficulty, preprogramme0
demonstrations, and self-confrontation,
were rarely used. Some of -the features,
such as freeze and initialization, were notused in an instructional manner, but morefor training management purposes. That is,
they tended to be used to terminate a
. training exercise and to proviide a, transi-
tion to the next exercise, ratherlahan forproviding opportunities to give the student
pilots performanee feedback and the
appropriate remediation. This study is
documented in the report by Gray, Chun,
Warner, and Eubanks (1981).
3. F-15 Simulator IOS Operational Test
and Evaluation. An operational test andevaluation of an F-15 flight trainers 105
was conducted. MeasureMents of the IOt
controls and displays, workstation dimen-
sions, and workplace environment were
compared with the published military
Standards for equipment design. These,
measurements included,CRT display charactersize, .pushbutton control size, lighting
levels, ambient temperature, noise levels;'writing surface deptft and height, and reach
distances. In addition, a number of F-15
IOS users were interviewed to determine
operational deficiencies and recommenda-
tions were provided for correcttng these
deficiencies: This evaluation was accom-
plished in 1977 for the Air Force TacticalAir Warfare Command (TAWC).
4. F-5E Instrument Flipt Simulator
Operational Test and Evaluayion. AFHRL/OT
10
-..._
supported an operational test and evaluation
of the F-5E Instrument Flight Simulator.
This effort involved the development of the
est procedures and data collection
aterials as well as assistance in the
ollection Nand analysis of the test data.The data collection, materials consisted' ofreing forms which were administered to
F-5E IPs to determine the fidelity and
training' capability of the simulator
cockpit and the operability and instruc-
tional capability of the 10S. The data
collection effort involved: (1) briefing
the pilots and IPs on the misSions to beperformed; (2) distributing the- ratingforms; (3) monitoring the activities of theIPs at the IOS and 'the pilots in the
,cockpit; and (4) recording their commentconcerning the simulator. The evaluation
was managed by the Air Force Test and
Evaluation Center (AFTEC).
5. A-10 Operational Flight Trainer
Operational Test and Evaluation. -The
operational\ test and evaluation of-Ihe A-10Operational 'Flight Trainer was also
supported by AFHRL/OT. The support involved
assistance i/the development of data
collection materials and collection of testdata. The data collection materials
consisted of rating forms that were admin-istered yo. :the test IPs to assess the
fidelity-`and. training capabilities of theA-10 simulator cockpit Oft the instruc-
tional capability of the A-10 IOS. This
, evaluation was also managed by AFTEC.
6. SAAC 105 tontro1l Panel Design. The
control/indicator panel tfor the Aerial
Combat Engagement Display (ACED) of the
Simulator for Air-to-Air Combat (SAAC) wasredesigned. The ACED is essentially a CRTwhich is capable of displaying several
pages.. These pages present a view of thegaming arda and the various aircraft in thearea, a pilot's view from inside the cock-pit of the visual scene, and performancescoring tiata. Controls and indicators are
used to display different pages andotochange range scales. In the modified panel
design, the ACEO controls/tndicators that
,were originally contained in two separate
panels were' consolidated into a single
panel. The contr'ols/indicators wt*e
arranged in functional groups. The panel
design has been implemented on the SAAC.
AFHRL/OT IOS R&D IN PROGRESS
1. IOS Design Guide. A guide for
flight simulator IDS design is currently
under development by AFHRL/OT. The, guide
will be made up of three. sections, or
, volbmes. The first volume will include
detailed descriptions of advanced flight
simulator MS configurations. The second
'will. contain a compilation of the appro-priate human engineering design criteria.The third will provide an IOS design forfighter/attack 'flight simulators based on
the data in the first two volumes. At. present, ,several simplation trainingfacilities hhve beem-visited and descrip7tions of the IOSs have been prepared.'The facilities that were visited include:(1) F-15 Instrument Flight Simulator. (2)
A-10 Operational FUght Trainer; (3) F-14Weapons System Trainer: (4) Tactical Air-crew Combat Training System ITACTS). and(5) EA-6B *Weapons System Trainer. Duringthese visitl Abe operation of several other,simulation systems Is observed, namely,'
the F-14 Operational Flight Trainer, F-14Mission Trainer, TA-7J Operational FlightTrainer, A-6E Weapons System Trainer, andA-6E Night Carrier LandinV Trainer. Tacticsinformation will not -be included 'in thereport because of its classified nature.Descriptions of the IOS for the F-16, F-18,and P-4 traipere will also be incor-porated in the report. The second volumewill° contain relevant human engineeringdesign criteria, standards, and specifica-tions obtained from a compilation of
existing data and from studies conducted atAFHRL/OT. Examples of the areas that will'be addressed in -their IOS design guideinclude control/aisplay design and layout,workstation configuration, workplaceenvironment, anthropometry, and operatorseating. Volume three :will provide a
conceptual IOS design specifically forfighter/attack simulator applications. "Itwill contain detailed design specificationsand the function and operatidh of the IOScontrols and displays will be explained.When the thred-volume report is completed,a sequel will be prepared.for tanker/transport/bomber simulators,
2. IOS Interactive CRT ControlsEvaluation. as een mni ma e
which alternative control devices for IOSinteractive .CRT displayi will be
evaluated. The controls are: (1) touchpanel, (2) light pen, (3) numeric keyboard,and (4) voice actuation. These controlswill be compared in relation to threedifferent CRT presented performance tasks.In one task, the subjects will be asked toload various weapons from a menu onto thepylons of a stylized aircraft. The proce-dpre is to select an appropriate pylodnumber with the controls, then se]ect andenter the 'weapon number identifier. Forthe second task, the subjects will be
required to move a cursor from a specificlocation to another locatiOn using thecontrols, which is analogous to initial-izing a simulated aircraft at a new loca-tion on a CRT. presented IOS navigationmap. The third task _will require the
.z
subjects to enter numeric data to esthblishthe simulator parameters suc as altitude,airspeed, heading, radio frequenties,latitude, longitude, and glide slope. Thesubject sample will consist-of.IPs who willbe tested in a repeated measures experi-mental design. Roth response times anderrors will be recorded to provide the
, measures of task performance. To date, theCRT displays (performance tasks).are nearlycomplete and the controls are operational.The test:will be conducted on a special-function, part7task traine'r which is com-prised of i Chromatics video displaY'"terminhl and a North Star minicomputer.
ASPT IOS Energy ManeuverabilityDis lays: Energy Maneuverability Displays(EMDs) were developed for the IOS of theAdvahced Simulator for Pilot Trafning(ASPT), one for use in conjunction with
JF-16 flight training and the other for A-10, taining. These displays are presented on
an IOS vector graphics CRT, and they showAhe current energy state of the simulateda)rcraft. Since aircraft maneuverabilityis a function of available energy, thesedisplays will. enable the simulator IP to
monitor the maneuvering capab'ility of theaircraft. With this information, the IPs
can instruct their student pilots on how toincreaSe energy . to maximize aircraftperformance without exceeding the struc-tural limits of the aircraft. An experi-ment will be conducted to assets theutility of these displays for flightinstruction at the ASPT 'IOS. A group ofF-16 and A-10 IPs will be asked to monitorthe corresponding F716 or A-10 EMD during aseries of preprogramded,flfght maneuvers inthe.ASPT. Following the demonstration, the'IPs, will be asked to rate,the instruptionalutility of the EMD and, their cotnments'be solicited via a questionnaire. The Engimplemented in the proposed stddy were.adapted from the EMDs used on*the Navy's" .
Tactical Aircrew Combat Traintng System c
(TACTS) which were .originally developed byPruitt (1979).
11
10
4. IOS Tactics Dis lay Bevel() ment.R&D hat been n tie e eve op a S
display for air-to-surfhce taCtiat -
training. A codceptual model of the display .
has been designed which will be implementedand evaluated on the ASPT. The displayprovides the essential aircraft flightparameters and tactical informatfbn such asaltitude, airfpeed, and G-load as well asan overhead view of the tactical area withfixed range rfngs to show the range andbearing of ground radar and surface-to-airmissile launches. The simulated aircraftis positioned'in the-center of the displayand the terrain moves in accordance withthe speed and heading of the aircraft. rn
the evaluation, a group of IPs will be asked
! to monitor the display- during a series of
preprogrammed tactical missions in which
radar will lock .on and missiles will be
fired: The IPs will be surveyed to deter--mine- the utility of the display for
ihstructing/monitoring pilots in simulated
tactical training scenarios. /0 /
5. IOS' integrated Visual Display
Developmenl. An experimental effort is
undirway to deve'op on IOS integrated
visual display for ute by IPs diming basic'
(T-37) simulator flight training. The
purpose iDf, the display is to Orovide all
the critical flight instrument data 9n a
single CRT. similar in form to a head-up
display 'HUB/. In this effort, a question-.
naire was deieloped and administered to 25
T7.37 IPs to determine the display require-
ments for instructing/ monitoring basic
training missions including contact flight,
aerobatics, formation flight, instrument
flight, and navigation. The questionnaire
data are currently being analyzed to
identify the specific flight information
required for each of the missions. The
appropriate display symbology will be
selected to depict this information lnd
Aisplay will be designed containing the
Symbo'ogy. .
6. IOS Instructional Features Require-
ments, A-survey was ,recently initiated toidentify, the utility of advanced instruc-.
tiodal features in aircrew training devices.
The survey will be administered to IPs at
various flight simulation facilfties. The
IPs will 'be asked to rate how frequently-the IOS advanced instructional features areused in flight training, how easy is it to
use each feature, and' thli'training value
1-and-training potentialOf the features. A
total of '7 features, will be examined
including Freezev 'Record/Playback, eset,
Demonstration, Crash Override, Automated
1)erformance F.eedback, "'and Automated Adap-
tive Training. The sdrvey will be conducted
at A-10, F-15, and F-4-simulator training
.sites. The results of ther survey will be
used as a guide for specifying the required
instructional features ln future flight
simulator'IOS designs.r
12st
cbtausums
AFNRIJOT has provided scientific and
engineering-support to a variety of'flight
simulator IOS design'and evaluOloh efforts.Many_of these efforts have been in.suRportof outside Air Force agencies which lhave
involved the--assessmat of IOS instructional
utility.and the development of gtheralizedguidelihes for IOS design. .0ther efforts
have been accomplished strictly in house tosolve specific IOS design problems... It is
anticipated that AFNRL/OT will provide
cogtintied, if dot expanded, 105 RAD
support. Greater emphasis will be placed
on the use of -desk-top,' special functions
trainers in the conduct of 'future' IOS'
design studies. These devices will be used
to investigate ,IOS control and display
designs as building blocks for the develop-
ment of full-mission flight-trainers.
REF6ENCES
Gray, T.N., Chun, E.K., Warner, N.D., and
Eupanks, J.L. Advanced Flight Simu-
lator:. Utilization- in A-10 Conversion
and Air-to-Surface. Attack Training.
AFNRL=TR-80-20, AD A094 608. Williams
AF8, AZ: Operations Training Division,Air Force Nathan Resources Laboratory,
January 1981.
Pruitt, %%R.- Energy Management Training Aidfor the Navy's Air Combat ManeuveringRange (ACMR). Report No. N00123-78-C-
1371. St. Louis, MO: McDonnell AircraftCompany, McDonnell Douglas Corporation,
April 1979.
.f
OPERATIOiAL PROBLEMS IN IN-STRUCTOR,OPERATOR STATION- DtSIGNS
DR. JOHN P. CHARLESIcon,:Incorporated
-INTRODU.CTION
In the past, training devices, as'.
with many other systems, were all too.often "provided".to the.user and leftfor'him to use as best he could. Theuser neither had been involved in the.specification-,por in the desigh of thedevice. .As a result, the operator in-terface or instructor operating sta-tion (IOS).was generally designed as asimulation control unit .rather than asa training control unit. Fortunafely,'the early devices,simula*d relativelysimple systems and Were seldOm usedfdr (full system or drew training. -
* Thus; the instructor With the help of,a technician, could generally ooper-ate" the device.
Two factors have significantly,changed this piOture. First, theapplication of advanced digital tech-nology to training devices has expan-ded their capabilities enormously.Existing weapon system trainers (WST)can provide 'multiple targets 'in anelectronic and missile warfare en-,vironment with visual and motion simu-lation. Freeze, dynamic replay, re-set and demonstration functions aregenerally available. Graphics dis-plays provide the instructor with"pilot view° displeyl and plots. Hun-dreds of pages of alphanumeric,dataon simulation options are provided.
The second factor was the in,creased emphatis on the use of simu-lation for training which occurred inthe 1970s. Rising costs of weaponsystem ownership and operation led toa renewed application of simulation'to crew training;
Thus, with the requirement for ex-panded use of simulation and with thetechnology available to implement therequirditent, a wide variety of train-ers were developed and installed.Each major weapon system training sitenow has at least a.WST, and in addi-tion, most have part task trainers
-
(PTT) to support crew position train-ing.
follow t1 Tece set b) simblationtechnolo . Relatively simple con-soles designe&primarily to contzoithe simuIationv had proVen acoeptable'for the early devices, even thoughnot optimnm. fn the absence of anymore refined iequirements,-the.con-solee for the Ilew sophisticated train-ers_have followed much the same-4-Oroach. Unfortunately, the conse-quences have produced serious opAa-bility'problems as well as sub-opti-mum training suppoXt, .111111e-the de-vices cary-come close toduplicatingthe weapon system characteristics,.they cannot necesdarily be-operatedto train effectively using that capa-bility.
1,
The Naval Training Equipment-Center'undertook a series of studiesin the 1970s to look at instruct-Orconsole problems. One of' these stud-ies (Charles, 1975) looked at the roleof the instructor pilot (IP) in simu-lator training. The study which sur-veyed all of the aviation readinesstraining squadrons and the relatedsimulator installations faund'thatsevereopera.mdrCO
r
changes had occurred in theion and use of the devices. ,Of
iMPortance was the expressed "
cern that the next generation ofainers which were,then in design. ,
and development, woUld prove inoper-able by instructor personnel if the,simulation control orientation of the iIOS'design Continved. ,The extensiveuse,of cathode raj!' tube (CRT) dis-plays would coMpound the prottlem.
IOS PROBLEM AREAS'
Alen operability problem& didarise in some of the newer WSTs, the
,
NAVTRAENIPCEN-undertook a review of -
fhe IOSs,with-the goal of identifyingboth the-causal factors and feasiblesolutions. Two-WST IOSs have been re-
the EA-6B WST Device 2F119and he F-14A WST Device 2F112...Ingeneral, the problems whi_oh were iden-..tified,are similar in both devices.They reflect two basic development.
Unfortunately, research and de- probleMs, the lack of.human faotorsvelopment effort in IOS design did ,not engineering effort during design and
13
. '
.
.development and the lailure to'define
the of)erational environment-and re-quirementfor the device in sufficient
detail, .
While the human engineering pro-,
blems are significant and intdract
with the operat.ional problems to coe-
pound, the operaipility prdbleM, theoperational problems are df prime im-.
portence...,:"The best human engineeredIOS cannot solve the user or Oper- ,
ational problems which exist. Howeverconversely,"*.the mast user responsIgesystem will also beineffectiVe'if in-
operable.
AlthoUgh steps have been end arebeing taken td solve some oP the oper.-
ability problems, the training 9r Uti-lization problems in general remain.
These problems are centered about
three areas:
a. pefinition,of the user
V. definirtion of.the use, .
o. definition of the training .. .
fUn9tiOns -
d. definition of the operators'. *
While the thrtee are clearly, in-
- terrelated, ea h has a unique impact
on'IOS design eqUirements'since they .
identify the 'aulationrequirements,the trainer'and training functionalprequirehlents and the operator/instruc-
%- tov'interface requirements._A
-eUS,AR DEFINITION, .
. The,uspn refers to the-units
which actually utilize the train r in
their training program. In the.
Navy,"the user and-the custodian and
maintainer are not tne'same.activity.Most of the WSTs are supported`by the
Fleet AmAtion Specialized Operational
Training,Group'(FASOTRAGRU detachments.
.
4The 'trainers are maintained and read-
,ied.for training as scheduled 4y
FASOTRAGRU perSonfiel. They ).so pro-
'iided technical support to the oper-
ation of the..device. 'The'primary'nsers are the Fleet Readiness Sqffa-
drons (FRS) and the fleet squad:reins.
Their:rqUiretents, although similar
in te 'of thE weapon system and its
operation, iffer in-terms of thetraining program content end fmplemem,
tation.6
14
'The FRSs conduct several levels
of,a.syllabi designed to familiarizethe replacement aircrew' with:theweapon system and its operationalmissions. The majority of replace-ment aircrew are recent graduatei oithe Naval Air Training Command and,'are designated Naval Aviators and
Naval-Flight Officers. They cam:beconsidered as the bachelors degree df
the aviatiotr bommunitY;and have thebasic prsfessionai qualificAtions.The FRS training- programs are de-ignea to produce;aircrews:Ao are
qualified to operate a speciLc'weapon system and to jOin a fleet4- ',
squadron for additional.training praor
to deploying. Although bany.ofthe .training prograMs have been.task ana-lyzed and behavi,oral and.training ob-jectives identified, the thrust ofthe typical FRS7training program is
not criterion-based, nor is trair4ng
to proficiency the.primary objective. .
The FRS has been likened'to a 'graduate
school. As in acciemiC graauateschools,while gradeg are utiiized,they are considered more for informa-tion%than for a.pass-fall functioncThe replacement aircrew are qualifie0aviatbrs and NFOs, anekthey are con-'
,side'red to be capable 1415'f tfansitioningto the new system in the allatted ,
training time. When unsatisfactoryper'formanCe does occur, it is initial-ly handl d with the usual options of
counsel g, extra time and tutoring.
,In ,pUmmary, the FRS training prO-
gram,/giVen,the limited training time,limite0 flAtt time, and limited-simU7lator 4ime, is organized to give the ,
replacement aircrew Ite maximum train-
ing and experience possib/e whilecltsely monitoring performance,forsafety and correct operating'proce-dixreS-
,Fleet squadron training issimf-larly cOnceiyed and except for NATOPSand instrument checks which haveassociated performance criteria, therest,of the training is aimed at im-
parting as much experience and know-ledge across the mission requirements
as possible priorito deployment.Readiness critefia,have been developed'but are, in general, more experienceoriented rather than performanceori-ented.
Theae training programs departfrom.-the classic training approach
xhich is Organized around a highlYstructured syllabus, testing 'and fixed,driteria. t,
In summary, the discussion of theUser programs,has emphasized!,the in-structor oriented nature of the pro-gram since;it serves to frame some ofthe oPerational problemS in existingDOS des44gn.
t.tpE DEFINITION
tThe_FRS use of the WST varies as
a:function Ot the phase of training.The FRS syllabuO)egins with systems,familiarization and progrrses throughbasic flight characteristics to weaponsystem mission-and tactics :ilaraining.t1Procedures'and%position training are.--(conducted on.*PTTs if' they are, avail-able. If not, the WST is used tosupport all Plases of training includ-
e.
system familiarization
basic fl ight
Operation and basic tactiCs
advanced'tactics
spedial events
Procedures and basic flight train-ing, when conducted on-the WST; gen-erally utilize a manual training modein which the instructor can controlthe initial conditions and the evolu-tion of the subsequent events:- The"instructors consider Et essential tocontrol' the onset.and removal of mal-functions and emergencies. They alsofind it easier and more realistic tosimulate the various controller func-tion in a manual- mode.
.Systems operations and basic.tectics training is generally conduc-ted in a semi-manual mod& if possible.Pre-programmed initial condition setsare used for establishing the target(s)and environment as well as the flightstarting'point. The instructor thenassumes control and adjusts the scernario to meet aircrew performance andtraining needs.
Advanced tactics°and battleproblem training events are more fully.programmed because of the almost im-possible task of trying to controlmanually the targets and the many
IS
1
15
vatiables involved. Some instructorinteraction.is, however, required toreorganizeHor reorient the problem A,based on aircrew actions. .
\
Special events such as,spin, highangle of attack and missile defense'training are *generally conducted inthe manual mode since they involve"free flight".py the student.-
Finally, alqhough not yet imple-mented, programmed and standardmission events are needed to permitcrew and unit eValuations. SIla-tion proVides a-feasible means ofaccomplishing such operations sihcerealistic targets and weapons (andyeapobs effects) are generally un-available for "shoot-offs".
Fleet squadron use is similar toFRS/use except that familiarizationtraining (procedures and basic flight)is not part of the training program.Detailed training event gilides andgrade sheetsgare generally not utili-zed.
In summary, the use of the WSTsin,pqRs and fleet squadron trainingincludes:
manual modes'for familiariza-tion training (if conducted.on thedevice) as well as for special "ex-periential" events such as spins andmissile defense.
semi-manual modes for systemsand basicttactics trainigg
programmed or'formulatedmodes for advanced tactics and battleor war problems
TRAINING FUNCTIONS DEFINITION
Effective training regtires aseries of tasks ranging from instruc-tor preparation'for the event to re-cord keeping. The set of tasks Usedto analyze the IOS designs includedthe follbwing:
a. Prepare - review event, air- ,crew files, simulator status;get forms, guides; manuals
b. Brief - review event, objec-tives and procedures withaiicrew and instructor staff
r
c.Initialize - configure con-soles and cockpit; select.andimplement initial conditionsor programmed mision
d. Train - instruct; control simrulation; monitor performance
e. Evaluate - evaluate aircrewpr6iciency; diagnose performr,ance problems
f, De rief event resultswith aircrew and instructors
gr. Manage data - update aircrew,steff a simulator files
h. Develop-events - create/pro--gram/modify events, displays,
missions, simulation data
i. Train instructor - train insimulatoc operation and use .
The training device should pro-vide support to each of the funotions,especially the brief, initialize,train, evaluate, debrief and developfunctions.
DEFINITION' OF OPERATOR
The traditional operators oftraining devibes have consisted of thetechnician simulator operator (SO)provided-by the FASOTRAGRU Detachment,and the,instructors provided by thesquadrons. The SO is typically atechnician who is learning consoleoperation prior to being assigned toa maintenance crew. Thus, in general,the SO is not an expert in simulatbroperation although such assistance isgenerally available on-call. The SOis available to assist the instruc-tors inbutton smashing"%
-
The instructors from the FR willhave completed the instructor trainingsy;kebus which generally includes ashba comrse on simulator use and'some
741
-the-job.exPeiience. Of more im-ortance is the fact that the instruc
tor will also be instructing on Othertrainers as well as in-flight. There-foDe, with rare exception, he is notdedicated to WST instructing. Duringhis tour of duty, he wilb,also beastigned other squadron duties and hisground instructing time will taper.offas a function of time and.duties inthe squadron. .
The fleet squadron instructors
16
-are notAtrained in simulator operation(except for any prior FRS experience)
-and cannot be expected to become qual-ified in operating the trainer.
SUMMARY
The user data whidh should 'constrainsIthe IOS design involves thespecification of the:,
a. user - primarily FRS andfleet squadrons
b. use - primarily system oper-ation and tactics (exceptwhen supporting familiariza-tion training)
c. functions - all trainingtasks,
d. operator 7 the technicianoperator and the squadron in-.structors. The operator isgenerally new to the device,.the FRS ins.tructor has lim-ited training in operating,the device, and the fleetsquadron instructor willhave no training in operatingthe device.
OPERATIONAL PROBLEM/
A variety of problems occur inoperating typical WSta, many of whichseriously impact training effective-
. nesS. Some of these problems, whichresult from failure to Consider theoperational environment in terms ofthe user, the use, the training func-tion support and the pperators areoutlined below. Zhe problems wilf begrouped in terms of IOS configurationand layout problems, display and con-trol problems, operating problems andtraining function problems.
IOS LAYOUT PROBLEMS
FroM the,review of IOS install-ations, it is clear that insufficientattention is given to console locationand layout. In general, it.appearedthat the typical IOS is located in amain traffic flow pattern, and_be-comes a meeting place for-personnelin the area. While observation of on-vgoing training can contribute tolearning, a-ready room environment isnot conducive to either the trainingor to beneficial observation. Figure1.depicts the location and layout ofthe 2F119 IOS. All traffic to and
from the cockpit must pass through thearea. In addition to the congestionproblem, the aircrew can generally geta glimpse of the upcoming simulationbeinq initialized. Light spills onto )
the displays when doors age opened.Noise interferes with inter-instructorcommunications.
ARRANGEMENT PROBLEMS
Most IOS consoles provide a sta-tion for each of the instructors as,foreseen by the specification and thedesign. This rarely reflects the useand user needs. One of the majorproblems occurs in the battle or warproblem event, especially for thefleet squadrons. As can be seen inFigure 1, only three instructor sta-tions exist (and none for the SO!)The battle problem instructor who isresponsible for the overall evolutionof the event and for evaluation Andcritiquing of crew performance, hgs noeffective station from which to oper-"ate. The IP'mans the flight station;the ECMO instructor, the tactics sta-tion. In practice, the battle probleminstructor mans the ECMO 1 station ,
where"he is forced to share the dis-plays of the other stations, manipu-late the station control'between theflight and tactics modes of operation.
,and perform the ECMO 1 instructorrfunctions. The problems are Similarfor the 2F112 as can be seen'in Figure2. Here, the battle problem instruc-tOr is forced to sit behind the in-structor stations and.function with acgpboard and whatever display hp canread from this "p:osition.
DISPLAY PROBLEMS
. The CRT display provided.,the de-signers the opportunity for -displdyingliterally any Simulation data avail- /
able. In many of the WSTs, the oppor7tunity was uiilized. The'volume of
. pales-required the'use of index pages.Table 1 Suminarizes the display options ,
available on the 2F119 WST flight sta-tion displays.
'The:-quantity of data far exceedsthe capacity of any tP to access-and.utilize effectively during trainingand still be able to monitor and eval-.uate replacement pilot performance.The.options cannot bea'ccessed by aninsCructdr who is not trained and pro-ficient or experienced in the flightstation operation. A similar set ofdisplays exists at_the tactics station.
.
17,
Instructor Area
Figure 1. Device 2F119 IOS Arrangement
_TABLE 1. 2F119 FLIGHT DISPLAY OPTIONS
Title PagesDisplay Index 1Initial Conditions Index 1Initial Conditions 10Pilot Instrument Monitor 1Pilot Console Monitor
.1
ECM0,1 Monitor 1Procedures Monitor Index 2Procedures Monitor 99Malfunctions Index 1Malfunctions t.Imput Codes
132
Function 10Parameter Recording 1Cross Country . 1Hostile Environment 1Terminal Area 1ACLS 1GCA/CCA 1ChM Index 1CEM (Alphanumeric) variableCEM (Graphics) 1CEM Summary 1Demo Index 1Demos 10DRED 1Visual Status Monitor 3Graphics Test DisplayMemory Monitor 3System Time 5
TOTAL 167
4
TACT I1CS
INSTRUCTOR
S TAT I ON
FLIGHT
INSTRUCTOR1
STATI ON
Figure 2. Device 2F112 IOS Arrangement
A similar display approach was im-plemented on the 2F112.
CRT cockpit monitor displayi,onthe typical console pose serious prob-.lems since the data is not displayedin a manner readily interpreted by. theinstructor who is also training inflight. The displays do npt.typicallyparallel either the arrangement or theformat used in the aircraft. Figure 3,depicts a typfcal page usecl'in WSTsand is from, the 2F119. As/can be seen,the instructor cannot glanbe at thedisplay and ascertain switch settingswithout first locating the control byreading the list and then reading thecontrol setting.
CONTROL PROBUMSit
The typical instr ctor stationhas two CRT displays 4Long with somecockpit repeater disp ays. The CRTsare operated by pele, ing.,the CRT (and
18
OPERATOR
STATION,
sometimes the specific area on the CRT)to be utilized, then selecting the'display mode to be accessed and fi-nally pagina to th e. ckata required.The sequence of steps which involvesboth switch and light pen operations,for example, is time consuming andrequires the instructors concentratedattentions. Errors, which often occur,typically require repeating the entiresequence of steps.
Light pens are widely used forsimulator control. Although lightpens are becoming more reliable, theyare still unacceptable for time sen-sitive inpUts such as malfunction in-sertion, weapons launches and trainingcontrol functions. IA the past, poorlight pen relia4lity has severelyhandicapped console operations.
The typical joystick display-control dynamics involves a step func-tion which renders.it unusable for CRT
1 7
I
P.LU5 Ga4301..s. nUNII.JR
'STALL UARNIND AOFGUAR I
FUEL MIMIREFUELINGBOOST PUMP TESTTANK PRESStams DROP TANK PRESSUINGFUS=Ign gAGE
OFFOFF .
mamNORMNORMNORMmATN
OLEED AIRWUL ENGR ENG
.
ASS/ST SPIN 2ECDVERYEMER cLAP OFF
APCAIR TEMPFLAPERON POP-UP
'ANTI-SKIDHOOK BYPASSSPD 'BRAKES TESTEXTERIOR LIGHTS mAS1E2COLLISION LIGHTTAIL LIGHTTAXI/PROSE LIGHTVINO LIGHTSFORmATION LIGHTPYLONSLEURADAR ENTEROXYGENSEL STORES SETT
ON
TOUCH doOFFONON
.11*TON
ORTORTOFF
SAFE
CASIN OOMP':FF AXTMOW ANTAOF
FLE4
RADAR OEACOg PLRmODEAm :Quo MITnlfi
MAN RAm AIRAUTO TJIPPEOUIP COOL MSTR
MILS
IIIVLINK 1AI)FREO
ACLS-NORM
OMD.O
ggiEL
SIT
ARA-S37
7 REF ,
S
DISPLAYS
PACK IPACK2airAUDO/MIAN
CHAFF
DECM
0eOFF
-OFF AFT
1.111/.1=.1111.1 4.1P11111EMI
L ONL ONG
R GINR ENG
AUTOPILOT
AWNH°42 0RCASTAR AUG
..
OFFOFFOFF
NORMSTA! AUG
UHf
FREO,
PRESETMAIN(a )
MOriTEACODE'RAD TESTMODE ImODG 2mac 2.ftMODE OMODE 4IDENT .
4FF-SIF
.
l
.-
.
UHF IUHF 2HFTAC/IFFCOM/IXWANTISICS
..
ICSAMPL SELMIC SEL
NORMEMCOLO
ENG ANTI-ICEDEFOGUINDSHIELDPITOT HEAT
. OFF
AUTO
UINGFOLD LOCON
UHF
FREO
PRESETMAINCO )
I COMPI LAT
41,
Figure 3. Typical CRT Monitor Display .
Operations. Manual control or flyingof targets using the joysticks hasproven equally impossible because ofthe lack of sufficient flight infor-mation and because of, joystick controlaxes cdupling%
CockpitSconfiguration and initialcondition control mismatches generallyrequire extensive manipulation bf ICprocedures to res6lve the problem.
.Communication options controls onmost WST IOSs have proven so'time con-suming to use and so error producingthat the options are not utilized.The c'ontrol panel is left in "over-ride".
OPERATING PROBLEMS
Manual modes of trainer operationave generally proven aifficult toutilize even though required for manyevents. The exception is the IC whichinvolves ktarting with the cockpit pre-flight procedure gt the take-off end
tiof th runway. However, even for thisIC, e instructor, unless he accessesre vent data 6ages is unsure as tofuel state, stores configuration,weather, etc. The major problem is
. one of knowing on wha page and inwhich display mode tPle relevent datais available, and when accessed,. how
4).. to edit and what the impact will be oninter-related pgrameters. The task isbeyond the "novice" user.-4-'.
Programming or formulating meth-ods requires extensive training andrecent experience to utilize. Thisrequirement cannot be met by thetypical imstructor. The procedureson most WSTs do not provide an "inter-face" between the,instructor and themission programmer.
Target creation is normally ac-complished in simulation parametersrather then in user terminology. Thus,targets are created in terms of "smallfighter" with an "IR" missi.le and aspot jammer, for example, rather thanin terms of a Badger or Backfire or.Muffcob. The result is that the in-structor does not generally know whatthe displayed target represents orhow to evaluate the aircrewstacticsrelative to the target.
TRAINING FUNCTION PROBLEMS
IdeallY, a training device shouldbe abTe to support each of the train-
ing tasks fhvolved in the trainingmission. Ye,7 do. To the ektent thatthey do'not,iadditional"workload isplaced on thdiinstructOr or SO andadditional training*may be rewired.-.table 2 is a sVmmary of three WSTssupport lEo the straining functions out-lined earlier.
-
TABLE 2. 'TRAINiNG FUNCTION SUPPORT
Function
WST
2F111 '2F112 2F119
Prepare-. none none noneBrief _ none none none.Initialize yes yes yes'Train -. yes ye-6 yesEvaluate none none somebebrief none none noneManage Data none inone' noneDevelop events yes yes . yes
Train instruc. none no4e none
As can be seen, the typical WST6are not designed to support trainingfunctions. As has been disdussed inthe previous sectiors, even thoSetasks in Table 2, which am- recordedas "yes", present marginal support tothe function. Yet, each of the de-vices has data stored within the sys-tem which would be useful for the in-
)structor in:
reviewing the scenario andoptions prior to briefingthe student
reviewing the weapon systemoperating procedures andlation options.
briefing the aircrew on eventprocedures, skenario and .
objectives
briefing the inStructor staffon the scenario and trainingprocedures
debriefing the aircrew on theresults of the training
.debriefing the instructors onthe problems and changes re-quired
updating training records
Communications simulation isperformed manually in most trainerand can require up to two instructorsfull time for some battle problems.
20 9 .1/
.
NrSimdlaLon. of controllers and provid-ing relevent background communications,especially for sequential battle prob-lems, is almost impossible.
None of the WSTs reviewed fpro-vide the capability of either briefingor debriefing the aircrew (or instruc-
, tor staff) without utilizing trainingtime and the instructor console. Noneof the WSTs provide the additionaldisplays and controls required or theinterface required.
Hard copy output on most WSTs istypically so slow and involved that it.is not used. Yet, all instructorsagree, that hard copy is desirable,especially if no debiiefing displaysare available.
Malfunction options far exceed ausable set on most WSTs (typically-i*the hundreds). In addition,:informa-
. tion on the simulation characteristicsof the malfunction is not provided.As a result, the instructor has no in-formation on the different cockpit in-dicaeions and relevant procedures, forexample, for different engine fires offlight control system failures.
SUMMARY
Recent reviews of the IOSs ofseveral WSTs have.documented a varietyof operational problems. In.general,they result from_both the failure'toconpider user requirements of to de-.,sign the IOS to. basip human factorsengineering criteria.
WERENCES
1 Charles, Joiln P. "InstructorPilot's Role'impimulation Train-ing lphase II)." NAVTRAEQUIPCEN76-C-0034-1. Naval TrainingEquipment Cenler, Orlando,.FL,August 1977.
2. Charles John P. "Device 2F119(EA-6B WST) Instructor ConsoleReview". t-NAVTRAEQUIPCEN1083-1; Naval Training Equipment \Center, Orlando, FL, (in printing).
21/22
THE REAL WORLD ma INSTRUCTIONAL SUPPORTFEATURES IN FLYING TRAINING SIMULATORS
Clarence A. Semple and Barton K. Cross, IICanyon Research Group, Inc.741 Lakefield Rd., Suite.BWestlake Village, CA 91361
ABSTRACT
The real world is defined as the operational training environment.Instructional support features (ISF) are simulat hardware and softwarecapabilities that allow instructors to manipulate, upplement and.othe ise
control the student's learning experiences with the 'ntent of promotin the
rate at which skills are learned and maximizing the vels of skillt a hieved.This paper summarizes research findings, observations and'experiences aling
with the design and use of the following ISFs: automá a demonstratio ;
record and replay; programmable and manual malfunction co trol; automatedcueing and coaching; automated controllers; computer cont lled adversaries;and quantitative performance measurement. Definitions of t atures aregiven, followed by comments on their design arid use. A finAl sit.on dealswith persistent needs for improved.instructor training in the use o imula-
tion in general andin ISFs in particular.
BAUGROUND
The increasing complexitylof modernweapon sysVms, combined with their everincreasing performance capabilities, havecreated a situation in which the performancecapabilities of crewmembers are becoming com-4nensurately more critical. AS crew perfor-mance capabilities become more critical, thetraining of those capabilities becomes anincreasingly difficult problem in that'lessperformance variance can be tolerated.
Training using+operatiolw4 equipment is-4 becoming increasingly difficult. fAcquisi-
tion, operation and maintenance costs are allhigh. Skilled maintenance and instructionalpersonnel often ai-e scarce. Finally, envi-ronmental constraints constantly encroachupon real world training areas and impact onhow they can be used.
The use of aircrew training device(ATDs), while not as glamorous as inflighttraining, has several inherent advantages.ATDs still are relatively inexpensive toprocure; operate and maintdin, at least incomparison with actual equipment. Conse-
quences of operator error or inability to'appropriately respond to normal and emergencyrequirements are significantly less in a siM-:
ulator. Control over training events isvAstly superior to that available in the air.Simulators can be used to create low proba-bility conditions at will for training pur-poses, training events easily can be repeateduntil necessary operator Skill levels areachieved, and their utilization rates aresignificantly greater than actual equipment.
23
The question ROW has.become how to max-imize the training utility of ATDs ratherthan whether to use them in training. With
the proper design, application and use dfinstructional support features (ISFs) intraining devices, the utility and advantagesof training devices over actual equipmentcan become even greater. The training sim-ulator can become an even more integral partof the real-world operational training envi-ronment.
INSTRUCTIONAL SUPPORT FEATURES
ISFs are features of training simula-tors that are specifically designed to facil-itate the instructional process. ISFs are
the hardware and software capabilities thatallow jnstructors to manipulate, supplementand otherwise control the student's learningexperiences with the intent of promoting thetate at which skills are learned and maxi-mizing the levels of skills achieved. Theyare designed to aflow control of instruction-ally related variables such as rate of con-tent presentation, amount of content pre-sented in a single block, amount and distri-bution of practice, types of tasks practicedtoknowledge of results, and measures of perfor-mance. ISFs are the capabilities that trans-form a simulator from simply a practice de-vice into a flexible element of the totaltraining system. Common instructional sup-port features include freeze, record andreplay, and programmable initialiving con-ditions.
Early ISF impleMentation efforts werefrequently viewed as compromising the hard
won fidelity of simulators. While it cer-
tainly is possible to implement ISFs in ways'that will compromise the fidelity of simula-
tion, lower levels of fidelity are not the
necessary consequence of introducing ISFs
into a training simulator. The features can
be implemented in ways that will leave the
fidelity of the simulator intact and at the
same time enhance the training effectiveness
and utility of the device. The commonly
assumed justification for high levels of
fidelity is to enhance training effective-
ness. Instructional support features are
capable of still further enhancing training
effectiveness and efficiency. Thus, instruc-
tional support features can be regarded as
"fidelity plus." This statement assumes
that the implementation of the support fea-
tures is accomplished in a professional
manner that does not compromise the fidelity
of the simulatiop itself (e.g., video dis-
play terminals used for instructional feed-
back are not placed in middle of an otherwise .
accurate instrument panel).
'This paper does not attempt a discussion
of each ISF available with current Vchnolo-
gy. Rather., several were selected onathe
basis of difficulties that seem to OVSSist in
their design and use. The content of4his
paper is based in part upon the ratheralw,
' ger experiment41 research data that eVst on;
instructional support,features, a g tupon observations of tr4ining
applications of various instructfon*,s are address?;.::-tures. The folio
Automated dem Aions;Record and repfiy;Programmable'and manual malfunction
control;Automated cueing and coaching;
Automated controllers;Computer controlled adversaries; andQuantitative performance measurement.
Definitions of the features are given,
followed by comments on their design and use.
A final section deals with persistent needs
for improved instructor training in the use
of simulation in general and ISFs in partic-
ular.
Automated Demonstrations
Automated demonstrations (auto-demo)
permit the standardized presentation of a
mission segment or entire simulated flight.
All simulator systems, including the visual
scene (if msent), motion system, primary
flight controls and displays, crew communi-
cations, and sensor displays respond asthough skilled aircrews were at the controls.
Auto-demo requires substantial two-waycommunication between ATD controlling soft-
ware and hardware, and crew station controls
and-displays. Given this requirement,aut6-demo is found only in more sophisticat-
4 ed computer controlled ATDs. Teanically,auto-demo can be constructed for any segment
of flight (e.g.otakeoff, fly straight and
level, perform aerobatic or air combat ma-
neuvers-, deliver weapons, fly a standard
approach, and land) without the student or
instructor operating any primary controls.
Two instructional values commonly are
assumed for auto-demo. First, the feature
provides a performance model that the student
can observe, analyze, pattern his own be-
havior pattern after, and use as a reference
for self-evaluation in subsequent trials.
The student's workload is considerably re-
duced during auto-demo, providing him with
a better opportunity to observe relationships-
among cues and system responses. Similarly;
instructor workload is considerably reduced,
allowing a better opportunity for instruc-
tional interaction with the student.
A second value is that automated demon-
strations may be the only way to show the
student what is expected of him in ATDs that
exactly reproduce crewstation physical con-
figuration. When the crewstation is either
single seat (e.g:, F-15) or incorporatescontrols and displays necessary to execute
manual demonstrations at only on& crew posi-
tion (e.g., F-4), there is no opportunity
for an instructor to enter the crewstation
and do a hands-on demonstration. Therefore,
the only remaining avenue for a demonstra-
tion is by means of auto-demo.
In an exkination of auto-demo utili-
zation, Semple, Cotton and Sullivan (1981)
concluded,: "The training value of an auto-
mated demonstration will be greatest when
the cues, responses and task performancerequirements being demonstrated are new to
the student (i.e., they are not highly
familiar to him)." -
Navy research with the A-7 Night Carrier
Landing Trainer (NCLT) has shown that auto-
-demo has a "significant" instructional value
(Brictson and Burger, 1976). However, there
were no data quantifying the specific con-
tribution that the feature made to overall
training effectiveness or efficiency. This
application does represent a case, however,
where an instructor-flown demonstration was
not possible because the AID is a single
seat configuration. No other method for
demonstration was possible.
The only known experimental study to
address the auto-demonstration capability
2224
was reported by Hughes, Hdnnan and Jones(1979). -This study attempted to address the,instructional impact of both the auto-demoand record and replay inaructional features.Subjects were divided into three groups witheach grOup receiving the same basic instruc-tion followed by either extra praftice, re-corded replays in lieu of one practice trialin each block of trials, or an auto-demo inlieu of one practice trial in each block oftrials. The results indicated that extrapractice was most instructionally beneficial,followed by record and replay'and auto-demoin that order. fhe generalizability of theseconclusions must be questioned, however, inthat the basic instruction for all threegroups incorporated auto-demo, and automated .instruction was used in lieu of instructors.Given a real-world instructional setting,these results might ncip hold-up.
There are two basic methods for creatingauto-demos. One is.t use an ATD's recordand replay capability. This method requiresthat a proficient air ew member fly thenecessary profiles in the desired mannerusing desired techniques. This method hasthe advantage that highly proficient crew--members.usually are available and he dis-advantage that it is difficult, even forhighly proficient,aircrew members', to flyprofiles with the perfection often judgednecessary for a performance model for thestudent. Also, the lack of well definedcriteria for the largest percentage of fly-ing taskslmakes it difficult for instrubtorsto agree On what constitutes an acceptabledemonstration.
The second method for creating anauto-demo is to develop customized softwarefor each demonstration. This approach hasthe advantage that mathematically perfectdemonstrations can be created. It has thedistinct disadvantage, however, of requir-ing computer software specialists' to createnew demonstrations or to modify initialdemonstrations. Both manufacturer and" unrpersonnel view the direct software approachas the more time consuming and castly of thetwo methods available.
The need for "perfect" demonstrationsmust be questioned for either method ofdevelopment. Demonstration of "typical"performance may be more appropriate in thatno one flies "perfect" maneuvers in the real
_world. 'Appropriate tactics, techniques andperhaps even typicdl errors and performanceproblems should.be emphasized, rather thanpucise flight Oath control.
Record and Replay,
Record and replay,is tlie capability ofsimulators to record relevant,system
parameters and then use these data to re-create student's performance. Typically, thelast five:Continuous minutes of performanceare recorded. After freezing the ATD, theinstructor selects the point in time duringthe Tast five consecutive minutes that hewants replay to be initiated. All 6ents ofconsequence are reproduced during replay,including the visual scene (if present),motion system operation, primary flight con-trol and display movements, and appropriatesensor disp1dy coiitent. The ATD performs asthough the student was again flying the re-played segment exactly as it had been flownbefore.
4
Experience has shown that record andreplay is used more often,in the training oftasks that are new to the student and arerelatively complex (for him). In general,the use of record and replay has centered onundergraduate pilot training and on thetraining of new (to the pilot) and relatively
4 complex advanced flying,skills, includegnight carrier landing and air combat maneu-vering.
25
The primary instructional benefit at-tributed to reCord and replay is that itprovides both students and instructors withan objective recreation of the student'sper-formance that can be examined for problems,errors and their causes. In short, thestudent is provided with objective knowledgeof results and the instructor with concreteevidence from which to suggest areas ortechniqUes for the improvement of the stu-dent's performance. Reploy seems to be mostuseful when'the student%it not aware that hehad made an error or is uncertain of the pre-cise cause of his error and resulting per-formance problem. This situationjs,mostapt to arise .when complex ta4s are being,initially learned. FollowingInterviews'with instructor pilats throughout the,mili-tary, Skple..et al. (1981) concluded: "The
training value of record and replay will begr6atest when the cueS-, responses and task
performance requirements being learned are,%.:new to the studipt"
Record and replay is currently usedalmost exclusively for pilot and/or copilottraining. Replay could be used, however, ina,crew training environment as a mees toevaluate and improve crew interaction.Voice replay also would berequired in thisapplication since some of the better indica-tors of crew interaction are crew communica-tions. This potential value for replaytde-pends heavily on how well Individual crew-member responsibilities are defined.
The record and replay feature requiftssubstantial two7way communication betweenATD controlling soft(4are and hardware, and
4-
crew station displaysland controls. Thus,
the feature is found only in more sophisti-
cated, computer controlled ATDs. ATD hard-'
ware and software must be configured to in.-. 7
duce changes in the operation of siinulated.
subsystems withoutintervention bY the stu-
dent or the instructor..Several factors should be considered
when implementing a record and replay fea-
ture. For replay, a total of five minults
appears to be-more than adequate. In most
uses, 1.5 to 2.0 minutes probably would be
sufficient. The amount of replay time to
be incorporated into an ATD should be deter-
mined on a case by case basis. Anqi'terna-
tive to specific time increments also should
be given consideration. A reasonable alter-
native would be to provide the instructor
with a cueing control. Replay then could
-be started atAdetermined time prior to
when the cueing control was activated (e.g.,
20 seconds; or whatever would be reasonable
for the particular training application).
This would allow instructors to relate re-
play to training events rather than clock
time. Voice replay should receive serious
consideration for training tasks where voice
communication is a significant element of
total task performance. Finally, there may
be training and user acceptance values in'
being able to deactivate replay part way
through, thereby allowing the studeRt to
assume manual control using a "fly out"
capability.qg
Malfunction Control
Procedures trasiners, part 'task trainers,
operational flight trainers and full mission
simulators generally include capabilities to
simulate a variety of malfunctions. It is
widely accepted that ATOs proAde a safe and
controlled environment-for training re-
sponses to malfunctions and vmergencies. In
such,settings, responses to single and mul-
tiple malfunctions can be trained and.prac-
ticed either in isolation, or in mission con-
texts. The number of,malfmnctions that in-
structors can present,to'a student typically
ranges ,from 60 to several hundred. The num-
-bv typically used ranges from 20 to 50.
The most common,method of inserting and
removing simulated malfunctions is manual
operation of controls by the instructor.
Types of controls now in use include dedi-
cated pushbuttons, programmable pushblittons,
alphanumeric keyboards, and touch panels.
, Another alternative is automated mal-
function insertion. One auumed value of
automatic malfunction,insertion and removal
is that it unburdens the instructor from
more routine tasks, freeing his.attention
and time for more important instructignal
activities. A second value involves'student
self practice. Some training managements'
encOurage students to practice whenever AM
time islavailable, even if instructors can-
not be bresent, In these cases, automatic
malfunction insertion could be used to struc-
.4ture training sessions.
Little creative thoUght has been applied
to meaningful-methods for automatically in-
- serting malfunctions. Time into.the mis-
sion is the most common method and has proved
not correlat with mission qv ts. Forunworkabre because mission li
e
e often does
example,- AID clock time into a mission typ-
ically does not take freeze into account.
If the ATD is frozen so the instructor can
work with the student, the malfunction clock
keeps running. As a'result, a new malfunc-
tion can be inserted at a very inappropriate
time in the mission.. The acceptability of
this circumstance is low, both instruction-
ally and in terms of'user acceptance.
Semple, Vreuls, CottonDurfee, Hooks
and Butler (1979) developed a functional
specification for a simulator instructor's
console which, among other things, incor-
orated'automated malfunction insertion and
removal.- In the initial concept, the in-
structor could select a malfunction and
select from a list of initiating conditions
for that malfunction. The malfunction would'
be inserted when the initiating conditions
were met and would be removed following the
completion of correct student responses. 'In
an experimental prototype of the system,
automatic malfunction insertion and removal
were incorporated, but instructors did not
have a choice of initiating conditions.
During a preliminary test of the syStem,
instructors did not respond favorably to the
automated malfunction scheme (Semple, 1982).
Their strongest complaint was lack of flex-
ibility. However, instructor training in
system utilization was minimaland it is
not known whether their commentS- would hold
if tnstructor training in the feature had
been 'more rigorous.
' There are four general issues which
should be considered when implementing mal-
function insertion options in an ATD. One
is the,amount of malfunction cue recognition
which shoulfl Ue trained in versus out of the
ATD. Using valuable training devices to
train content, which would be equally appro-
priate for other media, is fnefficient train-
ing design. Ease and convenience of use by
instructors and operators is equally impor-
tant. If a feature is difficult or trouble-
some ip use, experience shows that it is
likely to go unused (Semple, et al.; 1981).
Memory aids shoupl be designed into the
system to remind instructor5 which malfUnc-
dons are engaged/and.which are available,
4
Finally,,with autbmated malfunctiog inser-tion, programming must.allow for meaningfulfailures relative ta the mission rather tnanto an-arbitrary time-line and instructors.must be able to override iutomated insertionsand removals.
.
Automated Gueing and Coaching
Cueing messages alert stUdents,e.g. "check altitude:" Coaching mes-sages instruct, e.g. i?etarn toflight level 150. -Au omated cueing and
"ncoaching are not common in present ATDs.Their assumed instructional values,centeron tht promptgess and accuracy of guidanceand feedback informatton provided to thestudent, and the upburdening of instructorsthrough automatfon. Such systems also tan-provide feedback and guidance to studentswhen a qualified instructor may nbf be pres-
- ent, as in "extra time" practice. Theyrequire good automated Performance measure-
. .ment to determine what messages should be,given to the student and tHe timing of theirdelivery. Automated cueing and coaching sys-tems can be disruptive if messages occur toofrequIptly, which suggests the need to beable to deactiVate the system, and further .
suggests the desirability of.decision logics,designed to keep the frequency of Cueing ancoaching messages within acceptable bounds inrelation-to student skill levels.
A programmed mission scenario typicallyts required,so that desired performance isdefined cld'arly. A quantitative performancemeasurement (QPIO system and additional.de-cision logics also are required for deter-mining message content and timing. *A QPMcapability is needed to Sense when studentperforMance is less than what is required forthe task he is practicing. When differencesare found,,,system logics are needed'to iden-tify the*propriate message content. Typ-ically, a cueing message would be transmittedfirst and performance monitoring would'con-tinue. If the performance deficiency wasnot corrected, the appropriate coaching meS-sage would be transmitted. If the deficiencycontinued, either coaching Messages could becontinued or the instructor alerted so thathe could intervene.
Thred technOlogies are available forcreating tlib mesages to be transmitted tatfte student: audio tapes; digitally storedspeech; andoomputer generated speech. Com-
puter generatdd speech is relatively new but
is readily understandable.r,Digitally stored.speedh-is eveniii&-e-"hUMan.", Taped Messagesoften involve prolonged search times and .
mechanical unreliabilities. DigitOoly stored
, or computer generated speech.technologies arewell suited to automated.cueing and coachingmessage delivpv because,the messages-
27
involved usually are brief, and computermemory requirements are well within reason.Also, changes in message content arft easilyaccomplished.
The issue of which messages should bebuilt into an automated cueing and coaching
system must be addressed on a case by casebasis. The analysis should begin by identi-fying typical performance problems that gounattended by the student. Commonalitiesamong the mtformanceproblems (and associa-tion cueing and coaching messages) then .
should be identified,so that the smallestpossible set of cueing and coaching messagescan be identified. Draft messages thenshould be developed and reviewed for clarityand brevity. Finally, the automated cueingand coaching'system should be tried out'a representative target operational trainingsetting before its design is finalized.
Present applications of automatedcueing and coaching center on basic flightand navigation task training. In the uture,it may be pOssibje'ta incorpOrate these capa-bilities into procedures' task training. In
one application, use of ,the-feature variedconsiderably from tostructor to instructor,which was to be expected.because of lack c
of instructar training On poiential valuesand limits.. SoMe'instructors indicated thatthe feature was,used "quite Often" by stu-dents who came in to practice on their Own,but they could not quantify the amount of
, use.
It is.likely that the use of looser per-formance tolerances is desirable early intraining to trigger cuetng or coaching mes-sages. This coutd se7ve to reduce the num-ber of messages transmitted to the student -
duning early training when his abilities toperform may be significantly less than at the
'conclusion of fraining, and when distractionsmaybe of negative trairiTng value. However,there presently are no guidelines for deter-mining'these tolerances. Fmrther researchis needed.
r
AUtomated Controllers,-r
Control ineans to regulate on direct.Instructors often play the roles of air ,
traffic controllers, tacti,cal controllers,
or they control the actions df simulated air-* borne threats. The automated controller
instructional support feature is designed toassist ATD instructors in-providing the conitroller function.
Automated controller systems incorporatemodels of the specific operational situationsthat they control and require automated per-formance measurement capabilities to relateactual student performance to desired
performance. antroller messages are thenappropriate to,both the original incoming-
message and the operational situation. The
combinationiof automated speech understand-,
ing, situation recognition and computer gen-.
erated speech are becoming powerful instruc-tional tools for automated controller appre-
ciations. A typical example involving bothspeech understanding and speech synthesis
might be:
Incoming communication:,From Pilot,
"Approach Control - X RAY 1
turning to final"
Situation:>
' Aircraft X RAY 1 is turnirigonto the final ILS approach'at the correct altitude.Environmental conditions arethose selected for the exercise.
Outgoing COMmunication:From Automated Controller
'X RAY 1 you are cleared to
land. Wind now 150 at 20
gusting 27"
Synthetic hice-based controllers areexpetted to be used increasingly for manyATD voice applications with highly struc-
tured vocabularies.
Potentiall training benefits stemingfrom automated controllers lie in four areas:
1) unbuedening instructors and/or ATD oper-ators-from playing controller roles; 2) in-
creasing the timeliness and correctness of
controller messages and (edback; 3) unbur-dening instructors from the'measurement ofverbal task performance (and associatedrecord keeptng); and 4) providing a.newmedium through which students and ATDs can
interact in a highly naturaloanner. As
,.,examples of the fourth point;it is tech-,-nically possible for theystudent to ask theATO, "How,did I do on that bomb.runV If
the system has the necessary performanCe
models and performanCe measurement capabili-
ties, it could respond: "Very well," and
provide a detailed performance diagnosis if
' desired. Automated voice technology also
opens opportunities for very,precise, autd-
.',mated student-calching and cuein-g.
Current training,systeMs which incor-
porate computer speech understanding are
" limited tO--individual 'word recognition MR)technology to'interpret individual words or
very short,phrases, This technology re-
', quires very precise, stylizedsspeech by the
human and requires each student to repeat
28
c-
each phrase or word.up to,1Q times to-"traiethe computer to underltand what was"said..
In a recent prototype:training system evalua-'
tion, recognition rates 50 to 97% we're found
with an average of 86% (McCauley and Semple, --
1980). This is far below the 95 to 99%recognition rates possible under ideal con-
ditions (Lea, 1980). Connected speech recog-
nition technology has recently surfaced, al- .
lowing peoplb to speak more naturally, with-
out the ttylization constraints required.byIWW. Also, connected speech systems seem
to be easier to "train.:'
With resliect to,computer generatedspeech, present technology is quite a4equatefor creating words and sentences that can be
understood by the human. Work conttnues on
ways to make the cOmputer generated speech
sound more natural. Finally, much of the
technology needed to create the mathematical
models and performance assessment-capabili-ties required by automated controllers also
exists. However, it still is the'case that
all such models require experimental testingand fine tuning.
The design of automated controllermodels involves two principal considerations.Firstl the technology of computer speechunderstanding is improving very rapidly.Computer speech technology'developments inthe last,teven years have emphasized "apOli-
cations," rather than the development ofbasic principles of speech understanding and
synthesis. This has resulted in certain sys-
tem inadequacies at this time, but dramatic
improvements are currently under development
and will become availa0e in'the near future'
(Cotton and McCauley, 1982. )
A second-and important consideration is
the desir of the operational performancemodel that drives the controller. Early
controller models, for example, were derivedfrom "text book" procedures fon performingthe maneuvers that were being controlled.In the operational world, pilots seldom flyprofiles strictly according o procedures.Hunt) controllers are aware of this and re-
pond accordingly. For example, a pilot may
hooseto turn to intercept.his final ap-proach courseat a distance from touchdownand'at ao altitude that an autoMated con-troller has not beensprogrammed to recognize
as the initiq point for a final*approach to
landinig. Two things can result. One is
that the controller model may issue spuriouscommands because it has not Correttly recog-nized initial conditions for-the start of
the approach. Seced, the automated per-formance measurement system that providesinformation to the controller model also may.be "fooled" because of a departure from theprocedure it has been prdgramthed to accept
as baseline performance. This can,provide
ki
44a0.
I.
3
!the controller model with inaccurate,infor-mation and can result in a poorer store onthe approach than actually-was'earned.
An automated.ground controlled approach(GCA) controller has been successfully ap-'
plied to an F.-4E simulation,as an integralliareof the Automatic, Flight Training Sys-tem (AFTS) (Swink,'Smith., Butler, Futas andOngford, 1975). This instructional appli-cation of automated speech technology markedthe beginning of_a new era-of AutdMated con-trollers-for ATOs: While the F-4E AFTS GCAcontroller required a resident general_pur-pose computer and i disk memory.systet-to
provide a limited repertoire of GCA orientedwords and phrases, modern microelectronicstechnology now offers similar capability on2 to 4 chips with repertbires Of up to 200words. The AFTS technology was; however,foundAuff#ient for it's purpose and sub-sequently employed in A7 Air National Guardand Greek Air Force training.
There are many prototype systems eitherunder development or in testing which incor-porate automated controllers including the'Navy's F-14A operational flight trainer(Seeapie et al., 1979), the Brecision
. Approach Radar Training'System (PARTS)(McCauley and Semple, 1980) and an AirIntercept Controller (AIC) t ming system(lictau4y, Root and Muckler: 2).
%
The use.
of controller mod c in air-...crew trainihg is relatively ne, The tech-
nology And the "lessons learned" \data basesrequire expansion. It is stronglY\recom-
-mended, therefore, that all automated con-troller models be evaluated and refined.during the development process to 6surethat the models function accurately beforetheir desigm is frozen.
I
tomputer tontrolled Adversaries
Computer controlled adversaries fre-quently are referred to as "iron pilots.!'
ThWare-computer models that control the'actions f simulated'adv,ersary ai-rcraft.
Iron,pilots have been used in visually. equipped air combat simulators such as the
Air Force SAAC, Northrqp Corporation'sLAS/WAVS, and NASA's Differential Maneuver-irig Simulators. properly desibried, they tanprovide realistft adversary maneuveeing-
while unburdening the instructor from con-trolling the adversary. When combined-within automated performance measurementzpapa-bility, summary iriformation can be 6enerateddescribing engagement final butc,ome, offen-sive/defensive times for each'ejrcraft,time in the gun envelope, time in missile-envelopes and similmperformance infor-matibn.
.
4
Three primary instructiOnal values 'areassociate4 with computer controlled adver-saries. One is the repeatability of the be-havior of the adversary, which is consistentand predidtable (by instructors) during"
training and provides a more consistent base-, line against which'to evaluate student per--- fdrmance an-d diagndse learning problems. A
second value is the unburdening of the in-structor during training so he can concen-trate on student performance and providemore meaningful and timely guidance and feed-back. A third value is the lesSening ofspecialized instructor skills that must bedeieloPed to continuously control a simu-lated adversary aircraft from a remote con-
,.
29
sole.
Iron pilots with.selectable levels-ofpilot skill hold"ctnsiderable potential forair combat training. Easier adi/ersariescould be used earlier in training. As thdstudent's skill levels increase, more dif-ficult adversary reactions could be selected.The progressive ipproach to adversary capa-bilities could'hasten the learning process.This was the rationale behind the "normal"and "difficult" autopilot apersaries de, _
veloped for the Northrop LAS/WAVS simula--tion. Instructors in LAS/WAVS used both,difficulty levels for the trainim of tran-,,.,
sitioning students (Spring, 1976; Payne,et al., 1976).
Experience gained.with iron pilots in-the SAAC device and the'Northrop LAS/WAVSsuggests three relevant design considera-tions'. The first is that adversary actdonscontrolled by tron pilot Models must be-realistic. Original iron pilots were "toogood" and consequently were unrealistic.
'They operated an perfect information, and, theie decisions were made-.almost instantly. 'This made aem virtually unbeatable....:They
, had little training values as a reiujt.When iron pilots' are designed to provide
realistic maneuyering.f they are well re-'ceived by instructors and are used exten-sively. The secofid consideration is that .
iron pirots with selectable "skill levels"Should be developed so that adversary re-,.sponses can coincide, with student pilot -
skill levels during training. ,The third con-sideration is to incorporate fundamental drbasic adversary maneuverIng capabilities(such as simple turning maneuvers) for useearly to basioairicombat training,
. I
Quantitative Performance Measurement ---,
,Quantitative performance measurement(QPM)'for training is the computer-based
monitdring, recording, processing and dis-, playing of objective, quantitative infor-
mation for describing and diagnosing student
performance. OM systems have been fairly
,.commOn inLresearch simulators for over ten
, years, but researchsystems are.not welltuited for operational training. They have
. been tailored for research use and frequentlyproduce voluminous performance data that renquire subsequent statistical processing. A
QPM system designed for use in training mustIperform all statistical and other processing
. 'of performance data in real or near-real.time-so that students'and instructors are providedwith useful, concise and timely performancefeedback.infOrmation.
Practically 411 qUantitative measurementcapabilities in existing ATDs or ATDs soon t6be delivered are best described as perfor-mance monitoring and data recording capabili-ties. They allow instructors to select tol-erance bands +/- 100 feet) aroundvarious performance,parameters (e.g., alti-tude). The system then monitors for casesthat exceed tolerance band values, and re-cords and/or reports out of tolerance condi-tions. Such limited performance monitoringcapabilities have been applied effectivelyto drive automated cueingand coaching sys- '
tems wheee individual parameter varia ionshave been assuthed to'have.instructional
meaning. However., such capabilities havefound little acceptance for performance eval-uation and learning problem diagnosis inday-to-day training. In other words, suchsystems are not used by instructors. The
volume of data produced bY,such systems oftenis overwhelming and is difficult to integrate
and intenpret. Also, instructors almostnever ard'trained to use individual parameter
'variation data meaningfully for training.
APM res arch has sh,own that some mea-
sures contrib ,e more"to the total descrip-tion of stude performance than do others.Research also has"shown that individual mea-sures may not be useful for aiscriminatingbetween "good" and "poor" performance, butproperly weighted and combined-are quite use-ful for.discriminattng performance differrences (Waag, Eddowes, Fuller and Fuller,
1975). The extent to wh3ch various measuresmust be weighted and/or combined remains are-Search issue, but the need to do so has
.
been demonstrated for basic instrument flightmaneuver training (Vreuls, Wooldridge,Obermayer, Johnson, and Goldstein, 1576) andair combat maneuvering training (Kelly,Wooldridge, Hennessy, Vreuls, Barnebey and-Cotton, 1979).
,,
. Quentitatiye measurement of.performance.
'of proCedural sequences is d relatively newtechnology. A number Of newly acquiced-ATDswill incorporate procedure monitoring cepa-bilitii (e.g., F-16, A-10, F-5, B-52 and
C-130). These Systems will display tHe se-quences in which procedures are performed.It will be the instructor's responsibility
to determine whether or not the performance
.is.acceptable:
Under Navy sponsorship, blends of"manual" and "quantitative/automated" per-formance measurement capabilities Wete in-corporated into a recent experimentalprototype instructional support system(Semple el al., 1979). Among other factors,procedural performance was displayed on avideo display terminal. An ideal sequenceof procedural steps was displayed, irrele-vant procedures were separately displayedos they occurred, and the clock times atwhich all procedures actually were performedwere displayed beside the steps. In a
rather limited initial trial use;'instruc-tor pilots found this display valuable, al-thougeactuar event times Were not consi-dered necessary (Semple, 1982).
In the same system, procedural Perfor-mance scores were derived and displayed.The scores were based on algorithms developedby a gr'Oup of highly qualified instructorpilots; and following much heated debate.In practice, the weighted scores were judgedinvalid by instructor piloti. The lessonseems to e that valid quantitative per-formance measures (Individually) and scores(collections of weighted measures) must bederived through statistical analyses, atleast for flking training.
et,*
Taken together; flying training QPMcapabilities which emphasize either quanti-ties of individual measures or analytically-derived weightings of several measures willbe of little practical value until bothinttructor training and measurement method-ologies improve with respectqo quantitativemeasure indices.
'-cuidelines for the design of prac1ic41,valid and acceptable QPM systems are not yetat hand. Human performance is complex, andone human's evaluation of another human'sperformance is more complex. Computer-basedsystems for assessing and diagnosing humanperformance are beginning to evolve, butoperatiorial applications of true QPM systemsbasically are non-existent in flying train-ing. Further research is yequired.
\..30
ISTRUCTOR TRAINING
Virtually all instructors wii-wtrain' other pilots usidg aircrew trainipg devices,
Other than basic procedures trainers, are'rated airmen. Typically they are motivatedand dedicated personnel who are highly cpm-petent at performing the tasks they areteaching others to do. They may have been,assigned their instructional duties, br -
they may have volunteered for the job.
28.
0 Ii
.Instruction (in-ATDs and/or aircraft) may betheir primary job assignment, or it may be anassignment dominated by collateral duties.Specific training for their instructionalassignment may have been systematic andrigorous but more likely it was not.
The best of training equipment, by it-self, Will'not produce operationally readyaircrews. The equipment must be used effec-.tively and efficiently to achieve this goal.Obviously, instructor training should be cen-tral to the effective and efficient use of
0 - Ans., This comment continues to haVe con-siderable face validity even.though tbere isno empirical evidence indicating that ratedpersonnel are required fo'r flight-orientedtraining, or that inttructor training ininstructional principles, ATD use, or in-'structional feature utilization actually hasany benefit (Caro, Shelnutt and Sptats, 1982).The fact of the matter is, these issues neverhave been systematically examined.
It seems self evident that ATD instruc-tors are central to the proper' use of train-ing devices. Part of the instructor's jobis knowitig how tooperate Ms. A secondpart is knowing how to use such devices andtheir capabilities effectively. Achievingthe second goal requires knowledge of deVicecapabilit4es along with.principles 'of in-j,struCtion. Present, typical military in-structor training provides neither withcertainty.
-
There are exceptions. : However, typicalmilitary pilot instructor training programsfocus on how to perford the tasks to be ,trained, safety, ansl training-related ad-ministrative matters. On average, only aboutthree hours of formal instruction deals with,how to be a teacher. The operation and useof ATDs typically is left to informal on-the-job training. Overall, there is little in-structional quality control, except forstandardization and evaluation functions,which may or may not focus on ijistructional_processes and Products..
There is no question that the militarypilot training system, including simulation
'.-training, works. The issue really is one ofefficfency and effectiveness: could more bedone and could it be done with more efficientuse of resobroes. Simulation plays an im-portant role io pilot training, and this rolelikely will grow. As it grows, instructorselection and ti-aining Will be keys to en-fiance productivity. Pei'hait is timeathatthe prinpiples,of instftctional system de-velopment are applied to the-tasks of man- .
aging and conducting training, as well as tothe tasks to be Irained.
31 ,
, REFERENCES
Brictson, C.A. and Burger, W.J. Transfer ofTraining Effectiveness: A7E Night CarrierLanding Trainer (NCLT) Device 2F103.Technical Report RAVTRAEQUIPCEN 74-C-0079-INaval Training Equipment Center, OrlandoFL, August 1976.
Caro, P.W., Shelnytt, J.B. and Spears, W.D.Aircrew Training Devices: Utilization.Technical Report AFHRL-TR-80-36. AirForce Human Resources Laboratory, Logis-tics and Technical Training Division,Wright-Patterson AFB, OH, January 1981.
Cotton, J.C. and McCauley, M:E. Voice,Tech-nology Design Guides for Navy Ti-7YrniiigSystems. Technical Report NAVTRAEQUIPCEN80-C-0057-1. Naval Train4ng EquipmentCenter, Orlando, FL, 1982 (Draft Report).
.Hughes, R.G., Hannan, S.T. J5nd Jones, W.E.Application of Flight Simulator Record/
,
Replay Feature. Technical Report AFHRL-TR-79-52. Air Force Human ResourcesLaboratory, Flying Training Division,Williams AFB, AZ, 1979.
Kelly, M., Wooldridge, L Hennessy, R.,Vreuls,-D., Barnebey, S. and Cotton; J.Air.Combat Maneuvering Performance Mea-surement. Technical Report AFHRL-TR-79-3.Air Force Human Resources Laboratory,Flyingjrainillg Division, Williams AFB,AZ, 1979.
Lea, W.A. (Editor) Trends in Speech Recog-nition. Englewood Cliffs:
, 1980.Prentice-Hall,
McCauley, M.E. and Semple, C,A. PrecisionApproach Radar Training'System (PARTS)Training Effeotiveness Evaluation.Technical Report NAVTRAEQUIPCEN 79-C-0042-1. Naval Training Equipment Center,Orlando, FL, August 1980.
McCauley, M.E., Root, R.W. and Muckler, F.A.Training System Evaluation of.the Proto- .
type Air Controller Training System forAir Intercept Controllers (PACTS AIC).Technical Report NAVTRAEQUIPCEN 81-C-0055-1. Naval Training Equipment Center, -
Orlando, FL, 1982 (Draft Report).
Payne, T.A., Hirsch, F.L,, Semple, C.A.,Farmer, J.R., Spring, W.G., Sanders, M.S., g,Wimer-, C.A., Carter, V.E. and Hu, A.Experiments to Evaluate Advanced FlightSfmulation in Air Combat Pilot Training:Volume 1. Transfer of Learning 6xperfment
. Northrop Corporation, Hawthorne, CA,March 1976.
I.
Semp7e, C.A. Instrudtor Pilot Evaluation of
thEAutoMated Instruction-Support Station
(ISS). Technical-Report NAVTRAEQUIPCEN4-C-0084-1. 'Naval Training Equipment
Center, Orlendo, FL, March 1982.
Semple, C.A., Cotton, J.C. and Sullivan, D.J.
Aircrew Training Devices: Instructional-
Support Features. Technical Repprt AFFIRLd-
TR-80-58. Air Force Human Resources
Laboratory, Logistics-and Technical
Training Division, Wright-Patterson AFB,
OH, January 1981..
Semple, C.A., Vreuls, D., Cotton, J.C.,
Durfee; D.R., Hooks, J.T. and Butler, E.A.
Functional Design.of an Automatid
Instructional Support System for,Opera-
tional Flight Trainers. Technical Report
NAVTRAEQUIPCEN 76-C-0096-1. Naval
Training Equipment Center, Orlando, FL,
January, 1979.
Spring, W.G. Experiments to Evaluate Ad-
vanced Flight Simulation in Air Combat
PilotTraining. Volumej. Simulation
Development and Preparation. Technical
Report NOR 76-52. Northrop Corporation,
Aircraft Division, Hawthorne, CA, March,
1976.
Swink, J., Smith, J., Butler, E., Futas, G.
and LangfordPI.H. Enhancement of Auto-
mated Adaptive Flight Training System
(AFTS) for F-4E Weapon Training Set
(WSTS): Training Specification. Tech-
nical Peport NAVTRAEQUIPCEN 75-C-0017-1.
Naval Training Equipment Center, Orlando,
° FL, September 1975.
Vreuls, D., Wooldridge, A.L., Obermayer, R.W,
Johnson, R.M., Goldstein, I. and Norman,
D.A. Development and Evaluation of
-Trainee Performance Measures in an Auto-
mated Instrument Flight Maneuvers Trainer.
-Technical Report NAVTRAEQUIPCEN 74-C-0063-
. 1. Naval Training Equipment Center,
Orlando, FL, May 1976.
Waag, W.L., Eddowes, E.E., Fuller; J.H. and
Fuller, R.R. ASUPT Automated Objective
Performance Measurement System. Technical
Report AFHRL-TR-75-3. Air Force Human
Resources Laboratory, Flying Training
Division, Williams AFB.,.AZ, 1975.
3 0
32
rt
THEY CAN MAKE OR WEAK YOU: CONSIDERING THE INSTRUMORAS A USER IN AUICMATED TRAINIM.SYSTEMS
Military systems are becoming moreadvanced and complex. This results in arequirement for'increased training time byusers'. ConcurrentW, the growing manpowershortage is causing a decrease in thenumber of experienced people available'toprovide instruction and training on anynew system. Thus, the military is facinga need to provide more training using lessinstructor resources. '
One solution hes been the developmentof automated training systems to helpshoulder the training burdens. Thesetraining systems, more than just simulators,or Part-task trainers, present designproblems Which operational systems don'thave. Designers-have to produce systemswhiCh will train the required operationalskills and also provide training relatedfunctions for the students and instruc-tors. ..These include suCh capabilities ason-system instruction, performancemeasurement, preprogrammed scenarios, auto-mated feedback, and training managementfanctions.
Training system design is a fairly byoung art and the design artists are slowlylearning how to do What they intend. Inthe past, the system designers have tendedto concentrate primarily on hardware andsoftware considerations. Mbre recently,there has been an increasing realization
that courseware considerations (e.g., -whatis being trained, who is being trained,etc.) should also be a driving force intraining system design. The latest reve-lation in system design is "peopleware",attention to the people factors in theinterface between man andmachine.
Unfortunately, for many designers theterm'"users" only means students. Trainingsystems have many other sets of users. Theusers include instructors, operators, ad- -
ministrators, and maintenance personnel.Each of these sets of people will use thesystem and their needs should be consideredin the system,design. 'Of this additionalset ofusers, instructcrs have the mostinfluence on system implementation andsuccess. As a result, it is very importantto design the system to.be instructor
' friendly.
There.are a large number of factOrs to, consider in Making automated training
,
0
0
33,1
Dr. Robert HalleyLogicon, Inc.
systems instructor friendly. It is importantto note that-instructors, often for goodreason, tend to resist a change ficultheirtraditional approaches to cqmputer-basedtraining. First, they see that they willhave to learn a new 5db, When they alreadyknow the old one very well. Second, theyoften perceive this change as a demotionfrom an important instructional position toan assignment as a computer system laCkey.Third, there is a common.Pdpular fear abouthaving to deal with things computerish.Computers are typically regarded as beingmysterious,4-expensive, and very brdakable.This keeps people from just slepping in andusing them. Fourth, the instructors havereal concerns about whether the new systemwill be a training improvement. They areoften concerned about both instructionaleffectiveness and possible dehumanizationof the training environment.
This instructor resistance can be ex-pressed in a variety of ways, anywhere froman outright refusal to use the system to ,asubtle lack of faith in the system Whichgets transmitted to the student. All thesetypes of resistance can lead to the sameoutcome. No matterhow good a system is,if the instructors don't like and trust it,they can completely destroy its effective-ness. A personal example dates back tocollege, when a new videotape-based curric-.ulum was being used to replace normallecturing in amoral communications course.The instructor wasn't convinced about_thenew approach and.introduced the materialsby saying "All right, you guys'll have towatCh these duMb tapes. Then we'll get to.the important stuff." NOt many peoplewatched.the tapes very carefully. .
Implementing instructor friendliness ina. system means anticipating the instructors'needs and making them 4s easyas practicalto accomplish. It is not enough to puttogether a system which can do many and magicthings and not also make it easy to use. Forexample, many systems have extensive errorchecking to help identify whenoccur, but the system reports them a "TMerror 106" or a "100 413.218 error" leavingthe user to first go look up in an errortable precisely what happened and then tofigure ont What,caused,the error. A more ,
user'friendly approach would be to have the .
system report tomething like "That state-ment is incorrectly formatted. Please
correct the statement and re--enter it."
Practically, you need to get the inr.,structars.involved as early in the system
design process as you can. Instructors can
provide important input in identifying whidhtasks they will use the system to accopplish-
and what ways those asks can be dOne simply
and effectively. This information can be
used in designing the system-instructorinterface and in preparing the instructor
training materials 'In one Logicon system-design, instructor pilots were utilized in
designing the system's information-presen-tation approadh. The result was a very con-
cise set of CRT displays which have beenwell received as comprehensive and easy touse.
Instructional design is typically 7'
accomplished by contractors or militarycurriculum designers working outside the
actual instructional context. They often
fail to recognize that the instructors haveimportant things to contribute concerningthe curriculum and about the.instructionalapproadhat and methodologies that arefeasible and practical for the student
population. It is valuable to get in,-structor input and then make sure you point
out their contributionin the system'materials. This will add credibility to thesystem and give subsequent instructors some
pride of ownership.
Training systems are often implementedat user commands with only a minimum of
accompanying documentation. The documen-
tation is sometimes supplemented by aminimum of user training and then the in-structors are left on their own to figure
out the rest. The typical result is that
the system is exercised in only the
simplest modes.
It is important to integrate'instruaortraining into the system for both initial
and continuinquiuse. This important tool
can affect the instructor's willingness to
s use the system. Itis important tbensurethe instructors' knowledge and attitudes
about the sv-tem. Important training topics
include (1) the$ontinuing importance ofthe instructor's koles in system success(2).the reasons that the system is beingused to.replace sthe old methodologies (3)how the system it used in this instructionalsituation, and how to use the system to maxi-
mum instructional advantage. Using the
_system to prOvide this training can alsoi
show the instructors that the system can
train effectively.'
This traitang must-provide the ratiOnaie
fbr the developent cf the system and present
the pystem's approadh to the instructional '
task. Trip pystem should also detail theimpart:m.1f decision-making roles that the
instructor will be asked to fill and alsoexplain the breadth of the system's caperbilities fo- instruotor options. PracticeShould be provided in utilizing all the
options *and capabilities.
The initial instructor training Can be
provided through a thorough instructortutorial which covers all the topics listed
above in an_ instructional package. This
should be supported by a systkutHELP capac
ity.which provides access to irdividualtopics from within the tutorial. This would
allow a presentation of any area in Which
the instructor needt a review.
A pragmatic example of applying thesedesign principles is the Logicon developedInstructor Support System (ISS). Designed
specifically as a tool to assist instructorswith trainer associated tasks, in its initialapplication, ISS has been attached to an
existing flight simulator. ISS is being
used to replace an instructor station thatis so difficult to use that the instructormust spend most of his energy interactingwith the instructor station, leaving verylittle time or energy for the task of in-
structing.
The ISS provides many functions tosupport the instructors' tasks. FOr example,
the instructors can dhoose from threedifferent ISS training modes:" specializedtask training (STT), instructor sclt(ISEL), and canned. 'Each.of thesetrainingmodes allows the instructor to hand tailor
the students'simulator experience duringeach exercise. The ISS training curricUlumis comprised of a set of training modulessuch as "afterburner take-off" or "MiramarTACAN 1 approwh" or "left engine fire."Each module is separate and includes soft-
' were identifying what behaviors are to be
measured, how success is to be graded, whatchecklists and procedures are involved, andwhat marks the beginning and end of the
task. In each training mode these modules
are utilized different/y.
. STT mode allows the instructor toschedule one or more training modules whichfeature practice on a specificskill. This'
allows repeated executions.of a task suchas landings without having to practice take-
, offs and other tasks associated with anormal misSion. Thus, If the instructor
34
32
fllt the student needed-work on landings,several landings in a row could be sdheduIed.After each' landing the computer wouldautomatically set up the next one until thesession was finighed.
ISEL mode requires the instructor toassemble a, complete'"chocks-to-chocks"mission. In either ISEL or STT the in-structor can choose malfunctions'to be in7--serted into the practice.
The canned mdde providqa a predesigned-practice.session. In this mode thetraining modules have already been selectedand sequenced and the instructor need onlyselect the overall package. Since cannedmode practice sessions are-generally de-signed to follow thetraining syllabus, thecanned mode helps to ease the instructors'training burden. Canned mode exercises arealso used in check rides and "graduation"exercises to provide a consistent reasure-.ment environment.
In addition to assisting the instructorwith his task by providing practice exer-cise control, ISS provides other instructa.support features. These include stUdentNmonitoring, perfOrmance measurement,
grading, instrudtional records keeping, andstudenf performance debriefing. The studentmonitoring function allows the instructorto watdh a diSplay of what the student isdoing while it is being done. Important
display information including aircraft conrfigurations (e.g., flaps up, hook down),aircraft parageters (e.g., ppeed, altitude),a geographic plot and historic trail of air-'craft position, and diagppstic nessage asproblems occur are clearly presented on ,twodAsplay devices, rather than spread over aten-foot long instructor console.
The diagnostic messages are a very imrportant.feature which appear, thus far, tobe unique to ISS. Although some trainingsystems have diagnostic nessages which,aredisplayed at the end of a practice session,ISS provides a real-time,diagnostics displayat a very detailed level. This allows theinstructor to know innadiately the precisecause of a student problem. Figure 1 belowshows an example of ISS diagnostics.
The perforgence measurement function'keeps track of student behaviors and comrpares them to expected behaviors. The.grading function uses these performancemeasures tO provide a suggested grade-forthe student practice session on any of anumber of skills. 'Since there is often..some instructor resistance to having thegrading function taken over by a computer,ISS has been carefully designed so as toeasily let the instructor review all thescoring criteria and how the student's grade ,has been derived. Figure 2 on the following,
-
page shows a sample of the ISS "grade -
sheet." The instructor can review thQ
DIAGNOSTICS
PooR NAv CONTRoL--- FEEGA = 066 RAD 26 DMENAV AID IMPROPERLY SET--- UHF FREQUENCY = 261.6PooR ALT4TUDE CONTROL -- CROSS FEEGA AT. IWO FT
REMOVEMALFUNCTION
SELECT
ENVIRONMENT --OPYIONS (LAND)
a
RE-POSITION --OPTIONS
SELECT
SELECT
CHECK LIST ISELECTDESCENT CHKLST
MANUAL
f-
FREEZE HDCoPY VoICE
FIGURE 1. ISS DIAGNOSTICS DISPLAY
MENU
GRADE CR I TER I A
TASK MODULE = *GtA FINAL GRADE F 3
MEASURE NOM/NAL q.17 3.5 3 0 2 S VALUE GRADE VT
/ ON gLIDESLOPEa
100.01
BO la
70
55
SSqa
cidi 73 2 S 10
SLIGHTLY OFF .GLIDESLOPE IS 20 30 .q0 27 63 3 C .10
0 18.0i 20.01 30 Us
WELL OFF GLIDESLOPE 2 10 IS 20 21 63 2 0 10
0, 5.01 10:01 IS 01
ON CENTERLINE a 100.01 BO 7. SS si.ss q a la
a HU 70 SS
ASLIGHTLY OFF CENTERLINE IS 20' 30 a q a 10
0 IS 01 20.01( 70 Us
/ WELL OFF CENTERLINE 5 10 IS 20 SO 52 3 0 SO
.0 s:a) la as is as
-AOA DEVIATIONS (RMS) iff .5 / 2 / 8 2 q 2 WI 2 0 .10
15 0 6 1 2 i
GLIDE SLOPE SNAP SHOT a 30 SO 75 125 8 qa 1U
a 0 30 01 50.01 15 01
CTR LINE SNAP SHOT a 100 Us ISO 200 300: 1 q.11 la
100.01 151 01 200 01
ANGLE OF ATTACK SNAPHOT 55 IS EA is s 16 s i7 2 is as q a la
IS iq q 13.3 13 q 12 3
FIGURE 2. ISS GRADE SHEET.DISRLAY
session and quickly and simply Change any
of the grades. The scoring criteria havealso been designed so that they can beeasily-revised if the stiridards are indor-
rect or are changed.
The ISS instructional records keepingfunction stores-the performance measurer-vents, results, and grading for immediateCRT access and as well as for printing hard
copy records. The student debriefingfunction is an extension of the records
keeping capability. The student's last
entire, practice session is recd. ed and
the individual tasks can be play back at
either nornal or fast (4 timesspeeds or can be frozen at any po for
the student and instructor to review All
of the aircraft data and spudent perfor-
mance data-.are replicated'during the debrief"
playback.
.ISS playback is different than that
found in most similar systems. In newer
trOming systems, for example, the play-back.mode recreates the coCkpit,activitiessuCh as moving the stick and the pedals.
This information is not very valuablein-structicnally. .In ISS'the diagnostics
v
36
a.
messages, aircraft parameter displays, and.
a historic radar- and connunications trail
are provided for review.
The .ISS provides multiple functions to
assist the instructor with his job. Be-
cause of careful use of instructor input
and a continuing consideration of the in-
structor as usar during the design process,the ISS -provides a very instructor friendly
interface. The instructions for system use
are defined in terminology normally asso7.-
'ciated with the aviation environment. 15:
series of decision menus are readily acces-sible through,touchopad controls on tW lower
display; In contrast to.man%new systdms
2F112, 2F119) where scenario'genera.,-.
tion is'very difficult, these nenus Provide
theinstructor with cdmplete exerciseauthoring and, practice control capabilitiesthrough a simple series of screen touCh
interactions.
The ISS encompaeses an extensive ir-stentor tutorial introducing the system
and the system's functions as well as giving
the instructor an extensive series of prac-
tice exercises. The tutorial has beensUbdivided into a set of minitutorials
c
whidh are-aocessibae at all times througha HELP function.
There are at least two more lessons tobe learned from the ISS experience. First,it is not enough to have the ISS features.The features must be implemented well orthey can even make the system less usable.FOr example, most new systems provide in-forMation on CRTs and provide system con-trol through light pens or function keys.The concept is good, but the tendency isto provide the system access and systeminformation without mudh thought to humanfactors. Thus, the screens are crowded andhard to read and the menus hard to use.
Second, the ISS is not yet all happi-ness and light. It; too; still hag some ,
problems in the area of user friendliness.Design of good practice sessions requiresthat the instructors have done some priorplanning and know what they are going touse the session to accomplish. Otherwisethe training session can turn out somewhathaphazard and not meet its training objec-tives. These requirements for 'planning and
forethought-can make the ISS a littlescary to the user. Requiring this pre-planning may, in the end, be beneficial tothe process,'but it is not entirely user
'friendly. An hAarovement on this approachmight be to have the system be "smart"enough to help the_instructor develop theprocess by haying guidelines and rulesbuilt in.
Any good training system design willattempt to design the training syStem tobe sufficiently capable and flexible toensure that the machines serve the usersrather than the other way around. Theusers should not have to accamAodate thehardware and software by learning littletricks and changing their behaviors tomeet the machine's needs. However, thechoice qf any approach will constrain theultimate flexibility of the design and itsability to meet all of the usevs' needs.If, for example, you put the "ON" key onfhe left, people who are used to having the"ON" key on'the right or Who have never
, had to turn anything "on" will have tolearn to adapt to the machine. MOney,
c...)
pe sonnel, and time constraints tend toco spire to make your design choices for
' , but it is important to remember toild in as Much 'user consideration as you
an and to learn from your mistakes. ISS,'in its next incarnations, is going to pro-vide a mode Where it acts jtistlike theinstructor console it is tpplacing. 'Thisway the instructors will have to learnfewer new skills, but will still have thepower of the ISS at their disposal Whenthey want it.
37/g8
3
It is crucial to remember that instruc-tors are a vital instructional resource.Instructors can help a training syptem pro-vide effective and'efficient training orthey tan-severely limit the system's useful-ness simply by how they respond to theSystem. Instructor friendliness, then be-coAes a major consideration in trainingsystem design. Two important initial stepsin building an instructor friendly systemare getting to know the instructors' needsand getting the instructors involved in the 1,development prccess. A third very *portantstep is using the training system to trainand convince the instructors about thesystem's usefulness. A fourth important de-sign consideration is building the systemto serve the user, rather than vice versa.Lastly, one must remember the"user friendly"includes all the users, not just the stu-dents.
A
t
;4
INSTRUCTOR ROLES: OPTIMIZATION OF THE INSTRUCTOR STATION INTERFACE
pR. B. E. MULLIGANUniversity of Georgia
and
LT. THOMAS N. CROSBYNatral Air Systems Command
INTRODUCTION
Use of the comPuter in its vari-ous manifestations as a medium forinstruction has contributed much toour understanding of the instruction-al. process. Impact of this powerfulmedium on instructor roles, however,has received little theoretical at-tention. It is.4the,purpose of thispaper to examine the impact of tom:puter-based instruction on tradi-tional instructor roles and funct,ionsin order to achieve a better under-standing of the possibilities itcreates for enhanced instructor-machine interfacing.
Use of instructional media ofany kind tends to alter in some ways'instruCtorst roles, at least how theyare performed. Some media may alterthe functions performed by instruc-tors to such an extent that qualita-tive changes in rolew..results. It isour contention that such changesoccur wherever the computer is em-ployed as the primary medium forinstruction. wherever it is used inwhatever mix with other media, ieseems that the computer uniquelyalters instructor functions.
The type of computer applicationwe shall consider consists of a net-work of terminals tied to a centralprocessing unit (CPU). In its sim-plest form the system may consist ofa single tefminal devoted entirely tostudent use. 7/n more elaborate formthe system may contain several termt-nals,.at least one of which wouldserve as the instructor's console.
In even the simplest form of 'tilesystem, there would exist two-waYcommunication between terminal andCPU.: In its more elaborate form,there also may exist the capabilityfor coMmunication among terminalsdirectly and via the CPU. Theoreti-cally, the inputs to and outputs fromsuch a network may include virtuallyany response a human is capable of
making so long as it tan be tranSdus.ced at the terminal, and any patternof energy change a.human is capableof perceiving f it can be displayedwith fidelity to the human's'sehsesat the terminal. As futuristic asthis conceptiOn may appeary-suchnetworks already form the core ofmulti-terminar training systems.Indeed, simplified versions may befound in your child's toy box. Thisis not to say that limitations ontransduction, processing, and dis--plays are not real, but rather theselimitations are currently technologi-cal as opposed to conceptual. Thebasic idea seems clear, but its rami-fications for instruction do not. Itis to achieve a better understandingof this that we shall attempt toexplore the conception more closely,especially as it applies to the in-structor.
How mdch of the human instruc-tor's traditional-roles may be takenover by computer-based instructional -systems? What functions will remain,.or newly emerge for human instructorsto perform? These are questions.especially pertinent to the desigm ofinstructor stations.
For purposes of this paper "in-structor station" is defined as onemethber of a network,of computer ter-minals driven by a CPU'programmed forinteractive instructional deliveryand two-way communication between theinstructor's terminal and'each stu-dent's terminal., Clearly, the.role
, of an instructor operating spch astation would diEfer significantlyfrom his.role in the traditionalclassroom eveh if the instructionalsubject matter remained the same:
40Before changes in instructorroles and functions due touse of thecomputer as an instructional mediumcan be assessed, it seeMs4necessaryto first arrive at &time categoriza-tion-of them as they have been tradi-tionally practiced, and to further
393,t)
examine khe major *classes of Tactorswhich deteymine their selection.
TRADITIONAL ROLES_
Five instructor rolei may bedistinguished as-a minimum nuMber ofcategories-needed to describe the,major-domains of instructor activityWithin.traditional settings. These
are:
1. Cecturex2. Leader
'3. Supervisor4. Tutor5. Aide
InStiuctor roles are here dis-tinguished in terms of the emphasiseach glves to the performance.ofseven categories of instructor func-
tions.
TRADITIONAL FUNCTIONS
The major categories of instruc-
tor functions in traditional instruc-tional contexts may be divided rnto
seven categories. These are:
1. Information development2. Information delivery3. Student supervision4. student guidance5. Student evaluation6. Data management7. Course management
It will "be apparent in the fol-
lowing definitions of traditiortalinstructor functions that all cate-gories are not mutually exclusive of.
one another. Even though various of,
these functions are interdependentand overlap somewhat, they neverthe-less represent different channel,sinto which instrpctional energy is
directed. Function definitions are:
1. InfOrmation development: Thegathering, synthesizing, and organi-
zing of subject matter (facts,. con-
cepts,, Rrinciples, procedures, etc.)pertinent to course objectives into
formatS appropriate Torpresentationin partiCular instructional situa-
tions. In many industrial and armed
services situations, morelormilapproaches to instructional systemsdevelopment (ISD) would be taken,
beginning With task.analysis.
2. Information delivery:,Thepresentatidn of subject matter neces-
sary to achieve course objectives by
40
Om,
whatever means that are'available,effedtive, and consistent with theinstructor's role and situation.
3. Student supervision: Thedirection of student learning activi-ties toward timely realization Ofcourse objectives through mandate,directive advice, personal example,assignment of study materials, stipu-lation of praitice techniques,, per-formance evaluation, praise, criti-cism., etc.
4. Student guidance: The assis-tance of students in discovering,orienting toward, and developing
'feasible approaches to realization of
future goals through counseling,interpretative evaluation of apti-tudes, providing information perti-nent to formulation ahd realizationof long-term objectives, etc.
5. Student evaluation: Assess-ments of relative performance, Sub-.
ject matter and/or skills mastery,and any other dimension of behaviorcorrelated with success in an area of
activity (attitudes, motivation,emotional stability, etc.) on thebasis of test scores, proficiencyscores, subjective ratings, physio-logical indices, etc. -
6. Data, management: The record-ing, categorizing'and filing of indi-
vidual performance data; calculationsof norms, trends, statistics, etc.;documentation of instruction-relatedstudent activities (attendance,promptness, awards, demerits, etc.);collation of summary data for, andpreparation of, stu,dent Progressreports; etc.
a7. Course management:-Control-
ling the conduct of a course of in-st,Fuction incluting all decisionsregarding subject matter, examples,demonstrations,,etc., to be presen-ted, the study materials and practiceexercises.to be required, the irk-structional media to be Used, thetests and other evalvative instru-ments to be used, the sequencing andtime allotments.foi instructionalsegments,' the updating and revisionsof course objectives, syllabii, 'andinstructional,materials, etc., the' .
allocation of instructional resources(funds, supplies, etc.) and facili-ties, etc.
ROLES AND FUNCTIONS
Rdles are not sharply delineatedin terms of the traditional instruc-tor functions, but rather,in terms ofthe ways the functions are-Performedand the level of responsibility eachrole assumes for the execution of agiven funbtion. Not all of .the tra-ditional functions are peisent inevery instructor role, e.g., aidesare usually not responsible for in-formation development and lecturersare minimally, if at all, responsiblefor student guidance.: Even where tworoles are responsible for the sametype of function, they would rarelycarry them out in the same manner,e.g., delivery of instructibnal
by lecturers_ and super-visors is entirely different.
Both the level of responsibilityfor, and the manner of performingeach of the traditional instructorfunctions may be viewed as outcomesof a deterministic process.
DETERMINANTS OF FUNCTIONS
There appear to be four immedi-ate determinants of instructor func-tions. These are:
1. Instructional objectives2. Instructional media.3. Delivery situations4. Instructor/student ratios
Usually, the above condftionsare established by a more remotegroup of determinants, including:
1. 'Instructional goals2. Entry-level requirements3. Through-put requirements4. .Exit-proficiency requirements5: Resources and facilitiesfi. Time/Cost constraints
While the above by no meansexhausts the realm of potential de-terminants, immediate or remote, theyseem to be the ones tha; are mostinfluential in most instructionalsettings. Consider the followingexamples:
EXample T., Assume that theinstructional goal A a university's
.
department of philosophy is 'to offerinstruction in philosophy of scienceat a.level which could be taken bygraduate students irr all areas ofscience. Entry-level requirements inthis case would be quite general,
41
limited to students with graduate 4
status, but stipulating.no specific.prerequisites.' pniversity and de- *
partmental policies would determinethe through-put requirements; a mini-mum and maximum acceptable,number ofstudents per academic term. Exit-proficiency requirements could be setby a committee of expertsAconsistent'with the instructional goal, but,usually this would be done,by theinstructor(s). Resodcces and facili-
1 ties would include instructional, manpower, library holdings, media,rooms, etc.
In order to satisfy the instruc-tional goal, a high level of exper-tise would be required.of the in-,strdctor, usually, a Ph.D. with aspecialty in the philosophy of sci-ence. Few universities would havemore than one such individual avail-able and it is unlikely that theuniversity could afford to bring inadditional instructors. The bottomline would be that this course ofinstruction would be offered by oneinstrudtor and that the deliverysituation would be a classroom withmedia support restricted to that onhand (slide projector, blackboard,textbooks, mimedgraphed handouts,etc.). This would not be a serioushandicap since, in this case, theinstructional objectives would beconceptual rather than skilled. Atthis point all major aspects of theinstructor's functions have beendetermined.
The cost constraint has limitedthe number of instructors to one andthe through-put requirements, to-gether with the available deliverysituation, has set the acCeptable'number of.students at, say, between15 and 30. The teSulting SMall in-structor/student ratio combined witha single instructor delivering ab-stract conceptual material in a class-room.situation with only rudi:mentary .
media support would mean the_follow-ing:
a) The instructor's responsi-bility for information development .
would be nearly total and probably itwould have to be carried out persón-.ally, the.result depending heavily onthA level of specialiied expertise ofwhich the instructor is capable;
b) The instructor would bear 0
total responsibility for delivery ofsubject matter information in the
ssrood, and, due to the limited
sUpport media at his disposal and the
number of students he must reach, the
*only mode'available for informationdelivery would be that of the lec-
ture;
c) The verY conditions that dic-
,tate the lecture form of information'delivery would pre-empt one-to=pne-student interaction (unless it occ#r-
red outside of cfass) and, therefore,would render'negligible any presump-tion of responsibility by the in-
structor for the "functions' of studentsupervision or student guidance;
mot
dt Unless the instructor were
assigned an'assistant to evaluateassignments, grade tests, and keep'
track of attendance, etc., the in-j.
structor'soresponsibility for thefunctions of student evaluation and
data management would be total. In
the case thatan assistant were avail-
able for these duties, the 'instructor'sresponsibilities could be reduced to
a minimum depehding upon the competence
of the assistant;
ej The instructor would assumecomplete reSponsibility for develop-
ing the course,syllabus,.determiningwhen and how much time will be devo-
ted eo each instructional-section ofthe course, the nature and.number ofassignments and tests to be given,
selection of textbooks and otherreading materials, and all otherdecisions,affecting the conduct of
the course.
It is appprent that the job .
description which emerges from the
above enumeration of functions couldonly,be that of a lecturer. Theroles of leader a d tutor would en-tail,similar levels of responsibilityfor,informatidn de elopMent and de- ,
fivery, as well asifor student evalu-
, ation and the management functiods,
t the-levels of responsibilityassumed by leaders and tutors for
student supervision and guidancewould be considerably greater than
that affOrded by the determinants.%
the example under considers-tiOn, the instructor could-be a (dis-,
cussion) leader if, the instructor1student ratio were reduced-thus enab-
'ling a more informal mode of informa-
tional delivery. In order for theinstructor to serve as a tutor (per-
mitting a_highfy interactive,,one-on-one mode of information delivery)
,
42
the university's through-put require-
ments, resources and restraiht allo-cations, perhaps eyen its instruc-
tional goals,wourd 'have been quite
'different. 'Certainly,,requirementsfor student supervision and guidancein,aseociation with this course ofinstruction would have been far morecompelling than they were. The pointi5kthat, given the assumptions in
this example, the only role thal isfeasible within this traditionalinstructional context is that of
lecturer.
It should be.noted here that adistinction can be made between two
kinds of,instructional.leaders., The
first kind is that mentioned in the,
example above, i.e., the discussion
leader. The second kind is what weshall call the field (or team) 'lead-
er. Both provide instruction for
relatively small groups of students
and the levels of responsibilityassumed for each of the seven in-
structor functions is approximately
the same. (Incidehtally, the rangesof levels of responsibility assumedbyleaders for the various functionsappears to be more variable than in, '
the case of any other instructor
role.) Tho chief tlifferences between
field and discussion leaders are that
the latter serves within classroom-like situations pnd the former reliesheavily on personal example, usually
with some authority beyond that of
instruction.
>cample 2'. Assume that theinstructional goal of a major airline ,
is to train technicians in the main-etenance of a particular kind of jet
engine. The entry-level requirements'
for students tO be admitted to this
program inclUde previous training in
basic electricity theory, circuitsdesign ana construction, blueprintreading, transformers and motors,
electro mechani,cal devices, micro-
processor controls and operations,
etc. The through-put requirementscall for six students to be trained
in an 11-week apPrenticeship-typetraining program. Exit-proficiencyrequirements stipulate that each
stgdent must be capable of performing .
sly normal ma4intenance functions,
trouble shooting eledtrical circui,tS,carrying out performance tests,andmeasUrements, making minor repairs,
etc.^., Physical resources and facili-ties are available at a centralized
company maintenance school. The
39
information delivery situation willbe a skills laboratory with.suchmedia as engine mock-ups, engine-circuit mock-ups with programmable
"than o e student Superficially, thesuperv sor's role ears most likethat o the aide in t the latter
.might ell work with th same numbermalfunctions, testing instrumenta- of students under much th etion, individuel student study-car- ditions. HOWever, in this example,tells equipped with audio-visual
. '' , ail aide would have little responsi-:devices, studeht notebooS contanng
,k ii bility for information delivery and
job-aides, etc. moderate to low responsibility forstudent supervision; guidance, andevaluation,. In this situation an ,
aide's responsibility for data manage-tent could range between high.andlow, and relatively little responsi-bility would exist for course manage-ment. Thus the configuration Ofresi5onsibiiities assumed for in-structor fuhctions in this example',clearly iirriits the-choice of roles V:).that of a supervisor..
Th ,relationships between deter-
Altholigh t'he cost for skills'laboratory and media-are consider-,able/ they are made feasible by coh-centratihg the training into one 4
well-equipped centralized facility. ,
This also reduces the number of in- .
structors needed to one. The in-structor's qualifications includeseveral'years experiente in aircraftmaintenance plus.successful comple-tion ofan extensive instructor
,
training program conducted by the .
manugacturer of the engine in ques- 7 miriantS/Afunctions, and instructortion0 It may be assumed that the cl.._.,/TAles are both interesting ahd,com-company's maintenance training pro- plex, involving considerations-othergrim is a somewhat streamlined ver- than just those mentioned in thesion of the MaDufacturer's instructor abave exampleS. One such considers-training program. The instructional tion is the level and'type of exper-objectives would consist almost en- tise that js generally required fortirely of concept applications and eaCh role. Since expertise is usu-.handsTon skilis.
. ally inversely correlatecrwith supply,
.t in the market place, within limits
The instructor's responsibilities expertise can be translated intofor informatign develosiment'in this dóllar cost. However, when instruc-case would be negligible, but he tor'costs are added to the totalwould be almost totally responsible costs of instructional delivery andfor informatiOn delivery, student . the cost per student is calculated,supervision, student guadance, stu- the'cbstpf instructor expertise ihdent evaluation, and data management. some roles may be markedly dimini-The inStructor's level of responsi- shed-,- While a highly qualified lec-bility for course management would be turet may be expehsiv4 relative to amoderate since all decisions regard- supervisor, the cost per student foring sequencing of instructional Units, the lec%ure-type instruction usually ,
testing, evaluatton, etc. would have would be less due to the larger num-been made at the time of training ber of students and the absence ofApeogram development. Given this any need foi expensive media. Evenarray of function responsibilities so, the matter of instructor exper-and their determining conditions, the tise seems to be more a problem ofrole of this instructor could only be .supply than,cost. In fact it appearsthat of.a supervisor, that, the higher the level of re-
Although this instructor may,'from time to time, carry on groupdiscussions with his students, hisrole could not.be that of a discus-sion leader because the instructionalobjectives.dictate that he perSonallysupervise hands-On exercises in askills laboeatory. Neither cOuldthis intructor be regarded as atutor becduSe he has no'responsi-bility for information development,,is minimally concerned with generdl,theoretical knowledge, and he isrequired to demonstrate and overseethe acquisition of skills by more
43
quired expertise is, the shorter thesupply of qualified instructors.
Instructor availability is onefactor that is directly impacted bymodern instructional technology,especially computer-based instrucAtiOn. If the supply of insfructorsin any given field is.reduced or evenlimited by the,time it takes totrain, or educate, those instructors,then .any'approach to insttuctionwhich can replace at least some sin-structor functions with automatedfunctions should decrease the timerequired to produce'the needed
supply,,This iS not to say that the-'number of months or years it takes totrain effective instructors is'the
.only factoraffecting their tupply atany point in time. Monetary-andcareer incentives, job satisfaction,aptitude and previous education areamong the major factors which control
-the number of individuals who aremotivated and competent,to receiveinstructor training in the firstplace. In this area also, compu-ter:based systems should improvesupply through incentives and job,satisfaction. Time savings in train-insemend increased motivation, ofinstructors are two faCtors which .should prove especially important toinstructional systems'that e*perienceIligh rates of instructor, turnover,such as those in the*Med-services.
It is ,frot the purposeg)f thepresent paper to examine the ways inwhich computer-based instructionalsystems provide motivational incen-tives and ind'Egased job satisfaction.for the instructors that operatethem; but some of theSe ways willbeCome,,apparene la'ter on in this'paper. Suffice ito say that thepossibilities for instructor motiva-tion inherent in the network-typecomputer-based system is an exciting
00and,important frontier tor fupre-development. It .is the other avenueof impact that the computer-based.system has on instructor supply (in-,;;deed., instructor effectiveness als0_with which,we are.concerned here,Le., the alteration Of instructorfunctions. .
Which of the traditionarfunc-tions will betItzred?' Each of thefunctions will be-altered in someways, and several may be virtuallyeliminated. However, One furtheraspecle6T traditional instructionneedrtome mention before proceedingto an examination Of automated in-
structor functions. That is the
relationship between-traditionalinstructional delivery situltions and'instructor roles.
TRADITIONAL SITUATIONS
The physical situations in whichinstruction is traditiOnally de-livered constrains instructor func-tions ana thus narrows the choiceofroles appropriate to be performed in
them.- Five'riajor classes of instruc-tional delivery situatiOns may bedistinguished: 'These are:
1 It
44
1. Classroom2. Skills laboratory3 Individualized study-station4.---Simulated operational en-
vironment' .4k5. Actual operational,environ-
ment .
A
It is-probably unnecessary toenumerate the distinguishing chatac,-,teristics of each of these situationssince they are generally familiar.Certainly, this classification schemecould be expanded it'liner distinc-tions were made. The Scheme offeredhere is the minimal number of cate-.gories of situations in which in-strUction is traditionally deliveredand which,permits the distinctions we-wish to make.
The degree and kind of interac2tions between students and instruc-tors differ in these situations,i.e., the emphasis on functions andthe way they are carried out changeas do instructor roles. Instructorssimply do not lecture in situations3,.4, and 5. Only rarely would alecture occur.in situation 2. It '
seems that the lecture form of in-structional delivery is largely 'con-fined to situation 1. Thus it goeswithout saying that the lecturer'is,---,at-home only' in the classroom, Like-'
wise, the discugion leader.alsofinds his primary place there. Therole..of field leader, on the otherhand, seems to opcur with greatestfrequency in actual operational en-vironments, but this role would notbe uncommon in simulated environ-ments. Of course, the supervispr, ismast at home in the skills laboratorybut he may be found in situations 4and 5. : Tutors, which appear to bedecreasing in their frequency ofappearance, probably due tO theirhigh cost per student, carry outtheir,functions only in individual-ized-StationS%) By contrast, aidesappear to be the most ubiquitous ofinstructors, occurring.in all situ- .
ations except the.classroom.
The changes in.roles with situ-ations reflect changes in other tra7ditional determfnants as well. In-structional media, objectives, de-.
"livery techniques, and instructor/student ratios also change with titu-ations. The covariance of theseimmediate determinants of instructorfunctions is attributable to thecombined influences of the conditionsreferred to_in the jest section as
tJ
-
'remote det'erminants. The flow ofinfluence between remote and immediatedeterminants, however, is.not alwaysunidirectional, i:e., from remote toimmediate to functions and ultimatelyto roles. Rather, the flow of influ-enCe occasionally is two-waY, in boththe vertical and horizontal directions.For example, if the situation we haveis a one-room schoolhouse, just moneyenough for one instructor who mustmeet btate qualification standards, ablackboard, a couple boxes of,:=Chalk,a closet full of,dog,eared books, and30 or so right-handed chairs, thenthe instructional goals and otheg'remote determinants have to be modi-fied to suit the more immediate ones,it they are considered at all. It isinescapable that, in this'situation,the instructor is going to be a lec-turer regardless of what behaviorsare-to be trained (taught).
If there is any truth to theassertion that the behaviort whichcan be trained'in anysitqation de-pend oh the behaviors wh±h can bpbrought to occur there, then it mustalso be true that instructional de-livery situations substantially in-fluence training effectiveness sincethey certainly limit (if not induce)much of the behavior that does occurih them. Unfortunately, too much ofwhat appears to occur in the school-room today, seems,to be learned, thesurprise being that a#Ything else(e.g., academios) could be.. An an-swer,to this problem that one sooften.hears calls for "more disci-pline." It would seem that thissentiment is, if nothing else, tleast in the right direction for it,implies control of behaviog. Wesuggest that it is the interactionswith subject matter that needs most-to be controlled. It is just thiskind of control that.automated sys-
.tems,can be effectiV1 in providing.
AUTOMATED FUNCTIONS
In this section we summarizewhat, 0 general, a'compRter-basedinstructional system can provide inthe place of eacti of-the traditionalinstructional functions. The listsof automated functions pr.esented hereare state-of-the-art. As exciting ascurrent capabilities are for instruc-xtional application, it should_be keptin mind that considerable progressremains to be achieved in all areasof .this man-machine interface (re-sponse transduction, intelligent
4
45
programs, information displays).Although some of automated func-tions listedob ow can be performedby relatively mall:systems, the typesystem we are considering consists ofa network of sophisticatedterminals(both student and instructor) drivenby an imaginatively, programmed large-capacity CET.
1. Development of instructional'information: (a) serves as a guide toinstructional development by means ofprograms (menus with prompts andmessages) based on algorithms foreach successive stage of course de-sign; (b) facilitates writing,' edit-ing, and drAwing of instructionalmaterials through programs for wordprocessing and graphics production;(c) permits convenient filing, cross-referending, and combining of in-.structional information through pro-grams for.information management.
2. Delivery of instructionalinformation: (a) presents coursesubject matter ranging from abstractconcepts to factual itemizations inself-paced,Pmastery-based formatsdisplayed in written, spoken, and/or
, graphic forms; (b) delineates rele-vance and applicability of subjectmatter by presenting contexturalinformation and sample problem solu-tions, problem-solving exercises,etc.;' (c) provides instructions .forproceeding through programmed les-sons, performing skills, correctimg'mistakes, or obtaining remedial in-formation for review; (d) produces Orcontrols simulated representations ofoperational devices, field or jobsituations, abstract processes, per-formance pnocedures, job aides, etc.,enabling demonstrations, practice,rehearsal, etc., of concepts, rules,skills, procedures, attitudes, Eoles,team exercises, "what if" explora-tions, etc.; (e) delivers individualand/or ,group response-contingentfeedback designed to aide self-diag-nbsis of learfling progress; informa-tion may consist of prompts, ques-tions,encouraging messages, scores,etc.,in a variety of display,formattthat may include special auditory or'visual effects; (f) provides summaryfeedback at the end of each instruc-tional unit or major exercise con-sisting of scores, outcome state-ments, evaluations of relative per-formance, necommendations for im-provement, overall course performanceprofiles, course grades, etc.
. 3. Student supervision: (a)guides students along inStructionalpaths adjusted for level,of achieve-ment and rate of progreis; (b) opti-mizes individuadized instructionalpaths through frequent performance .
jchecks, variable,palh b.ranching, andreview sequences; (c) prevides super-visory instructions and feedback(written, spoken, and/or visual;graphical or simulated visual demon-".strations of ,"do this") contingentupon individual or group actions withpossipe instructor intervention;
4. Student guidance: (a)' pro-vides repommendations for futurecourses of action (remedial study,more in-depth study, informationsources, job possibilities, etc.)based upon performance profiles; (b)
-response to student questions about,their performance profiles with tn-terpretive answers and comparativedata; (c) provides job descriptions(requiremeras, work conditions, sal-'ary data, etc.) in at'eas related tocourse of instruction.
5. Student evaluation:-. (a)determines correctness of choices,problem solutions, or actions on anitem-by-item basis; (b) tests per-formance proficiency relatiVe to'instructional objectives, groupnorms, and standards; (c) 'diagnoseslearning progress, detects learningdifficulties early, prescribes reme-dial work, and adjusts difficultylevels/rates to match students' a-bilities; (d) provides overall per-formance profiles, course grades,percentile ranks, etc.
6. Data management: (a) acceptS
as inputs any properly computer-interfaced responses by students andinstructors; (b) automatically .records in central memory input datafrom all system terminals, poolsgroup data, forms generic data basesand carries out statistical!Dr otherprocessing while preserving-indi-vidual-student records; (c) displaysselected data files, analysis results,or interpretative messages auto-matically, or on command to designatedterminal in written, spoken, and/orgraphical tourists.
7. Course management: (a) de-
scribes syllabus-controlled sequen-cing of, and time allocations for,successive instructional units and'the,timely exeCutiog of tests, evalu-ations, and student feedback; (b) .
46
4 3
tracks individual and group progressrelative ta establiShed course mile-stones, performance standards, andthrough-put rates; lc) carries outcourse evaluations and recommends-tions'for revision based on dataanalysis and success in attaininginstructional objectives; (d) enablesmonitoring of individual or grouppeTformance by instructor who mayselectively intervene or interact .with individuals or group; (e) pro-.vides prompts and messages to in-structors to insure timely occurrenceof non-computer instructor functions..
The above lists of automatedfunctions inClude no mention of"howthey ar,e, or might be, effected.Even tHough this question'is cer-tainly beyond the scope of this paperthere are at least three good reasonswhy it is worthwhile to considerautomated functions indepndently ofthe technological means.by which they
can, or might be, achieved; (a) appli-cations of technology mast be justi-fied in terms of the fUnctions it canperform, and these functions muststand-on theirown merits; (b) thesame functions may be accomplished indifferent ways depending on the re-quirements of particular applica-tions, and (c) new applications ofthe same functions and new techniquesfor effecting them may be developed.-It is also the case that new tech-nologies result in the emergence ofpreviously,unanticipated functionsand applications. As we shall show,applications of computer technologyin instruction modifies instructor
,roles not only by assuming and im-
\proving upon traditional instructorfunaions, but also.by laying the\foundation for emergence of new in-structor functions. Rut first, lets compare automated and traditionalnstructor functions.
F NCTION DIFFERENCES
Perhaps the approptiate cplestionthis point would be, what instruc-
to functions can, and must, humans"pe form that computerized machinesca not? In answer, we must admit toig orance. It is the word "must" inthe que'stion-that renders its answerobs ure, at least to us.-. Even if we
cou d enumerate every possible-humanactivity that might be. zanstrued to.be an.instructor function, we couldnot say on the,bas.i'S of s "entificsvidence which of them were sential
*
to the learning process. Perhapsnone ere.
It would seem safe to concludefrom the last half-century's researchon the learning process that theminimum necessary conditions forlearning:are. (a) the presentation ofenergy-borne information to-the sensesof a living organism, and (b) theconsequent occu-rrence of some sensoryeffect within the organism that maybe overtly manifested in its behav-ior. While these two cOnditions 'arenecessary, even if they are met theoccurrence of learning is not as-sured. Information enhancement,repetition of its presentation, re-sponse-feedback, etc., ,also may Joerequired in certain circumstances tdincrease the probability that learn-ing will occur, that it will occurrapidly, and that it will endure.Whether or not.such conditions areessential to the learning process insome fundamental theoretical sense(if, indeed, there is just one learn-ing process), it seems they must.beconsidered practically indispensible
ircumstances appropriate for theaccom lishment of instruction. Sinceinstructor functions traditionallyhave been the means through:whichproduction and manipulation of these'conditions for learning, has beeneffected, it seems reasonable toassume that any instructor functionwhich,a computerized machine cannotbe.programmed to simulate effectivelyshould be consitiered a "must" for thehuman instructor. Some insight intowhat these "must" functions should bemay be gained,by examining the dif-ferences between instructor and auto-mated functions. Function differ-ences are summarized below using thesame numbering of categories as be-fore.
1. Computer-based systems canfacilitate the development of in-structional information, but theycannot recognize instructional needs,originate program goals, or conceiveof the means by which to attain them.Humans must assume virtually allresponsibility for gathering andsynthesizing information, evaluatingwhat would be important and interest-ing, creating stimulating cofiaptu-'aliiations, examples, etc., of theinformation, and finally, originatinginnovative computer programs to'achieve effective information deliveryto, and interaction with, students.However, these indispensible
.67
instructor functions do not have to'be carried out by the Andividuals whoserve as instructors with computer-based "systems. In,fact, it wouldprobably be more cost-effective forinformation development functions tobe exclusively the jurisdiction ofprofessional scientists, scholars,computer programmers, instructionaldesigners, etc. The salience of thisspecialist approach to informationdevelopment is ramified in the rapid-ly'emerging areas of computer-basedimagery, animation, simulation gam-ing, etc. If the possibilities forcomputer-based instruction are to beexploited to the fullest in the yearsto come, it is apparent that we shallhave to rely more upon the specialist(better still, teams of specialists).
2. It is in the perforrhance ofinformation delivery functions thatthe computer-based instructionalsystem proves its worth. The degreeto which such a system can carry outthe 'automated functions listed pre-viously depends on the fidelity ofits tan-machine interfacing .(sensorsand manipulanda; visual, auditory,etc., display devices), its CPUcapacity,,and the sophistication ofits software. 'If we-wish to thinkvery far ahead, display functionsmight be expanded to include computer-driven machine movements (robotics).
It is apparent that we can ex-pect dramaticadvances in all areasof computer technology in Ehe rela-tively near future. However, evencurrent capabilities permit the exe-cution 6f nearly all informationdelivery functions for which humaninstructors are-responsible, includ-ing some functions unachievable byany means other than the computer(e.g., interactive video displays).This is not to say that computerizedmachines duplicate human actions incarrying out the same functions, orthat such duplication is always de-sirable even when it is possible.For example, it may be that someinformation cat, be transmitted to a
student more effectively in a writ-ten, visual display than in.a spoken,auditory display, wl-ere the lattermight be the format traditionallyused ty human instructors. Converse-ly, if it were belleficial to transmitthe information in a simulated humanform, a speech synthesizer could beemployed. Thus, in order to dupli-cate human functions, it may not benecessarAto simulate hunan actions.
-VP
A review of the capabilitieslisted under automated functionsindicates a high order of individu-alized and interactive informationdelivery by computer-based instruc-tional systems. The question'is, arethere any additional delivery func-tions that must be performed by ahuman instructor-operator Whose sta-tion is a terminal tied into a net-work of student stations? It isimplicit in this questiOn that theinstructor's communication with stu-dents is limited to just those chan-*nels available in his station. Thesemay be of two types: (a) extra-CPU,
and (b) intra-CPU. Ey means of theextra-CPU channels, the instructor
may deliver non-computer-based inforzmation (e.g., speech and'other accou-stic signals, visual signals anddisplays, etc.). These displays maybe presented either independently of,
or in synchrony with, CPU-produced.
information. Also, the instructor,may selectively open channels to one
or more student stations, enablingtwo-way video and/or-audio communica-
tion. By means of the intra-CPUchannels the instructor may interactin ways and at times permitted by theprogram (e.g., as a team leader, as acompetitor in a gaming situation, asa prompter, etc.1 or he may overridethe program and.initiate appropriatepre-program directives, messages,data listings; etc.
.these are some of the delivery
functions an,instructor may performfrom within a network'station. Due
to this system's large capability far-performing automated functions, prob-
ably none of the possible human in-
structor information delivery func-
tions an be considered a priori to
be indispensible. Depending on spe-
cific training program requirements,
both extra- and intra-CPU instructorfunctions may be considered desirablefeatures and incorporated as part of
the whole delivery systeM. Since
non% of the automated functions needbe sacrificed in order to accomplish
this, it seems highly likely thatinstructor fdnctions of the sortindicated above would be included
wherever possible. In addition tomonitoring students, inclusion of
these functions would enable the -
instructor to interact with his stu-
dents on an individual or group basis,
competing with or leading them, cor-
recting or encouraging them, di-
recting or questioning them, etc.
The motivational influences of these
kinds of interactions on both in-
structors should magnify the effec-.tiveness of the entiee learning ex-perience in this situation.
The instructor's responsibili-
. ties for delivery information outside
of the network station probably should
be regarded as essential. They would
include familiarization of students
with training systems operating pro-cedures, demonstrating system opera-
tion, orienting students to trainingobjectives and performance expecta-tions, carrying On post-trainingsession discussions with students,
etc. These functions woUld probablyrequire that the instructor serve in
several teaditionalsro1es: as -lecturer
during initial familiarization: assupervisor during demonstration of
system operations; -and as discussionleader during pre- and post-training
sessions. By contrast, during systemtraining sessions the instructor'srole would be characterized by amixture of elements from the tradi-tional roles of field (team) leader,
supervisor, and tutor. An additional
set of elements of'a non-traditionalnature may aasd be included, i.e.,those of a competitor. The compound
of these elements resalts in theemergende of a new role for the in-
structor station. For want of abetter name for this role, we shall
call it "coach".
48
3. Student-supervision iscategory of functions which ulti-
mately depend,on the delivery ofinformation-from instructor to stu-
dent. Although this information is,
instructional, the.knowledge components of it are generally limited to
system, or machine, operations con-veyed through demonstrations'of the
skills involved in performing the
operations. Supervisory informationAlso consists of evaluative feedback
from the instructor regarding both-
the actions made in skills perfor-mance and the outcomes or products of
those dc*tions. Characteristic ofsupervisory information is a strongdirective component which is intended
to control both actions and outcomes.
As the instructor's responsibility
for supervisory functions depends on
how closely the students' actions are
to be controlled. As indicated in,the list of automated supervisory '
functions, the computer-based system
enables a high order of student di-
rection on a response-by-responsebasis. The instructor is thus
45
relieved of mon of the drudgery ofstudent supervision, this being re-placed by finely attuned immediate'response-contingent feedback to thestudent. Supervisory functions re-maining for instructors to perform,os in the case of information de-livery functions, must be dividedinto stotion and non-station func-tions, the former being broken downinto extra- and intra-CPU functions.
While operating from his netwOrkstation, the instructor's supervisoryresponsbilities would require moni-toring of student performance viaboth computer,and non-computer dis-plays. Instructor supervision viaintra-CPU channels might consist ofprogram overrides, re-initializingsome portion of the program, inter-acting with students in ways permit-ted,by the program (e.g., enteringinstructions, calling up visual dis-plays, manipulating pointers and suchon visual displays; etc.), activatingpre-programmed directives, etc. Bymeans of extra-CPU channels, theinstructor might issue commands,instructions, corrections, etc., atany point independently of CPU-con-trolled actions. Whether or not, andthe degree to which, any of theseactivities were considered necessary'would depend on particular programrequirements. The instructor's super-visory duties outside of his networkstation, and after initial familiari-zation and demonstrations, probablywould be negligible consisting mainlyof informal advice and encouragement.'Thus none of the instructor's super-visory functions can be considered apriori inditpensable, though certainlythey would be desirable in most cases,especially those involving skillperformance.
It is interesting to observethat, even if the instructor'ssuper-visory functions are strong, in thistype of automated instructional situ-ation his role would not be that ofthe traditional supervisor. Rather,it would be that of a coach as indi-cated earlier. On the other hand, ifsupervisory responsibilities werepegligible, the instructor's rolewould tend toward that of the tutor(if he supplied knowledge informationinteractively) or that ofthe com-
,_petitor (in a gaming simulation).
4. The advisory and counselingresponsibilities usually subsumedunder student guidance pfobably would
be largely peripheral to the instruc-tor's network statiob functions. Theinstructor's responsibility for stu-dent guidance...outside of the networkstation, however, would include bothformal and informal discussions re-garding student career goals, rele-vance of the current training toattainment of those goals, job avail-ability and,requirements (some ofthis may be available at the stu-dent's network terminal), etc. Fur-thermore, specific advisory or coun-seling sessions probably would bescheduled to correspond with majorcourse milestones. , At such timesstudents may be faced with decisionsinvolving changes in job aspirations.
-The instructor's counseling respon-sibilities probably should be-limitedto providing recommendations foralternative courses of action forstudents to pursue depending on their-success in the present course ofinstruction. Formal counseling byinstructors might be carried out inconjunction with, or referred en-tirely to, a professional counselor"knowledgeable in the area of instruc-tion. Thus, the computer-based sys:tem of instruction under considera-tion would seem to permit a largerange of instructor commitment tostudent guidance and, consequently,this may not be regarded as an indis-pensible area of instructor func-tions. At least some' modest level ofinformal counseling, however, wouldappear to be beneficial.
5. As indicated in the list ofautomated student evaluation of func-tions, the computer-based systemrequires little additional instructorinput in this area, if any at all.However, if the instructor's super-visory responsibilities within hisnetwork station constitute a signifi-cant element in the training programthe instructor's responsibility forsome forms of evaluative feedback tostudents would be greater. Suchfeedback would be in the form ofcorrections, reprimands, indicationsof performance deficiencies, etc.Likewise, because the system can beprogrammed to provide the studentwith virtually any form bf evaluativeperformance index, diagnoses of learn-ing progress, overall performanceprofiles, etc., the instructor'sresponsibility for student evaluationoutside of his network station wouldconsist largely of interpretation andexplanation. Of course, need for thelatter would depend on the ease with
49
u
which students could read and inte4-pret their machine-generated evalma-tions. In this, as in any other'areaOf computerbased instruction, pro-gramming inadequacies devolve intoinstructor responsbilities. In sum-mary, the network system instructor'sstudent evaluation functions arenegligible except where supervisoryresponsibilities are high. In thatcase, the two areas of instructorfunctions are parallel and, to someextent, indistinguishable:
6. In the area of data manage-ment, instructor functions are elimi-nated. Instructors may enter theircommeuts, evaluative scores,Jetc.-,.ipto student records, but the in-
. 1-structors bear no responsibility-for -
filing, up6ting, or other processingof performance data.
7% 'The mechanics for course'management are all pre-set in thetraining program. Instructor respon-sibilities in this area consist main-ly of keeping the prearranged programon,schedule, ensuring smooth inte-gration of non-program activities'with the schedule. Thus, insofar ascourse management is'concerned, theinstructor's role is essentially thatof an aide. He must service andotherwise work to ensure, the uninter-rupted conlinuance of the program'.In a sense, the program manages it-
self. This is.to say that all man-agement decisions regarding schedul-ing, resource allocations, programrequirements, contingency plans (in-cluding provisions for instructorintervention), etc., are made priorto implementation.
INSTRUCTOR STATIONS
The kernel of our analysis isthat, if computer-based automatedinstructor functions are utilize 'CO\
their fullest potential, no humainstructor functions may be consied a priori, as indispensible. Themachine is muCh moxe than an extra-ordinary medium far the delivery ofinstructional information. It isalso a capable instructor.
The question becomes, then, whathuman instructor functions may beconsidered desirable? A cursoryenumeration of these functions wasgiven in the preceding section. Theywere divided into two situationalclasses; functions to be performed in
the instructor's network station, and
5,1
functions to be performed outside ofthe station. Within-station funCl-tions were further divided into extra--CPU channels and intra-CPU channels.The question now that we have toaddress is what characteristics ofnetwork instructor stations are nec-essary to 'permit the performance ofhuman instructor functions?
The model of the computer-based.instructional system (trainer) thatwe are considerini consists of anetwOrk'of terminals tied to eachother through a CPU and through di-rect circbits., The CPU assume§ thelion's share of responsibility forpresenting' information, collecting'performance data, data management,interactive processing of inputs andoutputs, sequencing, etc., dependingon the ways.it is programmed, the CPUmay call for instructor actions,decision's, feedback to students,etc., thus structuring and integra-ting instructor functions into thecomplex flow of events. If%the pro-gram involves a gaming simulation,the instructor may interact,withstudents on a competitive basis as anopponent or as a team leader. Fur-thermore, through the CPU the in-structor's station may be tied into.anetwork of instructor stations there-by placing at the instructor%s dis7posal a generic data base built uthrough common experience with tthsystem or originally designed forinstructor training.
It should be clear that the coreof this system is the CPU, not theinstructor. Its programming, capa-city, and speed determine what it canprocess (the inputs it can accept,the decisions it can make, and thedisplays it can present). The in-structor station, as well as studentstations, are the peripheral inputand output interfaces with the CPU.Excluding direct links between sta-tions, the specifications for stationinput sensors and output displaysmust conform to CPU capabilities. At
present CPU technology appears to be
more advanced than sensing and dis-
play:technology. With some temeritywe suggest, therefore,'that,dtlythingwhich can be sensed in anal;Og andconVerted to digital, and vice versa,is within the capabilities of exist-ing CPU technology and thus can beincorporated into student and in-structor stations. Ignoring tech-nological limitations due to such:problem§ as digital conversion of
analog inputs and ouputs, etc., letus. examine some desirable input anddisplay characteristics of instructorstations.
Interface features of the in-structox station moy be organizedinto a two-by-two contingency table.The columns of this table designatechannel type and the rows designateinterface type. The two channelsare, as before, intra-CPU and extra-CPU. The two rows are controls anddisplays. Thus we htve controls anddisplays for each channel, resultingin four categories of instructor
or station features.
' 1. Intra-CPu displays: thereare two classes of displays withinthis category: (a) monitors and (b)read-outs. The intra-CPU monitordisplays provide the instructor withduplicates of the CPU outputs tostudent stations. These would in-clude both visual displays (CRT) andauditory displays (headphones) . Thestudent station being monitored wouldbe selectable from the instructorstation, as would the type display.Since these displays duplicate thosein student stations, they could beused by the-instructor while partici-pating in g'ame sinitulations.Theread-out displays would provide theinstructor with private CPU outputsthereby enabling him to receive in-formation from the computer indepen-dently of that available at studentstations. Status reports on all. sta-tions, data files, promPts and mes-'sages, etc., would be delivered atthese displays.
2. Extra-CPU displays: thesedisplays permit two-way communicationbetween students and instfuctor.They may be audio (headphone) and/orvideo (television) displays. In-addition to communication betweenstudents and instructor, these dis-plays could also be used as extra-CPUmonitors. In either case, s'tudentstations would be selectable by theinstructor, as would open channelcommunication with all stations (audioonly) . By means of one of thesediSplayb (probably audio), studentswould be able to initiate communica-tion with the instructor.
3. Intra:-CPU controlp: TheS:einstructor response inteeTaces are oftwo types: (a) operator and (b) mas-
' ter. The operator controls duplicatethose present in each student station.
They permit the instructor to engagethe system as a participant (competi-tor kik leader) ih game simulations,etc. The master controls enable theinstructor to atldress the CPU inde,pendently of student stations. Bymeans of the master control terminal,non-automated instructor functionsprovided for in the computer programcould be carried out. '
4. Extra-CPU controls: thesecont4ols include"all switches, knobs,levers, etc., necessary to activate,select, operate, etc., any non-CPU rdevices or,displays. Instructorcontrols for extra-CPU displays wouldbe included in this category.
Of course, specific design fea-tures of instructor-stations would bedetermined by the same soets of de--terminants that were shown to influ-ence traditionai instruction- In--.
deed, the entire netwoik system wouldbe impacted by these factors. Thereis, however, an additional elementwhich must I5e taken into account indesigning the instructor station'. Itis the CPU; the sophistication of itsprogramming, the capacity of itsmemory, and the speed of its opera-tion.
The technology'exists to Provide' -7advanced CPU hardware as it dpes forcontrols.and to a lesser extent for
4displays, but the program softwarehas to be created for each applica-tion of this ,teanology. We suggestthat system design and software de-velopment should be parallel efforts,both directed toward the same end, 4nintegrated instructional system. Ifthe contributions of instructional,software, and hardware designers areweighted equally throughout the plan-ning and development stages of com-puter-based instructional systems,then the instrvctor station Conceptmay be fully realized. At that point
4 we will see the new instructor rolesemeige: the coach, the competitor,the team leader, etc. Whatever kindof animal we build, let us design itso that its appendages and its brainform an fhtegrated whole. In thatthe instructor will be but one func-tioning organ.
FRONT-END ANALYSIS LEADING TOVTX INSTRUCTOR OPERATING STATIbNS
FUNCTIONAL SPECIFICATIONS
Jeffrey N. PunchesVice President
MATHETICS, INC. -
P.O. Box 26655San Diego, CA 92126
INTRODUCTION
The emergidg Ndvg Undergraduate PilOt.(VTX)training system, designed under the guide-lines of OMB' Circular A-109, is a trainingsystem composed of a number of integratedelements. These elements are the aircraft,simulators, training management system,academic system, and an integrated logisticssystem. In the McDonnell-Douglas/BritishAerospace VTX system, the simulators are inreality a family of training devices whichprovide a systematic, hierarchial method oftraining hands-on learning objectives. Thepurpose of this paper is to present thefront-end analysis procedures which wereused"during design of the instructorstations or the family of VTX trainidgdevices.
-;1
otsrucerba.rime
*TUMOR
.
MillDots
1441,Selectioo
*HoboOrnileimft
pi
Colloctim
PRELIMINARY ANALYSIS
The VTX-training system (VTXTS) developmenteffort 'centered. around-..a.r.itt=effectivemeirrod tip meet-the trainihg requirements ofNavy jet pilots through theyear 2010. TheVTXTS is based upon.the requirement to trainto criterion 87 Navy:provided objectivesand any developed during the ensueing ISD.analysis. Each of the contractors in-volved in the VTX procurement was requiredto use Instructional Systems Development(ISD) principles and'mettrodology in. thedesigm of the system. the ISD analysis alsoprovided answers to problems in areas ofdesign-to-cost and life cycle cost and re-source requirements. Through the use of ISD,a usable, integrated system was created intowhich the family of tr'aining devices werefully interfaced. This paper describes theudown-in-the-trenches" procedures that wereused by the team of analysts who developedthe functional specifications for the VTXTSsimulator Instructor Operating Station.Figure 1 depicts the Mathetics analyticalprocess which lead'to the functional spec-ifications fox the VTXTS InstructorStations. The right hand blocks depict thegroups or personnel who provided data to orreceived data.from the study. The left handblocks are the analytical steps used duringthe effort.
Within the ISO process, considerable workwas performed as part of the process ofdeveloping functional'specifications for the
"PCtam
Develogment
PE
FoectiopalFloeWWI
Allocationi of Features.
.10,,,,,,sys andttnint:
SIMATIX(10D/NEERIK
ercoat
1I
i
I
i
14
Soto*Development
If
FunctionalSpecificationDOCUMORt4,4n
Figure 1VTXTS Simulator Front-End Analysis
i'amily of Simulators. The ISO stepsvolved were:
in-
. Navy Jet Pilot Warfare SpecialtyJob Analysis
. Navy Jet Pilot Task Analysis
. Objective Hierarchy Development
. Media Selection
. Syllabus Specification
. Course Map Development
Of he above steps in the ISD process, themedia selection and syllabus developmentinputs formed the Oase on which the analysisbegan. The media selection models suggestedthree types of devices, an Instrument FlightTrainer (IFT), an Operational Flight Trainer(OPT), and an Aerial Situation Trainer(AST). These devices and their associatedcapabilities support a hierarchial acquisi-tion (simple prOgressing to complex) offlight skills. Table 1 lists the types oftraining devices, the key capabilities andthe associated pilot training stages.
53
t./
TABLE'1
Device-
.
Capabilities
.
Training Stages
IFT No visual .
Platform MotionFlight & Ground ModesAuxiliary IOS
' CPT mode
Flight Support.
FAMBI
RI
ANAV
OFT Large field-of-view` G-suit,-G-seat
Interactive TargetCGI Visual
,
,
FAMBI
-RIANAV
' FORMNFORM
' GUNSWEPS Pulot,
' TACNAV' CQ
AST
.
' Dynamic VisualDome Visual
.
G-seat/G-suitbuffet systemsInteractive TargetAuxiliary IOS
FORMNFORM
- TACFGUNS
, ' WEPS,
ACM
Building on the data produced during the 1SD
development process, Mathetics, Inc. per-
formed an analysis which led to a functional
specification for the instructor operating
stations for these devices. The design was
, to be specifically oriented toward making
the instructor a teacher rather than an
operator.
CONSIDERATIONS
Initially MATHETICS' study team reviewed the
literature on prior simulator design and
applicable human factors considerations.
Although many studies and authors con-'
tributed to the data base for this study,
. two works were particularly valuable. The
first, "The Instructor Pilot's Role in Sjmu-
lation Training" by Dr. John P. Charles.
(1978), formed the basis on which the design'
algorithms were formed. The .second,
"Training Device Design: Human Factor's
Requirements in the Technical Approach", by
Dr. Alfred F. Smode (1972), provided a
wealth of information pertaining to design
considerations and presents many pertinent
examPles of human factors considerations for
design of simulators.
This concept of 'a generic instructor station
for all of the training deVices was a
desirable goal for the VTXTS for several
reasons, including cost and risk reduction,
Commonality of hardware and software, and a
54
5
reduction in instructor training require-
ments. The instructor consoles, although
generally common, would.have some dif-
ferencet due'to the varying capabilities of
the devices and therefore, the magnitude of
the requirement for instructor monitoring
and instructional activities.
The s dy of all factors to le considered indesign'ng a generic IOS revealed that theinstructor console should have a number ofcomputer-assistance features designed to aidthe instructor.. jhese features were:
Standardization of TrainingPreprogrammed TiSsons which woUldprovide a series of stamdardizedexercises graduated in difficultywith some capability for instructorintervention in prescribed ways tocontrol the training process. This
concept includes the possible manualoverride of defined mitsion events toaccommodate students who 'performbelow expectation for a given exer-cise and for insertion of additionalrelated events in an exercise tochallenge student who exceed scenariorequirements. These additions anddeletions of training eyents are only'allowed if the preprogrammed missionscenario software allows manualintervention.
Computer Assisted Measurement SystemAutomated evaluation and scoring withobjective criteria adjusted to thestage of instruction and level ofdifficulty for eacli student. The.
system would provide error indica-tions and information which are disrplayed at the instructor console.
Automated Monitor and ControlCapabilityComputer driven multi-format, inter-active CRT displays at the IOS whichpresent performance/error and sylla-bus status information tn allrelevant modes and in variable for-mats (both alphanumeric and graphic).
Records of Student PerformanceSoftware recor-71Tig of student perfor-mance for debrief. Student perfor-mance information (syllabus eventwould be error, and summary informa-tion) available.on demand for useduring and after a training exercise(for both debrief and for record-keeping purposes).
The first f these computer assisted fea-tures became a driving factor in the IOSdesign. The nature of the Navy Jet TrainingC9rmand requires that specific prerequisiteknowledges and/or skills must be acquiredprior to the follow-on event. Therefore,the training 'problem in the Undergraduate'Pilot Training (UPT) context is best placedunder software control where the instructorcan only minimally alter the trainingscenario. This allows for training sallabusconfiguration control across the entire JetTraining Command, in that the learningobjectives, conditions and standardsassigned to a particular lesson cannot bemodified by the instructor who thinks he has"a better way". These preprogrammed lessonsare designed in such a way as to specify theapplitable instructional features, controls,and strategies. which may be empToyed toteach certain objectives maneuvers, orflight.elements. One might contend thatsuch an* approach would take the instructor"out of the.loop" however, it is believedthat such an approach would actually enhancethe instructor's capability to teach ratherthan operate the device.
Heavy programming of lessoni is also neces-sary where automated performance measurementis to be utilized. One cannot alter thetraining scenario and/or lesson methodologywithout impacting the results from the per-formance measurement system. Preprogrammedlessons also create a predictably capablepilot with the skills necessary for entryinto the weapons system oriented aircraftlocated at the various Fleet ReadinessSquadrons (FRS): Therefore, a conscious
effort has been made, in concert with ISD/ methodology, to design out the flexibilityOf past simulators. Designing out flex-ibility means, designing in standardization.It is believed that designing out flexibil-ity would also reduce cost and increaseoperating simplicity. The MATHETICS studyteam held that it is possible to design anIOS which has no training controlS otherthan interactive CRTs which were conlrolledby a package of training soffware designedspecifically to enhance the instructor'scapability to train.
The last consideration of note was the con-cept of separating the simulation systemsoftware (aero model, motion.model, etc.)from the package of training software(syllabus, scenario control, etc.) in termsof configuration control and management.Detailed unique training software programscad be created for each-lesson containingthe initial conditions Tor each segment,allowable instructional features and strate-gies, and performance measurement routines.These packages of training software wouldact as executive routines for the entiretraining device computational system,prioritizing progtams, routines and features_in terms of training requiremints. Thetraining software would-be under the controlof a Training Support Center add would beeasily updatable as part of the trainingsystem quality control effort. Instruc-tional personnel would create the trainingmethodologies which would then be imple-mented in software. This concept wouldallow Tor training implementation of thesyllabus by flight instructors trained toinstruct in the visual/motor skills sifflying high performance jet aircraft.
ANALYSIS PROCEDURES
Based upon the results of the initial datacollection, the MATHETICS study teamselected the Operational Flight Trainer(OFT) as the first tre.iner for analysis.This was done primarily because the OFTsyllabus encompassed every training modulewithin the yTXTS curriculum and varitd incomplexity from simple hands-on systemfamiliarization to weapons delivery. Thefunctiblial specification for.the OFT couldthen be later adapted for use in the Instru-ment Flight Trainer (IFT) and the. AerialSituation Trainer (AST). The MATHETICSstudy team selected sixteen lessons from theOFT syllabus for full development-and analy='sis. The lessons were written froM theexisting lesson specifications developed inprevious project work. The :sixteen lessonswere selected for their instructional con-tent, the nature and intensity of insteuctorinvolvement; and the possibility for use oftraining features or strategies.
4, 55 r-t.)
Each of these sixteen lessons.was then fully
analyzed in order to determine the sequence
and validity of the learning objec,tives,
detail instructor and student activities,
and estimate the scope, possible application
of simulator learning strategies and the use
9f instructional features. After full lesson
development the analysts looked at each of
the lessons tor a number of-attributes and
selected six of the original sixteen lesson
scenarios for further analysis. The scope
of the general training requirements
included:
Basic Instrument ProceduresProcedural Demonstration
RequirementsModulized LessonsEmergency ProceduresVerbally Described DemonstrationsSelf-Training RequirementsHigh Density Student/Instructor/
System InteractionsVisual Task TrainingReplay requirementsPerformance Measurement
System RequirementsBackward Chaining RequirementiIntense Instructional ActivitySubjective Instructor Evaluation
The second portion of the analysis utilized
a functional flow analysis procedure adapted
from the work of both Smode and Charles.
Ihe functional flow analysis involved the
development of "training algorithms" aided
by Subject MattExperts (SME). The SMEs
were asked to examine each of the lessons
and describe how they, as flight instruc-
tors, would best teach each of the lessons.
This process yielded a set of eleventraining algorithms which were entitled:
Pre- riefingLes on ModificationBriefingi .
InitializationSt rtDe onstration --
Single Pask SesnentMultiple Repetition Segment
(loop sesnent)
Post-MissionDebriefPost-Debrief
The analysts then created strings of thesealgorithms which corresponded to the lessons
as written in the earlier ddvelopment
effort. The instructional methods and in-structional flow for each of the six lessons
was:then validated by SMEs. These stringsof algorithms became the functional flow
diagram for each lesson. This step in the
56
,
process really 'involves developing, an over-7.,
view of the initructor's role in ihe
.trainidg system in order to consolidate
thinking about 'design philosophy and aid-in
the further definition of design require-
ments for:the instructor station.
The next step, the allocation of features,displays, and controls, began with a review
of the literature to determine the types of,
Instructor Station capabilities which were
currently being utilized in existing and
near term simulators.,.These capabilitiesfell into seven categories:
Instructional Features'
DisplaysControlsConsole TypeInstructor ActiVity Type
Controlling System/SoftwareSubsystem Involvement
A "relative time-line" allocation of simula-
tion system capabilities was then performed,
again utilizing heavy SME inputs. The
"relative time-line" analysis allowed speci-
fication of activities, instructional fea-
tures; controls, displays, etc. for each
activity on the functional flow diagram. An
alphabetical character was assigned to each
of the above listed instructor/system capa-
bilities. Each subactivity was assigned a
numerical identifier. The alphanumericidentifiers were placed under 'each func-
tional flow diagram in "relative time-line"
fashion. This data format can be seen in.
Figure 2 which depicts a portion of a func-
tional flow diagram containing a single .task
algorithM. This format was utilized to i
determine the instructional requirements for
each lesson and tp provide a qualitative
indication of frequency of use and an indi-
cation of critisality. Additionally;---the
functlonal ,flow analysis permitled theanalysts to "visualize" the Aesined sequence
of instructional activities and options
available to the instructor while actually
involved'in a training evolution.
SPECIFICkTION OF INSTRUCTIONAL FEATURES
AND STRATEGIES
Based upod the data that was generated
during the\preceding analyses, a set of'instruction'al strategies and features for
the simulator was devejoped. Instructional
strategies are defined as simulator capa-
bilities whiah are utilized as methods of
presenting sAmulator training materials.
The instructio1al strategies applicable to
the VTXTS OFT kare DEMONSTRATION, REPLAY,
PREPROGRAMMED '\LESSONS, MANEUVER/SEGMENT
RERUN, BACKWARD, CHAINING and FREE FLIGHT.
These strategies\are defined beldw.;
nitializeileruneS10'sequence
Continue eventif necessary
i
Freeze provide
10 guidance
A
a
I ,
ID
I
1 1, 3 7
5
1 3 7 A r ;
2 I I. 5, 7
I 4 2, 4 1, 4
5, 6 1 1. 2
Figure 2Functional Flow Diagram
features applicable to the OPT were deter-mined to be SELECTABLE DIFFICULTY LEVELS(SDL), DIGITAL SPEECH GENERATION, andPARAMETER FREEZE. SELECTABLE DIFFICULT1LEVELS is an OFT instructional feature thatconsists of a simultaneous programming ofthree levels of difficulty for any scenarioand which is utilized to either challenge orunload a student whose performance is aboveor below the UPT student average resped=tively. SDL is in reality a simple "poorman's" adaptive training capability.DIGITAL SPEECH GENERATION unloads theinstructor of the requirement to act as amanned interactive system (GCA controller,ATIS, 'clearance delivery ACLS messages, LSOcontrol, etc.) by means of speech synthesis.PARAMETER FREEZE allows the instructor toselectively freeze a certain flight para-meter (airspeed, angle-of-attack,- etcJ or a.portion of a,flight path (glideslope, cen-
of pivrecorded points (initial conditions)._-acceptableper-for---
which allow a student to learn the laStresponse (last portion) in a response ormaneuver chain first. Learning then pro-ceeds "backward" up the chain until all,
membersof that chain are acquired. .FREEFLIGHT lessons, comprising approximately 2%of the scenarios, enable the instructor toconstruct his own flight scenario from amenu of init40 conditions, malfunctions,flight.segments, and instructional modules.
DEMONSTRATION is an OFT instructional strat-egy that consists of a 'prerecorded aircraftmaneuver, or series of maneuvers, that pro-vides.a model of desired performance for aflight event. REPLAY enables the instructorto replay the student's performance duringany portion or all of the most recent fiveminutes of simulated flight. PREPROGRAMMEDLESSONS comprise 98% of all training eventsand consist of a pre-defined sequence ofmaneuvers, segments of flight, instructionalevents and allowable instructional features'and strategies which reside in the OFTtraining software. MANEUVER/SEGMENT RERUNis ao instructional strategy which permits .
the Instructor to return to the flight con-dition which existed at the beginning of amanuever, either by flight segment designa-tion 6- by entering any mission time yithinthe five minute replay recording. BACKWARDtHAINING is a strategy which presents a set
Instructional features are defined'as simu-lator capabilities which enhance thetraining of a student by modifying theinstructional scenario or strategies. ), The
mance measurement parameters.
. It is felt by Wany simulation industryobservers that information about the in-tended use to.be made of a simulator's in-structional features, if. made availableduring the' design process, could be used todesign a more effective. vehicle fortraining. The needed information must con-vey to the designer the-prospective simula-tor user's concept of how the various in-structional features are to be employeddurifig simulator instruction. It wasdecided by the study personnel that the useof Instructional Feature Design Guides,adopted from Pohlmn, Isley and Caro (1978),
would best accomplish this purpose. The VTX
Instructional Feature Design Guidesaccounted for -learner characteristics and
teaching methodologies appropriate to Und&-
graduate P'ilot Training. Each of theinstructional strategies and features waS
written up in an individual design guide
which described the feature and how it would
be employed by the simulator instructor to
perform coaching, demonstration, feedback
and instructional functions for relatively
unskilled pilots.
A six-element format was uiilized for the
instructional feature designguides' and
included the following items:
Feature name
''inconiorate a number of features that weredeeited appropriate as a result of the pre-
viously discussed analyses. These capa-
bilities of :interest are as follows:
F-14 Instructor Support Station (ISS)
Interactive CRT Touchpanels
Preprogramed/LesSonDigital Speech GenerationPerformance Measurement
System
Purpose and Intended UseFunction DescriptionConcurrent EventsFeature ,Diagram
Figure 3 is an -example of a Feature
Diagram.
I
Sadveril Chaining Featinv Diagram
Tenainat
Figure 3Example of a Feature Diagram
SYSTEM,DEVELOPMENT,
A ,sUrvey of existing and emerging simulator
technology was conducted in order to,incor-
porate ideas and technology which would
improve tile instructoes capabilitytoteach. The two devices"Which most dramati-
oally influenced.the IDS design process were
the F-14 Instructor Support Station sand the
F-18.0FT, Device 2F132. These two devices'
F-18 OFTDisplay SystemInteractive CRTProcedures MonitorPreprogramed Missions
Of particular interest is the simplicity of
of training devices all include an IOS which
is very much simIlar in hardware configura-
tion to the-2F132. However, they aremarkedly different in software-design,ope.ra ti onal capability and syllabusimplementation from the F-I8 OFT.
The analyses indicated that it would be
desirable to include two additional sub-
systems into the simulator design to further
enhance its capability to,meet chailenges
the training system. Theselsubsystems were
an Automatic Performance Measurement System.
(APMS) and a Remote Briefing and Debriefing
Seation. The APMS would provide a computer
system that would score a -student's perfor-
mance automaticallxand package those scores
'in a manner whieh would be useable to the
instructor in.real time and during,the sub-
sequent debirief: The APMS programs will
compare the student'S performance with a
definitive criterion, normalize these scores
in order,to provide peer ratings, determine
students errors, and suggest remedial strate-
gies. The APM system is designed ih such a
way that performance on one leg or segment
does not affect scoring on subsequent legs
except that the possible UST Rf the SDL
feature'may be continued.
The Remote Briefing and DebriefingConsole
-------consi-stsof-an_lsolAted station i nterfaced
to both the simulators and the Training
Management System. This coniole can ,be
utilized for Simulator lesson preparation,.
pre-mission briefing, Rost-mission de-.
briefing, student data management and
training management functions. The intent
of the Remote Briefing and Debriefing Capa-
bility is two fold; one, to allow briefs and
debriefs to be held in an area, more com-
patible to the instructional, activities
involved in pre- and Post-mission training
than the simulator itself, and two, to
increase the available OFT training time by
freeihg the device from the briefing and-de-..
briefing activities.
58
The analyst team then used the funCtional \.flow diagrams to validate the IOS design,ensuring that ;the six lessons could betaught as envisioned. This validation pro-cess was iterative in nature as..,,the IOSdesign and the functional flow di4grams werecontinually changed and .updated untilworking design was completed. This analyti-cal, step is similar to, design "mock-up"review and ensures that the instructor caninstruct from the instructor station. Thefinal step was documentatcon, of the actualfunctiorfal speci fications.
LESSONS LEARNED
The front-end analysis leading to the VTXTSIOS functi'Onal specifications was notwithout pitfalls. AS the analysis pro-ceeaed, it, was necessary to work 'around anumber of problems. The folloring para-graphs describe the most important lessonslearned'from the Mathetics study effort.
1. Simulation engineers are hardware. specification oriented. It was dif-
ficult to communicate traini.ng require-.Tents to the engineers due to the lackof hardware definition. Feature designguides describing in detail the use ofinstructional features to meet the. UPTtraining requirements largelrobviatedthi s probl em.
2. The ISD base on which the IOS 'fi:ont-endanal sisiranchT)FeTmust-V comp ete.onsl era6Te effort mirilThe expended on
the development of learning 'objectives,conditions and standards prior to mediaselection and lesson specification.Improper media selection and/or poorlessoh specifications, could lead to
, functional specifications which do notaddress actual training requirements.
3. Subject Matter Expert '(SME) participa-Th7-1 is essential durciillesson deve-Torment, and the functlor'llaanalysisTritFoTA user inputs-Th-1analytical effort may end up in leftfield. SME input injects reality intothe ftont-end efforts.
4. Fanctional flow analyses are labotintensive ,aTrinust be iterative innature. Th-e-Tun-CTionir flow analysisis not the area to skimp on manpower.The work must be iterative in natureand previously developed algorithrpsupdated as theidiagrams are created.
5."The'analytical "mock-up" is necessaryto validate the functiOnal flows.TrErforming a "riTa-up" review with SMEsidentifies weak points and allows,for.improvement of the specification.Without this step, the furktional flow
diagrami, add feature allocations willnever -communicate the instructionalrequi rements.
,6.. Front-end analysis can ylield signifi-cant improvement in user manufacturercooperation. SFr inputs,Oefore thedevelopment -of the functional specifi-cation, if properly ifnplemehted, willreduce the adverse impact of FleetProject teams later.
7. Without front-end anaTysis "goldFiTfirTe may r-7Firr, t ,.- This front-enanalysis s owed that existing or near-term technology was adequate. for theVTXTS simulators and incorporation ofexotic features and capabilities wasnot necestary.
FINAL COMMENTS
A number of conferences have addressed thevalue of front-end analysis in simulatorinstructor station design in theoreticalterms. This paper has presented an examplfof one approach to performing a front-endanalysis leading to functional .specifica-tions that has been trietl. The methodologyutilized in this.study is not original butis ,adapted from previously published work?This front-end analysis achieved a muchbetter collaboration with project engineersthan ordinarily occurs during simulator IOSdesign efforts and, thus greatly enhancedtheir understanding of the LIPT trainingrequirements and how A typical scenariowould be taught on 'the device. TheMathetics study team sees the actualdevelopment of working specifications for asimulator IOS as an iterative process whichcontinues throughout the simulator procure-meglotrell The functional specificationsgenerated as a result of this front-endanalysis was an initial training input tothat iterative process. The training teammust remain involved throughout .theensineering design, constructiontnd useracceptance processes in omder to ensureincorporation and proper implementatiOn ofkey training recommendations. Withouttraining requirements inputs and analysis,there is a risk of user dissatisfaction withthe4device when placed in the actualtraining situation.
REFERENCES
Charles, John P. "InstructOr's Role inSimulation Training" Technjcal ReportsNAVTRAEQUIPCEN 75-C-0093-C, 76-C-0034-I,
dis and 76-C-0034-2, Naval Training Equip-ment Center, Orlando, Florida; June,1978..
Mackie,- k,R.; Kelley, G.R.; Moe; G.L.;
,Mencherikoff, M.; ufactars Leasting to
the acceptance or Rejection of TrainingDevices", Technical Report" NAVTRAEQUIP-CEN 70-C-0276-1,'Naval Traiming Equip-ment Center, Orlando, Florida, 1972.
Pohlman, Lawrehce D.; Isley, 'RobertN.;Carb, Paul W.; "A Mechanism for Gom-municati ng Simulators InstructionalFeature Requirements", Proceedings,Naval Training EquipMent Center Industry
Conference, 1978. '
Smode, Alfred F. "Training DeviceDesign: Human Factors Requirements in.the Technical Approach', Technicral
'Report NAVTRAEQUIPCEN 71-C-0013-1. NavalTraining Equipment Center, Orlando,
'Florida, August 1972.
*rs,
60
SECONDARYDISPLAYMODULE
HUDMONITOR
IAMERACTMECONTROL/DISPLAY
MODULE
TWO,AXMCONTROI,LER
WRITINGSURFACE
SECONDARYDISPLAY
(NORMALLYINSTgUMENT
REPEATER 'DISPLAY)
COMMUNICATIONSJACKS AND CONTROLS
AIRCRAFT SOUNDCONTROL
FIGURE 1. GENERAL CONFIGURATION OF VTXTS SIMULATOR-INSTRUCTOR CONSOLE(US'ED ON IFT, OFT, ACMT)
is located in back of and to the left of thestudent's seat, and at right angles to the in-strument panel.,- An instructor seated at the
.console can thus see the student and his in-strument panel and/or the simulated outsidevisual scene by looking over his right ghoul-der. For an ACMT, a fnll console is locatedoutside the simulator, and a miniconsole con-sisting of an interactive disrslay module islocated in front of a jumpseat next to thecockpit.
The configuration of the interactive mod- .
ule is presented in Figure 2. The swiches oh .
the interactive module tonsist of dedicatedspecial function switches and'the CRT touchscreen, which is used in conjunction with var-ious tYpes'of menu "displays presented on the_CRT. The selection of membrane-eype off-scieen tou h pane over mechanical switchesand of th a parent membrane-type touch /screen over nfrared and sonar-type touch -
streens was made on the basis of a detailedcOmpariton'of the ease-of-use, initial cost,reliability and maintainability of these al-ternatives. A caution in making these kindsof controUiardware comwfsons js that,thecost of the/decoding and-interface equipmentrequi-eed for different types of switches andswitch panels is often much greater than thecost of the'switches or panels themselv'es,_andthus must' be broken out separately.
63
EMERG - STATUS INDICATOR 'CONT ROL .
( 0 0 0 0 0 0 0 6)
CRTWITHTRANSPARENT
MEMBRANE.,TYPEON-SCREENTOUCH
PANEL
MAtiuNCTION SELECT. --,
I
EVENT CONTROL
0 0 0 0
NUMERIC ENTRY
ENTER
0
----- MANEUVER CONTROL
o0 0000
-s0
0
7
5
2 30
.....
TWO-AXISCONTROLLER
FIGURE 2. INTERACTIVE CONTROL/DISPLAY MODULE
Functional Design
The functional design of the instructon
console consists of several major features
which appear capable of dramatically simplify-
ing procedures and improving training. These
features are as follows:
1. -An extension of the task module con-
cept developed.by Logicon and the Canyon
Research Group for the F-14 OFT ISS. (Semple,
et. al., 1979)
2. The use of "event" modules in addition
to task modules.
3. The inclusion,of explicit provisions
for conducting three basic types of simulator
training: routine; partially specialized; and
fully specialized.
4. The itclusion of explicit provisionsfor controlling four basic types of simulator
learning events: IP demonstration; autopilot
or canned demonstration; live run; and replay.
S. A special, minimal set of dedicated
function switches and switch logic designed to
interact with event modules, task modules and
special purpose displays in a way that pernits
high4y efficient control of each type of
training and each type of learning event.
6. Low-density pictorial and tabular
special-purpose displays designed to provide
the instructor with the essential information
at each point in the training process..
The following sections describe the na-
*ture, use, and projected advantages of each of
the above instructor console design features.
Task Modules. As originally conceiveo by
Logicon and the .Canyon Research Group for the
F-14 OFT ISS, a task module is a detailesl pre-
specification of the initial conditions, per-
iformance conditions, and automate0 performance
measurement algortrs appropriate to a given
maneuver or task, whfi is stored in computer
memory. One of the primary functions of such
modules is to automate maneuver set-up. Using
this approach an instructor can set up a simu-
lator to train any given maneuver by simply
identifying the name of the maneuver to the
computer. The computer then sets up the simu-
lator automatically according to the speci-
fications in'the task-module. Task modules
thuS.greatly:reduce instructor workload by
eliminating the need to remember or look up
and then manually enter all of the information
which must be speci(iedto set up a simulator
to train a.specific maneuver.
The proposed approach expands the original
task module concept in two ways. First, the
task module specification is expanded to in-
clude additional information such as 1) the
method to be used to control the second air-
craft in ACM and formation training; 2) the
standard and optional instructor console ais-
plays to be shown ta the instructor in each
part of each maneuver; and 3) the displays to
be recorded on videotape for later use in de-
briefing. Second, "auxiliary" task modules
are created which enable the instructor to
quickly set up standard variations on basic
maneuvers in order to tailor training to the
needs of individual students i.e., to con-
duct partially specialized training as dis-
cussed below. This is done with a minimum
impact on computer memory requirements orsoftware development costs by using auxiliary
modules to specify only those parameter values
which are different from those in the "core"
module for the basic maneuver. Standard var-
iations on a given maneuver can then be set up
by selecting a desired maneuver variation from
a menu of standard variations. In addition to
greatly reducing instructor workload, core and
auxiliary task modules will increase training
standardization to a level not possible with
current consoles, in that all maneuvers and
manedver variations will be set up, displayed
to the instructor, evaluated, and recorded in
the same way for each student and instructor.
64
The procedures for fully specializedtraining are simplified by requiring the in-
structor to manually input only those parame-
ter values which are different from those in
the core or auxiliary task module under con-.,
sideration, as discussed below.
Event Modules. Event modules are an ex-
tension of the canned mission mode defined for
the F-14 OFT ISS. As defined here, an event
module is a detailed prespecification of the
lesson pl'an for a given training event which
is stored in computer memory. An event module
is created for each simulator event in the
flight syllabus. The event module for a given
event specifies the specific maneuvers to be
trained in the event, as well as the sequence
in which they will be trained. It also spec-
ifies the type and number of demonstrations to
be used in the training of each maneuver, as
well as the minimum, standard, and maximum
number of student practice trials to be flown.
The use of a prespecified mT'.jifl, standard,
and maximum number of trials will be further
discussed below under controls and displays.
In general the everirmodule will automatically
set up the simulator to fly each run of each
maneuver in the sequence called for by the
lesson plan for that event. It will do this
by calling up appropriate task modules. For
example,'at the start of a training session
the event module will automatically set up the
siNlator for the first run of the first Ma-
neuver. This might be, e.g., an automateddemonstration of a lazy 8. After he finishes
talking to the student and verif/ies that the
student is ready, the instructdr starts the
run by OUshing the START switch. The run will
CONCEPTUAL DESIGN OF AN INSTRUCTIONAL SUPPORT SYSTEMFOR FIGHTER AND TRAINER AIRCRAFT FLIGHT SIMULATORS
4
VERNON E. CARTERNorthrop Corporation
ABSTRACT
A conceptual design for a comprehensiveinstructional support system (ISS) for fighterand trainer aircraft flight simulators was de-veloped as part of an independent researchproject to develop and evaluate advanced pilottraining techniques. The major elements ofthis ISS are:
1. A low-workload flight simulator in-structor console based in part on the taskmodule approach developed by Logicon and theCanyon Research Group for the F-14 Operational.Flight Trainer ISS;
2. A corpprehensive automated performancemonitoring system based on techniques devel-oped by Northrop, Vought, and Systems Technol-ogy, Inc. (STI)
3. A set of automated and nonautomatedsimulator instructional features selected onthe basis of a review of the literature andrecent field experience
4. A low-cost, low-workload brief/debriefconsole designed to minimize the time spent inthe simulator on tasks other than activeinstruction.
These elements are designed to work togetherto reduce instructor workload and increase theeffectiveness and efficiency of simulatortraining. Although the design is oriented to-wards large, highly-structured training sys-tems, many of its features appear applicableto other trainin§ envionments.
.rINTRODUCTION
A conceptual design for a comprehensiveinstructional support system (ISS) for fighterand trainer aircraft flight simulators was de-veloped as part of an independent research anddevelOpment project to develop and evaluateadvanced pilot training techniques. The majorelements of the ISS are 1) a low-workload in-structor Console based in part on the taskmodule approach developed by Logicon and theCanyon Research Group for the F-I4 operationalflight trainer ISS; 2) a comprehensive auto-mated performance monitoring system based ontechniques developed by Northrop for evaluat-
' ing student performance in introductory air-to-air tactics, by Vought for evaluating stu-dent performance in carrier landing and by STIfor examining pilot control techniques in avariety of tasks; 3) a set of nonautomated andautomated simulator instructional features se-lected on the basis of a review of the recent
literature on instructional features, supple-mented by interviews with research and userpersonnel; 4) a low-cost, low-workload brief/debrief console designed to reduce time spentin the simulator on tasks other than activeinstruction.
Working together these elements appear ca-pable of dramatically reducing instructor
workload and significantly increasing the ef-fectiveness and efficiency of simulator train-ing. In general, it appears that the proposedapproach should
111.
1. reduce the instructors perceptual andprocedural workload far below that of currentsimulators;
2. reduce the time required to achieve agiven level of student performance;
3. permit a much higher level of trainingstandardization and control than is possiblewith current simulators;
4. provide a full capability for tailor-ing training to the needs of individualstudents;
5. reduce instructor tr.aining require-ments;:and
6. minimize the need to refer to user'smanuals.
DESIGN GUIDELINES'
Two basic design guidelines wehe estab-lished for this development. The first, apoint often stressed byftMr. William Harris ofthe Analysis and Design Branch at NTEC, isthat the design of all equipment to be used bymilitary flight instructors should reflect thefact that they are typically undermanned andalready working long hours. Their time, aswell as their energy and patience, shouldtherefore be treated as a scarce, essentialresource to be conserved wherever possible.
It should be recognized for example that in-structors do not have time to stop in the mid-dle of a training session to try to remember acomplex procedure or a set of parameter valuesin order to set up a given maneuver. Theyhave even less time,to look up the correctprocedure or,information in a user's manual.Instructor consoles and other equipment to beused by instructors should thus be as simple,self-explanatory, and eksy to use as possible.Achieving such simplicity not only reduces thetime normally spent in remembering, looking upand performing procedures, but also reduces
61
,
set-up errors which are time consuming andfrustrating in themselves. Finally,-equipmentthat is simple to operate should significantlyreduce instructor training requirements, or atleast allow more time to be spent on teachinginstructors how to instruct rather than how tooperate the equipment.
The problem with all this is that makingoperating procedures simple and self-evidentfor something as complex as a flight simulatortypically requires the development of fairlycomplex software to partially automate theprocedures. Vie development of such softwareis both time-consuming and expensive. Al-
though it seems clear that simpler procedurescan contribute to redueed life cycle costs inseveral ways, there is simply not enough dataat this point to prove that the projected sav-ings will be enough to offset the cost ofdeveloping and maintaining the additional
software. A second and very important guide-line established for the study was thus theworking assumption that the cost of developingand maintaining the software required to sim-plify operator procedures will be offset inthe long 'run by savings in investment and op- .
erating costs resulting from such benefits asreduced instructor training requirements andmore effective and efficient student training.For example, the proposed instructor consoledesign appears capable of reducing the timenow spent in deciding what maneuvers to train,in obtaining needed information, and in set-ting up the maneuvers. These savings shouldtranslate into reduced time in the simulatorfor each student. This in turn should resultin a reduction in the total number of simula-tors requi.red, and/or a reduction in the op-
erating and maintenance costs per simulator.
An example of the problems caused by notsimplifying operator procedures is provided bysome current military instructor consoles thatare reported to require the instructor to per-form as many as 40 procedural steps to set upa single maneuver for training. Obviously
such a design will waste a great deal of stu-dent time, instructor time, and device time ineach of thousands of training sessions through-out its life cycle. Lengthy set-up procedures ...
lower simglator training efficiency by reduc-ing the ratio of productive training time to
total simulator time. They further lowerefficiency and effectiveness by introducing -
continual delays in training which have an un-avoidable effect on student'and instructorconcentration and motivation.
INSTRUCTOR CONSOLE
Design Approach
A high level of partial automation is usedto achieve a very low level of workload forthe most frequent case of routine or standard,.
ized training. A moderate level of automation
62
is used to achieve a relatily low level of ._workload for the less frequent case of parti-
ally specialized training. A low level ofautomation is used for the relatively rarecase of fully specialized training, resultingin a level of workload for this type of train-ing which approaches that found on current
consoles.
The application of automation to the sim-plification of console procedures is thus inaccordance with a frequency-of-use approach inthat the size of the development effort.tosimplify procedures is proportional to how of-ten the procedures are used. This results in
a console that is "managed by excertion" jnthat the instructor has to do very little un-less he wishes to deviate from the standard"default" condition. From a different stand-point, the console design makes use of theprinciple of least effort in a way that shouldresult in a much higher level of training stan-dardization than is realized with current sim-
ulators. For example, the console designhakes it easiest of all for the instructor toconduct the standard training specified in thesyllabus and lesson plan; harder but stillrelatively convenient to conduct partially -
specialized training; and least convenient ofall to conduct fully specialized training.Thus, the more the instructor deviates fromthe syllabus or leston plan, the harder he has
to work. In this way the design strongly en-courages standardized training but makes itpossible for the instructor to introdUce spe-cialized training whenever he decides that itis necessary.
Physical Configuration
The general physical configuration of theinstructor console consists of an interactivecontrol/display module, a secondary display
module, and a two-axis controller as shown in
Figure I. The interactive mod reconsists ofa 21" calligraphic CRT with a transparentmembrane-type on-screen touch anel or "touch
screen", and several off-screen membrane-typeilluminated touch switches, where individualswitches are created from a continuous matrixof switches by making cutouts where desired ina thick, e.g., 1/8", plastic overlay. The
secondary display module consists of a second21" calligraphic CRT plus a smaller CRT in-cluded primarily for the purpose of presentingthe instructor with the view seen on andthrough the students HUD.
The configuration shown in Figure I wasdesigned for use with an instrument flighttrainer (IFT), operational flight trainer(OFT), or air combat maneuvering trainer
(ACMT). The interactive dispJay module by it-self can be used as the instructor console fora cockpit procedures trainer (CPT), whVre itcan be swiveled to face the student fO1- self-instruction. For an IFT or OFT, the console
stop when the instructor pushes' the STOPswitch or when the run time reaches some pre-specifted maximum stored in the task modulefor that maneuver. When the tun stops, theevent module will start setting the simulatorfor the next run to be flown, typically a stu-dent practice run ("live run") for the samemaneuver. After the standard number of stu-dent practice runs have been flown for thefirst maneuver, the event module will startsetting up the simhlator to fly the first runof the second maneuver. This procedure is re-peated until all maneuvers have been flown.Unless he wants to deviate from the standardlesson plan, the only inputs the instructorhas to make are to push POSITION SIM to ini-tialize each run, START to start each run, andNEXT MANEUVER to proceed to the next maneuver.
Although it may appear at first that theevent modules are too restrictive, a varietyof ways are provided to permit the instructorto deviate from the standard lesson plan totailor training to the needs of the indiVidualstudent. As with task modules, the difficultyof deviating from the standard lesson plan foran event is inversely proportional to the es--timated frequency of the need for such de-viation. Thus, modifying the number of trialswithin the prespecified minimum and maximumiTrrnEFF of trials is very easy. Modifying thelesson plan to provide training on a maneuverwhich is not even in thejesson plan is rela-tively cOmplex, although still fairlystraightforward.
On the surface it may seem desirable tolet a student practice each maneuver or taskuntil he reaches the specified or customarylevel of performance for that point in train-ing, regardless of the number of trials calledfor in the syllabus or lesson plan. In prac-tice, however, most large-scale training pro-grams permit only slight deviations from theplanned number of trials for each maneuver.This is due to the fact that large deviationshave a chain reaction effect on subsequentsyllabus events. For example, excessive timespent on one maneuver will usually result ininsufficient practice on one or more of theother maneuvers scheduled for the same event.,The student is then faced with the problem ofgoing to the next regularly scheduled simu-
lator or aircraft training event without beingable to perform these other maneuvers at theexpected level. This is yet pother exampleof how individualized progression, althoughdesirable from a theoretical standpoint,
creates serious problems in large scale train-ing systems. Recent advancements in dynamicscheduling techniques may someday provide ananswer to this dilemma.
Provisions for Routine vs SpecializedTraining,. The proposed design makes explicitprovisions for three types of training: rou-tine training; partially specialized training;and fully-specialized training.
In the context of individual maneuvertraining, routine trafning consists of train-ing the student to perform one of the standardversions of the maneuver called for in theflight syllabus. Partially specialized train-ing in this context refers to tailoring train-ing to the needs of the student by having thestudent fly a relatively common variation ofone of the standard versions of the maneuver,e..g., under different visibility conditions orwith a different e.g. Fully spkializedtraining refers to training the student to.fly
,
-uncommon variations on one.of the standardversions of the maneuver, e.g., ln turbulencewith one engine out, etc.
In the context of the training of an en-tire event, routine training refers to con-ducting the training exactly as specified in a
pre-established lesson plan, in terms of themaneuvers trained, the sequence of maneuverstrained, the type of runs or "learning events"ided to train each maneuver and the number ofrues of each type flown. Partially special;ized training here refers to making relativelyslight variations in the lesson plan, e.g.,varying the number of student practice runsfor a particular maneuver within pre- ,
established minimum and maximum limits. Fullyspecialized training here refers to making ma-jor changes in the lessOn plan such as exceed-ing the maximum number of runs specified for agiven maneuver or introducing a maneuver whichis not in the lesson plan.
As discussed above, the console .k de-signed so that routime training, which shouldbe completely adequate most of the time, re-quires an absolute minimum of instructor in-puts. Partially specialized training, neededoccasionally, requires a moderate number ofinstructor inputs. Fully specialized train-ing, needed only on rare occasions, requires arelatively hjgh number of manual iRputs. In
comparison with current cowles instructorworkload should thus be very low,for routinetraining, higher but still relatively low forpartially specialized training, and approachthat of current esonsoles only for the rela-tively.infrequent case of fully specializedtraining. ,
4
Provisions are also made to record thenumber and nature of each partially or fullyspecialized maneuver or task used by the in-structor to identify: 1) standard variaSionsof maneuvers and tasks which are used ffe-*fitly enough that they should be added tothesyllabusto-become part of standardizedtraining; and 2) nonstandard variations of ma-neuvers and tasks which are used frequentlyenough that they should be added to the menusof standard variations available to the in-structor for conducting partially specializedtraining.
.65
t.
Provisions for Different Run Types. The
proposed console design also makes explicit
provisions for four basic types of runs or"learning events" which are used in simulator
training: IP demonstration runs; automateddemonstration runs, student practice runs; and
replay runs.
In IP demonstration runs the instructor
pilot flies the simulator from the instructor
console using the inside-out and outside-in
graphid4displays and the two-axis controllgr.This feature was included to permit the in-'
structor to actively fly the simulated Sir-
craft in order to illustrate-facets of perfor-
mance that are not adequately illustrated by
any of the automated demonstrations. Auto-
mated demonstration runs are runs flown by an
interactive autopilot or by a "canned" tape.
Studgnt practice runs ("live runs") and replay
runs are self-explanatory. The use of the in-
structor console to set,up and control each of
these types of runs is described in the fol-
lowing sections.
Controls. Figure 2, above illustrates the
three basic types of controls provided on the
instructor control/display module. These are:
I. A transparent CRT touch screen used to
-select items from alphanumeric or pictorial
menus displayed on the CRT.
2. Dedicated function switches located
both above and below the CRT. These are
grouped, into five subpanels corresponding to
five basic functions- performed by the instruc-
tor: simulator status monitoring; event con-
trol; maneuver or task control; number entry;
and malfunction insertion. The switches used
to perform each of these functions are dis-
cussed below.
3. A two-axis controller located to the,
right of the control switch slant panel. (For
use with an IFT, OFT or ACMT. Not required
for a CPT.) The tw -axis controller can beused by the instru tor a) to fly the student's
aircraft during m nual demonstrations of cor-
rect maneuver per nmance ("IP demos"); b) to
manually control t e simulated enemy aircraft
during ACM and gunnery training; and c) to fly
one of the friendly aircraft during close for-
mation or traffic pattern training.
Five tyries of dedicated function indicator
switches were defined for the console: 1)
EMERGsTor
ONSAff WSMALSYS,( M
those used to control and monitor the status .
of the simulator itself; 2) those used to
identify and modify the event to be trained;3) those used to monitor and control thetraining of individual maneuvers and tasks;4)4those used to enter numbers into the com-
puter (e.g. to change the value of parameters
defining initial conditions, etc.); and 5)
those used to insert, monitor, and removemalfunctions in the simulated aircraft. The
subpanels containing each of these types of
switches are shown bejow in Figures 3 and 4.
The Simulator Status Panel, the NumericEntry Panel, and the Malfunction Select Panel
are self-explanatory. The functions asso-ciated with the Event Control Panel and theManeuver Control Panel are described below.
Event Control Panel. The Event Control
Panel is used a's follows. Pressing the STAGE
switch causes a menu of the different stagesof training to appear on the CRT display,e.g., Instruments, Formation, ACM, etc. In-
dicating a specific stage with the touchscreen calls up a menu of the different train-
ing events (simulator sorties) which must be
flown during that stage. Indicating a spe-
cific event on this menu causes the detailedlesSOn plan for'that event to be displayed,showing the sequence of maneuvers and tasks to
be trained, and the minimum, standdrd, antmaximum number of trials for each maneuver or
task. If the lesson plan is mord than onepage long, pressing a point on the bottom
right (or left) corner of the touch screenwill cycle forward'(or backward) through the
pages in the menu. Indicating a specific
event 4lso causes the event moduleifor thatevent to be loaded into main memory. Pressing
the START EVENT switch then calls op the task
module for the first maneuver or task to betrained and starts setting up the S4jmulatorfor the first trial of the first mapeuver or
task. (Where desirable, this could,be mech-
anized so that an instructor could start anevent from somewhere other than the beginningby simply touching the screen to indicate thefirst maneuver or task to be trained befor
pressing the START EVENT switch.)
The LESSON PLAN switch shown in Figure 4
can be used to call up the lesson plan for re- '
view at any time during the event. The lesson
plan can also be used as a menu to conductspecialized training by,introducing maneuvers
and tasks out of sequence, e.g., to skip a
STAT 5 DICAT,D9 CON.YR a
WITONSySTEM SEAT 511IT
ATA
FIGURE 3. SIMULATOR STATUS PANEL
664.1".
i
STAGEMENU
EVENT CONTROL
EVENTMENU
L E SSON
PLANSTARTEVENT
EVENT CONTROL PANEL
irOEMO
AUTODEMO
LIV.ERUN
REPLAY
TIME
OPTIONS
MANEUVER
POSITIONslop
MANEUVER CONTROL
GONDI TION i
START
J
DEMOVOICEOFF
t.FIFEIF
IP
DISP
ST-OP
STUDMP
NEXTMAN
RIVER
EXtRAT RIAL
iJ
TWOAXISMANEUVER CONTROL PANEL AND SIDESTICK CONTROLLER CONTROLLER
MALFUNCTION SELECT
I COM/NAV I INSTRI
ENG I FUELI
ELECTI
HYDRAULI AUX
UHFRAD* fACAN HUD ADI i FLAME
OUT
OILPRESS
LOSS
FUEL
PUMP
TRANSFER
GEN TRIMPMMPOMP
UTILPUMP
OXYGEN
CABINPRESS
UHFADF
OPTION HSI OPTMN F IRE OPTIONLOWFUEL Mq*N AC
POWEROPTION
BRAKESYSTEM
OPTIONG
SUITOPTION
LEFT RIGHT
INSERT iREPRESENTATIVE MALFUNCTION CONTROL PANEL
FIGURE 4. LOWER SUBPANELS .
section of the plan, or to return to a maneu-ver already completed. Similarly, event menusand stage menus can be used to introduce maneu-vers normally practiced only in other eventsor even maneuvers normally practiced only inother stages. Again, the more radical thedeparture from the standardized training pre-specified for the event, the greater the,num-ber of procedural steps required.
Maneuver Control Panel. The Maneuver Con-trol Panel is used as follows. The POSITIONSIM, START, FREEZE, and STOP switches werecarefully selected as a minimal, necessary andsufficient set of_simulator run controls on
. -
67
_.
the basis of our experience in conducting sim-ulator training experiments over the past sev-eral years. The function of these switches isdescribed below in Figure 9 which presents adetailed sequential description of instructorcontrol inputs and display indications for astandard prerecorded maneuver demonstration.These controls function in exactly the sameway for an instructor flown demonstration (IPDEMO), autopilot or prerecorded demonstration(AUTO DEMO), student practice run (LIVE RIJN),or replay. The functions of the NEXT MANEUVER'and EXTRA TRIAL switches are described belowin the discussion of the Quantitative Perfor-mance Display.
Dis la s. Dynamic "inside-out," and$outside,.1nu pictorial representitions of the
student's aircraft and its relation to otheraircraft or ground features are now relativelycommon, at least for certain maneuvers. For
example, many current instructor consoles canshow dynamic two-dimensional pictorial-representations of aircraft glideslope and
localizer error. Similarly, dynamic three-dimensional perspective outside-in and inside-
out pictorial representations of ACrengage-ment like those developed by Cubic for the-Tactical Air Combat Training System/Air CombatManeuvering Range (TACTS/ACMR) have been de-
veloped for ACM simulator instructor consolesby Northrop and other companies. (Spring,
1970 In the proposed design,'dynamic inside-out and outside-in views of the maneuver beingflown are combined With a number of specialpurpose displays to provide' the instructorwith the essential, information required at
each point in training. Figure 5 shows an
outside-in display of an ACM engagement. The
numbers in Figure 5 show the relative position
of attacker and bogey at four different timesduring the engagement.
FIGURE 5. OUTSIDE-IN DISPLAY OF ACM ENGAGEMENT
As noted previously, the generalized (allexcept CPT) configuration of the instructorconsole has three CRT displays. These are:
1. A small CRT at the top of the.left-hand or "secondary display" module. This is
normally used as a repeater display to showthe instructor the view through the HUD andthe HUD symbology being seen by the student.
2. A 21" CRT on the secondary display
modulg. During a run this is normally used topresent a dynamic outside-in pictorial rep-resentation of the student's aircraft and itsrelation to other aircraft or features 'on theground.
(13,; 68
3. A 21" CRT on the interactive control/display module. During a run this is normally
used to present a dynamic inside-out pictorialrepresentation of the view through the wind-
screen of the student's aircraft. Between
runs this CRT is used to present special-purpose displays which provide the instructorwith a variety of additional informationrequired to'set up, conduct, monitor, and
evaluate training.
A desCription of the primary special-purpose displays developed during the design
study is presented in the following sections.
Maneuver Description Display. A samptManeuver Description Display is shown in Fig-
ure 6. This display is autoTatically pre-sented-to the instructor at the beginning of a
training for each new maneuver or task. The
Tlispltly shows:
1. The type and number of demonstrationsto be flow; and the minimum, standard and max-imum r11,er of student practice or "live" runs
to be Ilown following the demonstrations.
2. The standard or "desired" tiMe of oc-currence of critical points or "windows" whichnormally occur during the maneuver, on a timescale appropriate to the maneuver.
3. A detailed description of the perfor-mance standards, initial conditions, and per-formance,conditions for the maneuver to be
flown.
The purpose of the maneuver descriptiondisplay is to refresh the instructor's memoryas to a) the way in which the maneuver should
be flown and b) the specific criteria whichshould be used to evaluate studentperformance.
Quantitative Performance Display. A typi-
cal Summary Quantitative Performance Displayis shown in Figure 7. This display is
automatically presented to the instructor at
the end of each triaj. The display shows:
1. An indication of the training status
for the maneuver. This part of the display is
unchanged from that shown in the Maneuver De-scription Display except that the number of
the trial just completed,is brightened to showthe instructor the number of trials completedand the number yet to be trained.
*
2. Both the actual and desired (standard) ."time of occurrence of each critical point or"window" for the maneuver. This display was
added because of analytical studies which sug-:gest that the times of occurrence of critical
points in a maneuver are sensitive indicatorsof overall maneuver performance.
3. A description of the most importantout-of-tolerance conditions detected at each
BARREL ROLL ATTACK TRIALS DEMO-1 LIVE-1 2 3 4 5MIN STD
ACTUAL
TIME (SEC)
STANDARD
MAX
6 15 30 45 60 75 90AAAA. RI LR PU INV END
MANEUVER: BARREL ROLL ATTACK FROM ABEAM PERCH
PERFORMANCE STANDARD: END UP BEHIND BOGEY WITH:AOT = 1.15°NOSE-TAIL = 1500' -4- 200 FT
AIRSPEED = CO SPEED TO 4 15 KIAS
INITIAL CONDITIONSAOT 90°NOSE-TAIL = 0SLANT RANGE = .1.5 MI
VERTICAL SEPARATION = +2000 FTAIRSPEED = 350 KIAS
-'PERFORMANCE CONDITIONS
FUEL = 2000 LBSBOGEY AIRSPEED 350 KIAS
FIGURE 6. MANEUVER DESCRIPTION DISPLAY
window or in the segment or "corridor" immedi-ately following that window. The out-of-tolerance conditions for windows and corridorsare detected by the,automated performance mon-itoring system discussed in the followingsection.
As illustrated in Figure 7, the SummaryQuantitative Performance Display shows out-of-tolerance conditions detected for each windowand succeeding segment. Each out-of-tolerancecondition is indicated by a very brief English-language description of the out-of-tolerancecondition, followed by the observed value ofthe parameter and the standard or desiredvalue in parentheses. For example, Figure 7shows that the pull-up occurred very late,that it was initiated when the attacker was1/2 miles from the bogey, and that is shouldhave been initiated when the attacker was ap-proxiMately 7/8 of a mile from the bogey. Thevalues in parentheses serve to continually re-mind the instructor of the qualitative,perfor-mance standards established for the maneuver.
It would of course be possible to presentthis same display to the student by projectingit on the simulator's image presentation sys-tem or by displaying it on the student's mul-tifunctipn display ifi the cockpit. Thisfeature could 'be used as a tool to assist theinstructor in providing feedback to the stu-dent. With appropriate controlsadded to thecockpit this could also be used to provide the
69
simulator with a capability for self-instruction, which can be highly efficientwhen alternated with conventional instruction.
It should be noted that the display pre-sents only the two or three most important
out-of-tolerance conditions detdcted for eachwindow and accompanying segment. This is doneto prevent overloading the instructor,: with toomuch information. It is done by malls of atable look-up system which identifies the mostimportant errors by consulting a table con-taining the prespecified rank order of impor-tance of each of the known errors for each ,
window and segment for the maneuver. Thistable is part of the automated performancemonitoring algorithm which is in turn part ofthe task module for the maneuver. If de-sired, the instructor can.obtain informationon additional performance information.for anywindow and segment by simply touching the nameof the window. This causes a Detailed Quanti-tative Performance DiSplay of all the out-of-tolerance conditions detected for that windowand segment to be displayed. This detiileddisplay for a given window and segment would.probably "group out-of-tolerance conditions-by
parameter type.to make it easier for the in=structor to read, e.g., under headings such as"Energy Parameters," and,"Relative GeometryParameters.4,
oprom-
The time scale at the top of the displarhas a second, possibly very useful function.
(-BARREL. ROLL ATTACK TRIALS DEMO-1 LIVE-1 6% 3 4MIN STD MAX
ACTUAL
TIME (SEC)
STANDARD
R1 .Lfit PU INV END
, . , .
lr 15 30 45 7t 40Ak"ii'l"i'RI LR PU INV END
ROLLIN
LOW REVERSALLATE 3/4 MI (1) 1510-
PULL UPVERY LATE 1/2 MI (7/8)LOW LEAD 100 (15°+)EXCESS G 5G 0)
INVERTEDLOW A/S 270 KIAS (300 t 20)HIGH TCA 450 (30 ',5)
END POSITION -VERY HIGH AOT -200 (0 t 15)HIGH NOSETAIL 2500 FT (1500 * 200)
LOW AiS 325 KIAS (375)
GRADE. 0 1 2 3 4
FIGURE 7. SUMMARY QUANTITATIVE PERFORMANCE DISPLAY
By pressing any two points on the scale the
instructor can preset the starting and stop-
ping points of any automated maneuver demon-
stration or replay. This might be done to
save time or to highlight some aspect of ma-
neuver performance. Thus, in the example
shown in Figure 7, the instructor could usethis technique to quickly set up the simulator
to replay a just completed live run from a
point just before the low reversal to a point
just after the inverted position, in order to
illustrate some detail of correct or incorrect
performance in that part of the maneuver.
A similar capability can be provided for
live runs by developing additional software to
enable the computer to pick off,donsta
initial conditionsetc. fromirit diate
points in a canned demonstratis of the ma-
neuver. The ability to set arbitrary start
and stop points for live ru s provides an in-
herent capability for back rd chaining and
similar instructional tec lques as discussed
below in the section on istructional
features.
Normally,"as soon as the standard number
of student practice trials ("live runs") have
been cdmpleted for a given maneuver, the in-
structor will debrief the student and then,
press the NEXT MANEUVER switch. This will
cause the simulator computer to call up the.
task module for the next maneuver specified in
the event module and start setting up the
simulator for the first trial of the next
maneuver, typically a demonstration. If the
standard number of practice runs have not been
completed but the instructor decides'that nofurther practice is necessary, and if the num-ber of trials completed is equal to or greater
than the minimum shown on the display, the in-
structor can move directly to the next maneu-
ver to by pressing the NEXT MANEUVER switch.
Similarly, if the standard number of practice
runs have been completed but the instructor
es the student needs more pract e, and
if the n of trials flown is s than the
maximum shown on e instructor
can set up'a'n extra trial on the same maneuver
simply by pressing the EXTRA TRIAL switch. In
this way the instructor can easily vary the
number of trials flown within prespecifiedlimits to tailor training to the student's
needs. He cannot exceed these limits without
going through a more complex procedure. This
feature is designed to make it difficult to
reduce the number of trials to the extent that
a student receives little or no practice on a
maneuver, or to increase the number of trials
to the extent that insufficient time is left
In add ion to e sling the instructor to
keep track ,he is in the training for
- each maneav the part of the display showing
the trials completed and tPials remaining are
used by the instructor in connection with the
NEXT MANEUVER and EXTRA TRIAL switches shown
above in Figure 4.
70
OA.
for training on remaining maneuvers. 'Here .again, the actual number of trials flown oneach maneuver could be recorded to provide anempirical basis for modifying the minimum,standard, and maxinium number of trials spec-ified in the event module.
Other Displays. Several other displayswere developed to reduce instructor workloadfor various tasks. For example two-dimensiondl tabillar menus Were developed toenable the instructor to guickly set up thedesired mode of controlling each aircraft in athree ship formation, or in two-on-one or one-on-two ACM engagements. This type of displayis illustrated i4 Figure 8.,
ATTACKER AUTO -STUDENT...7..,/ 1 \ IP
OTHERSIM
WINGMAN
_
AUTO . STUDENT -\'' I / ....
IP--... OTHER
BIM
-
BOGEY
\ ' /...7, AUTO Z
..,,STUOENT
IA ...
IP
. .
OTHERSIM
FIGU E 8. MENU'DISPLAY FOR SELECTING METHODCONTROLLING EACH AIRCRAFT FOR ACM
Detailed Fifgtions Analyses. Detailedfunctions analyses have been developed showingexactly how the Event Control Panel, ManeuverControl Panel, Maneuver Description Display,and Summary Quantitative Performance displaywould be used to carry out each of the follow-ing instructional functions and subfunctions:
Identify event to be trained
Conduct standardized trainingdemonstrate maneuverinstruct live runreplay live run
Conduct partially standardized trainingmodify start-stop pointsmodify number of trials per maneuver
normal transitionskip to next maneuvergive extra trials on currentmaneuver
introduce standard variation ofmaneuver
Conduct fully specialized training
introduce nonstandard variation of4
maneuver
introduce mueuver out-of-sequenceintroduce maneuver not contained inevent module
Insert and remove"malfunction
Specify method of controlling each aircraftfor ACM and Formation Flights.
A sample functions analysis for demonstrat-ing a maneuver is shown in Figure 9. Thistype of analysis is an important.first step in
developing instructor console software in thatit can fairly easily be translated into a com-
.outer flow diagram.
Automated Performance Monitoring (APM) System
As discussdd in the previous section., theautomated performance monitoring (APM) systemgenerates a quantitative performance displayat the end of a maneuver which is a summary ofthe most jmportant out-of-tolerance conditionsdeteCted during thg maneuver. An example of aquantitative performarice display is shownabove in Figure 7. This information is Pre- '
sented to the instructor as an aid for evalu-ating and diagn..0,ing student performance onthe maneuver.
The APM system is based primarily on tdch-niques developed by Northrop, Vought, and Sys-tems Technology, jnc. (Carter 1976; Carter1977; Sepp 1977; Heffley et. al. 1982). Italso uses APM concepts developed by Appli-Nation, Inc., the Canyon Research group, andVreuls R. esearch,'Anc. "(Semple, et. al. 1979;Vreuls, et. al. 1975).
The system is designed to detect threetypes of errors: flight path errprs, controltechnique errors, and procedural (switch se7quence) errors. In all, cases, errors or out--of-tolerance conditions are detected by"comparing the observed valye of a paraheterfor a specific point or segment in a maneuver
, or task with the standard value and tolerancelimits stored in computer memory. Flig4 patherrors, e.g., "excessive airspeed at roll in,"are currently being measured in_this way by-the Vought A-7 NCLT APM system now in opera-tional use at NAS Lemoore. Control techniqueerrors, e.g.;' "controlling attitude instead ofNz" can be measured in this way by pilot be-havior identification techniques like thosedeveloped by STI, which were used in Northropsimulation studies to evaluate theaTfective-ness of alternate training aircraft configur-ations for wrier landing. Procedurarerrors, e.g., "arming switch _thrown out ofsequence" are now being detected in this, man-ner by the EA-3 Low-Cost CPT developed byAppliglaion, Inc., and several other systems.
The potential impact of-APM techniques onthe effectiveness and efficiency of simulatortraining has been demonstrated by studies onthe inter- and intra-obseriVer reliability ofinstructor judgments. (Carter, 1976;Knoop &Welde, 1973) These studies show that instruc-tor judgments of overall maneuver performanceare fairly reliable, but that instructor judo-
71
A
, .
CONDUCT STANDARDIZED-TAANING, . ,.....
.,
li Following the identification of the event to be trainedthe instructor conducts a standardized train* event Oy using the Maneuver
ControVanel pnd the specill papose displays as follows: , .
1. Demonstrates Mareuver (AUTO DEMO) , 0.
. a.. Presses "POSITION SIM" switch which causes the "POSITION SIM" ligAt -to start blinkineand the computer to start setting 0
up the sknulator for the prereCorded demonstration specified for this maneuver in the task module.
b. ("POSITION SIM" switch comes on steady and first frame of HUD, flight path, and instrument repeater dynamic dil.Olays
.,. are displayed on instructor consdle CRTs, indicating that simulator is ready to fly.) .4
c. Comments on the demonstration to be flown and verifies that the student is ready,
d. Presses "START" switch fo start the prerecorded demonstration. .
e, Makes comments during the demonstration. .
f, Presses "FREEZE" switch to freeze demdnstra.tion in mid-air. (This freezes the visual scene, motion glatforin, g-seat,
g-suit, instruments, etc, in the position existing at time-of freeze.)-
9. Presses "START" switch to restart prerecorded demonstrationtom fro en position. ...
h. Instructor presses "STOP" switch/demonstfation reaches auto stop point.(1) Turns off visual scene qnd returns rribtion platform, g-seat, g-sUit, i ruments, etb, to a netitral position. .
(2) Freezes last frame of HUD and flight path displays. 4.'
(3) Causes Summary Quantitative Perforinance Display and Grading Scale Display to appear on the interactive CRT. ,
IGURE 9. SAMPLE FUNCTIONS ANALYSIS
,-c
ments relating to the accePlability of spe-4ific parameters at specific points or overspecific segmenbiz daring the maneuver are
highly unreliable. 'Since these judgments arp
the basis for the instructor's diagnosis Of
the causes of substandard maneuver perfor-mance, it follows that mUch of the feedbackcurrently given to Students is in error: The
effect of this on.the effectivenesSanireffi-ciency of training "Can be understood-by con-sidering the effect of telling a Student he
pulled up late when he.actaally pulled upearly, ighich is precisely what the data indi-
cite is happening. :
-.-
features were classified into twb b.road types:
nonautomated and automated.0 4 ,
NonautoMated Features. .Figure 10 illus.:. ,
trates the results of an analysis performed to
evaluate a variety of nonautomated inStruc-
tional features.identified as candidates for
simulator ,training, hi genera) , the features ,
shown as rejected were rejected because of a
total lack of emPirical data on their training
effectiveness.
Backward chaining has been found to be a
highly effective simulator training technique
0 in an experimental setting. (Hughes, 1979)
It is:therefore'recommended for IFT;.OFT;,and.ACMTstraining pending further investqationsof its,tost-effectiveness inrmilitary pilot
training peograms::. The7instcuctor.console feal
ture used to,set u "arbitr4ry start and stop
points for live runs provides a basic capabil-ity fortackwftd chaining and similar instruc-
.tipnal techniques,.as discussed above in the
Itectfon on instructor console displays.
a
The proPdsed APM'system is deligned to -
cOrrect this prob4em by providinginstructorsWith objective data which can be used to give .
students more accurate feedback. This same
data can be Y'ecorded to provide-an objective
data base for improved mdhagement ttudent,
progress, improved student selection, and moreaccurate evaluation of total sgstem
performance.. - .
Although the benefits of the proposed APMsystem are extensive, the-46tt of developingand perfecting an APM algorithm for a single
complex maneuver can be quite high. For
,large-,scale.training sistems a phased APM de-
velopment may be required to spread thesecost§ over OR_ In this approach algorithmsare first developed,only for selected,
sensitive maneavehs. Algorithms for addi-
tional maneuvers cari,then be deyelopethas parit
of an ongoing ISO effort.
INSTRUCTIONAL FEATURES
A review of*current literature on instruc-tional features served as a starting point forthe identificatio.6 and evaluation of candjdate
instruCtional features. (e.g., Bailey VHughes,'1980; Caro et..al. 1979, 1980; Hughes,1978, 1979;lintern tcGopher, 1977; Lintern1978; Semple, et. al. 1980; Shaw 1979; and
Meller, 1979) Candidate instructional
Eas'-t time has been found to be very effec--
tive simulator tPaining technique at NASA-Ames
for preparing NAS test pilots for the severb
time;-compression effect encountered in actual
test flights. Fast time is a technique
whereby the program'integration'interval is
Ireater than the elapsed real-time interval,
causing fnequencies and velocities to in-;crease, i.e., things happen more quickly than
in real flight. Fast time ig thus a form of
overtraining whichocould conceivably increasethe training effectiveness of IFT and OFT'
training by be.tter preparing students for the
'increased-stre'ts and workload assodiated with
aircraft flights. Like other types of over-
'training It could also result in excegsivesimulator training-time. ,While current imu-
lator computers have an.inherent capability'for fast time, the cosf of upgrading imagegeneration computers, motion system sOyvoand other hardware to achieve the required
,'frequency response would be very high. This
72
V
rlf
.
,FEATURE. .
REJECTED SELECTED
J PARAMETER F,-REEZE .CAPABILITY FOR_
X
,FREEZING ONE O'R MORE SELECTED PARA-METERS. E G RANGE TO TARGET)
c ,'
2, BACKWARD CHAINING (CAPABILITY. FOR . IFT, OFT AND ACMT',...TRA INING LAST 4§GMEWT-IN A MANEUVER F 1 RST ISEE DISCUSSION/
PROGRESSIVELY ADDING EARLIER SEGMENTS)
3 FAST TiME (CAPABILITY FOR FLYING SIMU: - X
LATOR AT FASTER THAN REAL TIME).
.'
SLOW TIME.ICAPABILITY FOR FLYrNG SIMU- . X
LATOR AT SLOWER THAN REAL TIME) -
.6 .
5 NON INTERACTIVE DYNAMIC VISUAL SCENE X, ear
FOR`CPT (DISPLAY EhABLING STUDENT TOCORRELATE PROCEDURES INIfH tHE SIGHT,- .
,PICTURE, E G , FOR LEARNING TRAFFIC ,..
PATTERN PP(OCEDURE).
" 6 INSTRUMENN BLANKING (CAPABILITY FOR X
BLA'NKiNG OUT SELECTED INSTRUMENTS FOR 6.
SCAN TRAINING)
7 EYE-TRACK DISPLAY (DYNAMIC DISPLA.11
SHOWING INSTRUCTOR WHERE STUDE,NT IS r
LOOKING AT EACH MOMENT)...,
8 MANUAL REPOSITIONING (CAPABILITY FOR
'X
INSTRUCTOR TO CHANGE ATTITUTE AND ALTI-TUQE OF STUDENtS AIRCRAF T DURVIG FREEZETO ILLUSTRATE CHANGE 14'1HE SIGHT PICTURE
.-
(A TYPE -OF PARAMETER FOEEZE))6
',. 9 AUGMENTED VISUAL CUE,STREDICTOR DI-S- OFtACMT .
PLAYSIO BE DISPLAYED ON HUD.' '
FIGURE 10. ,EVALUATION OF NON-AUTOMATED INURUCTIONAL FEATURES
technique is therefore not recommended pending
a more direct demonstration of its effective-,
ness for military.flight training.. ,
LaAding-point predi,ctor.displays and aug-Mented visual.cues such as the 'pole track"-display are'recommended as setectable OFT HUD
ays for field and carrieY landing train-
ing, :The "pole track" is a display of succes-sively shorter "poles" on each....side of the
glideslope showing the correct guideslope and
providing 'an external indication of aircraft
veloOtty. The effectiveness of such displays
, .for simulator training in field and carrier
landings" has, been deMonstrated in the studies
by Lintern (1978) and Weller (1979)..
Electronic tOacer bullets and funnel dis-
plays like those on the G.E. HUD are recom-,mended.a selectable ACMT HUD displays forair-to-air gunnery training. These displays
a have been shown to be extremely effective for
.64*
teaching a student the effect of aircraftcontrol inputs on the dynamics of the bulletstream. A continuously computed impact point,velocity vector, and bomb fall line are recom-*nded as selectable OFT HUD displays for air-
to-ground weapon delivery training. It is
assume that these HUD displays woutd dupli-cate those in the aircraft.
Automated Features. Figure Willustratesthe results of an analysis performed to evalu-ate automated instructional features whichwere identified as candidates for an ACMT.
3
The capability for preprogrammed initialconditions and preprogrammed malfunctions isinherent in the task module condept.' Tech-niques for using normal and emergency proce-dure task modules in conjunction with othertypes of task modules are described in Semple,
et; al. (1980).
73/74,t!..3:)
1
FEATURE REJECTED. .
SELECTED
.--:'PRePROGRAMMED INITIAL CONDITIONS CPT, IFT, OFT, ACMT(TASK MODULE)
*- PREPROGRAMMED MALFUNCTIONS
CPT, IF1/2sOFT, ACMT(TASK MODULE)
AUTOMATED GROUND CONTROLLERIFT, OFT
PRERECORDED BOGIES AND FRIENDLY NC ' OFT, ACMT
MANEUVER-SPECIFIC AUTOPILOT BOGIES AND OFT, ACMTFRIENDLY AIRCRAFT
IlGENERALIZED AUTOPILOT BOGEY
ACMT (ADVANCEDTRAINING ONLY)
AUTOMATED PERFORMANCE MONITORINGCPT, IFT, OFT, ACMT
AUTOMATED AUDIO ALERTS DURING MANEUVER CPT, IFT, OFT, ACMT(TO INSTRUCTOR)
AUTOMATED GRADINGOFT ONLY
(AT THE INSTRUCTORS CONSOLE)(RECOMMENDED FORFIELD AND CARRIER
AUTOMATED FEEDBACK AFTER MANEUVERLANDING) ,CPT ONLY
(TO STUDENT)
AUTOMATED INSTRUCTIONCPT ONLY
ADAPTIVE TRAININGOFT ONLY
(USE OF HUD SYMBOLOGY).
AUTOMATEQ COACHING DURING MANEUVER x.
(TO STUDENT)
-
FIGURE 11. EVALUATION OF AUTOMATED INSTRUCTIONAL FEATURES
An automated ground controller similar tothat developed for the F-14 OFT InstructionalSupport System is recommended for IFTs andOFTs for simulating the ground controller's
'voice instructions in ground-controlled ap-proaches and carrier-controlled approaches.Instructor pilots consulted agreed that in-structor pilots cannot accurately simulateground controller voice instructions due to t
the specialized training and practice re-qucred. This feature should thus increase thetraining effectiveness of IFTs and OFTs whilereducing instructor workload.
Maneuver-specific "autopi7ot bogies" arerecommended for controlling the simulated1:0ogey and wing-man in introductory ACM train-ing and for controlling the lead aircraft,second aircraft,-Or third aircraft in close-and tactical-formation training. This ap-proach was used with success in ACM simulatortraining experiments perforMed by Northrop forNADC An 1976. (Spring, 1976; Carter, 1976)The autdpilot bogey is used for introductorytreining in classic maneuvers. Typicdlly, it-1%0 relatively simpfe algorithm which causestife bogey or friendly aircraft to react to a
,.
change in student aircraft position in thesame way that the instructor would react atthis stage of training. For example, a barrelroll attack bogey is programmed to tighten itsturn as the student reduces angle-off.:tail and
nose-to-tail, and to reverse when the studentovershoots. Autopilot bogies increase OFT andACMT training.effectiveness in two ways: 1)they provide much more realistic training thanis possible with noninteractive pi.erecordedbogies; and 2) they significantly reduce theneed for the-simulator instructor to fly thebogey aircraft in ACM and formation training,thus permitting the instructor to concentrateon observing and coaching student performance.
Theterm "generalized autopilot bogey"' isused here to refer to en autopilot bogey whichis capable of flying an entire ACM engagement-against a human pilot. This type of bogey isus in the Northrop Large Amplitude SimulatorAS), the Simulator for Air-to-Air Combat
(SAAC) at Luke AFB, and the 2E6 ACM simulator,at NAS Oceana. 'A generalized autopilot bogey
is recommended 'for ACMTs to be used in ad-vanced training but weitild probably not be
cost-effective for undergraduate ACM training.
3.4
.C3
As shown in Figure 11, an automated per-formance monitoring,system is recommended forall four types of simulators. As described inthe previous section, the APM system detectsout-of-tolerance conditions during each trial.of each maneuver or task, and,displays thisinformation to the instructor at the end ofthe trial. .By providing more detailed and.
,more accurate data on student performance, theAPM system performs three essential functions.First, it increases the effeetiveness of ac-tive instpction by providing the-instructorwith objective information which can be usedas an aid in evaluating and diagnosing indi-vidual trials. Second, it puts the managementof stydent progress through the syllabus on amore objective basis by providing instructorswith more complete and accurate data on whichto base grades. Finally, it provides a-greatly increased system evaluation capabilityby providing operatidnal ISD personnel withobjective data on the effects of variations intraining content, sequence, techniques, andequipment.
Automated audio alerts are recommended forall four types of simulators ,to alert the in-structor to certain critical student errorswhich might otherwise be missed. This enables
the instructor to correct errors as soon asthey occUr to prevent formation of unsafehabits; such as pulling too many Gs. This is
done by means of audio tones and/or digitallyrec rded voice messages.
Automated verbal coaching arid cueing ofthe student during a maneuver is,not recom-mended because of research evidence that suchCoaching fs at best distracting, and at worstconstitutes an irrelevant cue to correct ma=neuver performance which can be used con-sciously or unconsciously to "beat thesystem."
Automated grading is recommended for fieldand carrier landing training in OFTs becauseof the demonstrated agreement between A-7 NCLTLanding Signal Officer (LSO) grades and thegrades assigned by the A-7 NCLT automatedgrading system developed by Vought.
A capability for fully automated Computer-Assisted Instruction (CAI) is recommended onlyfor CPTs. At,the completion of each procedurethis system Kovides automatic feedback to thestudent by disp.laying the errors and the cor-
rect procedures Oh the instructeristudent con-sole CRT display. The system then instructsthe student to reset all switches to theirnormal positiops. Depending bn performance tothat point, it then instructs the student toperform the previous procedure again or tostart training Oh a new procedure.
Adaptive training is recommended for OFTsin conjunction with an A-7 NCLT-type gradingsystem to vary the difficulty of field or car-rier landing as a function of student perfor-'
76
mance. Task difficulty is decreased bydisplaying a predictor display'and augmentedvisual cues _such as a "pole track" display onthe student's HUD. Task difficulty is in-
creased by removing these displays. While it
is possible for ,tha idstriictor to insert and,delete these displWys manually, the trainingeffectiveness of these displays is greatest ifthey are inserted and deleted according to an.optimized adaptive training algorithm.(Lintern, 1978)
Adaptive training in wilich task difficulty
is varied by varying system dynamics, (e.g.,changing the stability of the simulated air-craft) has been shown to be ineffective and isnot recommended. (Shaw, 1979).
BRIEF/DEBRIEF CONSOLE
The brief/debrief console for the proposedISS is essentially a CAI terminal interfacedto a videodisc system, which has an addedvideo cassette playback Capability for de-briefing. It is assumed that the brief/debrief console would.be interfaced to a com-puterized, scheduling and records keepingsystem.
9Physical Configuration
The physical configuration of the consoleconsists of a 21" CRT display with a transpar-ent membrane-type touch screen and a set ofdedicated function switches located on an off-screen membrane-type touch panel, where indi-vidual switches are located and grouped bymeans of cutouts in d` thick plastic overlay.Ideally, the brief/debrief console would beidentical to the CAIlvideodisc terminals usedfor academic training, with the exception ofthe vileo cassette,unit and additional func-tiondWitches used only for debriefing. Adetachable alphanumeric keyboard i required
to permit the IP to enter nonstarida d commentsin the students computer file.
Functional Des*
The console is desdgned to be used forboth simulator and aircraft flights. It, sup-
ports five major instructional functions, asfollows:
1, Event Planning: The console is usedby insti-uctors to determfrie student status; toreview the student's performance in pastevents; to request schedule changes if oeces-sary; and to identify minor changes'in thestandard lesson plan which could beliefit thestudent.
2. CAI Briefirm and Test. The console is
US to give studeFts a,CAl review and test on
aca mic and fligNtsupport materials relevantto the flight to be flown.
3. IP Briefing. The console can providea hard copy of the CAI test results to be usedby the IP as a starting point for the IP brief-ing which follows the CAI briefing. (A hardcopy printer could be time shared by severalbrief/debrief consoles.)
4. Debriefing. The console Can be usedfor debriefing both simulator and aircraftflights, assuming that the aircraft has anonboard videvecorder.
5. Data'Entry. The console is used bythe IP to enter grades and comments on theflight in the student's computer file.
Debriefing Displays
For simulator debriefings, a multiplexingscan converter is used to make a two-channelvideotape record of what was shown during andafter each maneuver on any, two of the instruc-,,,ttr console CRT displayi. Thus one videotape 11channerwill typically show a dynamic "inside-out" presentation lilf the pilot's view through
windscreen during the maneuver, including
.)c tinuously changing digital readouts of keypa ameters on the edges of the display. This
,i followed by the Summary Quantitative Per-formance Display shown to the instructor atthe end of the maneuver, plus any detailedquantitative performance displays requested atthat time. The second channel will typicallyshow a dynamic "outside-in" presentation of anoutside observer's view of the maneuver; alsoincluding digital readouts. This is followedby a static graphic display of the desired aridactual flight path flown. The instructor canswitch 'back and fonth between these displaysa's.required during debriefing.
Debrief Procedure
For debriefing, the brief/debrief consoleis used by the IP and tfle student to review
the videotape record of the aircraft or simu-lator flight-to be debriefed. For both air-craft and simulator debriefings it is expectedthat only selected segments of the tape wouldbe reviewed, since reviewing the entire tape
_would require as much time as the originalflight. To make this process as efficient aspostible, three different methods are proVidedto enable the IP to quickly get to a point-ofinterest on the tape. First, he can slew di-mptly to any electronic flags which heserted on the tape during the_event. Second,he can slew to a specific trial-of a specificmaneuver, using a display of the lesson planas a menu. (This approach will be most effec-tive for simulator debriefings, due to fre-quent changes in lesson plans which are
dictatedby events and'conditions in the air.)Third, he can use fast-forward and rewind con-trols to slew to an approximate time in theflight, where the approximate flight time ateach point on the tape will be continuously
77
displayed on the screen. Once the approximatepoint is 16cated by one of these three meth-ods, the precise segment of the tape to be re-played will be controlled by conventional tapeplayback controls (fast-forward, rewind, stop,
_pause, and play.) Detailed functions analyses'have been developed which describe the proce-dures used for slewing to a deiired point onthe tape with each method.
Data Entry Procedure
The instructor grades each event using theconsole touch screen and an event-specificgrading form displayed-on the-CRT,The grad-ing form consists of a matrix of grading cat-egories listed vertically down the left sideof the display and a grading scale displayedhorizontally across the top. To grade a spe-cific category, the instructor simply pressesthe touch screen across from the category inthe column corresponding to-the desired grade.Standard comments on the students performanceare entered bylpressing the touch screen nextto tne or more appropriate comments in a menuof standard comments. Nonstandard commentsare typed in with the alphanumeric keyboard.It,is assumed that all grades and commentsentered by the instructor at the console wouldbe automatically entered into a centhlizedcomputer file containing a record of the stu-'dent's perfsrmance in each event.
CURRENT STATUS AND FUTURE PLANS
An independent research and developmentproject is now_in progress to develop a work-ing model of the Summary Quantitative Perfor-manee Display. Current plans are to follow upthis 'effort by developing working models oftheAther instructor console features describedabcive.
ACKNOWLEDGEMENTS
Special thanks-are Aue to William G.Spring of Paradise Research; Inc,, for assis-tance in developing and verifying thes feasi-bility of many of the design features of theproposed ISS1-,
REFERENCES/RELATED REPORTS
Bailey, J.S.,,and Hughes, R.G. Applied pehav-ior Analysis in Fling Training Research.-AFHRL TR 79-38. Air Force Human ResourcesLaboratory, January 1980.
Pohlman, L.D., and Isley, R.N.Development of Simulator Instructional FeatureDesign Guides. Seville IR 79-12. Seville Re-searth,Corporation, October 1979.
Caro, P.W., Shellnut, -J.8., and Spears, W.D.Utilization of Aircrew Training Devices.Seville TR 80-01. Seville Research Corpora-
tion, February 1980. (Final report on STREScontract performed for Air Force Human Re-
sources Laboratory.)
Carter, V.E. "Development of Automated Perfor-mance Measures," Volume 2 of Experiments toEvaluate Advanced Flight Simulation in AirCombat Pilot Training. NOR 76-52. Northrop
Corporation, March 1976. (NADC Contract
N62269-74-C-0314)
after, V.E. "Development of Automated Perfor-mance Measures for Introductory Air CombatManeuvers." 'In Proceedings of the 21st AnnualMeeting of the Human Factors Society, SanFrancisco, CA, October 1977.
Charles, J.P., Johnson, R.M., and Swink, J.R.Automated Flight Training (AFTTInstrumentFlight Maneuvers. NAVTRAEQUIpCEN 71-C-0205-1.Naval Training Equipment Center, Orlando, FL,
February, 1973.
Heffley, R.K., Clement, W.F., and Jewell, W.F."Overview of Work in Progress on NonintrusiveAssessment of Pilot Workload and Pilot Dynam-
ics." In Proceedings of the AIAA Workshop onFlight Testing to Identify Pilot Workload andPilot Dynamics, Air Force Flight TAs4.Center,Edwards AFB, CA, January, 1982. (STI Paper
No. 311, System Technology, Inc.)
Hughes, R.G. "Enabling Feature§ Ver s In-
structional Features in Flying Train ng Simu-
lation." In llth NTEC/Industry Con erenceProceedings. Naval Training Equipment Center,
Orlacidg, 'FL, November, 1978.
Hughes, R.G. Advanced Training Features:BriOing the Gap Between In-Flight and Simula"-or Based Models of Flying Training. AFHRL-TR-76 Air Force Human Resources LaDgratory,Flying Training Division, Williams AFB, AZ,March, 1979.
Knoop, P:A. and Welde, W.L, Automated Pilot
Performance Assessment in the T-37: A Fea-
sibility Study. AFHRL-TR-72-6. Air ForceHuman Resources taboratory, WPAFB, Ohio,April, 1973.
78
Lintern, G. and Gopher, D. Adaptive Training
of Perceptual Motor Skills: Issues, Results,
and Future Directions. TR ARL-75-5/
AFOSR-77-5. Aviation Research Laboratory,Universcty of Illinois at Urbana-Champaign,January, 1977.
Lintern, G. Transfer of Landing Skill AfterTraining with Supplementary Visual Cues. TR
Eng. Psy. 78-3/AFOSR-78-2. Engineering Psy-.
chology Department, University of Lllinois atUrbana-Champaign, July, 1978.
Semple, C.A., Vreuls, D., Cotton, J.C.,Durfee, D.R.,'Hooks, J.T., and Butler, E.A.Functional Design of an Automated Instruc-tional Support System for Operational FlightTrainers. NAVTRAEQUIPCEN 76-C-0096-1. U.S.
Naval Training Equipment Center, Orlando, FL,January, 1979.
Semple, C.A., Cotton, J.C. and Sullivan, D.J.Aircrew Training Device Instructional SupportFeatures. CRG-TR-3041C. Canyon ResearchGroup, Inc., July, 1980. (Final report onSTRES contract performed for Air Force Human
Resources.Laboratory.)
Sepp, G.D. Computer-Assisted Pilot Gradingand Prediction System - Phase 3 Final Report:Data Analysis and Training Aid Development.Vought Report No. 2-54202/7R-51460. Vought
Corporation, 1 September 1977. (NASC Contraci
No. N00019-73A-0010)
Spring, W.G. "Simulation Development and Preparation," Volume 4 of Experiments to Evaluate
vanced Fli ht Simulation in Air Combat Pilo
Trjiing. NOR 76-52. Northrop Corporation,March, 1976. (NADC Contract N62269-74-C-0314
Shaw, A.W. "Adaptive Training and In-Flight
Simulation." Brieting presented at Northrop
CorporatiOn. Vought Corporation, Dallas, TX,
October, 1979 (Unpublished)
Vreuls, D., Obermayer, R., Goldstein, I., andNorman, D. Trainee Performance Measurement iFour Instrument Flight Maneuvers.NAVTRAEOUIPCEN 74-C-0063-1. Naval Training
Equipment Center, Orlando, FL, 1975.
Weller, D.R. "Predictor Displays in Carrier
Landing Training." NAVTRAEQUIPCENNaval Training Equipment Center, Orlando, FL,April, 1979.
INTRODUCTION
A DESIGN WITH THE INSTRUCTOR IN MIND;'THE OAS/CMC PTT INSTRUCTOR STATION
MAJOR JOSEPH C. STEIN AND CAPTAIN MICHAEL E. SHANNONOAS/CRUISE MISSILE TRAINING MANAGERS4235 STRATEGIC TRAINING SQUADRON
Carswell Air Force Base, Texas 76127
Separate areas or windows will be set asidefor various types of- infonmation.
The Offensive Avionics System (OAS) PPT(See Figure 2) includes' two user interfaces,
one for the crewmembers and one for theinstructor. The instructor can-interact withthe CPT in three operational modes: Pre-Run,Run, and Post-Run. The interface has beendesigned to facilitate moving back and forthamong any of the thr.ee modes (See Figure 1).
The Pre-Run mode occurs prior to the training(simulation) session and allows theinstructor to select and edit a scenario fortraining and complete setup and initial-ization procedures. The Run mode is theactual training period during which theinstructor monitors the crewmembers' actionsand injects changes into the session. ThePost-Run mode occurs after the trainingsession in which the instructor can initiatea limited revieW and analysis. All sessionsmist go through some minimum initializationvia the Pre-Run mode of the user interface.Depending on the desired degree ofinteraction, initialization can var ly.
After the appropriate initialization, thinstructor will enter the Run mode to beginthe training session. He will be providedwith commands that allow him to abort thesession and return to the Pre-Run mode orinterrupt for a time period and then resumewithin the Run mode itself. Once the run iscomplete, the instructor enters the Post-Runmode in which he may conduct limited reviewand analysis or he may return to the Pre-Runmode to initiate another training session.
The largest area at the top of the;console is reserved for two major functions.'In the Pre-Run mode this area is used fordisplaying tables of values describing flightpath characteristics of a selected scenario.During the Run mode, this area shall bededicated to the real-time display and updateof a set of variables denoted as real-timeparameters. During Post-Run mode, this areashall also be used to reflect real-timeparameter updates, but shall reference only asubset of those parameters used during theRun mode.
The Command Menu area, at the bottomleft, will be used in all three modes todisplay the commands, in menliiform, availableto the instructor for selection. As theinstructor transits from one menu to another,this area will display the new menu.
The area located to the right and top ofthe menu consists of two lines displayingfaults and pilot-controlled statusinformation. This area will he active onlyduring Run mode and will be empty during bothPre-Run and Post-Run modes. The first lineof the area, the Faults Injected line, willindicate any malfunctions or faults currentlyin effect. The second line shall display thecurrent status of both the Pilot Steeringmode (manual/auto) and Terrain Avoidance mode(on/off).
EXECUTIVE MENU
[ PRE-RUN IPOST-RUN
Figure I
Instructor Interface
COMMANDS AND DISPLAY LAYOUT
Within any given mode, the instructor 'hasa set of commands available.for carrying outhis responsibilities. The command menus andother information will be presented on analphanumeric CRT laid out as in Figure 3.
79
In all three modes, prompts to theinstructor for input and notification oferrhrs will appear in the Prompts and ErrorMessages area. If the instructorinadvertently enters any form of unacceptableinput, he will be notified, told the natureof the mistake, and xequested to reen-ter the
V SPLIty
........
E152G/11 OAS/CMC PART TASK TRAINER
/
desired input. The Instructor Input line isactive in all three modes and is the onlyline able to receive and echo input from theinstructor. The bottom line "of the terminalis hardware-controlled and outputs terminalstatus j.nformation.
The tables which can be viewed,by theinstructor during the Pre-Run Mode are usedfor purposes of preview, editing, andscenario generation. The informationcontained in the tables describe the selectedflight plan values and cover such items asdestination points, air speed, missileinformation, etc. These entries may bealtered by the instructor to develop adifferent flight plan and can even be
by typing the number of the desired command.All keyboards input will be displayed in theinstructor input area of the screen. Oncethe number has been input, the instructor
merely has to hit the CARRIAGE RETURN (CR)key to activate the selected command. If thecommand then requires additional input suchas a numeric value before activation can be'completed, the instructor will be given aprompt indicating what information isrequired and what will have to be typed in.Editing keys will be provided to allow theinstructor to recover from typographicalerrors. Once the numeric value desired isdisplayed in the active field, the instructoragain hits the CARRIAGE RETURN for finalinput into the system. This menu-select-
PRE-RUN: SCENARIORUN: REAL-TIME PARAMETERS
COMMAND MENU
FAULTS INJECTED1 PILOT STEERING ANC TA MODE STATUS
1 PROMPTS AND ERROR MESSAGES
t
INSTRUCTOR INPUT
TERMINAL STATUS
Figure 3
Instructor Console Layout
"trimmed" down to create a much shorterscenario.
-
A wide variety of variables are displayedduring the Run mode. These variables relatesuch information as aircraft status,
navigational system accuracy, INS drift,crewmember input, system state values, etc.°dee displayed, the values of thesevariables,are continuously updated at 5-secondintervals.
A menu of available commands will bepreseated in the Command Menu area. Thecommand labels wilI-be numbered on thedisplay and may be selected by the instructor
to -activate design is aimed at minimizingthe amount of typing required.by theinstructor. It also has the advantages ofalways displaying the available commands,eliminating any errors due to command-typingmistakes, and allowing the user, via thenumerical select procedure, to quickly accessand activate commands of interest.
SYSTEM EXECUTIVE
When the mainframe is brought up andinitialized, the first thing that theinstructor sees is the EXECUTIVE MEN (SeeFigure 4) which consists of the three simplemode selections of Pre-Run, Run, and PostRun. In the hierarchy of menu organization,the instructor can always return to the
81 kJ
EXECUTIVE menu to change modes any timeduring a training run. From this menu, theinstructor may choose to enter any of thethree modes or exit the training programentirely. In the event the instructoractivates the Run or Post-Run modes withouthaving specified a working scenario, adefault value..will beassumed and scenario #1will be loaded irito the system as the currentworking scenario. Once the instrucIorselects any of the modes, the terminaldisplays the master menu for that mode.Using the commands in these master menus, theinstructor can move to other subsets ofcommands that allow him to carry out any ofthe tasks required in the selected mode. Atany time, the instructor can activate theRETURN TO EXECUTIVE commandavailable in eachof the master menus atir-FiturrT to EXECUTIVEmenu to select a new mode.
EXECUTIVE MENU
selected, the default working scenario willbe scenario #1.
The canned scenarios can be modifked by
the Instructor. Modifications to the workiqg
scenario are divided into two categories:(1) direct alteration of specific flight legcharacteristics (such as aircraft speed,altitude, wind velocity, etc.), and (2)
scenario redudtion. The first editingcapability is accomplished by identifyingspecific flight legs of the working scenario,and replacing current flight plan values withdifferent values. Scenario reduction isachieved by specifying a contiguous set of
flight legs that exist in the workingscenario. This set becomes the new "reduced"scenario, and retains all flightcharacteristics4s defined for the ,
corresponding legs of the original workingscenario. All editing functions which createa new scenario description cause the original
1) PRE-RUN working file to be repladed by the new
2) RUN scenario which then becomes the current
3) POST-RUN working scenario. Only the working scenario
4) EXIT stored in the mainframe is changed by theediting functions; the scenario stored on the
Figure 4The Executive Menu
OPERATIONAL MODES AND FORMAT OF COMMANDS
The instructor will be provided with aset of seven pre-canned scenarios and .
material describing each of these, includingnavigational charts. The seven scenarioswill be permanently numbered 1 through 7 andwill be available, at any time for the
instructor's-Use. There is additional spacefor three more scenarios, numbered 8 through10, which can be generated by the instructorthrough modification of any of the availablescenarios and stored in these last three
slots. Thus, there are a total of tenpossible scenarios, seven permanent ones
(numbers 1 through 7) and three replaceableones (numbers 8 through 10).
Upon entering the Pre-Run mode, one ofthe first requirements of the instructor isto select a scenario by recalling one of theten that are stored. Once recalled, all of
the initialization values stored under thatscenario's number will be loaded into themainframe. This scena o now becomes the
"working" scenario and an be used for the
training session as is o edited to create a
new, Modified working scenario. Note that if
a working scenario is not explicitly
disk remains unchanged. When the instructorhas completed editing a scenario, he maychoose to save,a copy of the working scenario
for future retrieval. If the copy is.not
saved, all modification and initializationwhich has occurred duringPre-Run will belost when the CPT program is exited. If no
editing has-occurred, it1s not necessary tosave the working scenario since it is merelya copy of one of the scenarios alreadystored.
PRE-RUN C(MMANDS
The four commands' which are available for
: selection during Pre-Run mode are listed inmenu form in Figure 5. When activated, eachcommand will either produce a new menu ofsub-commands, or will cause a request foradditional information in the form of aprompt. In the following sections eachcommand will be addressed in detail with thecorresponding display changes and requiredinputs clearly outlined.
82
PRE-RUN MENU1) SELECT SCENARIO
2) EDIT/PREVIEW SCENARIO3) SAVE SCENARIO4) RETURN TO EXECUTIVE
Figure 5
Pre-Run Menu
Activation of the SELECT SCENARIO commandis required before any manipulation of any
_scenario other than scenario #1 may beaccomplished. When the SELECT SCENARIOcommand has been activated (by having theinstructor type 1 CR ), the command menuwill remain on the screen, but a prompt willappear requesting a scenario file number.The instructor must then input a number from1 to 10 (representing the ten scenario files)followed by a CR . This will define shescenario file which will become the currentworking scenario. Any pre ious workingscenario will be replaced b this selection.Should the instructor decide ot to define a(new) working scenario, he may nter a CR byitself. This will be accepted as a negationto the original command and control willresume at the command menu level. Anyillegal input.will be flagged as such in theerror and prompt area of the console and theinstructor will be requested for new input.
By entering 2 CR , the instructoractivates the EDIT/PREVIEW SCENARIO commandwhich causes a new command menu to bedisplayed (see Figure 6). Each command inthe menu is related to the display of
particular data tables or values providinginformation describing certain attributes ofthe working scenario. These tables or valuescan be requested for purposes of preview bythe instructor, or can be edited forproduction of a new scenario.
and scrolling of the flight plan. , The
editing commands will explicitly call'outonly those columns in the flight plan whichmay be edited. Whenever the instructordecides to change one entry in the table, allother entries which are affected by thatchange will be automatically replaced andupdated on the display.
The second commiand available in theEDIT/PREVIEW SCENARIO menu is SRAM TARGETTABLE (see Figure 6). Activated by a 2 CRkeystroke input, the SRAM Target Table isdisplayed in the scenario display area, andthe SRAM TARGET TABLE menu appears in thecommand menu area. All targets in thescenario are listed in the table by targetnumber, with the flight leg destinationnumber and target characteristics such aslatitude, longitude, and elevation displayedin the corresponding row. The only twocolumns which are available for editing bythe instructor are SAIR (Safe And In Range)ENTRY and SAIR EXIT.
The Destination Table (see Figure 6) iscalled up by selecting 3 CR from the EDIT/ .
PREVIEW SCENARIO menu and presents theworking scenario flight plan by listing eachdestination point, type of destination point,latitude, longitude, elevation, and plannedtime of arrival. The DESTINATION TABLE menuwill also be displayed in the command menuarea. No editing of specific values in theDestination Table is possible, but the
EDIT/PPEVIEW SCENARIO1) FLIGHT PLAN2) SRAM TARGET TABLE3) DESTINATION TABLE4) FIX-POINT TABLE5) MISSILE STATUS6) RETURN TO PRE-RUN
instructor may create a new scenario byreducing the current working scenario byselecting a contiguous sub-portion of it.This sub-portion will then become the new,working scenario.
The Fix-Point Table (4 CR) is onlyavailable for presentation; no editing mayoccur. This table will list all fix pointswhich may be accessed within the limits ofthe working scenario. The points will be
Figure 6Edit/Preview Scenario Menu
The first command on the menu is theFLIGHT PLAN command (see Figure 6). Enteringa 1 CR will cause the first page (11 lines)of the Flight Plan table to appear in thescenario display area (upper window), and theFLIGHT PLAN menu to appear in the CommandMenu area. The Flight Plan table lists, thedestination points and flight characteristicsassociated with each flight leg in theworking scenario. The FLIGHT PLAN menucontains a set of commands allowing.editing
83
listed sequentially by number with thecorresponding latitude, longitude, andelevation of each point displayed as well.This table wiAl usually exceed displaycapacity in sPze and will require a pagingcapability.
, By entering a 5 CR , the instructor canblange Missile Status. This parameter is setto either 1 (Fully Aligned) or 2 (Power Off)in the scenario. The Fully Aligned statuswill cause the elimination of the requiredwarm-up time of missiles and weapons within ascenario. It will initialize the scenariowith missiles fully targeted (powered, armed,and in "GO" status) and MIlla fully powered.The Power Off status assumes no warm-up ofmissiles or weaf)ons has occurred. Uponentering the command, no change in the menuoccurs and nothing is displayed in the
scenario area. Rather, ,a message appears In
the prompt and error area indicating the.current value of the missile status parameterand requesting a new value. Entering
anything other than 1 CR or 2 CR will
cause an error message to appear and a
request for new input. Entering CR only
will negate the command giving control to theEDIT/PREVIEW SCENARIO menu.
When the instructor selects the RETURN TOPRE-RUN (6 tR), the EDIT/PREVIEW SCENARIOcommand menu will clear from the display area
and control will return to the PRE-RUN,mastercommand menu (see Figure 6).
If the ingtructor wishes to save thecurrent working scenario and itscorresponding parameters4lor laterretrieval/udage, he mustliktivate the SAVESCENARIO command by entering 3 CR in the
PRE-RUN master menu (See Figure 5). A prompt
will appear requesting a scenario file number
for 8 to 10 (representing the threereplaceable files available for storage).
Once selected, the old contents.of the filewill be,destroyed and replaced by the current
working scenario. If the instructor decides
not to save the working scenario, he may '
enter a CR by itself. This will negate thecommand activation, and control will return
to the PRE-RUN master command menu. All
illegal inputs will be flagged as such to theinstructor and he will be requested_for new
input.
Activation of this command will clear thePRE-RUN master command menu from the displayand control will return to the EXECUTIVEcommand menu (See Figure 5).
RUN MODE
The insteuctor activates the Run mode by
entering 2 CR from the EXECUTIVE menu.Upon entering this command, the training
session (simulation) starts:.
,Throughout the training' session, a set of
real-time parameters reflecting aircraft,navigational, and command information aredisplayed in the upper window of the consoleand updated every 5 sec .(see Figure 7). The
simulated time is displayed at the top,
right-eide. Below that is tir aircraft data
FAIPCRAFT DAT-3 ,
LAI": N 47-00.00 TH: 305
LONG: 019-00.00 TK: 300
TIME 14:12:30f
ALT: 21500. DEST: 16 828-TOT
W/V: 300/50. TAS: 420
pAV1G4TIONAL DATALAT: N 47-15.10 TH: 300 ALT: 22100
LONG: 14120-10.00 TK: 410 W/V: 310/30
!_AST COMMAWISSUE4RN-IKB: XHAIR 10,12N-IKB! MDFY 24
PTA: L 00:16:13
RNMP:,RANGE/SCALE 75PANEL: WCP -- LP 1,2,3 WPN PWR
RUN MENU1) FAULT INJECTION2) AIRCRAFT MANEUVER3) TAL RECOGNIZED4) TERRAIN AVOIDANCE MODE5) FREEZE6) RESUME7) RETURN TO EXECUTIVE
RNMP-FAIL WCP-FAIL
pILOT STEERING: MANUAL TA MODE: OFF
(PROMPTS AND ERROR MESSAGES)
(INSTRUCTOR INPUT)
(TtRMiNAL STATUS)
Figure 7
Run Display
84
COW St 0 : latitude and longitude, truehe i and gr und track, altitude andwin eloci , current destination point, andtrue air speed. Next 13. the navigational'data reflecting values or.the primenavigational aystem. These parameters are:latitude add longitu
, true heading andground track, altit e and wind/velocity, andthe planned time pt artival'error (early orlate). N4vigajdnal errors can be determinedfrom these ameters by ,subtra ngcorresponding values of the aircr ft andnavigational data.
The bottom portion of the window displaysthe last command issued by the crewmembers --separately for the three major panels:RN-IKB, N-IKB, and RNMP. Ihe'.1ast commandissued for the remainang set of panels isdesignated by.a panel label followed by thecommand given.
4-
This display of real-time parametersallows the instructor to have immediate,completely updated information as to theaircraft state and crewmember actions.
In addition to the real-time parameterdisplay, entering the Run mode causes the RUN'menu to be displayed in the command menuarea. There are seven commands associatedwith the Run menu and the followi,ng sectionswill explain each of these in detail.
_RUN MODE CCMMANDS
-The instructor has the capacity' to inject
malfunctions-, or faults, at any time AuringRun mode (see Figure 7). This allows,him tocause failures to occur at critical timescorresponding with crewmember activity. Thefault s ecified will occur at the moment ofselecti n. He also has the ability totermin e a given malfunction that is
,currently in progress.
Upon entering a 1 CR , the FAULTINJECTION menu will appear in the 4ommandmenu area (see Figure 8). 'The menu providesfour fault commands and to RETURN To RUNcommand.
FAULT INJECTION1) RN MANAGEMENT PANEL2) WEAPONS CONTROL PANEL3) DOPPLER RADAR4) OAS BUS FAILURE5) RETURN TO RUN
Figure 8Fault Injection Menu
Selection oi the RN-Management Tanelcommand_O CR ) will cause the currentstatus of the 4N Management Panel (RNMP) tobe indicated in the_prompt and error areajl(FUNCTIONING),.2 (FAILED)] along with arequest for input. If the current, status isFUNCTIONING and the instructor enters a 2CR, the RNMP will immediately cense tofunction, and the message RNMP-FAIL willappear on the Faults Injected line of the
A instructor's console. If the current statu'Sis FAILED and a 1 CR is entered, the RNMPwill begin to function again and theRNMP-FAIL message will bejerasect. If a CRis-input by itself, the command will negateand the,fault status will remain unchanged.An input-of any value other than,1 or 2 willcause an error messige'and a request for newinput.
85
Validation checking will oCtur by thesystem prior to activating any fault. Sinceno other fault can occur simultaneously withan OAS bus failure, the system will check theOAS buastatus before initiating other&failures. If the OAS bus status is FAILEDand the instructor attempts to inject an RNMPfailure, an error message i411 appear in theprompt and error area and RNMP status willremain FUNCTIONING. '
Weapon Control Panel failure (see Figure8) is activated by entering 2 CRL . Itoperates exactly as the RN MANAGEMENT PANELcommand except that' die message WCP-FAILappears in the Faults Injected area of theconsole.
The Doppler Radar failure ( e Figu e 8)is activated by entering 3 CR Itoperates exactly as the RN MANAGEMENT PANEL
- command except that the message DOPPLER-FAILappears, in the Faults Injected area of theconsole.
The OAS Bus Failure (see Figure 8)differs from the other fault commands in thatonce this fault is 11jected, it. must remainFAILED for the rest of the training sesaion.Thus, there is no need to' prompt for an input.value for the command.
When 4. CR is entered, the system willcheck the status of the other three panels.If any of them are currently FAILED, an errormessage.will appear snd.the command will beignored. If all three of the other panelsare currently FUNCTIONING, the/system-thenchecks the OAS Bus status. If it iscurrently FAILED, anot,her error message willappear and the command will be ignored.Finally, if all three panels and the OAS Busstatus are currently FUNCTIONING when thecommand is activated, the system changes theOAS bus status to FAILED and the message OASBUS-FAIL appears in the Faults Injected area
of the console.' Once the-instructor hasinjected this failure, he will have nofurther'need to return to the Fault Injectionmenu throughput tbe remainder of the training
session.
Entering 5 CR (see Figure 8) will clearthe FAULT INJECTION menu from the displayarea and control will return to the RUN menu(see Fitre 7).
During the progression of the trainingsession, the instructor has the capability ofacting as a simulated pilot in taking theaircraft through changes in speed, altitude,heading; and steering-mode.. He can alsochange the wind/velocity (direction andmagnitude) and the alternate navigationalheading Aror.
Upon entering , the AIRCRAFTMANEUVER menu will be displayed in the
' command menu area (see Figure 9). This menu
consists of seven coalmands for maneuveringthe aircraft, changing conditions, orreturning to the RUN menu.
AIRCRAFT MANEUVERI) TAS
2) AL ITUDELADING ,
4) PILOT STEERING MODE
5) W/V
6) ALTER NAV HgADING ERROR
7) RETURN TO RUN
ltFigure 9,Aircraft Maneuver Menu
When T S is selected by the instructor
with a (1 TYa prompt will be issued forinput of a new air speed value. The current
air speed value is displayed continually andupdated in the real-time parameter display
window.
Once a new value has been successfullyinput, the aircraft will begin a constantacceleration/deceleration until it reaches
the desired velocity. The real-time
parameter will reflect this 4hange.
As with the aircraft velocity, when the.ALTITUDE command (see Figure 9) is activated
by entering 2 CR , a prompt will be issued
for input of a new altitude. The current
altitude is another real-time parameter thatis displayed in the upper window.
4 //
Once a new altitude has been successfully
input, the aircraft_will ,begin a conseant2000 ft/min climb or descent'unal it reaches
the desired aleitude. the real-time ALTparameter will reflect the updaeed
. After a new altitude ties beed'input bythe.instructot the aircraft remitns at thait,
new altitude ntil further input from the '1:e:-
instructor.
a The instructor hab ehe capability toinput a new hiading by Seleceing a CR
command (see figure 9). Before changing the
heading, the system will 'dheck.ehe Pilot
Steering Mode. In order for the pilot(instructor) to manually change the heading,
the Pilot Steering Mode must be MANUAL. If
the Pilot Steering Mode is AUTO, no change inheading can be initiated by the instructor,an error message will appear, and the command
, will be ignored. If the Pilot Steering Modeis MANUAL, the HEADING Command will be-
initiated.
After successfully entering a newheading,,the aircraft will begin a turn of
constant radius until the new heading is
reached. Once attained, if this heading does
not take the airdraft to the expecteddestination point, it will remain in effectuntil a new heading value is issued.
The Pilot Steering Mode can be changed(see Figure 9) by entering 4 CR -. The
current value of the mode Pl(AUTO), 2(MANUAL)] will be displayed in the prompt anderror area along.with a prompt to enter a new
value. Entering a value pther than 1 or 2
will cause an error message to appear and aprompt for a new input.: The current setting
of the Pilot Steering.Mode is also displayedon the console below the Faults- Injected
line. At the start of the training session,the Pilot Steering Mode will default to AUTO.
Entering 5 CR allows the instructor to
update the wind/velocity (direction and
magnitude). A prompt for a new value ofwind/velocity will appear. The new value of
W/V entered by the instructor will remain ineffect until the next turn;:point in the
scenario is reached. Aftes the turn point
the next wind will be obtained from the(scenario) flight plan. All changes to if
wind/velocity will be reflected in thereal-time W/V parameter value. fa
The instructor can change t he Alternate
Navigation Heading Error (see Figure 9) by
entering 6' CR . A proppt for a new value,
not to exceed +5, will aPpear. Any illegalinput by the instructor will be flagged withan error message and a prompt for a new inputs
will appear.
If the Alternate Navigation system is the
pripe navietional model, the instructor candeEermine ehe current alternate navigationheading error by subtracting the realtime THvalues for aircraft And navigational data.
Upon entering.d new acCeptable value for thealternate navigation heading error, thealternate navigation system will replace theold error with. this new value and update'theTH parameter to reflect the new error. Anyother navigatiOn parameters affected by thechange (such as TK) will also be up ted.
If the alternate navigation s stem is notthe prime navigational model, the instructor
cannot determine the curient alte natenavigation heading error. He can still entera value within the +5 limit and t is newerror will be stored in the alter atenavigation system replacing the o value.However, the aircraft and the current primenavigational model:will not act upon this newvalue until the alternate navigation systemis selected as the prime navigational model.
Entering 7 Cg will replace the AIRCRAFTMANEUVER menu with the RUN menu (see Figure7) and pass control on to it.
When missiles and weapons are powering upand reach the Transfer Alignment (TAL) mode(see Figure 7), a TAL REQD message willappear on the crewmember's display. In orderto complete transfer alignment, theinstructor must issue a TAL RECOGNIZEDcommand. This is done by entering 3 CRfrom the RUN menu. Once entered, TAL iscompleted and the missiles enter Fine
Alignment (FA) mode. 'If the TAL RECOGNIZED
command is entered before either the TAL mode
is reached or the TAL REQD message appears,it will be ignored and have no affect on thesystem.
The Terrain Avoidance (TA) mode affects'the width of the sector that appears in' theradar video display. There are two valuesfor the mode: 1 (OFF) and 2 (ON). Uponentering 4 CR , (see Figure 7) the currentvalue of TA will be -displayed along with aprompt for a new value. Entering a valueother than 1 or 2 will result in an errormessage and prompt for new input.
The current TA mode is also displayed onthe console below the Fault Injection linefollowing the Pilot Steering Mode display.Any change in the TA mode will be reflectedop this display. At the start of thetraining session, the,TA mode will default toOFF.
The FREEZE command (see Figure 7) allowsthe instructor to halt the training sessionat any point without destroying the validityand consistency of the simulation. Thecomplete session can be frozen at an instant
87
in time, allowing the instructor to go to thecrewmembers and point out a sequence ofoperations or recent commands that were inerror or inadequate.
The instructor "freezes" the session byentering 5 CR At that instant, alltraining and simulation procedures forcrewmember activity will halt.:'Theinstructor may interact verbally with thecrewmembers and advise them of any problemsthey may be experiencing.
Once the instructor has finished his
verbal instruction during the "freeze", hecan then resume the simulation at preciselythe point at which it was stopped, and the
training session can'continue as plahned fromthat point.
To continue a "frozen's' session% theinstructor enters 6 .CR (see Figure 7). Thetraining session resumes from the exact pointat which it was frozen.
POST RUN MODE
The major purpose of the PostRun Mode(see Figure 4) is to provide a limited reviewof the'currently defined working scenario.The ingtruCtor may specify an arbitrary pointin the scenario as astart time for review.Once selected, he may begin a review of thescenario by activating the RESUME command.
When the review is in progress, the_instructor may Stop the reiriew at any time.
This will enable him to make any notes of thescenario without missing critical'material orallow him to,define a new starting point.' .
Thus, when used in conjunction pith defining
start._ pointa_and_ressoing the ,reviewsprocess,a particular sequence can be replayed manytimes.
POST RUN COMMANDS
Four commands ,are available during the ?
-PostRun mode and arepresented in menu formin Figure 10. Onry the DEFINE START POINTcommand requests additional input Irom theinstructor, and none of the commands invoke a
',-second level of command menu.
POST-RUN MENU '
1) DEFINE START POINT2) FREEZE3) RESUME4) RETURN TO EXECUTIVE
Figure 10
Post-Run Master Menu
Activated,by a 1 Clt'. the system will
display the current,start 'time (whichdefaults to the beginning Of the flightplan). The instructor wilithen be promptedfor a new start time and must enter a valuethat is between the start and end of the
flight scenario. If he enters an illegal
value, an error will be flagged,and he willbe requested for new input. Once a new start
time is successfully entered, tt will becomethe point at which the review proce.ss willstart when the RESUME command is aceivated.
The FREEZE command only meani.ngful:
during the review process. Activatedby akeystroke sequehce of'2 CR , this coMmand
tdll stop the revIewing process.'
. When RESUME command is,activateds thereviewing process will be star.ted orrestarted., The scenario wil,l begin at the
currently defined.start pOtipt, or freezepoint if-the session was frozen, qnd will
continue along-as though the aircraft wasfollowing the flight path perfectly. Radar
video will correspond to the exact flightpath with MFD formats available for
inspection. No online interaCtion ofcrewmember Commands will be available except
for radar video presentation format. Once
the postrun,review is completed, theinstructor can return to the exective menu byinputing a 4 CR command..Activation of thiscommand will clear the POSITRUN mastercommand menu from the'display and controlwill return to the EXECUTIVE command menu.
HAREWARE
The PTT has been designed in lipdules tofacilitate modifications as the OAS itself
changes and as additional OAS trainercapabilities are identified. The fivemodules'-- independent subsystems -- of thehardware configuration are depicted in Figure4 and described below. They illustrate the
MAINFRAME PROCESSING SUBSYSTEM BLOCK DIAGRAM
1-
gAGNilifTAPE
GUN1k011FR
'BENIRAI
PURPOSEINTEREALE
DI 5
(Pa RON I R
ASYNCHRONOUS,minri-cminrrCOUTROLIER
TO
DISPLAYSUBSYSTEM
TO
_OAS CONSOIE SUBSYSTEM
- I_ TO "-
INSTRUCTOR'S CONSOLESUBSYSTEM
VTRTUA1
CONSDIT.
.
r----7-71,__Pj MAUI HIHORY
PS Y. WORDS
..
4
..1. .
(UHF HinoRY ll WADS
MEMORY MAUAGEW1
128 REGISiER
PRIM 550hijM,OMPUIER
5YSI1M V
st
Mb
v,
use of off the shelf equipment.
The Mainframe Processing Subsystem (MPS)consists of a minicomputer with.disk drive,
magnetic tape drive, and system console (seeFigure 11).t The disk is used for realtinestorage and.retrieval of radar imagery data;the magnetic tape is used for transfer ofsoftware amok updates,of terrain data.
- The MPS is the centralprocessor for thePIT. 4 is responsible for logical andnuliterical processing plus cdntrol of allother subsystems. The PTT Software runs onthe MPS:
The virtual console is a CRT required forthe PRMAOS Operating System. The disk driveis requrred for the operating system, forsoftware and data file storage, and forrealtime access to display files forsynthetic radar imagery updates. Themagnetic tape drive is required for loadingterrain files and software updates on theoperational systems in the field.peripherals are seandard PRIME equipment.
All peripheral devices on the PRIME 550CPU (disk drive, magnetic tape drive; andvirtual console) have standard interfacessupportei by PRIME hardware and software.
The mainframe processing subsystemcapabilities are listed below for the PRIME550*CPU.
PRIME 550 CPU
5z Bit CPU Architecture ;
128 Registers
512 K Byte Error Correcting Code Main
Memory (expandable to 2 M Byte)
1 K Word GaChe
Single/Double Predision Floating Point
INSTRUCTOR'S CONSOLE:SUBSYSTEM (ICS).
The Instructoes,Console Subsystem (ICS)serves as a system monitor and controlinferface for the instructor. It providesfor interchange of textual data with the PartTask'Trainer control programs execueing inthe MPS. The following data are displayed atthe Instructior's Console on an alphanumeric'CRT display:
Flight Parameters ,
Mission,DataStatus of OASEmulated SubsystemsOperatiOnal Faults-AlPhanumeric data and'dontrol codes are
input through a keyboard.
:Functions which can be exercised are:
1
1
Trainer Session ControlAlteration of, Flight Parameters
Modification,of'OASEmulated SubsystemsStatus
Fault Seeding
Recovery'from Operational Faults
The ICS is a SOROC _IQ 140 CRT terminal (SeeFigure 12).
.
GRCWTH POTENTIAL
We realized that if we did not build a '
growth capability into the basic PTT design,we Would start at square one'again if furture
considerations required an expandion of PTTCapabilities. Therefore, grbwth potentialwas a high prioritS, nsideration throughout
<7the entire program This requirement forgrowth' potential sp led over into the design.of the instructor station as.-well. ..
..
First, the instructor station EXECUTIVEsoftware was designed'with this growthcapability in mind. For example, we realizedthat we'would eventually have to include more
,
malfunctiohs in the PTT. Therefore, theinstructor software had to be able to handlethese additional malfunction requirements-aswell. Therefore, the EXECUTIVE,sdftware --design included a group of unused
.
input/output flags. Thus, when an additionis made to the PTT software, the EXECUTIVE
routines will not have to change:. This willgreatly reduce the tithe and cost of softwaredevelopment.
,
89
The syitem hardware also lends itself toexpahsion. 30% growth capability wasdesigned into the PTT mainframe computer. If
expansion requires mire than the planned pad,we can add a 50% memory capability by
purchasing.a $12_,C00 memory board.
.The SOROC IQ 140 terminal (see Figure ljalso,provides us with tremendous growthpotential. We have not even-begun to takeadvantage of the speci$1 functions availableon the SOROC. ,
'
Finally, the instructor's command formatsand displays are ideally suited to'expansion.The single keystroke command foimat caneasily haildre doubledigit or even tripledigit commands. In addition, theinstructor's displays can be easily-changedto meet growth requirements. For-example, ifwe were to add mote faults than could fit onthe Fault Injection Menu (Figure 8), we couldsimply scroll this menu just.as'thedestination tables are already.scrolled.in-,-
i the present configuration.
"StNMARY
The 3-52 Offensive Avionics System (OAS)Part Task Trainer cPTI) is a trainer that isdesigned to focus on procedures fraining;specifically those procedures unique in
operation of tl'e new B-52 Offensive Avionics
System. With emphasis on procedurestraining, .the,instructor p/ays a major role
in the training of the students. The OAS PIT
"was a two-.user interface, one for thecrewmembers and one for the instructor. The
instructor's interface is designed to freethe insE-ructor to i%teract with the students
at all times. Specification requirements of
the instructor stat'on included; ease ofoperation, rapid scenario setup, access tostudents, easy student and system monitoring,
the growth capability. Wit!, these -
requirements in mind, an instructor stationand interface was designed to rake theinstructor's job si-lple and effective.
-
'Figurel2-Soroc IQ 140 Instructor Console
spaoc
4
. 90
TRkUTING EFFECITVENESS EVALUATIONS' OFSIMULATOR INSTRUCITONAL SUPPORT SYSTEMS
DR. STEVE R. OSBORNE AND DR. GEORGE W. NENZERAllen Corporation of flerica
Substantial increases in simulatortechnology have dramatically increased thescope and potential of simulator training.-Ultimate simulator trairEng- -effectiveness,however, is not only a function of adevice's capability to simulate trainingtasks accurately, but also of its Ability tooperate as an effective instructional tool.Its effectiveness as an-instructional tooldepends upon Instructor/Operator Station(ICS) factors such as instructor/operatorworkload, performance monitoring,and evalua-tion capabilities, and training task/missionset-up. Effective design and proper utili-zation of the ICS can affect'the potentialand achieved effectiveness of simulatortraining.
Advances in simulat technology, andattempts to ,incorpora state-of-the-artinstructional technology, have resulted inincreasingly complex and sophisticatedinstructOr/operator stations. Unfortu-nately, this application of new technologyhas not always led to advancet in instruc-tional effectiveness of efficiency. This;,
shortcoming, and the potential importance ofeffective IOS functioning, highlight ,theimportance of evaluating IOS design andinnovations in simulator instructionalsupport systems. However, accuratelyassessing the effectiveness of simulatorinstructional support systems poses a set ofproblems that ,often are not solved bytraditional approaches to evaluating simu-lator training effectiveness.
Ibis paper discusses some of theproblems encountered in attempting toevaluate the effectiveness of si4latorinstructional support systems and prOvidessome potential soltutIons to these prOblems.Three areas are addressed: selection of asuitable evaluation model and sensitivity ofperformance measuresj evaluation Objectives;and instruct& training.
, *
EValuation NOdels
Additions ornoodifications to training,systems usually must be justified by showingthat they either increase training effective-ness and/or 'that they somehow reducetraining costs wbile maintaining some givenlevel of effectiveness. This is especiallytrue.if the addition or modification represents a prototype- or proof-of-concepttiaining device or instructional aids
In general, eValuation models used toassess training effectiveness can bewouped
91
into one of two categories: analyticalmodels and experimental_mcdels. Analyticalmodels span a broad variety of proceduresranging from simple questionnaires andopinion surveys to well-developed, highly
- structured rating scale evaluations.
Analytical models usually do not involvedirect measures of student or instructorperformance. Instead, they rely upon esti-mates, .judgements, or opinions that arebased on direct Observation or experience.Experinental models rely on direct measure-ment of performance. Tbey normally involve=paring the performance of one group ofstudents trained with the addition/modifi-cation to the performance of another groupof students wbo are trained without theaddition or modification. Under this model,differences in performance can be attributedto the influence of the instructional addi-tion or modification provided that otherfactors that influence performance axe heldconstant for both groups of students.
A transfer of training evaluation isrepresentative of the experimental model inybioh the_asncunt of training required toattain proficiency with the actual equipuentis related to the amount of previous(simulator) training. Transfer of trainingstudies allow simulator training effective-ness to be expiessed as the amount of actualequipnent training that can be saved bysimulator training. Such studies also allowcalculation of potential cost savings.tbatcan be achieved through simulator training.
Experimental comparisons, guch astransfer of training studies, have becomethe standard fgr assessing simulatortraining effectiveness. There has been anatural tendency, therefore, to apply thesesame experimental models to assess theeffectiveness of simulator instructionalsupport systems such as instructor/operatoistations, special instructional features,and sintaator instructional capabilities.From a methodological standpoint, experi-mental, models represent the preferredapproach to assessing training effective-'ness. However, 'there are at least twofactors that limit the adequac0 of thisapproach for evaluating simulator instruc-tponal support systems.
Tbe fiist factor cOncerns the expected- contribution of the isistruction4a1 system
addition/modification to training comparedto the contriblatia4.made by the overall
. .
,training system. If the relative incrementin training effectiveness is small, then itmay not be detected on the overall systemlevel, i.e., in terms of student performrance. For example, student performancetypically is affected by multiple =Talentsof the training system suCh at academic,classroom, simulator, and actual equipmenttraining. In most cases, the imstructionalsupport system of a simulator, compared tothe overall training system, represents amodest influence on student performance.Such small effectS require very sensitiveperformance measures and a level of experi-mental control and rigor that usually is notavailable for evaluations conducted in
operational settings.
The second factor concerns usinginstructor-assigned grades as a measure ofstudent performance. Although instructor
grades frequently are used in trainingeffectiveness evaluations, they normally arenot very sensitive to actual differences instudent performance. Instructor grades areused more to manage student progresS throughthe training -program than they are to
measure performapce. Instructor grades also'tend to be'norm-referenced, i.e.,Xhey areassigned relative to the average performanceof other students of comparable training and
experience. Although grades serve theirintended Purpose, they alone usually are notsufficient to detect differences in performrance, especially if those differences aresmall.
The solution of the first problemrequires that the evaluation be conducted ata level commensurate with the expected
increrrent in training effectiveness. If thechange or addition to training represents amajor impaat on training, then it may be'sufficient-to assess that change at a molarlevel, such as by yeasuring differences instudent performance. On the other hand, ifthe change or modification represents a
small impact then it will be necessary toassess that change at a more -molecular
level. This may require a change in the way
training effectiveness evaluations are
conducted. Experimental comparisons -of
student performance may need to be replacedor supplemented with more analytical compar-isons. For example, changes in the instruc-tional-features of a simulator, which repre-sent a small change in the overall trainingsystem, could be evaluated by establishing alist Of behavioral objectives that the newinstructional capability should achieve.
The evaluation would then assess the extentto which those objectives were met. Suchobjectives should be a natural part of thedesign process itgelf and they should
reflect the original purpose for developingor modifying a particular instructional
feature or capability. Establishing a list
92
of behavioral objectives would allow a clearand specific statement.to be made about theexpected results of a particular simulatorfeature and also would provide a clear basisfor assessing the extent to which thoseresults were achieved. This inforMation
could be used to evaluate the,merits of agiven simalator feature, regardless of howsmall a component that feature was, relativeto the overall training system. An assess-ment of behavioral objectives also could beviewed within the context of a broader scaleevaluation of student performance.
The second problem, i.e., sensitivityof instructor grades con be addressed in anuMber of ways. First, set (4-Objectivemeasures of student performancettpiild bedeveloped. However, the time and cost'
requirements to develop good perfOIancemeasures often exceeds the resources avail-able fOr training effectiveness evaluations.Second, instructors could be asked to record
actual deviations. in performance for a
selected set of parameters judged to beespecially critical to successful perform-ance of the task being graded. This wouldprovide some additional performance infor-mation for the purpose of the evaluation.It also would allooW subsequent instructorratings to be based, in part, on selectedaspects of student performance. Third, thestandard four-point grading scale used byinstructors could be expanded to a seven-point scale by inserting intermediate pointsbetween each original scale point. Thiswould increiSe the potential sensitivity ofthe middle part of the scale which'is mostfrequently used, but it would not signifi-cantly alter the basic format of the scale.Tilerefore, instructors still would have ascale they are familiar with, but one thathas greater sensitivity. Finally, automated
performance . capabilities of simulators
should be utilized if they are available.
Evaluation Objectiv4Related to the topic of evaluation
models is the issue of evaluation objec-tives. Although the overall goal of anevaluation may be to assess the trainingeffectiveness,' or efficiency of a simulatorinstructional sUpport system, more specificevaluation objectives should be formulated.Specific evaluation objectives are especi-ally important for evaluating instructionalsupport systems because the specific proce=dures for evaluating such systems, includingdepenent variables, are not as well definedas they are for evaluations of total
simulator systems. Specific evaluation
objectives can provide a conceptual
framework for the development ,of specific
evaluation procedures. Clearly stated,
specific evaluation objectives al6Y can behelpful in Selecting an evaluation model andfor guiding the development of measures of
instructional effectiveness. )
The development of evaluation Objec-tives can take'many forms, however, somecriteria that Objectives Should satisfy areas follows. First, Objectives should bebased on the overall' purpose of theevaluation. Objectives, therefore, Shouldbe stated so that their fulfillment will
J. provide answers to the questions thatoriginally generated the evaluation.Second, Objectives should be operationallymeaningful in the sense that their fulfill-ment will provide useful information.Third, Objectives shoulTstated clearly.Finally, Objectives should be testablewithin the operational context in which theevaluation will be conducted.
Instructor Training
Another important factor in conductingan accurate evaluation of any instructionalsupport system involves instructor/operatortraining. A device rust be used correctly,and with sufficient knowledge of its limita-tions and capabilities, to achieve anaCcurate assessment of its true effective-ness. Instructor training also plays a keyrole in user acceptance of new trainingsystems. Unfortunately, many evaluationsbegin without adequate instructor/operatortraining.
The solution to .the problem ofinstructor training is straightforward.Training effectiveness evaluations of simu-lators and their instructional supportsystems should not be started until theinstructors/operators who are going to usethem are adequately trained. The nature ofsuch training should be threefold. First,instructor/operators need to know haw tooperate the simulator. Second, they needtodoe trained in how to effectively utilizethe instruetional capabilities of the simu-lator. Third,.they need to be trained withthe procedures that-are going to be usedduring the evaluation.
Theoretically, instruct= trainingshould be a simple task to complete.HOWever, it often poses a nutber ofproiblems, including the time required totrain all of the instructors, scheduling
_instructor training around the existingtraining schedule, *and rotation of instruc-tors in and out of the training command.Many of these problems could be solved byselecting a set of instructors who are notexpected to transfer away from the trainingsquadron 'until after the evaluation iscompleted. Having a smaller group ofinstructors participate in the evaluationwould simpl4ysthe instructor training task,and would perhaps allow enough time toprovide, them 'With some individualizedon-the-job tutorials. Alternatively, usinga sUftetof available instructors will
reduce the number of students that can befollowed at any given time and, hence, willincrease the duration of the evaluation. Italso may cause same prOblems for schedulingsimalator training events.
Summary
Conducting training effectiveness eval-uations of simulator instructional supportsystems involves all of the problems gener-ally associated with assessing simiLatoreffectiveness and poses -some specialproblems as well. This paper addressed someof these problems and discussed some poten-tial solutions.
Listed below are some guidelines thatmay help in designing and conductingtraining effectiveness evaluations ofinstructional support systems. These guide-lines incorporate some of the proiblemsolutions discussed above as well as somerecommendations not discussed.
- Ensure that the system to be evaluated issufficiently developed and debugged beforethe evaluation begins.
- Develop specific evaluation Objectivesthat can be, addressed with the resourcesavailable to conduct the evaluation.
- Select or develop an evaluation model thatwill allow evaluation Objectives to besatisfied.
- Ensure that, the evaluation model isappropriate for the size of the trainingeffect expected to be detected by theevaluation.
- Select or develop a set of measures thattogether will allow all evaluation Objec-tives to be addressed.
z
'1- Ensure that measures of performance aresensitive enough to detect differencesattributable to using the instructionalsystem being evaluated.
- Develop a conceptual plan for relatingmeasure of performance to evaluationobjecti e (this is especially importantfor eval ions that use ana
t
ytical models).
Ensure that is s/operators areadequately ed operate the simulator,and to utilize a e capability of the in-structional system being evaluated.Instructors/operators also should be trainedwith the evaluation procedures to befollowed.
--Monitar daily the conduct of the.evalua-tion and data oollection protedures.
93
420,
Following these guidelines will not
guarantee successful evaluation and theydo-not Radress all of the issues associatedwith conducting a training effectiveness
' evaluation. However, they may provide someassistance in designing and planning anevaluation and they may help minimize someof the prOblems encountered by previous
evaluations of instructional support systems.
wo,
f)-),J
94
;At
4
USE OF CgAPUTER-AIDED DESIGN SYSTEMS,IN WORKSTATION DESIGN
GERHARD EICHWEBERKurt Eichweber Prazisionsgeratewerk
Hamburg, West Germany
ABSTRACT
The increasing tethnical cOmplexity of modernsystems require the appltcation of' new techniques inthe design of the man-thachine interface. Stresscaused by information overload and "underload" willnot be relieved by increased training or better per-sonnel selection. Rethinktng'the man-machine dialogueis required. Advances *modeling methodologies offerpromising possibilities for the future; 3-0 Computer-Aided Design (CAD) systems offer solutions that areavailable today. Application of one such system, theEnhanced Interactive Design System (EIDS), is dis-cussed with reference to its use in actual weaponssystem design.
INTRODUCTION
Ever more complex technological demandsfor ever more accurate weapons systems havemade these systems so complex that the oper-ating personnel are being pushed toward theiroperational performance limits. Neither pro-longed training nor increased personnel selec-tivity can ensure the right solutithis to thesepressing problems. The use of -human engi4neering principles to define the man-machinedialogue and the man-machine interface has longbeen stressed as the solution to the need forshorter training times and lower qualificationprofiles, with better oper'ating results. How-ever, appropriate human engineering principles:have yet to be adequately applied.
Human factors engineering considers alarge area that includes not only anthropomet-rics, mechanical, and chemical (chemo-physi-cal) stress factors but also the whole range ofman-machine interactions. These factors im-pact the man-machine dialogue through indica-tors, controls, and visual and auditory infor-mation from the environment and also includeother sensory data and informationthat are notdirectly related to the operator's task, suchis movement, especially accelerations and vi-brations, sound, noise, temperature, and thewjiole microclimate. Another factor in the man-machine dialogue is the proprioceptive infor-maiion, i.e., the information regarding the lo-calization of the body and its extremities inspace, its movement, etc.
In today's human engineering community,our efforts must no longer be primarily ori-
AffP,
ented toward further increases in the rate ofdata output. The way toward higher efficiencyis increasingly dependent on how well we suc-ceed in reducing stress. Man has become theweakest link in the chain. This ts mostly be-cause engineers and system designers have al-ways tried to solve the technical problemsfirst. Because of their mechanistic blinders,they have hardly ever confronted man-relatedproblems unless forced to. Those human factorsthat were recognized and respected were themost primitive and easiest to capture, e.g.,anthropometrics (at least I have no other ex-planation why, for a lot of people, ergonomicsis still understood as a synonym for anthropo-metrics). Man is much more complex than simplya set of measurements. A lot of physiologicaland psychological knowledge,has been collectedduring the past 50 years, but little of thishas been directly applied to workstation de-sign. Actually, having approached the limitsof man's capabilities, we largely lack themethods for applying this knowledge to the en-gineering of a modern workstation.
STRESS AND.INFORMATION LOAD
One of the major tasks for human factorsspecialists is stress reduction. In order toachieve optimum results in stress reduction, wehave to work with both the chemo-Ohysicalstress factqrs and the proprioceptive informa-tion. Only a coordinated effort on all areascan assure good results, since otherwise iso-lated "peaks" of stress can affect the resultsof all the efforts. (It is often.noted that anisolated disturbance in an otherwise undis-
95 . j;)
turbed environment will generally be much more
noted and have more effect than if surrounded
by a higher level of other disturbances.)
In order to achieve a better, more ef-
fective, and more ,ogerational interface by
structuring the man-machine dialogue, we must
have as a goal the reduction of stress. Stress
itself is in some respects linked to informa-
tion. There can be all kinds of information,
not only those regarding the task itself.
Stress, from this poinf of view, can be caused
by both under-information and over-informa-
tion. If one gets too little information, one
instinctively starts to worry and tries to fill
the gap, searching for more information. This
applies not only to information needed con-
sciously for the task, but as well to subcon-
sciously needed information regarding the en-
vironment and oneself, e.g., .proprioceptive
information.Over-information as well can pro-
voke stress and thus hinder tM proper accom-
plishment of a task. We must therefore strive
to suppress all unnecessary and superfluous in-
formation. Stress reduction by structuring the
man-machine dialogue goes beyond the anthropo-
metrical positioning of controls based on the
criteria of reach and operability and beyond
positioning of indicators, in order to read
them better and more quickly. Structuring the
man-machine interface is a very complex proc-
ess, which, under given time-cost constraints,
and manpower constraints,requires new, more
effective tools and methods.
THE ROLE OF MODELS IN SYSTEM DESIGN
Many different tools have been developed
that attempt to handle the complexity of param-
eter interdependence in the anthropometrical
field. Man-models have been developed and suc-
cessfully applied in many different areas.
Techniques include bidimensional approaches,
the simplest being the Dreyfuss scale, the
'Bosch drawing aids developed by Dr. Jenik of
Darmstadt University, and other very special
application-oriented approaches, like the com-
puter-based design for car driver's worksta-
tions by Volkswagen, the Reach model for astro-
nauts used by NASA, and more complex man-models
such as SAMMIE; BOEMAN, CAR (Edwards, et al.,
1976), CAPE (Bittner, 1975), COMBIMAN (Evans,
1972; Kroemer, 1973; Dillhoff, Evans, and
Krause, 1974), and HOS (Strieb, 1974; Lane,
Wherry, Strieb, 1977). It is tempting to con-
sider developing even more complex computer-
based man models, where many more parameters,
like comfort and maximum limitt of movement and
joints, forces, reaction time, manipulatiiie
precision, resonance frequencies of organs,
. etc., would be considered. The data from these
models could be stored and automatically-
studied in relationship to postures within the
working environment.
Such models are feasible and given theincreasing power of modern computer systemswill be developed as man reacht6 toward moreperfectionistic levels of elaboration of such
man models. However, because of economic con-
siderations and pending the development of such
elaborate models, we must be satisfied with
less perfectionistic approaches and make opti-
mum use of available low-cost computer-aided
design systems, available specialized man-
power, and last but not least common sense.
COMPUTER-AIdED DESIGN SYSTEMS
Today three-dimensional computer-aided
design systems (3-0 CAD systems) are available
that, for a reasonable price, can run on the
medium-sized computers already available in
many institutions and companies. The Enhanced
Interactive Design Systems (EIDS), a CAD soft-
ware package that we use for engineering,
architectural design, and human engineering of
man-machine interfaces, can run even on small
16 bit computers such as an HP1000. With such
a CAD system, we have a tool at hand which can
be used in developing interesting alternative
approaches to workstation layout.
A tool is, of course, only worthwhile if
it is both helpful and cost effective. Al-
though initially expensive, the cost effec-tiveness, i.e., the return on investment, of a3-0 CAD system is favorable if you take into
account that through its use we are really mul-
tiplying the output achievableffrom the ratherlimited human factors manpowerlavailable in the
system design field. The rate of return oninvestment is very much dependent on the useone makes of the data base once it is devel-
oped. In our engineering designs, we make re-peated use of the data in designing (and rede-
signing) the man-machine interface. Once we
have established the linkages between the data,
the engineers can continually redesign the
"black boxes" and their contents, to optimize
the system's functionality.
State-of-the-art CAD systems allow a
total three-dimensional description of objects
and structures. For most crektation detignsituations, only a true 3-0 solid modeling CAD
system will suffice. So-called 21/2-D systems
are nbt of much help ,nor are wire-frame 3-D
systems that lacethe hidden line and hidden
surface capabilii4ies of .solid modeling sys-
tems. The data structures of a 3-0 solid
modeling system are such thatIthe topological
description of the real object is stored in the
computer. All imaginable views, planes, iso-metric, or p&spective, as well as cutaways are
just "aspects" (in fhe Latin sense of the
word). The computer does not "think" of the
objects as planimetric projections. But any
desired view can be obtained by choosing a
96
vieWpoint, a direction of view and the type ofview, and striking a few keys on the computerterminal's keyboard.
A further important feature is the abil-ity to remove or suppress hidden lines and hid-den surfaces. Only with such features can weevaluate visibility and parallax problems andobtain views of the objects that adequately re-flect the real product or structure being de-signed. With such a system, we can designevery item of our workstation and assemblethese items and structures in many differentways, trying-out alternatives and testing forreach and legibility for representative popu-lations. Since the system is a 3-0 system andsince the computer's internal description is anentirely three-dtmensional topological one, weare able to try different inclinations ofpaoits, examine parallax problems from dif-ferent viewpoints, etc. (Eichweber, 1981)
But besides enabling us to control everydetail of the layout and enabling us to examinedifferent anthropometric conditions and pos-tures for representative populations, EIDSaids us in the process of organizing the man-machine dialogue by enabling us to try out dif-ferent arrangements of indicators and con-trols. , We can design and test symbols andscales and develop new integrated solutions forthe man-machine dialogue. We can determine thebest organization for readability. If the in-dicators move, we can test different positionsand combinations of the respective symbols,even in their extremes. We can "build" ourworkstations and examine them from any vies-point, test and discuss them with others. Thetedious time-Consu ing and costly process ofbuilding mock-ups c in many cases be almosttotally omitted. We san ensu in most ofour man-machine inte , little or nothingwill have to be changed as the result of trialseven if we go directly to prototype construc-tion.
For different workstations involvingsimilar tasks, we can design functionalmodules, which, once they are stored in thedata baJe, can be reused for different applica-tions. For standard engineering analyses, wecan use the gegmetric data in .finite elementstress analysis programs and for the numericalcontrol of parts myhining equipment.
Three-dimensional data bases can also beused for simulation processes. Indicator sys-tems can, for example, be represented on dis-play screens, linked to simulators of the re-spective workstation or weapons system, andthen be used together with appropriate controls
to simulate the response of the system to theactions of the operators. This can be helpfulfor instance in the layout and immediate.testing of crew stations for airplanes,_ firecontrol systems, traffic controllers, etc. ;
'97
Finally, we can use the data base to ob-tain the documentation drawings needed fortraining handbooks, service manuals, and tl-lustrated parts lists. Perspective drawingsahd exploded views are readily-available by-products of the same data base. If one con-siders the extensive amount of manpower andtime it costs to obtain these with traditionaldrafting techniques, it is obvious that the re-peated useof the same 3-D data base multipliesits cost-effectiveness.
At Eichweber, for Apstance, we have usedEIDS for the engineering design of the newestgeneration ofour Tactical Laser IlluminatingShot Simulators (TALISSI) systems for directlyaimed weapons. All mechanical parts in thesystem were entered into the data base. Movingparts are simulated in their various opera-tional positions. Perspective and explodedview drawings for documentation are also pre-pared with EIDS.
SUMMARY
In oUr work, we have found 3-0 CADtems to be a vital tool in the proper humanengineering of a modern weapons system. Never-theless, a tool is only worthwhile if it isaFT-applied well. Human engineering can only be4'done well if all available knowledge and cri-teria are applied adequately.
To achieve this goal, we use interdisci-plinary teams and what we call the Value Designmethod (Eichweber, 1981). This method is basedon Value Engineering and helps to get close tothe optimum in both function and cost. (By the.word "design" we mean,to indicate that man andman-related functional considerations are in-herent in the conceptual design.) -This approachhelps us to easily integrate specialists intoan itterdisciplinary, goal-oriented workingteam, with the rules of Value Engineering andValue Analysis serving as. the "rules ,of thegame."
REFERENCES
Bittner, A.C., "Computerized Accommodated Per-centage Evaluation (CAPE) Model for Cockpit A-nalysis and Other Exclusion Studies,"TP-75-49, TIP-03, Pacific Missle Test Center,Point Mugu CA, December 1975.
Dillhoff, Evans, S.M., Krause, H.E., "In-corporation of Reach, Dynamic and OperationalCapabilities in an Improved COMBIMAN Mon-model," UDRI-TR-74-50, University of DaytonResearch Institute, Dayton OH, 1974.
Edwards, R.E., Osgood, A.G., Renshaw, K.S.,Chen, H.H., "Crewstation Assessment 0 Reach:User's Manual," Boeing. Aerospace Company,
Seattle WA, April 1976.
Eichweber, G., "Chance Fuer Mittelstandische
Untemehmen," der Arbeitgeber, Koeln, April
1981.
Evans, S.M., "Cockpit Design and Evaluation
Using Interactive Graphics," Proceedings of
NASA Conference on Applications of Graphics in
Engineering, October 1972.
Kroemer, K.H.E., "COMBIMAN, COMputerized BIo-
mechanical MAN-model," AMRL-TR-72-16,
Wright-Patterson Air Force Base OH, 1973.
qt.
(.i-
t") 98
Lane, N.E., Wherry, R.J., Strieb, M.I., "The
Human Operator Sfnlblator:Estimation of Work
load Reserve Using a Simulated Secondary Task,"
NATO Agard Conference #216, Neuilly sur Seine,
1977.
Rohmert, W., "Determination of Stress and
Strain of Air Traffic Control Officers," NATO
Agard Conference #216, Neuilly sur Seine, 1977.
Strieb, M.I., "The Human Operator Simulator:
HOS Study Guide Overview," TR-1181, Analytics,
Inc., Willow Grove PA,'1981.
NE
A J
Jos
RODUCTION
INSORy.BIOR/OPERATOR STATION DESIGN CONCEPTS-
DR. WILLIAM M. HINTON
and
MR. WALTER M. KOMANSKISPECTRUM OF AMERICA, INC.
The ever-increasing.sophistication andcoMplexity of weapon systems and the atten-dant rise in costs are parallel trends in thedevices being procured to support weapon sys-tems training. Compounding the situation isthe explosion in simulation technology whichhas led to development of sophisticated trein-ing devices. Oftert the devices have resultedfrom the technology because,it is available*and not as a result of ananalysis definingthe specific training requirement.
In the development of a training device,the first step is an analysis of the train-ing requirements which the device will satis-fy. Over the years these analyses have beennon-existent or when conducted have takenvarious forms with differing levels of accur-acy and thoroughness. All too often theanalyses haye been a cursory assessment ofthe existing or projected training require-ments. As a result the design and develop-ment of the devices have been left in thehands of engineering, computer, and softwarespecialists., .Their primary orientation andinterest is the development of training hard-ware,which.operates in accordance witb theapproved specification. This does not, how-ever, necessarily mean that the_device satis-fies the training requirements, .
The emphasis on the engineering develop-ment of a training device has developed bydefault (i.e., the lack of an ahalysis of thetraining requirements). The lack of partici-,pation by human factOrs and training analysispersonnel in providing the necessary inputrelated to the training requirements, theinstructor role, ease of operation, and inparticular thp.design of the Instructor/Oper-ator Station (IOS), has been a major contri-butor to the situation.
As a result training devices have beendelivered with IOSs which are complex anddifficult to operate. For example, many IOSshave multiple data entry methods such asswitches, light pens, numeric keyboards,
99
alpha-numeric keyboards, multi-functionswitches, pages and pages of CRT informationwhich must be called up before the specificitem can be located, etc. Extensive formaltraining is required for the instructor/oper-ator to become proficient. Daily usp isrequired to remain familiar with the opera-tion of the IOS, In the "real world" situa-tion the instructor/operator may or may nothave had formal training. More likely hehas received an abbreviated on-the-job train-ing program and does not use the device daily.Many instructors are, therefore, not quali-fied to administer training in the,deviCe.The result is that training quality, -quantity,standardization, and efficiency.suffer.
TOS design is not just the physical lay-out of a work place. It also encompasses
. consideration of instructional features, in-formation management, software capabilities,and many other varied factors. To properlyincorporatesthe Many factors into an effi-cient training-effective design, requires asystematic analysis process% 4 is.apRareptthat-many'IOSs currently in use,have notevolved through such an analysis and,design ,
procesg.
The purpose of this-paper iS to, summar-ize the prOblem,areas which have resultedfrom analysis and design shortdomings and to,
. present deSign principles and concepts toovercome these deficiencies. The fociiefs.- .
on operational flight trainers-10Ft). Many
of the problems and'recomendations, however,-are applicable to a wide range of trainingdevices.
e
RROBLEM AREAS
There are significant differences in thedesign features of existing simulator IOSs.Some Work better than others; some: have. ,
sophisticated feitures, but do not work verywells others are simple and work quite well.Collectively, howeVer,'there are weaknesses ,
, which characterize OFT IOSs. These weaknes-ses are as f011ows:
- Layout; labelling, coding, etc. do
not optimize device operation and
,minimize i.nstructor workload.
- Steps required to accesi many CRT
,displays are time consuming and
inefficient "/'
-' Delays in display access cause inef-
ficiencies in training problem con-
', trol and monitoring,
- Data entry methods are.confusing,
:redundant; and inefficient.
- Many features and capabilities are
not needed, not used, or difficult.
to use.
InstrUctot'training does not ade-
quately prepare instructors for their
roles and responsibilities in using
simulators.
Instructor roles and responsibilities
are not completely defined and
implemented.
Student traintng syllabi are poorly
ddveloped and used. Standardization,
organization, efficiency, and qual-
ity of instruction are, therefore, '
impaired.
- Simulator capabilities are not keyed
to the training requirements (i.e.,
objectives) tlf the training system.
%in which it will be used. The
simulators areo therefore, not.pro-
perlx integrated into the-training
system as an efficient element
whtch fills a clearly-defined need.
Instructor han dbooks S're.not properly
organized.and formatted to provide
instructor assistance in operating
the deyice in a traintng situation.
They are massive informational
voRmes,.not training tools.
When taken'togethet tbe clear impression
is conveyed that the to,1 concept of design
for the.instructor is very "loose" (i.e.,
the systematic principles of man-machine,.
interface, human factors, and training tech-
no1ogy have not.been propetly applied). The
get'result is that planning and design for
operation of the device are secondary com-
ponehts'of the total simulator development
process. Actual operation of the device
4100
and training using the device, therefore,
suffer.
In many cases it is evident that a num-
ber of the features and capabilities'were
included, not because they were clearly re-
quired to accomplish the desired level of
traiaing,-but because no one really knew.
whether or not they should be there; so they
were put in. This tendency toward overkill
has resulted from two-primary causes: (1)
failure to perform the proper training analy-
sis-efforts leading up to device design and
(2) the explosion in potential device capa-
Iiilities resulting from technological ad-
vances in the state-of7the art. The former
can be characterized as follows: "We're not
,completely sure what we need'to teach in the
device or how we should teach it, so we'll
stick these characteristics in just to be
isure.", The latter could go something like
this: "We do not know just how we'll use
these instructionalfeatures, but we'll put
them imanyway. They're neat." 'The overkill
increases the,reqUirementsfor,software and/
or hardware and escalates cost.
The easyWay is to allow simulator ,
technology to dictate 'and inclbde everything
that it permits and hope every contingency
is included. The ratidnale sometimes.ex-
pressed, "we may need it someday" or "its
nice to Nave",or "ft doesn't cost anymore".
This approach is-thedevelopment of a sub-
stitute aircraft and not.a training vehicle,
with the result that due to the complexity
of the IOS, the instructor pilot has neither
the skill, the time nor the inclination to
utilize the capabilities of the simulator.
Thus training capabilities of the simulator,
are degraded.
While the capabilities,and features may
be very desyable on paper, they tremendousl
complicate device operations and increase
costs: Complications result from the amount
of Informatidn whichmust.be handled, the
variety of data presented to the instructor,
the complexity'of the simulator control,'the
amount of trainidg required for fnstruction,
andhuman'limitations.
ASSUMPTIONS
The data on JOS problem areas lead to a
set of assumpttons which provide guide-H/1es
and a framework for the design principles and
concepts. The assumptions are as follows:
- A specific simulator training
k
syllabus will be s eloped as pdrt of the,overall planni , design, and fabrication ofa device. T syllabus will.result from asystemati front end analysis in which thesimulat r is treated as a component of an °
integrated training system. _It will be ap-proved by the User and will form the basisfor implementation of training on the'coarpleted device. Admittedly this assumptiondoes not always embody the way things areusually done. Rather it embodies the waysimulator acquisitions should work and is adirection in which acquisition policies and'rocedures are moving. A well-designed syl-us is an iMportant prereqaisite to good
sim tor design.
ining system and human engineeringpersonnel ill participate in the design ofhe IOS. It is further assumed that these
personnel will have the Skills- required todevelop and valtdatethe training tasks andSyllabus to convert the validated data intoan effictent IOS station. ihe emphasis isthat the IOS will be,keyed to the training,requirenieryts and situation,,not to technology,excess c abilities,,and designer.whims".
Instructor pilots (IPs) will continueas simulator training instructors for the.forseeable future, This assumption has boa)positive and negative aspects. Positively,it means that tactically and with resPect toflight skills, the instructors Will be ttighlyqualified. ,They.will, possess the ability toclosely identify with student problems. .Neg-atively, they'may not be highly motivated andtrained to perform the_roles and responsibil-ities of simulator:instructors. IOS designmust, therefore, encompass instrtictor aid'features which minimize instructor shortcom-ings and'which maximize their strengths.
Training of IPs will not improve inthe future. As boted previously, most in-structor training is insufficient to 'properlyprepare them for their role in admidistering,monitoring, and evaluating training,exercises:IOS hardware and software must, theicefore, beeasily interpreted and used, and mUSt providesupport'to offset the effect of insufficienttraining.
- Simulator teainology will continCle toadvance in rapid strides. It will, therefore,be increasingly important for analystS anddesigners to stay abreast of technological
.
developments and to'ciosely-evaluate their ,
app1icationsto indiV.idual simulator acqui-sitions.
,
P4.4191.
1
- Simulators will be used as,training
vehicles and not,as substitutes for the air-craft. They will, therefore, contain,onlythose attributes which contribute to training.The attributes will be determined through asystematic analysis of the-training require-ments.and the training system.-
DESIGN PRINCIPLES
The flight simulator design and-operagtion weaknesses, and the assumptions formthe basis for the IOS design principles. The .
design pripciples serve two primary purposes;(1) to provide a summary-Of what should bedone to correct the weaknesses discussedpreviously and to implement the assumptionsand (2) tp provide'a sebof guidelines whichwill direct the development of IOS design-.recommendations. The design'principles areas follows:
RUuce the instructor/operator re-.quirements for formal training by.providing an instructor/operator in-structional aid s'ystemrat or near the
. IO. jhis feature will not only pro-vide training in the operation of thedeyice but will actively assist theIP throughoutthe training exercise.
-, Emphasize the use:ctf automated train-'ing as the normal mode of.training.
Reduce the instructor/operator work-'"load by automating the ancillarytasks.
- Reduce thi number andtype of dataentry methods at the NS.
- Use "touch" panels On CRTs and elec-
tronic "touch pads" on the IOS con- 6
sole as the primary data eiltry.methods.
- Design IOS layoUts to- enhance opera-,tion, interpretation,.sequences ofactions, etc:
,- Eliminate the use of multi-functioncontrols/switthes, ,
Standardize the nomenclatdre used orfcontrols to reflect IP terminology.
,Investigate the'.use of cplor on CRTdisplays to highlight iraportantpoints. , '''
t
4
As discussed above, display location isstandard. There is, therefore, no oncernwith where a given dilPlay will appear. Alsodisplays which are candidaSes to replace a'given display which is heiTig presented on aCRT are selectable using the tblich pads onthe CRT face. In most cases cnnging dis-plays will require a one-step touch. In
most other cases software design shouldminimize the number of decision points (i.e.,-steps) the.instructor must handle.
- Minimum steps to manipulate trainingexercises. Minimum steps'..to manipulate.As
closely related to minimum steps to changedisplays. In this case, however, ratherthan changing displays for monitoring pur- 1
poses only, the instructor is locating thedisplay from which he will affect trainingproblem control (e.g., locate and select,aspecific emergency, locate and,activate aspecific navigational beacon, etc,).
- Large 'selection of programmed exer-cises. One of the'stiTngths of the proposedapproach to IOS design and operation is ahigh-quality front-end analySis. Amongother things, the analys4s yields a realistictraining device syllabus which is keyed tothe training requicements and training situ-
' ation. The progNmmed exercises designed,from the syllabus are, therefore, essentialparts of the total training systeM and shouldbe administered to, each student. The largeselection allows consideration of studentskills and progrest, a variety of equallydifficult exercises for differlent students,standardized training, programed exeiTisecapability for all training Aases, andelimination of time-consuming'set-up requiredin conventional free flight exercises. ,
- Large selection of initial condi-tions. Setting up initial conditions in thefree flight mode on many simulators is time-consuming and may be nbn-standard. A largeset of initial conditions gives instructors!,the flexibility they desire and improves use'of time'and standardization. Selection of agiven set of initial conditions wowld ke'yottier responses from the 10S. For- examplefts,_
; selecting initial conditions for an-airci.:aft'on the parking ramp may activate diSplay ofthe-normal cockpit checklist, or selectinginitial an aircraft in mar-shall may activate the carrier approachdisplay. 4
IAS teaching and prompting. The IAShas`two primary roles. The first is to
106/106
teath instl-uctors howTo use the.simulator.In this role jt is a CAI terminal used topresent information on the operation of theIOS and on the procedures for conduct)ngtraining exercises. The seconrdle is toprompt instructor'S during set-u0.and execu-tion 'of training exercises. In pis rolethe IAS is a job aid which improves instruc-tor efficiency and standardization of. in-struction.
A quote from the 2F114 (A-6 WST) in-structor handbook. is as follows: "This vol=ume,is designed to give'the instructor theflexibility necessary to provide,each studentwith the complex and repetitive trainino herequires." This emphasis on flexibility hasled to IOS designs which are needlessly com-plex and which actually hinder training.
, This is not to say that flexibilityshould be eliminated: certainly not. De-
grees of flexibility are necessary to meet .
the variety of needs in a typical FRS train-ing situation. A major thesis of this paper,however, is that too much flexibility is a,hinderance. To balance flexibel4 withability to meet realistic training require-ments, the IOS Skould be accurately program-med based,on the results of a thoroughfront-end analysis. ,The software conceptspresented above provide the guidelines forthis programming. The more structuredapproach embodied in the inalyses and result-ing software will help en71ire that simulatortraining is standardjzed yet sufficientlYflexible to meet training needs, 1-eadilymanageable, and is easy for instructprs tolearn to administero
CONCLUSION
4
Upgrade student and intiActor"train-
ing and oPerating materials. .
Improve reliability and maintainabil-ity by use of touch
5controls elimin-,
ating light pgns, etc.
Reduce cost by limiting IOS oapa-.bilities to training requirements.
Eacb of the IOS design principle's is dis-,'cussed in the following paragraphs.
INSTRUCTIONAL AID SYSTEM. To implement theOinciple to reduce the need for formal rtraining of IPs, it is proposed that a com-puter aided instructional (CAI) system beincorporated as an integral part of the IOS.This Instructional Aid System tIAS) will havethe capability of presenting instructionalprograms to the IP for.operation of the de:vice; provide'the neessary instructions to'the IP in.the set-up', operation, and conductof instructional exercises for a selectedmode of operation; and cue the IP as neces-spry during a training exercise.
As a pure instructional tool.it will beprogrammed to present instruction on basicoperations of the device to include topicssuch 4s control locations and operations,display formats and, interpretation, exercisecontrol, and use of related training material(e.g., instructor guides and checklistS).During set-up and conduct of exercises itwill aid the instructors via prompti whichguide them through the required operationalsteps. , For example, set-up on most devicesrequires the-instructor-to make a sequenceof decisions which establish the character-istics of the exerciSe. The IAS would stepinstructors through this process-, thus re-ducing instructor training requirements andreducing error rates_ During exercises theIAS would provide cues and instructions toephance instructor performance, decisionmaking etc., as he monitors, controls, andevaluates.
AUTOMATED OPERATIONS. The most sophisticated-simulator,' whichmill be efficient to use inachieving the training objectives and which
. will simplify the instructor workload, allow-_ing him to concentrate on instructionwersus
, operation, is the trainer which incorporatesautomated (programmed) training exercises':TiOt'apprehension that automating trainingwill be highly structured and thus rigid canbe eliminated by astute planning in thedeVelopment of a simulator training syllabus
102
and ehe specific training exercises. By us-ing'the analytical approach, the planningwill consider all the variables and contin7gencies required for achteving the trainingobjectives and, therefore provide all the.'required flexibility in simulator training,-
Use of an automa ted simul.ator instrbc-
tional system, by its very nature, Oill pro-mote standardization. The training txercisesare the same for each trainee. The informa-tion 'presented to the trainee is consistentand in the correct format and the performabcemeasurement parameters and scoring proceduresare the same for each trainee. Evaluationof the trainee performance is more objectiveand, therefore, more valid. Due to simpli-city of the design of the IOS, InstructorPilot activity at the IOS is greatly reduced,allowing him to concentrate on his instruc-tional role.
The use4of automated instruction willrequire,a reorientation in the concept ofsimulator training. This requires the recog-nition..that the simulator is a training ve-hicle and not a poor substitute for the air-splane. As a training vehicle.it should beresponsive only to the determined trainingrequirements as defined by specific trainingsyllabi.
The use of automated training exercises,specifically designed to provide training inachieving designated training objectives,will eliminate excessive trainer stt-up timerequired in a "free flight" mode. In the"free flight" mode of operations in many,devices, the instructor must determine, se-lect and insert every parameter such asinitial conditions, aircraft 'location, fly-topoints, ground communications facilities,geographic displays, environmental factors,and others. A manual.set-up time of 15-30,winutes is not unusual; this situation notonly is non-productive time for the traineebut deprives, him of scheduled training.
It is recommended that automation go onestep further. There would be no "free flight'mode as currently defined. Free flight wouldbe asross between current free flight andautomation (i.e., certain aperations wouldbe automated, and,instructor prompting wouldbe provided). For example, the instructor,could not enter initial conditions on-line.Rather, he Auld always select from the ph-grammed set of initial conditions. To accom-modate this feature the set of initial condi-tions would .be large and would be systemati-.
-e
cally derived fing requirements
an analysis of the train-
Selection o initial conditions to beginan exeixise woul automatically branch the-
-IOS to pre-deter ned displays and instruc-tions for instru r actions. This basicphilosophy of preselecting those displaysand actions whictt can be predicted ahead ofime, based on t e syllabus and.type oftraining to be to ducted, would be carrie4.throughout the five flight mode.
DOS LAYOUT. At first sight the IOS for asophisticated aircrew simulator may beoverwhelming in its size and complexity. Thenumbers of switches, displays, keyboards,and other assorted,inpUt/output features maybe ominous to the potential simulator in-,structor. Even after the instructor has be-come familiar with operation of the device,there may still be a feeling of beingoverwhelmed.
The IAS and emphasis on automation arefeatures which will reduce the complexityof the IOS. Even with these features, how-
. ever, there will still be requirements forenough controls and displays to present aconfusing operating' situation. In order toreduce confusion and increase the efficiencyof operation, the IOS design must be develop-ed through.a systematic human factors analy-si. The analysis is not just to meet'the ,
requirements of MIL--STD 1472C (Human Engi-neering Design Criteria foOilitary Systems,Equipment, Facilities). Iris intended toyield the most efficient training/operating/environment possible for the instructors.Principles such as placement based on fre-quency, criticality, and sequence of use,_must be used.
Characteristics of the CRT displaysmust be.closely considered. Many of the dis-plays used in existing devices are clutteredand difficult to interpret. One salientpoint is the possibility of using color tohighlight selected portions of displays.
DATA ENTRY. Data entry in many devices,isneedlessly complicated. For example, insome devices combinations of 'light pens,fixed function keys, and variable functionkeys are required interchangeably throughoutan exercise.. This mixing of modes increasesthe time required to gain proficiency and theprobability of error and confusion.
To remedy this weakness it is recommend-
103
.4
ed that data entry modes be limited to a sefof fixed function electroric touch pads on .
the IOS console and touch panels on the IOSCRTs. The touch pads available on each CRT
'display would be a function of the trai,eingsituation as reflected on the:CRT and wouldvary from display to display. Basicallythe fixed function keys would provide simu-Oator control. The CRT touch pads wouldprovide training exerci7control.
STANDARDIZED, NOMENCLATUR. Operating inef-ficiencies may be caused by unclear, confus-ing, ambiguous, or unfamiliar terminology.To help remedy th4 problem, it is recommendedthat standardized terms be adopted and used.Since IPs are the prima0 operators of thesimulators, the terms Used should be ir)11pilotese". Abbreviations should be avoidedwhen feasible. Coding should be minimizedand when used should be used in a clearly
interpetable,easily reMembered scheme.
SUPPORTING MATERIALS. Although he studentand instructor materials are no directlyinvolved in.IOS'design, they do play andirect role in that they will be used in con-junction with the IOS to carry out trainingexercises. They should, the'refore, be de-signed to.prepare the students and instruc-torsito carry out their roles during trainingexercises.
The IAS is intended to fake over some ofthe functions served by the instructor train-ing and operating materials. Adjunct writtenmaterials, however, will still'be required.These materials include the'instructor hand-book, exercise guides, and checklists. As
noted previously, the quality of these mater-ials for existing simulators, is generallyinadequate. Their shortcomings contributeo simulator utilization problems. For emer-ging systems they must be upgraded.
RAIABILITY.AND MAINTAINABILITY. Reliabilityand maintainability will be enhanced by theapplication of thepdesign principles, statedabove, to the IOS wOch in turn will be re-flected in the total simulator. The elimina-tion and/or reduction of switches and replade-i'ment with Wash controls; elimination of com-plex keyboard3 and light pens will make theIOS more reliable and require less mainte-nance.
COSA' Reduction in cost for the IOS/simula-tor is a primary goal in the application ofthe design principles. Eliminating theoverkill capabilities inherent in currerit
- J
simulators will result in,lower 'procurement,operating, and maintenance costs.
/DESIGN CONIT
. The design concept is based upon ton-sideration of...four data sources:
- Instructor tasks: the roles andresponsibilities of IPs in administeringtraining exercises. For example, brief stu-
dent, select mode of operation, initializeexercise, monitor'trainee, evaluate pervformance, etc.
- Training tasks: tasks the trainee,
will pFactice in the device: Thee..e thetasks from a typical Instructional Sys mg
Development task listing which are sele ted
for simulator training.
- Design principles discussed in ge
previous section.
- Technology resulting from an astess-
ment of current and projected state-of-Ae-art.
The design concept consists of t4Ycom-Ponents: hardware and software. The dom-
ponents interact and are dependent one upon
the other. A major issue in any design in-
volving both hardware and software is,es-tablishing the Orope-functional balarkbe-tween the two. In pointing out the ditinc-tion between hardware and software, there isan emphasis that IOS design is much more thanusing good humaefactors principles in deter-mining what the 10Sishould look like.1, It is
a systematic process of.determining the,in-
.formation storage, retrieval, display, andmanipulation requirements and implementingtkese requirements in a combination of hard-
ware and software which optimizes instructorperformance. Major design condepts are dis-
cussed in the following paragraphs.
HARDWARE. The IDS hardware must be reliable,maintainable and easy to operate and yetmust contain the components required forcontrol, monitoring, and evaluation. Majorhardware components of the proposed IOS de-sign concept are as follows:
The main console. The main console
is compact and is designed for a single in-
structor/operator. All layout is consistent
with good human factors design principles.Particular emphasis is on ease of use ofcontrols, orientation of CRTs for ease of
+a.
display interpretation, and a functional
work surface which accommodates instrt%torgdides, checklists, etc.
- Instructor Aid Station (IAS). The
IAS is a separate small system with associ-ated controls. It is an integral part of
the IDS and serves two functions: (1) pre-
sent instruction on basic trainer operation
and (2) prompt and guide instructors during
the course of training exercises. It may
also be used to display selected cockpitinstruments (e.g., multi-function display,horizontal situation indicator, etc.).
- Instructor's control panel. T4 in-structor's control panel is the major "hard-wired" part of the ICS dedicated to simula-
tor control. It consists primarily of fixed-
function electr6oic touch pads.
'SOFTWARE. In order to combine simplicity ofhardware with complexity of weapons systemsand training problemt, it is essential thatsoftware be designed to enhance instructor
performance. The software must be simple tomanipulate, present information in easilyused formats, and facilitate problem control
and monitoring. Major features of the pro-posed software design concept to meet these
goals are as follows:
Display continuity. A given displayis always presented on the same CRT. There
is no switching of displays.between CRTs atthe instructon's discretion. ,When a display
is selected or is called up automatically bythe software, it always appears in the same
place.
- Standardized displays. Each type of
.display has a distinct, precisely prescribedformat which is always used for that type of
display. Fainats are developed to enhanceinterpretation and use of the informationpresented. Display highlights are empha-
sized by spacing, graghics, bold alpftnumer-ics, etc. It isvaldrecommended that colorbe.used to highlight portions of the displays.
- Automatic presentation of displays.'Maximum use is made of software selection ofdisplays, so that minimum instructor inter-vention is required. During pre-programmed
modes all displays are software selected.During free flight the amount of instructordisplay selection is minimized through keyingdisplays,to trainee tasks.
104
')
- Minimize.steps to change displays.
4
As discussed above, display location isstandard. There is, therefore, no ,concern
with where a given dieplay will appear. Alsodisplays which are candidaps to replace a-given display which is beiTig presented on aCRT are selectable using the topch pads onthe CRT face. In most cases cAnging dis-plays will require a one-step touch. In
most other cases software design shouldminimize the number of decision points (i.e.,-steps) the,instructor must handle.
- Minimum steps to manipulate trainingexercises. Minimum steps'to manipulateAsclosely related to minimum steps to changedisplays. In this case, however, ratherthan changing displays for monitoring pur- I
poses only, the instructor is locating thedisplay from which he will affect trainingproblem control (e.g., locate and select aspecific emergency, locate and,activate aspecific navigational beacon, etc,).
- Large Selection of programmed exer-cises. One of the'strengths of the proposedapproach to IOS design and operation is ahigh-quality front-end analySis. Amongother things, the analysXs yields a realistictraining device syllabus which is. keyed tothe training requirements and training situ-ation. The programmed exercises designed
,from the syllabus are, therefore, essentialparts of the total training systern and shouldbe administered to each student. The largeselection allows consideration of studentskills and progress, a variety of equallydifficult exercises for different students,standardized training, programzed exercisecapability for all training ppses, andelimination of time-consuming set-up requiredin conventional free flight exercises. ,
- Large selection of initial condi-tions. Setting up initial conditions in thefree flight mode on many simulators is time-consuming and may be nbn-standard. A largeset of initial conditions gives instructors!the flexibility they desire and improves useof time'and standardization. Selection of agiven set of initial conditions wourld keyother responses from the IOS. For- exampleoselecting initial conditions for arraircraft'on* the parking ramp may activate display ofthe-normal cockpit checklist, or selectinginitial conditions for an aircraft in mar-shall may activate4,the carrier approachdisplay.
4
- IAS teaching and prompting. The IAShas'two primary roles. The first is to
10g/106
teath instructors how To use the simulator.In this role jt is a CAI terminal used topresent information on the operation of theIOS and on the procedures for conduaingtraining exercises. The secoad,rdle 'is toprompt instructor-i during set-urNand execu-tion 'of trairiing exercises. In pis rolethe IAS is a job aid which improves instruc-tor efficiency and standardization of. in-struction.
CONCLUSION
A quote from the 2F114 (A-6 WST) in-structor handbook. is as follows: "This vol-ume.is designed to give'the instructor theflexibility necessary to provide,each studentwith the complex and repetitive trainino herequires." This emphasis on flexibility hasled to IOS designs which are needlessly com-plex and which actually hinder training.
, This is not to say that flexibilityshould be eliminated: certainly not. De-
grees of flexibility are necessary to meet .
the variety of needs in a typical FRS train-ing situation. A major thesis of this paper,however, is that too much flexibility is a,hinderance. To balance flexiiklily withability to meet realistic training require-ments', the IOS should be accurately program-med based.on the results of a thoroughfront-end analysis. ,The software conceptspresented above provide the guidelines forthis programming. The more structuredapproach embodied in the palyses and result-ing software will help ennre that simulatortraining is standardjzed yet sufficientlYflexible to meet training needs, 'readilymanageable, and is easy for instructors tolearn to administer.
4
\.
General
II 4
MODULAR CONTROL OF SIMULATORS
Steve leidenstickerCatherine Heyn
Tactical and Training Systems Di/isionLoiicon, Inc.Sah Diego, CA
ABSTRACT
ne of the issues facing the designer of sophisticated simulators todayis how bc control their operation: The traditional approach has been tosupply tike instructor/operator with a Aeries of hardware delaices (push buttons,teys, swi ches% etc.) with which to command the machine. AJecent.trendhas move the functions to a CRT where commands are activated via an-associatedlight i, special func.tiarkeys, or, in nome cases by touching the CRT screen.Although the methods of entering commands to a simulaor are changing, thecommands themulves are not. f
Thtis paper,explores a concept,in1which series of lower level eventsof commands are grouped into a lafger entity called, for the sake of thisdiscussion, a "task module". Marhpulation of 'these task modules permitssimpler operation of a simulator and the task modules themselves providenatural' vehicles for performance measurement, scenario generatlon, briefing/de-hriefing; replay, and other'associated functions. ,In the future this conceptcan provide the framework for modeling intelligent adversaries or support'organizations in multi-unit simulators.
BACKGROUND -
Simulators for flight traihing purposeshave hien in use for' many yearb. As material"nd energy costs clidb, the need for inexpensiive
effective training has xorrespondingly grown.The simulator is a natural tobl for thiskind of training: The typical-'simulatorof today is a highly mechaniZed collectionIf dials, gauges and displays, connectedto a computer, relying on limited numerifalt.`inputs by a skilled technician to drivea mockup of the equipment being simulated.
A tylaicallexample is the 2F95 F-14 OperationalFlight Trainer (OFT).
Typical Cockpit Simulator Characteristics/Op-eration
ThesF-14 :Tom
\cat is S. Navy high
performance_aircraft. To fulfill its role,the aircraft contains very ophisticatedand complex avionics and weapon systems.
The 2F95 OFT is a relatively moderndevice consisting of a'aimulated F-14A cockpit'(pilot only) mounted on a motion platformcapable't*-froviding pitch, roll, heave,and lateral displacement about the relatedaxes. Visual scene simulation La providedby a single channel, field of view
I.
$
107
C V
VITAL III-S display that provides night/dusk'scenes for both land'and carrier-based oper-ations.
' The 2F95(incorporates a single XEROXSigma 5 computer system for all simulationand interaction with the atudent and inttructar.Opfmation is controlledefrbm a remotq instruc-tar/operator statIon located away from,but in sight of, the simulated cockpit.
1 ,
. -.1..,
The instructor controls operationin two ways, 0124 sets up an exercise,,and
c
one controls Operation during the trainingseas on.
The set up operation of the 9FT isdone by "programming" an Alphanumeric DataDisplay (ADD)-system. 7his system.consists,of a CRT terminal and'a ten key ktmerickeypad. The ADD system has a cotal of 22informational displayS, nine of which dealwith the parameters of flight (that is,
"carrier site, sea state, wind state andso forth)" To create an exercise, the infor--Mation on thes dieplays must be updated
session. By edting tha informat' onto reflect the intent of the current ;ring
j.
the parameter pages ,. the that:tutor programsthe malfunetionaland repOsition optionsto be encofintered\in the training exercise.
1
1
The second method of controlling theexercise is via a special control panel,which contains hardware switches, dials,and indicators that allow direct manipulationof the simulator (fig. 1). From this panel
the instructor cab reposition the aircraft,insert malfdnctions, adjust air turbulence
and sea state arid coAtrol the motion platform.
9
Now the plane is on the runway.
4. To program the malfunction, anew page number must be entered to put upthe malfunction display..
5. An engine fire, represerited byanother code number must be addep to the
tisk
,
..11111377
\
ftIttlION
Lw4t 1.1.11, 1
0.34,1.0T. Joao px,c.i
ID ElH
.11.1111. MARI10/4=1. Ell 1.111111=
41
ICS
Filure 1. OPT Control Panel
Unfortunately, this method of control
. results in the instructor spending moretime playing with numbers than with doing
the primary job, instructing a student.
For example, to give a seudent experience
in dealing with an engine fire on takeoff,
i
ollowing inputs must occur.
1, First, the plane must be on theend of the runway to start takeoff. To
do this, one must enter the ADD page numberof the rposition display on the keypad
...
mentioned previously.-
2. Next, )4 code number representingt e.desired position is entered.
3. Then a reset button is pressed.
138
list of available malfunctions. Another
number, associating this malfunction with
a malfunctiom insertion push button on the
control panel, must be entered.
6 When the student starts takeoff,the instructor monitors the airspeed indicator
and inserts th* malfunction by pushing themalfunction button at precisely the righttime.
7. A fire light goes on in the cockpit.
At no time during this button lushingexercise has the instructor accomplishedanything that has a direct impact on teaching
the student. In the worst case, the entireinstructional function can be lost in themechanics oi controlling the simulator.
e
(
Instructor Support System
To alleviate this situation and otherproblems associatdd With simulator training,the U.S. Navy through NAVTRAEQUIPCEN spOnsored
the development an Instructor Support System.(ISS). The purpose of this system is'to-take the mechanized aspects out of the trainingprogram and replace them vith intelligibleuser interfaces that describe an exercisefunctionally and in terms familiar and fcceptable
toutay trained flight instructOr.'
. The ISS completely replaCes the 2E95IOS with a new station consisting of twographic CRT displays, placed one above theother, with a touch sensitive device overlaidon the lower display. All information neededto conduct a training session is presenteddynamically on these displays and all conirol'of the problem and the 8,imulator 'itselfis,via the selection of menu choices withthe touch panel.
The subject of this paper is.bui oneaspect of the ISS. Other important o.nesare dealt with in other papers presentedaethis workshop.
TASK MODULE CONCEPT
In a system as c.omplex as the F-1,4OFT a valid 'training structure is imperative. "
To facilitate such a strpceure, logic suggeststhe breakdown of the. student pilot's tasks,--)into functional groups. For the sake ofthis discussion these groups are, called"task modules". A task module is a logicallyrelated series of external everits and pilotactions that should result in the meeting'of predefined criteria. Examples mightbe the takeoff from a particular airfield,a presrhrt checklist, or a wing sweep mal,function. Task modules ai conceived herehave certain important characteristics andqualities. A brief identification follows,more detailed discussion is provided later.
1. TheY can be "run" or executed.That is, they have beginnings and endingsin time.' They can run in parallel, thatis, two or more task modules can run atthe same time independent of each other.
.2. They can be easily identifiedand manipulated. This forms the basis ofhigh level simulator control, exercise definition, replay, and the like.
3. They provide' the vehicle forassociated training functions, such as performance &Valuation and record keeping.
' TASK MODULE TYPES
In the development of this concepttask modules have quite naturally fallen
4.
into three, types or groups: They are.normal,flight and malfunction. Normal task modulesinclude pie and postfligkt-checkouts andchecklists duDint flight. Flight task modules
.encompass those Xasks related to flying.skiLls, procedures and navigation. Malfunctiontask modules relate to system failures;
Normal
These are non'emergency proceduressuchbas engine starts, checklists, etc.An example of this type is the takeoffchecklist. In this 6ase, the task moduleis activated by an instauctor request atthe touch panel. At this time, the checklistappears on the instructor's display containingthe items to ,be accomplished (fig.,2).Additionally, the trainiag system may monitorthese procedures. The student pilot willperform such actions as are required, andwhen through indicated completion verbally.To end this task mqdule., the instructorpushes,another button. The checklist 'disappears,
and data concerning performance is recordedfor later replay, review, or analysis.
Flight
Flight task modules are those concerned
with the student's ability to carry out themission, be it navigation, landing practice,or the kike. This can be exemplified bythe TACAN II approach to,Miramar Naval AirStation. The pilot flies a tear drop patternfrom,an initial approach fix to a final.approach fix (fig. 3).- In this situation,the events beginning and ending the taskmodule are entirely automatic. The instructorcan devote all energy to monitoring thestydent's performance. The task modulestarts when the pilot is going in a specificdirection at a specified speed and altitude.When the student has started the approach,and by inference, the approach task module,monitoring begins again. This,time, however,the events being monitored are those specifically-related to the approach task. The student'flies at certain speeds, headings, and altitudes
to follow the proper approach patterns.When the student completes the approach,'based on speed, altitude and heading, thetask module also ends.
Malfunction
Malfunctions are activated by instructor
request, again through the use of a puahbutton. In this case, however, the buttonis on a touch panel, and is labeled withthe n me,of the malfunction ..to be inserted,for e....le,,ENGINE FIRE. Once the instructorhas .ressed the button, an indicator lightappe rs in the student cockpit and a malfunction
che klitft appears on the instructor's display4(fi . 4). As the student follows procedures,
109
.1"
- TAKE OFF CHECK LIST -I tRAXES CHECK
ACCUM PRESS UP-- CHECK.TAPES 1200/1500 MAX --2 FUEL NORM CUD/AUTO TRAM. DUMP OFF3 CANOPY CLOSED
4 SEAT ARMEDS SAS . ALL ONE nem ALL ZERO1 WING 20 AuTo1 FLAIS/SLATS/SRLAS OvN/OVNA, ,0 coNTRoLS FREE/100300010 UR ;RAKERS ALL IN S .
.1 mASTEA TUT OFF12 SI DI PUMP, NORMAL
13 COmP/STS GRY0 ALICVCDIv .ARNEss Loon15,CAUT/ADV LTS ALL OU
- PILOT S VERSA. inSPoNSE -
sAT -,r[sarc9 uNsAT SELECT
DIAGNOSTICS
AC -IyATE NEXTmALFuNCTION
ENvIRONMENT --OPTIONS (LANO)
1§,ELEC
-RE-POSI.TION IssLscil
ORTIONS
ofEcx LIST --TAKEoFF cmKLST
MANuAg (FREEZE
43'
OCOPy VOICE MENU
Figure 2. Takeoff Checklist
COM/NAV (TAC 22 /N(X)
ARA/53 20 RAD SRIuNF 210 2 RNG XcoNTRoL
HI TACAN 2
Olt 0
410,rtgi
.
CrArtneso
/ 7000 r.r
,3 Of 1" "gra) INK
x- 0;;;:,one
LEI
A/C MALFUNCTIONS
UNSCHEO WHOsvEEP
t.
itC PARAmETEAS
ALT iffgl vs/ -4,5
spo 2.4 MACH .ISS
100 ORD AOA /350 $22 FUEL C 13
A
A/C CONFIGURATIONGEAR uP RAmP LA/RA
FLPS ft APLT OFF
SPD/BK PAR ROLL ONVINO Sa/M PITCM ONHOOK ON YAK ON
OLC OFF THROT BOOSTSPLR/BK OFF RPM 70 70
'A
Figure 3. Tacan 2 Approach dhecklist
1'1 110
DO
4/
RIGHT ENGINE FIRE
.1 noarrut
Autsaact
.5 IF LITE GOES ouTCHECK FIRE DETECT SyS
IF - LITE STAYS ON- FIRE DETECT FAILS TEST- oTNER INDICATIcftS
A FUEL SHuT oFF HANDLE
THRoTTLE
5 CLIMS A DECEL
S FIRE EXTINOuISNER
DIAGNOSTICS
7.000 NAV coNTRA. SEAVOLF 2f0 RAD 7 OmE
<<< 7C GRADE FOR AScENT cHKLST - DO >»POW HDO coNTRoL mAINTAIN H00 255 FOR mOuSE TRANSITION
MALFUNCTIONSELECTIONS
SELECT ENVIRONMENT --ORTIONS (LAND)
RE-POSITIONOPTIONS
SELECT
SELEC;i
CHEcx LIST --DESCENT CHKLST
ISELECT
TIAPAJAL NEE NOccPy VOICE HEM
Figure,4. Malfunction Display
updates appear on the checklist indicatingactivity in the dockpit as it occurs. Ifa student completes ail procedures correctly,
themalfunction is removed automatically,or., if the instructor has requested it,may stay in until the instructor choosesto remove it manually with another buttonpress.
APPLICATION OF THE CONCEPT
Simulator OperationA
In thi examples above the relationship/between task modules and simulator operationis implied. The key is lite linking of thelarger,Units of the training operation tothe simulator directly. In the ISS thisis done via touC.h selections of dynamicmenu items put on the display. That is,the instructor is now allowed to manipulatethe problem using tools that ate naturalto the application; not those tied to thesimulitor hardware. In a sense then, thetask module structure becomes A kind ofhigh orider language, where pilot-relevantexpressions replace numerical code commands.
In contrast the proCedure outlinedearlier for inducing an engineifire on'takeoff,
a single menu selection marked ENGINE FIREON.JAKEOFF will accomplish the same thing.
111
%,
siExercise Definition
Exercise building is much simplifiedoHere the task modules can be likened tobuilding blocks. A-complete training sessioncan be developdd quickly and_conveniently'from a list of task modules. In the exerciseset up facility of the ISS,0moupt of availabletask modules which are related in functionare presented to the instructor in logicalsequence (fig. 5). The, instfuctor thensimply selects the task modules whiCh areappropriate fox_the eiercise. All suchbuilding can be performed at the start ofan exercisON and thus the instructor doesnot have to work "on the fly" to make sureall exercise components are covered.
Performance Measurement4
..--/
Performance mea urement and monitoringis also simplified. Grading parameter-s,....-and performance criteria may be associatedwith specific task modules. The computeris therefore free to monitor only thoseevents which have direct hearing on thetask module procedures bfipg graded. Withoutsuch a mechanism to limit'the processingrequirements, the system designer is facedwith the prospect of hiving to watch everythingall ,,the time. This can be an overwhelmingrequirement.
4
v-
.
ISEL MODE APPROACH SELECTION
0 .+IS4-7ACAN-1 (NKX)
.,:sm-TA!AN-a op(x)
.RE-IN1'10,6:ZE '-.!SE,ECT
Figure 5. Exel-cise Definition Selection Page
A relatedhenefit in this area isthe self limiting nature of the performancedata generated. Diagnostic messages andgrades sre oniy displayed for task modulesassociated with that training session.This simplifies and reduces the amount ofdata the instructor needs to examine tomake valid decisions regarding the student'sperfimmmnce and abilities.
Modeling Outside Agencies
Another area in which task modules can improvea training session is handling mathematicalmodels of outside agencies. Usually theactions of FAA. controllers and other individualswho interact with the pilot can be li'nked
to a Single task modkle. For example, allthe ac4vity of a GCA controller can belinked to a final approach task moduleThus, a specific function can be tied to,a specific task module. This contributesto a.simplification of softwsre design.If a function changes, the related taskmodule can be altered without interferingwith programs or other task modules. Mo&darity
natural part of the system.
An extension of this concept, whichhas not been incorporated into the ISS,is that-task modules could be built to providp
,112
intelligent adversaries.or supporting entitiesin multi-role simulators. For example,the skill or sggiessiveness of an enemypilot in an intercept,.trainel. could beeasilymanipulated using the tasOmodule concept.In Another application the imaginary crewOf an ASW aircraft could be commanded toperform some complex manuevers in supportof a shipboard ASW simulator via a simplemenu choice.
- -
Debrief/ReolayikecordKeeoinK .
_>,de
Debriefing the student is much enhancedby the use of ta-sk modules. The studentcan be shown a specific taslc during a replaywithout extraneous information unrelatedto the learning experience. Since gradingia linked to a task module, record keepingbecomes more viable And more clear to Studentand instructor alika:Any_task modulemaybe replayed in any order, saving-rime and-enhancing tke student's.undersettillas_juremphaiizing the specific areas that needwork. This is win sharp contrast to ,theproblems that ensue when,replay is timerelated and the action of interest.mustbe searched for, or worse; when no replayis available.
STANDARDIZATION AND FLEXIBILITi
,The concept of handling operation
via high.livel task modules also has somesecondary benefits..
Gene,rally, the task module also provides, standardizat on, which allows-student _pa:fun:lance
to be- ev d more cosksistenf14. A taskmodule wil ways start at the same timeunder the same conditions. Tbe same procedures
must be followed and nothing will be overlooked.
At first glance ix may appear thatthis .approach may reduce general flexibilityin simalator operation. This could be aproblem if only a, small numbeF of task ,modules
are jaaintained,.. However, to support a fullsyllabds such as that associated with F-14replacement training, a. large number (150in the ISS)' is required. This ,. plus thefact that the task modules can be linkedin a variety of ways, assures adequate flexibi-bility.
SOME CAUTIONS
In order for this concept to workeffectively dm task modules must be constructedvery carefully. Less than a very.thoroughanalysis of start/stop evenEs can causetask modules to run or stop running unexpeCtedly.
Haphazard definition of scoring criteriacan have very defrimental effect of useracceptance of the system.
113/114
/The greatest problem ficed by thedevelopers of the TEE was, an`mnderest-imat ion
of the reseurces requirea io define Chetraining tasks from Vbich the task modules ,were derived-intEAMbime and effort requiredto createeitnd cheeiaut the task modules..A means his:-.-...heen identified tb`ease thelatter burden 3.-"Y ati,toating some of thetask module,creation functions. This willenable members of the user's community tDcreate and modify task modules interactivelyst a ccoputer terminal. This will not requireprogramming skills nor a knowledge of thesimulator itself, but it gill requrrethorough and very deta4ed understandingof the learning process, not' i trivial task.
ABOUT THE AUTHORS
Steve'Seidenviickep is Che technical ContractManager for the ISS development projectat Logicon. He has been invlved in theproject from its early Aages in 1978.He hold a taster's degree in System's Managementfrom the University of Southern _California.
Catherine Heyn is a computer scientist forLogicob; She has corked extensively withthe ISS project. Previously she workedwith the Navy Personnel Research aEd Development
Center designing and implementing adaptivetesting and counseling programa. She holdsa bachelor's degree in Computer Scienceand Pqmholcsy.fromSan Diego State University.
-.0"`
SIMPLIFIED CONTROL OF TIJE SIMULATOR INSTRUCTIONAL SYSTEM.
ROBERT Ai GAMACHE.The Singer Company
Link Flight Simulation Division
INTRODUCTION.
Recent-advances in information processing anddisplay capabilities have permitted significantincreases in both the number and the complexityof training features found in today's simulator
.
--> instructional systems. These advances reflecta combination of more detailed user requirementsas 'well as the rapidly increasing complexitiesof the systems and vehicles being simulated.
,
These oveN)1 advances in capabilities haveproduced a correspading and steady increase
,
in sy§tem_operating and control requirements.,Efforts to advance the state of the art havefocused primarily on system capabiTities andfitlelity rather than instructional value,
often short-changing the role and the needsof the simulator instructor. These and otherfactors have contributed to increasing taskloading up to and in some cases beyond accept-able levels. This need not be the case; in-structional system control'and operatiOn-can
be made\simple and straight-forward,. \Taskloading tan be dramatically reduced and
instructional effectiveness increased throughcareful front-end activities combined withthe systematic application of appropriate 4
design principles. These principles concernthe recognition of the goals of the trainingprogram and the unique capabilities and re,quirements of the simulator instructor inmeeting these goals:-
The simulator instruCtor, like the flightinstructor, performs several key functionsin the-instructional process. Some of these
.
functions can be automated while some can befdcilitated.through the provision andformattjng of training-rdlevant information.In addition, information and control capa-biIitiesican be made available with minimumidstructdr intervention. The instructor'sworkload can be minimized and, equallysignificant, the instructor can be given readyaccess,to the instructional capaPilities her'equires.
.. ,
.THE INSTRUCTIONAL SYSTEM.
The function of the simulator instructionalsystem i to support the goals of the instruc-tional,drogram and to sUpport the instructoras ne perforatà the tasks required in facilitat-ipg instruction. Tpe 'instructional system canbe optimized throup the application of two keydesign considerations: OPERABILITY and ACCEPTA-BILITY. 'Operability is achieved when an in-structional system is as easy to use as ispractical, while acceptability is achieveti whenthe system makes optimum use of the instructor'sunique talents and of the simulator's unique'capabilities. Wiile these are somewhat.simpli-fled and_ generic definit-rons of criteria which
touch On virtually every aspect of system design,they do address the main concerns. For thisreason they are the primary guidelines for thesystem design approach tp be developed here.
tasks hy observing the performance of thestudent and the aircraft comparing theseperformances with the pr scribed perforMancestapdards and with his n estimate of thecapabilities expected of the student at agaven time. In addition, he fkcilitates learn-ing by prescribing practice condhtions.withinthe limits :imposed by_the flight,environment.
The simulator has the inherent capability of
providing more task-relevant information thanis available to the instructor in flight. In,addit.ion, it has tne inherent capacity-for pro-
viding and standaridizing practice conditions asthey are required for effedtive instruction, and.
it.has the ability to provide information in manydifferent forms and relationships.
AWEXAMPLE.-4-
YExcept for the aircraft preflight, the jobs of An example'of a trainihg exefcise as it mightthe simulator tnstructbr and the flight instruc- occur in a typical simulator is used to illus--tor are closely anAlogous. They perform highly trate the application of twelve basic tiesignsimilar functions, including the arrangement of objectives used to achieVe optimum ins ructionaltraining conditions, briefing, demonstration, peN system operability and acceptability. heseformance monitoring and.biagnosis, modification ,design objectives assume that the simula orof preplanned training exercises, coaching and and its instructional program pave been dvelopedguidance, performance evaluation and critique, in response to a set of well-defined trairT4ngand communications functfons. In addition, of objectives. The definition of these Ajectivescourse, the flight instructor devotes a major results in the Identification of relevant practicepart of his attention to acting as a safety conditions, performance measures, and performancepilot. the flight instructor performs these criteria reflecting,Student progress in the
.k..,
.
115
11.14j.,
trainth,g exercise. The design objectives are as
follows:
Mi.aimize the number of individual conthls
and functions on the instructor Console
2. 'Automate non=instructional tasks to the
greatest extent possible
3. Minimize the instructor control attions
required ter perform all operating and
control tasks
4. Minimize,requirements for changing displays
during critical tnstructional periods;
anticipate and programrdisplays needed in
eacrphaie of the exercise
5. Provide maximum instructor flexibility in
the control of the instructional proce
6: Provide continuod trainee performance
feedback
7. * Correlate the simulator's graphic display_
arid performance-data capabilities withtr'ainee monitoring techniques used in-
the aircl'aft
8: Minimize requirementsvariable data through
to enter or modifyexercise pre-planning.
9.. Provide instructor selectable performance
data recording
',10. Make-maximum use of relevant advancedinstructional techniques to enhance feed-
back, control the training" setting, and
simpliq performance data interpretation
Minimize use of the instructor for exercise
preparation
12. Design each feature to minimize requirements
for handbook reference
A
>.
Many of these guie11nes appear to be contra-
dictory. How, fo example, does one design an
instructional sy% m to provide maximum instructor
flexibility and co trol while restricting the
number of controls, control actions and'modi.:-.'
ficatiohs of variable data?
The answer lies,in a thorough analysis of the
training-to be provided in the simulator. The
training analysis must define the training ob-
jectives,qo be-addressed, the optimum conditions
neceisa-ry for effective training, the perfor-
mance parameters tb be monitored, and the cri-
teria associated with acceptable performance
and learnirig. In .addition, the training -
analysis must:Anticipate the needs of both the
student and 014_,instructor in practicing and-learning-and,Ik,directing and instructing.. The
functional requirements,in the training analysis
are reflected in the approach illustrated in
the following _example as they indicate antici-
,
116
TRAININGMANAGEMENT
SYSTEM
FLIGHT SIMULATOR ACADEMICS
Figure- 1. Training Maoagement System Overview
pations of:training events and instructor
involvement.
The example used here assumes the existence
of asystem by which academic, simulator, and
flight training are managed. Figure 1 is an
overview of a typical training management system:
Data used in scheddlIng training resources-and
aircraft and simulator maintenance as well as
data relating td syllabus preparation and stu-
dent grading are handled by the training
management system. Data are entered through
compdter terminals in appropriate areas of the
training facility.'
The instructional design approach illustrated
in this example is concerned with three major
phass of a typical.simulator training session.
The Tqrst phase includes the activities of the
instructor from the onset of his preparation
tasks up to he point of commencing-the train-
ing session.' The second illustrates instructor
conduct of an instructional task of moderate
complexity% while the final phase covers post-
exercise activities from the completion of the
final tuining task to the completion of the
debrief.
Each of,these three phases begins with a dis-
cussion of instructor activities performed for
a training flight in the aircraft, followed by
the corresponding activities and supporting
instructional system characteristics required
for a simulator traiaing session: A one-to-one
correlation is used to simplify the presentation.
The training tasks, task sequenceS: and grading
methods and criteria are representative of today's
instructional environment and its requirements. ,
'"Single Engine Landings" has been selected as
the typical instructional task. An instructional
environment relating to a mid-phase aircraft,
familiarizatlow session is used to provide.-,
overall contmnuity.
TRAINING PREPARATION - AIRCRAFT.
Prior to commencing the briefing the insthuctor
obtains a print-out of the flight grading form
by way of the remote flight Training Management
(TMS) terminal located in the briefing area.
The form proVides an up-td-date list of the
required training for this fTight since it is
.PLUMBER.UA ENS 1234
SCREAMER,LM LT 5678FAM -'- S 08/10/82TIME 1-30 4
AA A 8A UHEADWORK Q
P PROCEDURES .
BASIC A1RWORKP START MALFUNCTIONS
.
TAXI MALFUNCTIONSI ABORTED TAKEOFF
NORMAL TAKEOFFP CLIMBOUTIDEPARTURER BREAK TURN STALL ,
I ENGINE FIRE
A COMPLETE HYD FAILURE1
1. EMERGENCY DESCENTP VFR RECOVERYP PATTERNP NORMAL LANDINGSI !SINGLE ENGINE LANDINGS
,
A r..HYDRAULICS OF,EtANDING t ,
.-)P SHUTDOWN1
1
Figure 2. RepreSentative Grading Form
an output of the TMS training selection processl'esulting from the request for data. Theinstructor then obtains the complete trainingrecord file which includes all 'previously com-pleted and annotated flight grading forms andreviews this information to identify performancetrends, weaknesses, and.strengths. Externalfactors such as weather, traffic density andrestrictions, and practice landing field:se-lection are.introduced to complete the in-structor's overall flight planning requirements.
rind trainee proceed to the flight Aine,'reviewthe assigned aircraft maintenance data by wayof the remote aircraft mainteflance TMS terminal,and then perform the aircraft preflight con-cluding the preparations for flight.
TRAINING PREPARATION - SIMULATOR.
Instructor preparations for the corresponding(mid-phase familiarization) simulator session
/are identical to those required for the aircraft7 excepting those areas that pertain solely to theactual flight (i.e., external environmentalfacWs, air traffic instructions, eircraft pre-flight). The grading form is obtained 6, thei%tructor through the remote simulation TMS
, terminal located in the simulator briefing area.'The format, identical for both flight and simu-lator applications, is shdwn in Figure 2.
As an output of the TMS the form provides'the instructor with supporting informationtnat is not available through conventionalmeans. Note, for example, ,that a performanceindicator (dl'is identified for each of thelisted training areas or tasks preceded by theletters P or R. This indicator represents apast performance average accumulated from allprevious instruction. The letter P denotestasks or performance areas primarily introducedand graded while R identifies areas of difficultywhere performance has been substandard and addi7tional (beyond the norm) review and instructionis required. Tasks preceded by the letter I areto be introduced for the first time, while Aidentifies areas of advance instruction to beintroduced providing time and trainee performancepermit.
The format briefing is a fairly standard in-structionaV event and requi-res no special con-sideratiod here. Typically, it includes dis-cussions and review of the flight profile or plan,sequence of events, environmental factors, per-folvance expectations, procedural knowledge,and dther trainee fljght preparation require-ments, and other related areas as_applicable.On completion of the briefing, the instructor
Upon completion of the briefing, the instructorand trainde Rroceed to the simulator, pausingbriefly at the remote simulator maintenance TMSterminal to review simulator and instructional.system status. Upon arrival at the instructorconsole the START function is selected on thpinstructional system input device configuredas showd in Figure 3.
2
0
HELP
fl
VFART TASKINDEX
501101 a 171inn
FREEZE FRIT171 LdtmI CLEAR
8 9
ENTER
Figure 3. Instructor Input Device
117
This action initializes all r'esources of the
instructional system in preparation for the
activation of the training selection process.
To simplify the instructor's input procedures
all displays and repeaters are' blanked and the
following instruction is provided.
ENTER TRAINEE IDENTIFIER
preparation or planningpthode. Many of these
systems extend simulator turnaround times beyond
acceptable limits and often require extensive
instructor knowledge of complex and cumbersome
data manipulation procedures.
INSTRUCTIUNAL TASK-AIRCRAFT.
The flight has logically and sequentially pro-
The four digits selected on the numertc pad gressed on through to the pattern entry, and the
(Figure 3, extrime right) are repeatedlfor practice of previouslj introduced normal landings
verification on the CRT immediately beneath has been concluded. As an introduction to the next
tpe displayed instruction. Assuming the maneuver the instructor takes control of 4e air- .
p.'lelection is acceptable to the TMS/(incRrrect craft and d monstrates a single-engine-down-
or unacceptable entries are ignored and specific wind and ap roach to a touch-and-go landing
corre.ction instructions provided) ENTER causes (simulated y retarding of throttle to idle).
the displayed data to be cleared and the following Extensive verbal commentary is provided on
data and instruction.presented:cockpit procedures and aircraft handling tech-
niques including the shut-down procedures for
FAMILIARIZATION simulated engine fire on down-wind. In addition,
INSTRUMENTS the instructor must initiate and respond to
NAVIGATION., required voice communications with the tower ,
FORMATION , and monitor the traffic pattern'to insure safe
1 TACTICS . saircraft separation at all times.
161 CARRIER QUA6 . .
SELECT PHASE
The phase is isientified through selection of
the corresponding enclosednumeric, in this
case 1, for the familiarization phase in the
instructor's input device (Figure 3,, lower
left). Selection again causes ell display , ,
data to be cleared, and through direct inter-
face with TMS, initiates the ti'aining selection
process. After a short pause (during which an IN-
PROGRESS message is provided), the following in-
struction and corresponding cqntrol function are
displayed:
Upon completion of the demonstration the in-
structor returns the aircraft to its, normal
two-engine pattern configuration, and once re-
established pn downwind,.coqrol is returned
to the trainee.
The instructor then introduces the maneuver by
announcing a simulated engine fire and begins
to mon)tor the trainee's procedural response and
control of the aircraft, coaching and guiding as
necessary. In this capacity the instructor's
sequence of monitoring activities might typically
include:
1.
Do rNITIALIZE FOR TRAINING
Seleotion causes a complete training initiali
zation to occur by positioning and configuring
the simulated aircraft, selecting and depicting
the appropriate visual scene content to both the 2.
instructor and trainee, and displaying the appli-
cable monitoring and control data for thp initial
task at the instructor console. Upon completion of
theSe processes, the simprator is placed in freeze .
and a READY message is displayed to the instructor.3.
In review, the simulator preparation procedures
have included:
1, Selection of the START function to perform
a complete instructional system reset.
.2. 'ENTERin) the trainee four-dtgit identifier.
3. Selecting the desired instructional phase.
4. Activating the training intlalization process.
These procedures differ significantly from
those used with systems that employ extensive
and'time-consuming selection and modification
of instructional data in the special exercise
I
Procedural response to the simulated engine
fire as related verbally by the trainee;
the trainee reduces the throttle to idle
at the appropriate time to simulate the
engine shutdown
'.Response of the trainee to the aircraft
flight requirements (e.g., the addition.of.
power on the remaining engine to maintain
altitude and airspeed, maintaining balanced
flight, smooth basic airwork, etc.)
Establishing the proper downwind path for
the existing wind condition and arriving
at an optimum single engine abeam distance
4. Simulated transmission to the tower requesting
emergency clearance
5. Completion of landing checks and trans-
missivn to the tower requesting landing
clearance insures gear check is verified
and landing clearance is receivpd and
acknowledged
6. Basicoaircraft control during the approach.,
focusing on.angle of bank, rate of descent,
angle of attack, power setting, verifies
parameters at key checkpoints
7. Final approach for proper distance at roll-out, smogth basic airwork, proper alignment,minor corrective adjustments
.8. Touchdown in first third of the runway oncenter line at the optimum rate of descentand angle of attack
9. Acceleration on the-runway for aircraftcontrol on center line, and smooth powerapplication
10. Optimum pitch angle at lift-off; stablelateral control, smooth acceleration tosingle-engine climb speed, and a poisitiverate of climb throughout to patternaltitude
11. Proper angle of bank during downwind turnto establish an optimumdistance from the runway
In addition, and throughout the maneuver, in-struction is provided, the traffic pattern ismonitored for safe separation, all transmissionson the selected frequency are monitored, andtrainee performance is noted for post-flightanalysis, debriefing, and grading. The maneuveris repeated as required to achieve the desiredlevel'of skills developmerlt for this flight.
INSTRUCTIONAL TASK-SIMULATOR.
The corresponding simulatoV'training'sessionhas also logically and sequentially progressedon'through all the pattern entry, and the prac-tice of previously introduced normal landingshas been concluded. Upon instructor request,the instructional system is advanced to the
, CLOT 17**V C3
WIC.LO ONGINt LANOINGS INAS 10[0. ELAPSED 00 CA
ACTION .A4T09
OMIT IIP 4I (NAOMI MIT
01 0 7NAOTILI (ITj.. 11).-- .1- A, ...L-4I -, % , , -, A ,y,....Y".
1,- s i
0 CL1000IONITOA
DISALA7 ARIA
. CONTAOL5-M NAAOCCAv
.1
M CUAA TRACK
WKAD TOM To, (IC TURA 10VI VPS ASO 70 2710. CO AST 0NOG VI A $ C01 F'OTOLOAL7 011
PIA700I10IELICI1AS IV -AOA TOO
M M077400SMWSAWnem
CMLS0(2
.m omotrmiov047 24
10
101 El a
next task area (SINGLE-ENGINE L
Aas the
trainee completes the crosswindll t the.downwind leg. Through this simple request(discussed later) the required instructional
system features (visut repeater, CRT displays,an& supporting prograMs) are selected andconfigured to monitor and control this specifictraining event as it evolves (in addition to a
view of the visual scene of interes,t, theinstructor is provided with two CRT displaysconfigured as shown in Figure 4).
The prtwy display (on the left) provides agraphic depiction and tabular performance list-ing in the main areas, with supporting data Indcontrols to the right. The secondary CRT display(on the right) provides the instructor with aview of the applicable cockpit instruments andindicators with supporting problem and taskcontrols depicted to the right. The arrangementof the instruments and indicators'correspondsto the rear aircraft cockpit configuration tosupport the normal in-fLight scan patterns ofthe instructor.
Figure 4 also identifies the applicable controlfunctions from the instructorinput device(Figure 3) used to interact wittl each displaytype. Since the functional arrangements arededicated, the requirement to assign controls .
to a CRT and the accompanying input errors thatoften result from such an arrangement,havebeen eliminated.
Continuing with the task, the instructor takescontrol of the Simulated aircraft by activatingfreeze (Figure;3, upper right) ant alerts thetrainee to an tmpending initialization. As anintroduction to the maneuver the instructor
51N015 ENGINE 1.001011103
vfazil rAnsaj
00
CAWS I
--0
A
o0 0
rA
2 INITIALIZE TASK
MALIoNCIOOPPS
Et WIND
p TIWONATuNi
TINIMAINCE
2 yam:ay
2 *PONT ANO CO
111
111 in in in
Figure 4. Task Display Configuration
119
1/4
selects function 6 by way orthe input device
(Figure 4) to access the prerecorded demonstra-
tion. When selected, the supporting data
area of the primary display is updated as shown
in Fijure 5. Function 1 is selected, causing
the simulated aircraft to INITIALIZE for the
start of the demonstration. Upon completion,
a READY indication is provided (lower area)
with the system awaiting the release of freeze
to initiate the'playback. Function 2 is seldtted
to activate the accompanying prerecorded verbal
commentary.
'Upon freeze release, the instructional system
takes control of the simylated aircraft and
demonstrates a prerecorded single-engine down-
liind approach to a touch-and-go landing (also
simulated by retarding the throttle to idle).
The simulated single-engine configuration is
again used to maintain continuity with the pre=
viously describoraircrafttraining flight
maneuvers. Extensive prerecorded verbal com-
mentary is provided on cockpit procedures and
aircraft handling techniques, including the
shutdown procedures for a single-engine fire
on down-wind.Commvications wirth the tower
are included in the commentary.
As the simulated aircraft passes the 90-degree
position in the approach, the instructor momen-
tarily deactivates theverbal commentary to pro-
vide additional remarks that are applicable to
this specific trainee based on past performance
trends.' At touchdown the commentary is reacti-
vatet.
Throughout the demonstrated maneuver, the CRT
displays and visual repeater provide all normal
DEMONSTRATION
SIMULATED SINGLEENGINE LANDING
INITIALIZE
27.1 COMMENTARY ON
3 TERMINATE AI" PRESENTPOSITION IN FREEZE
RETURN TO PREVIOUSPOSITION
RELEASE TO TRAINEECONTROL
RESET DISPLAY
READY RELEASE FRZ
Figure .Demonstration Control
feedback to the instructor, including the per-
formance data (discussed later) depicted in the
lower main area of the primary display (Figure 4).
Once the simulitedaircraft returns to the down-
wind leg and while still under demonstration
control, a normal flightconfiguration is re-
stored ond the primary display is cleared of 40
all accbmulated track and performance history.
At this point, the instructor alerts the trainee
and returns control of the simulated aircraft
to him (Figure 5 - Function15) and then resets
the supporting dita area to its original data
configuration (F4nction 6). Once assured that
the trainee is in comfortable control of the
situation, the instnuctor introduces the maneuver
by announcing asimulated engine fire and begins
to monitor the trainee's procedural response
tand,control of the simulated aircraft, coaching
and guiding as necessary. In this capacity, the
instructor's sequence of monitoring activities
and Tupporting system control inputs might
typically include;
1. Procedural response uo simulated engine
fire as related verbally by the trainee
through the simulotor communications system.
The instructor verifies, the the correct
throttle is reduc,d to idle at the appropriatE
time by way of the ACTION MONITOR feature
(primary display),or the THROTTLE Position
status (secondary display). No control or
selection Inputs are required.
2. Response of the tta14e to the immediate
aircraft flight requirements (e.g., the .
addition of power on the'remaining engine
to maintain altitude and airspeed, main-
taining balanced flight, smooth basic air-
woek, etc.) by scanning the cockpit instru-
ments and indicators by way of the secondary
display in the forArd view s-cene of interest
in the visual repeater. No control or selection
Inputs are required.
3.,Establishing the proper downwind
for the existing wind conditions and
arriving at an optimum single-engine abeam
position by observing the aircraft symbol
and track historydepicted relative to the
optimum downwind corn-dor (primary display,
upper area) and abeam position (key check-
point B) and by scanning the abeam view
area of interest on the visual repeater.
No controller selection inputs are re-
quired.
4. Simulated trainee transmissions to tower
requesting emer,gen'cyclearance with the
simulated communications system. Frequency
is.verified on the ACTION MONITOR feature
(primary display) and the instructor re-,
sponds to the transmission or selects a
pre-recorded message. Available messages
are accessed on the PROBLEM CONTROL INDEX
(secondary display, Function 11). The
desired message (Figure 6) is activated
120
1 1 ,
kJ
Cla
MESSAGES
DOWNWIND CLEARANCE
ACKNOWLEDGEEMERGENCY
TOUCH AND GOCLEARANCE
E FULL STOP CLEARANCE
16
CEI
REQUEST GEAR CHECK
ACKNOWLEDbEGO-AROUND
COMMAND GO-AROUND
CONTINUE APPROACH
20 RESET DISPLAY.
Figure 6. Message Control
upon selection of the appropriate function-- in tpis case, 12. ,
5. Completion Of landing ,checks by observing
the ACTION MONITOR feature and the cockpitinstruments and indicators display. Jrans-mission to the tower requesting landingclearance is monitored in the simulatorcommunications system and the instructortransmits or activates landing clearance(Figure 6) and monitors for acknowledgmentand gear.check verification from thetrainee.
6. Basic aircraft control during the approachfocasing on angle of bank, rate of descent,angle of attack, and power setting as pro-vided by the cockpit instruments and indi-cators (secondary display); verifies parame-ters at key checkpoints by observing theaircraft track history and performance dataareas of the primary display reinforced by
the varying scene-of-interest view from thevisual repeater. No control or selectioninputs are required.
Final approach for proper distance andrunway alignment at roll-out by observingthe aircraft symbol relative to the idealdistance (checkpoint G) and the optimumpath reinforced by the visual scene ofinterest. Basic airwork and minor correc-tive techniques are observed in the cockpit
instruments and indicators display and arealso reinforced by the view of the visual
121
1
scene of interest; no cOntrol or selectioninputs are required.
8. Touchdown on the first third ef t runwayas indicated on the runway depicti n (pri-mary display) on center line as viewed fromthe visual scene of interest and at the op-timum attitude rate'of descent, etc. asdepicted in the Rprformance area of theprimary display (checkpoint TD), no controlor selection inputs are required.
9. Acceleration on the runway for center-linecontrol as monitored in the visual scene'of interest and smooth power applicationdepicted in the THROTTLES status area ofthe secondary,pisplay; no control orselection inputs are required.
10. Optimum lift-off attitude (performancedata area, checkpoint LO), smooth acceler-ation to single-engine'climb spee and apositive rate of climb to level f atpattern altitude as observed on thec2ckpit instruments and reinforced inipe visual scene of interett. The in-structor Vansmits the downwind clean-ance or activates the applicable pre-recorded message.
*11. Adjustments to angle of bank during thedownind, turn to- arrive'at an optimumdistance'abeam the runway as depicted bythe aircraft track history on the primarydisplay, supported-by the cockpit instru-ments and ttirvisua1 scene of interest forb4sic aircraft control and airwork. No
65ntrol or selection inputs 4re required.
Upon completion the rimary data display pro-, vides the instructor with a concise, graphic
and alphanumetic perfOrmance history of themaneuver. When combined with occasional noteson basic airwork and general performance trends
recorded as required by the instructor, A com-prehensive base of informatiom is made availablefor post-training analysis debrief and perfor-mance reassessment. A sample primary datadisplay depicting a completed touch-and-gois shown in Figure 7. (Some of the depicteddata is applicable to additional instructionalcapabilities to be addressed shortly.)
The HARDCOPY feature (Figure 4, Function 2)provides a printed copy of the entire displayas it appears on the CRT and is commanded bythe instructor at the conclusion of eachmaneuver. The print-out (which does not affectthe op rability of the system) occurs when se-lected thus avoiding storage problems and thedelays ften encountered when processing largeamounts f accumulated data at the conclusionof train ng. Once a print-out has been commanded,the tract history and performance data may becleared (Figure 4, Functions 3 and 4) at theinstructor's option in preparation for the nextmaneuver;
ELL,/ 163AWY13
1. ENG FOIE
SINGLE ENGINE LANDINGS rar:i0G0ELAPSED 21
16140 160 70 TEM. ,6C T0RB LOW 915 10 0 WT 70010 CG AF
AOC°, 1 7 GTO LL)
.12G 372 377 1 316 I 309 266 1 22e t ,5; 1 137 i 729 ,. 731
ALT 7119 13/1 / 243 "96 92, ( 703 ; 527 Z96 ; /740
1AS 178 ,68 169 ,6, 11,7 773 ,60 164 158
AOA 113 I 1 6 115 ' lig 11 f I 119 p, 108 1 118 120 I 119
Olg C36 c. l '2 : 0t3 0563 , 40104 I 0772 1 41827 D506 ' Co
$ANK 7.7 L I 4. ; 7,4 LIP ; L/7 ! 7,16 i ilp, 1 1 I 10
.7577igg2
7,9 7
73v9 0 1 Lo9 2
1 1979 7,9, ! ,,66 08 378
0: 3 I ,,10 1
Figure 7. Sample Primary Display Data
Note that a number of additional simulationcapabilities are provided to the instructor;these offer training assistance.and tdsk controlnot available during aircraft instruction.The supporting data area of the secondary dis-play provides access to a variety of these man-euver-applicable control features (Figure 4,
PROBLEM CONTROL INDEX). In addition to the
prerecorded messages discussed previously,environmental controls and engine failuresare also available for selection.
The index udes control of environmental
conditio s and parameters that are requiredin providing the flexibility fdr complete
instruction. Changing the wind component,for example, requires the trainee to alterhis approach path over the ground, whilevarying the temperature affects engine perfor-mance and resultant power requirements.. In-creasing the turbulence level makes smdtTO baSi,cairwork more diffiCult, while limiting visi-bility effects judgments which depend on thepilot's perception of the fltght area. Varying-
aircraft qpight and center of gravity cantotally dEange handling characteristics requir-ing altogether different approach techniques.Since the simulator provides the only safeenviromnent for the development of proficiencyin single-engine landings, the index also pro-
vides access to selected failures to realis-tically create the required conditions.
All selection and contnol functions are accom-plished in the supporting data area allowing
, the cockpit instruments and indicators to be44,continuously displayed for performance in-status
momitoring. All control actions (e.g., selectionof desired malfunction, wind component, air-craft Height,, etc.) are accomplished through
uie of the enclosed.numeric functions (Ref-erence 7, Message Activation). The se-et:ter--
A
A ./4conditions Are dis yed and the lower graphicarea of the primaryollisplay as shown inligbre .
7. Active malfunction titles are depicted in 1
the lower left and the letter M is displayedin the graphic maneuver area at the poioot of
activation. This information provides thenstructor with a continuous tiisplay of simu-lator s Ia tus during training and also facili-
tates pos -training man ver reconstruction
add debri fing.
. -
The I1ITIALLZED2 unction, also found in the
'supporting data a'nea immediately abpve thePROBLEM CONTROL INDEX, resq-sthe simulated'
,aircraft-to an optimum position irom whichthe task may be commenced. As a Wisual
aid to the instructor, the,position isidentified in a graphic area (Figure 7) by
depicting the correiponding function. Control
dnd system checks are performedxupon selectionand appropriate cuing messages provided toinsure the simulated aircraft is compatiblewith t4e demanded configuration prior to re-lease of cdntrol to the trainee.
The,ACTION MONITOR system depicted in thesupporting data area of the primary.display(Figure 4) is a continuously active proceduremonitoring display feature that lists all'
trainee actfons that are detectable by thecomputational system. The CLEAR MONITOR
function provides the instructor with a meansof erasing all displayed actions-in preparingto monitor a specific procedure-or sequence.As a part of the primary display, ACTIONMONITOR data is-included in all NARDCOPY print-
out selections.
Interaction with the final instructional systemfeature assigned for use with this task and to beaddressed here is also accomplished with the
primary display. Selection of the depicted
function (Figure 4) reconfigures-ft.supportingdata area with typical KSET and REPLAY controlfeatures as shown 14 Figure 8. As a visual aid
to the instrucOr selecting the optimum time.,increment, positions corresponding to the availablereset pointsJare depicted in the graphic area ofthe primary display on the simulated aircraft
, track history (Figure 7, Diamond Symbols).
These basic and straightforward control andmonitoring features offer thd instructor allof the flexibility required to successfullycomplete the prescribed instruction with mini-
mum distraction and maximum commonality with
aircraft training proceddres.
Obviously, the simulator provides much greatercapatfility for the control of training in manysignificant respects than the airtraft. BY thesame token, however, these capabilities must be
implemented in the simulator and in the instruc-tional system with operability and acceptability
as primary guidelines. These are neflected in
the example used here as unique simulator capa-bilities areorganized,tocsupport the methods
PLUMBER, U.R ENS 1234
SCREAMER, I M. LT 5678
FAM SESSION 4 ' d'/10/82PLANNED 1 30 ACTUAL 1 28
JAA A 1 BA 1 U
P ! HEADWORK1--- !
P PROCEDURES
P ; BASIC AIRWORK
P START MALFUNCTIONS XP , TAXI MALFUNCTrONS - X
I ABORTED TAKEOFF
P ; NORMAL TAKEOf F , X1
,--- -.-,
P CLIMBOUT/DEPARTURE
XR BREAK TURN STALLI ENGINE FIREA COMPLETE HYD FAIL RE XI EMERGENCY DESCENT 1 .
P VFR RECOVERYP PATTERN
1
X,
,
P NORMAL LANDINGS ..
I - SINGLE ENGINE LANDINGS XA HYDRAULICS OFF LANDINGS
-,
P SHUTDOWN X
Figure 9. Representative Grading Form
N .
simulator TMS terminal and integrated with theexisting data in the trainee history file. The
training-selection process is initiated and thegrading form for the next simulator familiari-zation training session is printed out. Its
contents and corresponding proparation require-ments are briefly reviewed with the trainee(who retains the form for subsequent reference).The instructional event is concluded by placingthe completed grading form in the trainee's folder
DRAWING THE CONCLUSIONS.
Although brief and 'limited to someiihat con- -fined areas, the information presented.should
,provide a sufficient basis to allow the mainissues identified at the onset to be re-addressed.
6. Was maximum instructor flexibilityand control of the problem provided?
-N7. Did the system prOvide continuousfeedback of trainee perfbrmanee?Was performance data recordinginstructor selectable?
9. Did graphic and perfOrmance datacorrelate wIh aircraft monitoringtechniques and procedures?
10. Were applicable advanced instructional
features made avaflable?11. Did exercisR preparation minimize
the use of the simulator?
12. Would'it be possible for new instructorsto operate.this system totally withoutreferencing a handbook?
One remaining issue needs to be resolve' be-.
fore any valid conclusion can be drawn. How
were these guidelines and objectives selected,and do they accuratelyNidentify the criticalesign areas to be addressed? Since they have
a (and for some, repeatedly) appeared innili<.procurernent specifications in therecent past, it is fairly safe to assume
that they.do.
SOME FLNAL THOUGHTS.
Have the two key design considerations ofoperability and acceptability been satisfied?Would the design approach promote ease of use?Is it one to which instructors could relate?
'There is a third and equally important consi-deration as well: Is it realistically achievable?
How well did the desigkapproach conform tothe twelve objectives and guidelines stated at
de onset? Specifically:
1 Were the number of individual controlsand functions minimized?
2. Where iequired ipstructor controlactions minimized?
3. Did the design minimize requirementsto enter or modify variable data?
4. Were non-instructional tasks auto-mated to the greatest extent possible?
5. Were diSplay switching requirementsminimized?
124'
8.
Efftctive and efficieqt simulator instructionalsystems can be designed and produced today, but
a few significant trends mus.10412 altered. First
is the specification that call% for minimum in-structor task loading,,but then demands maximumsystem capability combined with complete andcontinuously Wailable instructor control ofall system variables and, training features.Minimum task loading is normally stated as
. a design goal and maximum control as a require-
ment. Since the two are essentially incompat-ible when specified in this manner, the require-ment must obviously take precedence, often re-sulting in systems that are characterized byunnecessary and burdensome operations. This
situation is further compounded by the erron-I eous tendency to measure capability and train-ing effectiveness by the amount of instructorcontrol provided. These and other similar
trends serv only to mask system capabilitiesand discour ge effective utilization.
The answer, therefore, will not necessarily be
found in more systems and more technology, butmore likely from a willingness on the part ofkoth users and designers to make the true dis-
tinction between want and Oed when identifyingthe trainee, instructor and the system require-
ments. This is,a matter of analytical proceduretempered by self-discipline -- a process thatmust be totally oriented toward the specificlearning objectives of each system.
THE WRITER'S CONCLUSION.
This paRer has explored and expanded on somevery basic design concepts emphasizing practicalapplication and common sense. It has offered no
really new or innovative technology, but simply
PLUMBER,UA ENS 1234
SCREAMER,IM. LT
' a10/825878
.FAMSEMON4PLANNED 130 ACTUAL 128
AA A BA I U
P HEADWORK I ,
P PROCEDURES
P I BASICAIRWORK i.P I STARTMALFUNCTIONSP , TAXI MALFUNCTfONS - X
I ABORTEDTAKEOFF 'X '
P , NORMALTAKEOfF , XP CLIMBOUT/DEPARTURE X I
R BREAK TURN STALL -X
X-,--
I , ENGINE FIRE-
A COMPLETE HYD FAIL REt X
I EMERGENCY DESCENT X.P
P
VFR RECOVERY
PATTERN
X
,Xt----
P NORMAL LANDINGS -
I
AP--
- SINGLE ENGINE LANDINGS X 'II
HYo-R-AU-LIC-S OFF LANDINGS
SI-TUTD6W174- X
Figure 9. Representative Grading Form
Asimulator TMS terminal and integrated with theexisting data in the trainee history file. The
training-selection process is initiated and thegrading form for the next simulator familiari-zation training session is printed out. Its
contents and corresponding praparation require-ments are briefly reviewed with the trainee(who retains the form for subsequent reference).The instructional event is concluded by placingthe completed grading form in the trainee's folder
DRAWING THE CONCLUSIONS.
6. Was maximum instructor flexibilityand control of the problen provided?
-7. Did the system prOvide continuousfeedback of trainee perfbrmanee?
8. Was performance data recordinginstructor selectable?
9. Did graphic and perfdrmance datacorrelate wi9 aircraft monitoringtechniques and procedures? --
10. Were applicable advanced instructional
features made avaflable?
11. Did exercise preparation minimizethe use of.the simulator?
12. Would'it be possible for new instructorsto operate.this system totally without
referencing a handbook?
One remaining issue needs to be resolved be-.
fore any valid conclusion can be drawn. How
\\kxwere these guidelines and objectives selected,and do they accurately-videntify the critical
NN 41
esign areas to be addressed? Since they have
a (and for some, repeatedly) appeared in
bili specifications in the
recent past, it is fairly safe to assume
that they.do.
SOME FINAL THOUGHTS.
Although brief and 'limited to someWhat con- -
fined areas, the Information presented 'should,provide a sufficient basis to allow the mainissues identified at the onset to be re-addressed.
Have the two key design considerations ofoperability and acceptability been satisfied?Would the design approach promote ease of use?Is it one to which instructors could relate?
'There is a third and equally important consi-deration as well: Is it realistically achievable?
How well did the design.approach conform tothe twelve objectives and guidelines stated atde onset? Specifically:
1. Were the number of individual controls
and functions minimized?2. Where required instructor control
actions minimized?3. Did the design minimize requirements
to enter or modify variable data?
4. Were non-instructional tasks auto-mated to the greatest extent possible?
5. Were display switching requirementsminimized?
1211
EfActive and efficient simulator instructionalsystems can be designed and produced today, but
a few significant trends mus,10.0.1 altered. First
is the specification that caIn for minimum in-structor task loading,,but then demands maximumsystem capability combined with complete andcontinuously aVailable instructor control ofall system variables and, training features.Minimum task loading is normally stated as
. a design goal and maximum control as a require-ment. Since the two are essentially incompat-ible when specified in this manner, the require-ment must obviously take precedence, often re-sulting in systems that are characterized byUnnecessary and burdensome operations. This
situation is further compounded by the erron- ,
/ eous tendency to measure capability and train-ing effectiveness by the amount of instructorcontrol provided. These and other similar
trends serv only to mask system capabilities
1and discour ge effective utilization.
2
Jhe answer, therefore, will not necessarily befound in more systems and more technology, butmore likely from a willingness on the part ofizoth users and designers to make the true dis-tinction between want and pted when identifyingthe trainee, instructor and the system require-
ments. This is.a matter of analytical proceduretempered by self-discipline -- a process that
must be totally oriented toward the specificlearning objectives of each system.
THE WRITER'S CONCLUSION.
This paper has explored and expanded on somevery basic design concepts emphasizing practical
application and common sense. It has offered no
really new or innovative technology, but simply
proposed some logical and straightforwardapproaches in helping to resolve some ratherold, frustrating, and costly probleals-. It isnot the answer, but if your response to some
. -*-2.
or even just a few of the areas addressed waspositive, then perhaps it is a step in theright direction and merits further consider-ation.
125/126
.1
I
THE INSTRUCTIONAL SUBSYSTEM, A MAJOR COMPONENTOF THE TRAINING DEVICE
Or. Thomas J. HammellEclectech Associates, Incorporated
North Stonington Professional CenterNorth Stonington, Connecticut 06359
ABSTRACT:
The instructional subsystem has tradition-ally been a part of every simulator/training device. However, it has s oftenbeen a misnomer onsisting of little morethan an rator's station for thecontrollin of the sophisticatedsimulator. The necessary technologycurre y exists, both in the form ofhar ware/software capabilities andtraining/instructional techniques,, for theAvelopment and application of potentiallygh cost effective instructional tools to
enhance the instructional prOcess. Thus, a"bonafide" instructional subsystem shouldbe a part-of every training device. Thispaper addresses the methodology fordeveloping the instructional subsystem,suggests* several capabilities of thatsubsystem, and presents an applied exampleof an instructional subsystem in the formof SMARTTS.
INTRODUCTION
Jhe General Accounting Office, in theirreport to Congress concerning "How toImprove the Effectiveness of U.S. ForcesThrough Improved Weapon System Design"(1981), focused on the importance of theoperator to the overall effectivefunctioning of the weapon system, and thecurrent situation of insufficient earlyplanning to provide° an adequate operatorinterface. They estimated that humanerrors account for at least 50 percent ofthe failures of major weapon systems. TheyfOrther subdivided these failures intooperator skill level and proftciencylimitations, amongst other factors. Theirfindings attribute these problems to theincreasing complexity of modern daysystems. Two important issues are evidentfrom this recent investigation. The mostobvious is the need for improved training,as one means of improving operatorproficiency and reducing operator error.This points to the need for more effectivetrafning .systems, of which the instruc-tional subsystem is one contributingaspect,.albeit A major aspect. Second, thecorj3pjexity of training systems isin reasing n parallel with that of the
1).
, operational systems; requiring better human'factors 'design of the training systemitself (i.e., instructor and traineeinterfaces). Both of these issues point tothe need for effective human factors designof the training system from am instruc-tional process standpoint.
This problem of developing a more effectivetraining system is not- really new, sincethe training community has always beenconcerned with effective instruction. Thepredominant .emphasis in the design oftraining systems.4n recent years, however,has been placed on the engineering aspects,such as concern over adequate simblationfidelity and the technology to achieve thatfidelity. Although cost-effective fidelityshould be a major issue, proportionateemphasis should be given to the othertraining-related aspects of the training
. device and system. In an investigation ofthe training effectiveness differences'attributable to differing levels offidelity of major simulation characteris-tics (e.g., 120 degrees versus 240 degreesvisual scene horizontal field of vtew, fora shiphandling/shipbridge simulator),instructor differences were found to haveseveral times the impact on theeffectiveness of .training in comparisonwith any of the simulation characteristicsinvestigated (Hammell, Gynther, Gra$so, andGaffney, 1981). The relattve importance.ofthe instructor was not unexpected (e.g.,Caro, 197), although its strength wassurprising fn the presence of simulationCharacteristics selected for theirpotential impaCt on training effective-ness. The importance of this finding is
that the instructor typically embodies mostof those non-simulation characteristics ofthe training device and training system(e.g., exercise design, monitoring ofstudent performance, student feedback).The instructor is that "catch-all" thatmagically transforms the simulator into a
training device. This, of course, shouldnot be the case. The training deviceshould. be more than just a simulator, itshould. have capabilities specificallydesigned to augment the training process.
The training device is rapidly becoming a
most important element of the trainingsystem, and is often the..centerpiece of
liD
that. tratning system. This paper focuses
on, the simulator-based training system,
wherein the simulater/training device is a
major element of 'that system. Other
elements include the 'training objectives,
the training syllabus and other training
media (e.g., at-sea. training). The
training device,o_as 4t has often been
traditionally known, is nothing more than a
simulator. It' merely, seeks to imitate
aspects of the real world, such as
providing a44-adar display showing informa-
tion similar to that which would be seen ona the actual ship. The simulator is, of
course, limited .in that which it can
reproduce faithfully. Those aspects ,that
ane'simulated are presumably those deemedpeCessary to the conduct of an effective
traihing process, while many others are
simply not addressed by the simulator.
The training- device is more than a
simulator" (Wammell, 1981). Whereas the
simulator simply has a simulation system,
theitraining device has' both simulation and
training subsystems. The simulation
subsystem is that noted above. The
training subsystem, on the other hand,
'should consist of capabilities that are
designed' sPecifically to enhance the
training process via aiding the instructor,providing +nformation to the students, and
so op. The instructional iiiRystem (i.e.,
training subsystem) has traditionally been
a part of every simulatoe/training device.
However, it has often been a misnomer
consisting of little more ,than an operator
'station - for the controlling of the
sophisticated simulator; it has seldom
really provided capabilities to support the
training proess. The necessary technology
currently e>dsts, both in the form of
hardware/software capabilities and
training/instructional techniques, for the
development and application of potentially
high cost effective instructional features
to enhance I the instructional process.
Thus, a "bonafide" instructional subsystem
should be a part .of every training device.
The instructional subsystem should be
tailored based on the many considerations
surrounding' the particular training
application ,'and the training device.
Training assistance, capabilities should be
provided to sUPport ''.4the instructor, the
trainee(s), and training system management
(see Figure 1). The instructor support
capabilities should provide tools and
information for (1) development of training
exercises and support material, (2)
monitoring and control of the training
process, and (3) assisting in achievement
of an effective training process interface
with the trainee(s). The trainee support
capabilities are often coincident with
"(3)". Management support capabilities
should deal with inter and intra site
coordination, -the continual improvement. in
.,training cost/effectiveness, and so on.
Although management support capabilities
are a part of the _instructional subsystem
since they-directly impact the effective-
ness of training, they will be--only
cursorily addressed herein. .
The availability of sophisticated
compuIer-display technology and its
traditional incorporation in virtually all
sophisticated simulator-based training
systdms enables a. wide variety of training
assistance technology to be Lapply
integrated into the simulator;Med
training device. The issue, then, is
twofold: (1) instructional technology
should be, incorporated as a major part of
every training de4ite, and (2) the extent
to which the instructienal technology
shOuld be incorporated depends upon the
particular training objectives, , the
capabilities feasibly available on the
training device, and the many other "issues
and constraints surrounding the particular
training situation.
Third Generation Training System -
The issues currently faced in the
development and integration of instruc-
tional support capabilities represent the
start of the third generation training
device. The first generation training
device, which is typified by early efforts
at simulation (e.g., Link Trainer) had as,
the primary concern simulation fidelity.
The problem at that time could be viewed as
one of simulation fidelity at almost any
cost. It was simply necessary to achieve asufficient amount of fidelity in the early
simulators so as to achieve what was
considered as meaningful training. The
primary requisite methodologies and
research information at that time concerned
engineering-related issues; that is,
engineering design and simulation
techniques. These issues, of course, still
remain and will continue as relevant issued
for the develbpment of simulators.
As adequate simultion -capabilities were
achieved in several areas, the second
generation of training devices emerged.
This generation was no longer concerned
with achieving adequate simulation at any
cOst, since the engineering capabilities
were often available. Rather, the second
generation training device had. as its
primary issue cost effectiveness -- the
cost of design with regard to the
effectiveness of resultant training.
Rather than maximizing simulation fidelity
at any cost, the major considerations
focused minimal level of simulation
fidelity necessary to achieve requisitetraining effectiveness (ite., identifica-tion of the essential features of the-simulator, and their respective levels offidelity, required to meet specifictraining objectives and performancestandards). The variety of methodologiesdeveloped to address this issue focused ona systematic approach to the dtsign of thetraining system and/or the trainingdevice. Examples include the TrainingSituation Analysis (TSA) and theInstructional Systems Development approach(ISO-7' These methodologies typicallyidentified the specific behavioral tasks(i.e., including skills, Anowledgeo andtraining objectives) of the trainee, andthen sought to identify the Minimumnecessary simulator/tratning devicecharacteristics for tOeir achievement.was the case with tht first generationtraining device issues, the secondgeneration issue (i.e., cost effectiveness)remains today, and will continue to remainpertinent.
The training industry is' currentlyembarking on the third generation trainingdevice. As the second generation was afurther refinement of the first, so thethird generation training device and issuesrepresent a further refinement of thesecond and first generattbns. Whereas thesecond generation's emphasis was oni'dentifying the minimally acceptable levelof simulation fidelity and thus minimizingthe cost while maintaining effectiveness,the third generation will , focus onproviding training enhancement features togreatly improve the effectiveness 'of
training at relatively small additionalcost. This further evolution of thetraining device will augment the earliergenerations in substantially improving thecost effectiveness of training, byspecifically ,designing in instructionalsupport capabilities.
Recognition of the importance of theinstructional support capabilities has beena fundamental problem, but one that isbeing steadily overcome in both thetraining and operational communities.. Themajor technological problem is twofold:(1) the development of appropriatemethodologieS and resulting researchinformation pertaining to the design of theinstructional subsystem, and (2)utilization of these methodologies andinformation to' actually design theinstructional subsystem. The third
°Oheration training device does not, asyet, have, appropriatt methodologiesadequately developed, much less anempirical informatten base fromr which todraw, to gutde the design of the instruc-tional support subsystem. This is a
129
substantial problem currently facing thetraining device industry-
Methodologies for design of .theinstructional subsysfem have not beenformalized. Hence, the instructionalsubsystem has traditionally not beenincluded as 4 part of the training device.Instructional subsystems, of course, have
-been developed and implemented on certaindevices, although to an extremely limitedextent. The instructional subsystem shouldconsist of (1) instructor supportcapabilities, (2) trainee interfacecapabilities, and (3) training systemmanagement capabilities. The remainder ofthis paper will identify aspects of thesecapabilities, and issues to be consideredIn their development. Since one of themost effective means of instructing is thatof presenting examples to the students,this paper preSents an example of aninstructional subsystem known as SMARTTS(Submarine Advanced Reactive TacticalTraining System), which has been recentlyimplemented on a Submarine Combat SystemTrainer (SCST).
INSTRUCTIONAL SUBSYSTEM CHARACTERISTICS ANDISSUES
The instructional subsystem, as notedabove, should address the instructor, thetrainee, and system management (Figure 1).Each of these are considered as importantelements of the instructional subsystem(Hammell and Crosby, 1980; Hammell, 1981).This paper, focuses particularly oncapabilities to support tile instructor andtrainee.
The Submarine Advanced Reactive TacticalTraining System (SMARTTS) (Hammell andCrosby, 1980) is a pioneering effort in thedevelopment of instructional supportcapabilities, in that it is currentlyintegrating a sophisticated instructionalsupport subsystem with an existingsubmarine combat systems trainer. Thisproject developed a substantial body ofinformation regarding the -design andapplication of the instructional subsystem,and may' be used as a departure point fordirectly addressing many relevant issues.The SMARTTS project has, for example, (1)developed and applied some of thepotentially useful techniques for thegeneration of information to designinstructional support .characteristics, (2)has designed a variety of instructionalsupport characteristics which are currentlybeing implemented, and (3) is planning toembark on a test and evaluation of theinstructional support subsystem and itsparticular set of characteristics. Theproblems identified, the experiences, and
the techniques developed during the SMARTTS
project brovide considerable insight into;
many specific issues- relevant to the
instructional subsystem.
:The instructidnal subsystem is addressed
below in terms of (1) design methodology,
which discusses the overall approach to the
design of the instructional -subsystem; and
(2) training assistance technology, the
sAciftc capabilities, of the instructional
support subsysteM, Several issues are
addressed- under each of these, with
examples drawn from , SMARTTS as apPro-
priate. 'Also, concepts for future
development are introduced.
Design Methodology
Hi,ghly structured methodologies are
available for the selection and design of
training media (e.g., Training Situation
Analysis (TSA), Chenzoff, 1965;
Instructional System Development (ISD),
NAVEDTRA 106A). These methods, which have
been demonstrated as effective, are
primarily intended tO address the
Student/trainee interface with the training
media. For example, a major use of the ISD
approach is to determine simulator/training
device fidelity characteristics .(e.g.,
level of visual scene fidelity). These,
systematic approaches relate the Araining
device/training system characteristics to
specific trainee operational tasks. This
of .course is necessary. However, such, an
approach does not directly address those
training device/training systems -capabil-
'ities that can directly augment ;the
training process but are not directly
linked to the operational tasks (e.g.,
external feedback). These other
training-related characteristics do not
sierive from the operational _tasks, but
rather derive from the base of generic
training researth and methodologies. There
is obviously a need to design the training
syste0 vith regard to both operational
tasks* and generic training methodologies.
In practice, both are ad%essed, althoughthe-latter is not done7so l'ormally% ,
INSTRUCTOR INTERFACE ANALYSIS. The
available methods . (lave not adequately
addressed the initructor interface (i.e.,
including various forms of instructor aids)
to, the training /invite/training system.
Relatively little emphasis has typically
\ been placed on the design characteristics
\ of the training device/training systew to
, directly assist the instructor in
\conducting the training process. Needed
is a structured approach for the design- of.
the instructional subsyttem. The analysis
Should be. quite similar to that of ISD,
S
although not focusing on operational' tasks,but rather focusing om instructot tasks and
training methodological prinCiplet. This
type of analysis has been employed' in the
past, - although to a.- very limited extent
(e.g., the SMARTTS'example presented later;
also, Charles, 1978). Several' elements of
an -ISO process for the' instructional
subsystem might include the following
elements:
Instructor task anal)sis -- a
detailed analysis of the functions
and jasks to be performed by, the
'instructor on atl aspectt associated'
with ,his training role. . This
analysit should form the basit for
the 'design, of, training "system/-
training device characteristics to
support the instructor in conductirig*
an effectiVe training process.
Analysis of instructor loading -- anestimate should be made of the load
placed on the instructor in the
performance of his functions and
tasks. This load should address for
each task the difficulty, time to
perform, frequency, available
resources, accuracy of. performanCe
required, and so on. The intentionis to- develop an accurate profile of,
the loading placedon the instructoracross his functions.and tasks, and
his expected performance level on
the basis 'of the training system
design.
130
Identification _of 'critical
instructor fqnctions and tasks --
based on the above two elements,
thote instructor functions and tasks
that would benefit ' substantially
from other assistance should be
For examPle, data
recording is a task that a good
instructor -would perform frequently,
so as to have good' postexercise
feedbatk information. Data
recording can be extremely cumber-
some and time consuming; further=
more, it is a task that'can often be
readily-oerformed by tt,e computer/-
training device, _thus off loading
the instructor to.devote more Of his,time to those tasks that only he canadequately perform (e.g., monitoringstudent communication).
Determine instructor .support
.tapabilities -- specific instructor
support capabilities sho41'd be
determined via a .cost/effective
.trade-off analysis )based- on the
above elements, alternative design
approaches for providing -support
capabilities, and their expected
effectiveness in assisting theinstructor and enhanCing theeffectiveness, of the trainingProcess. A-media selection approachto support the instructor functionsand , tasks should be performed,similar to that accomplished. underISD for the trainee.
This type of'Iront-end analysis is devotedto the instructor, and should result in the-generation r of cost/effective trainingdevice/training system characteristiCs.Although aimed at spesifically supportingthe instruCtor, their usefulness is basedon enhancing t training process 'withregard to cost effective, considerations.Many of these Jharacte,istics will alsodirectly imp.kt the trainee interface aswelle since .man of the instructor tasksdeal with the tr inee interface.
4-
TRAININL .400DOLOGY AIDS ANALYSIS. Asimilar'nnalysis sheuld be conducted -todetermine support capabilities with regardto generic .trainihg methodologies (e.g.,importance of specific external feedback).This the of. analysis, which would impactinstructor and trainee interfaces, wouldhave a similar series of elements, as
Tra pg Methodology characteristicsi htification of those specific
train, g process characterTsticsthat a relevant to the particulartraining\situations supported 113, the
training device/training system(e.g., iMmediate graphical 'feedbackof tactical parameters). This wouldbe siMilar to a task analysis, butconducted to identify the specifictraining methodology characteris-tics. The intention is,to identifyall those characteristics of''thegeneric training Methodologies thatmay be important for this particulartraining process. .
Priorttization of trainingmethodology characteristics -- eachof the training methodologycharacteristics shouild be evaluatedwith, .regard' to 'its potentialeffectiyeness during the trainingprocess, for the particular traihingdevice/training system 'underconsideration. The result would bea prioritization of those character7istics on the basis of their likelytraining effectiveness iMpact.
. .),stem/device capability analysis --the capability of the training'device/training system to supporteach of the training methodology
131
characteristics should be deter-mined. This should be done wtthreg rd the specific aspects ofthe tr ining situation and
methodolog characteristics desiredto achieve an effective trainingprocess. For example, if a complextactical problem is being .trained,delayed feedback regarding detailedtactical and performance parameters,may be desired to enable detailedanalytical investigation ofparameter interrelationships; theability of the training system/-training device to provide thedetailed feedback information in an
appropriate form (e.g., on a largescreen graphical display) should be
4evaluated.
s Identify . instructional supportcapabilities .7'. based od the aboveelements and a cost/effectiveness,trade-off analysis, specific supportcapabilities would be 'identified.
The resu)ts could identify deficientcapabilities on the existingtraining device/training system, or'cduld identify/select capabilitiesto be developed .in a new trainingdevice.
The above elements represent an outline ofsteps, similar to that of the ISD process,for the design of the instructionalsubsystem characteristics. Obviously, all
of the steps have not been detailed, norhave their specific procedures beenadequately developed. Rather, they areintended to indicate that a structuredprocess similar to that of ISD sbould beaccomplished for each of the major ,aspectsof the instructional support subsystem.Furthermor, the instructionpl supportsubsystem includes several major parts of
.the training device/training system, eachof which requires B distinct 'analysis. . It
is important to note that the aboveanalyses should be 'adequately developed,but done so quickly with a minimum offrilTs. The,analysis should be tailored tothe available information, the sophistica-tion of the training device/trainingsystem, and the available resources. Manyof the characteristics could be evaluatedfrom the standpoint of.a ghOpping list of
4potentially useful charoterigtics; and thefinal set selected from the list.
This approach is appropriate for the.designof a new training device/training system,as well as that of improving an existingone. SMARTTS is an example of the lattercase. The methodologies are likely to be'slightly different for each oase, butbasically the same.
SMARTTS DESIGN. The SMARTTS instructionalsupport subsystem design was the result of
several analyses similar to the above, over
a period of years. The initial concepts of
SMARTTS stems from an investigation of
submarine tactics training (Hammen, Sroka,
and Allen, 1971; -Hammell, Gasteyer, and
Pesch, 1973). Of the variety of
recommendatibns that were deyeloped
regarding basic and .advanced submarine
officer training systems, many addressed
the need for appropriate instructional
support capabilities. Pigure 2, taken from
the latter report, lays, out the fundamental
instructional support capabilities that
were later to be developed under SMARTTS
The concepts developed in these' earlier
investigations'-were later fully deyeloped
in specific detail on the basis of a front
en4 analysis conducted under the,SMARTTS
project. This later analysis included
several of the aboverecommended steps,
focusing on both the instructor tasks and
generic training methodologies. This
analysis also selected which of the
Capabilities recommended in the earlier
studies were to be developed in-the initial
preprototype SMARTTS, and which will be
developed in subsequent developmgntal
versions of SMARTTS.
An overview of the instructor function and
task analysis conducted _for SMARTTS is
presented below.
Instructor Function and Task Analysis. An
overview of the analysis of instructor
functions ,and taSks is presented in Figures
3, 4, 5, and 6. This analysit identified
the major functions conducted by the
submarine tactics instructor, the specific
tasks he performs with regard to each'
function, and estimated the time loading
and difficulty associated with performance
of those tasks. The major instructor
functions were determined as follows:
exercise development
o: monitor and control of the training
processbriefing (pre, freeze, post)training system management
Monitor and control of the training process
.is the instructor function given th, .
greatest recognition by training device
builders (Figure. 4). When i training
device, is employed, the instructor must
set-up the exercise, .control the device
during the, exePtis0 (e.g., maneuvering
targets), monitor the,trainees' activitfe,s,
and provide some amount of training during
the exercise (e.g., guidance and feedback
to the trainees). Capabilities, in one
form or another, are provided on most
training devices to enable set up. and
132
control of the exercise. Some capabilities
are often provided to permit monitoring of
trainee activities, and often some
recording. Occasionally, capabilities are
available for providihg some training
assistance, such as feedback. However, all
too often the monitoring .and Control
.capabilities installed on a training device
are designed from an engineering-control
standpoint, rather than from an instruc:
tional process control viewpoint. Specific
instructor tasks, A-1_, are included in
Figure 4, along with generic training
issues and consideratiohs under each task.
Also, the multiplicity of branching paths
between tasks -is also indicated to some
,extent by the (A) and (E)' symbols. The
information contained in Figure 4 is a
summary of the type of analysis information
upon which the SMARTTS characteristics were
based. For example, the instructor has a
task to monitor the scenario (i.e., Task
C). A consideration under that task,
particularly in a complex system, is
providing cues ta the instructor regarding
current and upcoming scenario events.
Capabilities were developed in SMARTTS- to
provide cues and alerts to'the tnstrUctor
regarding various tactical actions on the
part of the target, as well as when various
performance indicators (e.g., probability
of ownship counterdetection) went beyond
preset standards. These capabilities,
which are easily provided by the .computer-
controlled traiding device, reduce the
instructor's scenario monitoring load,
enabling him to devote a greater proportion
of time to monitoring'the trainees (i.e.,
Task H). Other examples are given later
under Training Assistance Technology.
Briefing of tile trainees (Figure 5), in its
various forms, is probably the most
important function performed by the
instructor. In this function the
instrxtor has a direct interface with the
trainees, 'and provides them with specific
information to reinforce and/or modify
their behavior. In a simulator-based
training system briefings may be given
prior to, during a pause in, and following
completion of the session on the'simulator
(i.e., pre, freeze, and , post briefings
-respectively). Although essential to the
effectiveness orthe training process, the
emphasis' placed on briefings varies
considerably across training establish-.
mentt. Furthermore, the capabilities
provided .as part of the training device/-
training system to assist in conducting
effective briefings are often quite
limited. Examples .of capabilities to
support briefing tasks will be provided
later below under, Training Assistance
Technology.
TRAINING SYSTEM
TRAINING osnoTivEl TRAINING DE VICE
SIMULATIONSUBSYSTEM
TRAINNGIBOG RAM
I NST RUCT: ONA LSUBSYSTEM
INSTRUCTOR IRA NEE
I.
MANAGEMENT
EXERCISEDEVELOPMENT
FACI LI TIES ANDCAPABILITIES
MONITOR &CONTROL TRAINING
ON DEVICE
VII PINGPRE POST, FRE EZE
TRAINING SYSTEMMANAGEMENT
FEEDIACdOISPLt
Finure.I. Focus of instructional Subsystem
FIRE cothROL 1-4 SONA;
PERFORMANCE MONITOR
I NERVATIONS MEASURESI RFO)4MANCE
1
CUES,
SUMMARY
HISTORY7RACKDIAGNOSTICS
KNOVMOGEAILEOPPONENT
a
Fioure 2. Instructional Support Capability Reouirementsfor tIe 2IA Series Submarine Combat System Trainers
(from Hamel), 4asteyer and Pesch, 1973).A
133
I.
a
Ct
OTACACOT.CO- Tgus.1 IMAM iffr0.
TO..00sC mscnas DArn
0-
IS
MulctT T
SCS4..14.0ROY LILY..
TIOTILAGStiu ICTI
t at ROOT S
un 0,10114WIOno:MO.1S(orofliAC 0.05.14.10(MOCK! ST TICTSep
TAMOTUTGIS
CAP AS OW TOat C009OUT 0400t VT 0
Si T-1..txtOCTIt
011.1CONIts
MSG.CAM RATSCIssAs0
ICTVIC.<
SOL ssTUTTOITEM! NU
oo...141.0tSCIMASIO
I VALUATITOT LASSO
ACTG..Mel DS
TOO
4.0.4YJAACI MT MT
4.0* Csa.n.TI
111011Cf
TOO TAILAIrlCCISSAYto Fs'to TaCTICAL0.01116.4n0o
.IICAVSOALSTANOMICTS
tOsselTust(551555
nut sistAAAAA TatTom I VI SitGIRT RA
TUATTOst
tTOLTCOOst 01.11.0",
T 0.1.1 TS
YOTTS T
STacssRACT .4 Mt....how. St
l AU OTCOI OA TTON
..1.0t TOT'. 0,sloptstassussrt
scaNnwGTJeCT itSCIOTo.o 0.00&VT". O.TAT RCM -
T
WTI UAL. TO'S
oltACtICAL vvaistTACOCAL
( RAC 004
.TVTST,C.ATITtass TIVIS
Al T Sam,.TO niumC 011.CortPr naKoomoLI 0.0.CA505
rioure 3. Exercise Develooreot
TS/ SToa, Tv PoMAC OW 0,J.411 S
MOTICTsOsI.C.4141ST LOTS 0,(TWAT.,("outs,
sAlts0COW, TIO
I COMMA rtI IKTI1
.0. T. C.v.,none.,"Tenni IS
Ann,K flKed
11,01AAs
vl RSA..Clen.. 0 T 447C.100.
"IntSoo ...AK.
PAIOAACI
ell Dela 4s , Oel .,
..,10.1
ion.01.4f0e
Flour. 4. ,Monitor ind Control Trainino 1
C.
s
0
o-
0-
H STAN,ICtImai
4NILICTNIPAIS
111111,
ONINUAIIDOCZNINTA TONNOITINCTOROL8141
0--
LOKI TINYITORAGI
1700 MARINel RP ININANCI LOKI. TIANSUTHOUT 111141104MPIICTINS CAPANAIT.OPIRAno3100101NC
IDI TIANNI
011.11COVIS
RIOIAMANTSTRANN01CONNIAND
LIO,Nue RACINDAI
YOUTS
/LSI
DITIenlICIPON
MINN/WRNS
SercA7ANYLADLIT.NOTINAORAIM
SCOLAT
T4C1,0-.4.LOnTACNCALy..45101Palli Tst
111.1C71C1LAW Of0415*001 =MR/1111 1000MICCIINN TO ACTONOrTA/N IsolLAN LOCCONNI
110CNCTSIATI
NO-PCSLIJI
ALTIINMINVti
t
lot5155001=
Di TeRNINNTRANNI
INFO*. TON
Nrolowra00111Nr1FAA Cnvltan
HII/AU/ATIMANN.
715500A44I
INAGNOINCSTANOANIN
fuo:l NoNA 0.11OnAlCini-lusacTAI,
INAROCOr
Noure 5. Briefina (Pre: Freeze. Post)
&CLIC' DATA"MIR
IASI MOPLININUTVTUN
DoNNANISAONLIOvTIWor TA
IMO.CCtkatC11.
.10,0111NONNI
CONOUCTSTA TriTICAt
TIMlout T-141INvSTATI MINIMAL
ocAls viioNSI XIIINNINIAL
MUG.TRAININGDATA
r.'1101CrTRAN41NSme MOOD
TRAININGIf IllatVIMI
COMMANICATC11CRAANttDIVItOk TerrIWO IIKTANNITNag TRAININGNATIONAL
ONETRUCTOR
Cat 010k1
TRAiNING-
CONTROL
'RANGING
MAILS.TRAININGBIANCA 1St s
Ftotire'f. Trainino System Panamement,
135
141AA,II
DATA
GAN/NgTAILIS
4
NO
NO
l1 1:471tOCCCWPCeg
MORI.(TANAriOtImARIASWWI MO4ConCAtTRAANOK
TRAINING
INCONANONIA NOS
CuRNICIALNIINDGMBrumMOONICATION
TRAN4154NIRSONAINCIpoRNATIONIIICALsAKI00.01045
NRA4
.1
Exercise development (Figure 3), which
proceeds the conduct of training sessions,
consumes a substantial portion of an
instructor's time. This is a complex
function wherein the iostructor actually
develops the training program and its
supporting materials. It is complex in
that it often requires the creative design
of the course and its training exercises,
with the instructor drawing upon the
state-of-the-art in training methodology to
achieve an effective training process.
Whereas some capabilities, although usually
quite limited, are often provided on the
training device/training system to support
exercise monitoring, control and briefing,
usually little if any capabilities are
provided to support exercise development.
It is the exercise development function,
interestingly, that determines how the
training device will be employed within the
training system. Examples of system
capabilities to support exercise develop-
ment are presented later under Training
Assistance Technology.
Training system management, similar to
exercise development, is typically an
overlooked function of the instructor, and
one with which he spends substantial time.
A wide variety of tasks, issues, and
considerations may be associated with
training--system management. Furthermore,
each of these.may depend on the particular
training establishment under which the
training---deviteitraining system operates.
Nevertheless, several training system
management tasks are common across training
establishments, as indicated in Figure 6.
Perhaps the most important common Set of
tasks involves monitoring the effectiveness
of training over time. For example, it is
highly desirable to identify those training
objectives (i.e., with associated
exercise's) that the typical trainees can
readily perform upon entering the training
program; and likewise to identify those
training objectives (and exercises) that
the trainees typically., have sUyantialdifficulty.with, even near the end of the
training program. Ideally, reduced
emphasis would be placed on the former
while increased emphasis would be placed on
the latter training objectives in subse-
quent training programs. For another
example, alternative training methods and
training materials should be periodically
evaluated to continually upgrade their
quality and the effectiveness of the
training process. These are major
management tasks directjy impacting the
cost/effectiveness of traillinT, and
subsequently the operational readiness and
effectiveness of.,weapon systems.
Each tiMe a training vercise is run dn the
simulator valuable training performance
136
data is generated. These data could be
collected over time and used as the basis
for the above evaluations, and other
training system development activities.
These data would be extremdly useful to
most levels of managdMent for evaluation
and planning purposes. Relatively little
of these data are recorded and used today
with regard to most training device/-
training systems. This is one example of a
set of training device/training system
capabilities that could substantially
augment training system management.
The instructor function 'and task analysis
conducted on SMARTTS was integrated with
the generic training methodology analysis
as indicated in Figures 3 through 6. This
analysis resulted in the identification of
issues and constraints deemed important to
the conduct of an effective training
process for submarine officer tactics
training. These form the basis for the
subsequent design' of the SMARM preproto-
type system (i.e., the instructional
support subsystem capabilities which make
up SMARTTS). A similar process should beconducted as part of the design for each
training device/training system.
The next step of the design process is the
trade-off analysis to identify and design
the specific capabilities to be included in
the particular instructional support
subsystem for the particular training
device/training system. This trade-off
analysis should be -conducted similar to
that of other cost/effective analyses done
under fhe ISD*process. The results of the
analysis conducted for SMARTTS, although
not the analysis itself, will be summarized
later in this paper under Training
Assistante Technology.
Other Design Issues. The above recommended
analysis has as its objective *the
development of the characteristics of the
instructional subsystem from the stand-
points of the instructor functions and
application/enhancement of effective
training methodologies. Several other
issues should also be considered during the
designInof th)s subsystem. Of primary
import ce is the flexibility of the
nstructional subsystem with regard to
future upgrading. That is, the subsystem
should be designed with efolution in mind.
As the training device/training system is
used, and as information is generated
regarding the effectiveness of its various
charcteristics, the instructional slibsystem
-Should be continually improved. More
effective training methodologies should be
incorporated, new pe.rformance indicators
and feedback displ'ays developed; and so
on. As .'the instructors continually
increase their experience in raining andwith the device/system, as the operationalsystems continually change, and as thetrainees continually improve theirperformance the training device/training .system should ltkewise change. to refocustraining emphasis and to continuallyimprove the effectiveness of training. Theflexibility, to permit evplution in thisregard must be designed-in at the onset,with a conscious effort made to continuallyupgrade training. The" engineeringcharacteristics of the system should besuch that the major training-relatedcharacteristics can be readily modified,and new characteristics added. Thisimp,uts the design of the hardwarecomPlonents and, the software architecture.For example, new performance indicators Sndfeedback displays will always be required.Hence, the software architecture should bedesigned to permit easily adding newperformance indicator algorithms, which mayrequire access to a wide variety ofscenario parameters; generic displayformats should be set up which enable theconstruction of new displays drawing uponthe variety of parameters generated andrecorded, and a variety of graphicalformats. Although details of system designflexibility are not discussed in thispaper, they have teen incorporated into theSMARTTS system. It is anticipated that asSMARTTS is used demand for new capabilitieswill continually exist. The SMARTTSsoftware architecture is designed toreadily accommodate such modifications andadditions as relatively minor softwarechanges. Some of the capabilitiesidentified under the SMARTTS front-endanalysis that would further enhance thisevolution/modification process have notbeen included in the preprototype, put.areobviously considere& for the furtherdevelopmental units.
It is apparent, as indicated above, thatthe training process conducted using a
particular training device/training systemwill change over time. Furthermore, in a
similar vein, at any given point in time'
the complex simulator-based training systemis likely to be used for a wide range of
_training levels and training needs. Hence,system flexibility should also address theready tatloring of the instructionalsupport capabilities to a particulartraining situation. That is, differenttypes of support capability characteristicsare required for the different levels oftraining likely to be encountered. Many ofthese can be determined in advanced, whileothers must be tailored close to the timeof initiating the training exercise. Thetraining device/tr6ining syitem, and its,
instructional support .subsystem ,should beflexibly designed to .permit on 'the spot
137
tailoAlk of the training exercise,performance indicators used, feedbackdisplays and their.formats, and so on. For
cexample, it may be necessary to generat -modify the next training exerci especifically on the basis of performance i
the previous training exercise. ,Also, theperformance standard on which a cue is
presented -to the instructor may likewisehave to be modified shortly pnior torunning the exercise. . The systemarchitecture should be flexible to enablethis type of tailoring with relativelylittle effort.
As noted in the introductory paragraph tothis paper, human factoring of. theinstructor and trainee interfaces is asimportant in the training device/trainingsystem as it is in the operational weaponssystem. Too often this aspect of design isQverlooked. Many of the instructionalsupport capabilities provided to theinstructor could simply be included underthe cathory. of "good human factorsdesign", since their main purpose is tofacilitate the, instructor/system interfaceand improve the effectiveness of hisperformance. This includes, for example,the specific human computer/training deviceinterface for exercise set up, control andmonitoring. An effort should be made tominimize the amount of training 'necessaryfor the instructor to be able to operatethe training device. On SMARTTS, forexample, the traditional alphanumeric/-function-key input devic p. was replaced witha touch-sensitive- Plasma display device.The plasma input device has many advantagesbeyond those 'which have been incorporatedinto the preprototype SMARTTS; neverthe-less, those plasma-related input featuresincorporated into SMARTTS'should provide a
substantial increase in the effectivenesswith which the instructor can communicatewith the training device. The instructordoes not have to learn, for example, a
large number of input codes for effectingcontrol of the training device; he does nothave to remember the options available tohim, and look up the appropriate codes, or
coordinate the options from the CRT to thealphanumeric* input keyboard; he does nothave to input a sequence of alphanumericcommands on the keyboard. Rather, thecomplete set of input commands arelogically laied out ill a tree structure,and only those commands available ta, him atany point in time are displayed on theplasma entry device. These availablecoMmand5 are displayed in English. Toenter a command, he simply points with hisfinger to ;the appropriate command. Thistype of interface is user friendly, and toa large extent self-teaching through use ofthe device itself. Careful considerationshould lie given to tar many other aspects
1 t_
of the instructor and trainee interfacqs
with the training device (e.g., location of
instructor information, location of trainee
feedback displays).
Training Assistance Technology
Jhe remainder of this paper identifies'
aspects of training assistance technology
that Ave important elements of the
instructional support.subsystem. These are
presented with regard to each of the four
major instructor functions identified
above. Specific examples of characterit-
tics are given, pertaining to the SMARTTS
system, where applicable.
EXERCISE DEVELOPMENT CAPABILITIES. (Figure
3). Exercise development may include thedevelopment of the training objectives, thecourse syllabus, and so on. Although these
capabilities can be supported by parts of
the instructional subsystem, under this
function and under the training system
management function, they are not directly
addressed - here*. Rather, 'those
capabilities more directly related to the
training -device activities itself (i.e.,
conduct of scenario exercises) are
specifically addressed -- herein, and
summarized in Figure 3.
Instructional Support Material. A variety
of material should be available to the
instructor as his primary resource for
yieveloping the training device exercise
scenario. Most of this material would
likely be in the form of handbooks,
although some .could be available in a
computer data base. This information
should inclu.de (1) a complete set of
tactical and behavioral training
,o6jectilget, cross-refey:enced to exercises
and operational situations; (2) -a complete
.set of tactical reference information, such
as Naval Weapons Publication (NWP) series
documents; (3) a set of guidelines
regarding training methodologies, training
device 'operations, and training exercise
development; and (4) trainee input
characteristics information, including
previous trainee/team performance
information, and training/tactical needs.
The -Ifformation related to trainee input
characteristics could lie available on a
large information base assembled from
previous training exercises of that
particular individual or team, together
with comparative data on the population of
similar trainees, and so on. The necessary
data would te collected under the training
system 'management function of the
instructor.
The SMARTTS system, to support this
exercise development function, has
developed documents to aid the instructor
in the design Of exercises. SMARTTS has
developed not only the standard system
operating handbook, but also an
instructor's handbook which addresses how
to design exercises, alternative training
methods that can be employed, how to
conduct a training exercise, and so on!
Additionally, a comprehensive set of
training objectives have been developed
along with a set of training exercises
cross-referenced to these objectives. A
training structure has been developed
across multiple levels of trainee
proficiency, with diagnostic information
supplied to evaluate the trainee/team input
Characteristics; remedial training
activities have also been included to helpassure that all trainees meet the mininum
entry level standards. Furthermore,
training materials for the initial tactical
course, including exercises, have been
developed utilizing the SMARTTS capabil-
ities. This approach not only has provided
the structure and guidelines with which
instructors can develop subseouent training
courses and exercises, bdt has also
provided an example in the form of the
igitial such course.
Exercise Development Simulation. A
flexible high speed simulation capability
is necessary to enable the instructor to
review existing exercises in the library,
modify an exercise, or create a new
exercise. This fas,t time simulation
capability is necessary in support of tasks
"B" through "I" under the exercise
development function (Figure 3). Review,
modification, or development of an exercise
can be an extremely cumbersome and time
consuming process. This is particularly
the case where calculation of the
developing tit** line sCenario interaction
is necessaryto'assure appropriate scenario
events in support of the exercise
objectives. Furthermore, it is well
recognized that scenarios often turn out
substantially different when run on the
device than as they appeared on the desk
top plot sheet of the developer. The fast'.
time simulation should provide capability
for the instructor to:
(1) call up an existing exercise and run
it in fast time to any desired
point, retrace steps, .and so on;
(2) investigate alternative actions fromany desired time to any other
t desired time;
(3) generate and display relevant
performance indicators and situation
138 .1t.)
parameters (e.g., ta et range) atthe various times o the actualscenario and the alternatives underinvestigation;
(4) modify any of the scenarioparameters, enabling compaeison ofthe tactical parameters . andperformance indicators at any.subsequent point in time;
(5) enter new performance indicatoralgorithms to be recorded anddisplayed during or subsequent tothe exercise;
(6) insert instructor cues to beautomatically keyed at times duringthe exercise based on variousaspects of the scenario parametersand/or time;
(7) configure new feedback displayformats for trainee briefings basedon the training objectives, issuesto be focused on, or parametersgenerated during the scenario, andso on;
(8) develop new subjective performancemonitoring input categories forobservation entry by the instructorduring the actual exercise;
set up all necessary scenarioparameters for running the scenarioon the training device;
(10) permanently record the exercise inthe exercise library for laterreview-via this fast time simulationcapability, or for actual runningduring a training exercise on thetraining device.
Additional capabilities coutd -also belisted; but these present a good overviewof the type of capabilities that should beavailable to the instructor in an off-linemode of the training device to enable himto revieve- and configure scenarios. Alibrary of all available training exercisescenarios should obviously be accessible tothe instructor for this developmentprocess. This fast - time simulation .capability should be available 'at anoff-line location to the training device,permitting access and fast time simulationin parallel with the actual running of thetraining device. The terminal should haveappropriate graphical displays and an entrydevice,to facilitate instructor interaction,and'evaluation.
SMARTTS has identified the need for a fasttime simulation capability to assist theinstructor ,in deyeloping exercises. This
(9)
139
full capability would require the fast-timesimulation capability as well as otherelements of the exercise , developmentfunction. In the preprototype SMARTTS this
_capability was given secondary importanceto the monitoring, control, and briefingcapabilities. Nevertheless, a fast- timelsimulation capability at a remote terminalwas provided on SMARTTS, partly due to a
particular design of the system which hastwo parallel complete operating systemswith associated simulation models and twocomplete instructor consoles (i.e., one ina classroom and the other in an attackcenter). The SMARTTS system permits thefast-time running (i.e., at a rate up to16X) of any exercise scenario in thelibrary. All of the tactically relevantvariables and performance indicators can beevaluated on either of the instructorconsoles. Modifications of any of thetactical parameters can be effected fromthese consoles, with the modified exercisespermanently stored for later training use.This SMARTTS capability, during the limitedtime in which it has been given operationaluse, has been found to greatly assist theinstructors in preparing and evaluatingscenario exercises. Requirements had beenearlier identified to more fully developthis,capability in the later developmentalversions of SMARTTS; initial indications todate are that these planned additionsshould be carried out.
Instilictor Interface Language. The typicalinstructOr in military training systems hasa good Operational background (e.g.,submarine operating experience for a
submarine tactical instructor), although alimited instructional and computerbackground, particularly with regard tooperating particular , training deviCes.Hence, it is desired to have a, computerlanguage developed to facilitate theinstructor's interface with the trainingdevice. This interface lanuage shouldpermit the instructor to interact usingsomewhat standardized operationalterminology, instructional terminology,and/or near-English terminology. It shouldbe designed to accommodate those activitiesnormally performed by the instructor wheninterfacing to the training device. Withregard to exercise development, forexample, this language should accept,discriptive commands by the instructor inthe form of the normal operationalparameters (e.g., whereas the computermanipulates targets on an x, y, z grid,.operational personnel often view thegeo,graphic situation in terms of range,bearing, .and depth/evaluation). Further-more, the . interface language should bestructured to tontain a large set of maceofunctions" that readily translate the
1 3
instructor's needs into exercise,
performance indicator, cue, feedback
display, etc., modifications. For example,
the capability should be provided to enable
the instructor to readily tailor feedback
displays to particular exercises. This
macro function should enable the instructor
to rapidly provide the information
necessary to configure the display in terms
as cloSe as possible to those operationally
used, and then enable the computer to
automatically configure the display. This
might be accomplished using a digital
interface drawing tablet, with appropriate
parameter data readily accessible, and with
standardized manipulation algorithms
available.
SMARTTS, although not oossessing
capabilities to this extent, does permit
instructor flexibility in setting-up and
tailoring trainee information displays.
SM'ARTTS, for example, provides several
generic plot formats of X versus Y with
which the instructor can display any
parameter (i.e., performance indicators and
tactical variables) as a function of any
other parameter. The instructor simply has
to indicate which variables he wishes to
plot and they would be automatically
plotted. The instructor can also set-up,
several groups of - parameters to be
called-up for display simultaneously by
requesting the group(s) rather than each
individual parameter. 'These capabilities
are not only useful during the exercise
generation functiom but are also intended
for the briefing function. SMARTTS has
identified the need for an instructor
interface language, for exercise
demelopment, but has not formally developed
stich a language as part of the preprototype
system, although some capabilities of this
type are included in the preprototype.
MONITOR AND' CONTROL TRAINING. The
instructional support subsystem should
provide a predominence of its cadabilites
for the monitoring and control of training,
and for the briefing function which ' is
addressed later. Suggested monitoring and
control cadabilities are presented below,
in reference to Figure 4.
Exercise set up. At the point when the
scenario exercise will be run on the
training device during actual training, it
7shou$ have undergone prior development and
1. ,evaluation. Hence; typically, 'eercise set
up would be a matter of simply selecting,
the,appropriate exercise from the library
In
prditiating the problem on the training
evrce. Some minor modification may also
,%-bg desired at *this time. , Ari exercise
Vibrary, therefore, should be available to
140
the instructor from which he can' select the
appropriate exercise. Actual set up on the
device should be as automated as possible,
so as to free the instructor to perform
other duties, and to enable quick turn
.around time between training exercises if
desired. The exercise sset-up procedure on
today's .training deviCes, even after the
exercises have been fully developed, is
often cumbersome and time consuming; manual
entry of all the set uR parameters is often
necessary at the time of initiating the ,
exercis'e (e.g., in some instances this may
take up to 45 minutes). This laborious
entry process should be unnecessary for
standard exercises; on a sophisticated
training device the exercise parameters can .
simply be stored on a disk for automitiC
entry. A capapility should be provided to
enable the instructor to modify any of the
parameters at the time of initiation-, if he
so desires.
SMARTTS provides an instructor's console
immediately adjacent to the fire control
system consoles from which the instructor
can select an exercise from the exercise
library and have it automatically entered
upon command. He can also investigate and
manipulate many of, the relevant tactical
variables at this ,.time to -modify the
exercise. Furthermore, if ,the instructor
so desires, a imdified copy of the exercise
can be automatically stored in the exercise
:..library as a riew exercise for later use.
This type of capability substantially
reduces the low level time consuming task
of set-up.
Exercise Monitor. The instrudtor must
monitor both the scenario and the trainee
activities. Relevant monitoring informa-
tion should be generated by the training
device, with much' of t,he information
selectable by the instructor in real-time
when he requires it. It should be provided
to him in a timely manner and in a clearly
understood and meaningful format. 'Cues
should be provided as appropriate to
off-load an appropriate part of his
monitoring function. This information
should be provided to the instructor in a
convenient location which-, enables him to
perform his other duties with a minimum of
travel. Both alphanumeric and graphical
information should be provided, depending
on the purpose of the information and the'
precision the instructor requires.
SMARTTS provides an,instructor's console in
the attack center, immeOately adjacent to,
the fire control party and the MK117 fire
control system. An identical instructors
console, is located in the classroom for
briefing purposes. SMARTTS, automatically
records a wide variety of tactical
'variables and performance indicators. Thisinformation is available to 'the instructorupon request at the instructor's console.Both graphical and alphanumeric formats are'provided, depending upon the information
and the initructor's desires. The displayis automatically up dated over time toprovide the most recent information, as
well as to provide historical informationfrom the beginning of the exercise or otherdesignated times. Cues are.provided to theinstructor regarding (1) performanceindicator values or tactical parametervalues that have gone beyond predeterminedlimits (e.g., probability of counterdetec-tion going beyond 50 percent); and (2)
pending action by the automatic interactivetarget (AIT) (see below regarding controlcapabilities). The display formats used toprovide information to the instructor weredeveloped with regard to the requirementsof the training situation and the type ofinformation submarine officers normallydeal with (e.g., line-of-sight diagram).
Hence, the instructor need not return to
the program operator consoles to monitorexercise progress. Rather, he can staywith the fire control party to monitor thetrainees' actions, and still have the
normal scenario monitoring information
available. Furthermore, SMARTTS providesthe instructor with a variety of additionalinformation concerning trainee performanceand the problem status.
Data Recording. The analysis of instructorfunctions and tasks inevealed that typicalinstructors spend a relatively large amountof time recording exercise data, if theyprovide this type of' feedback to the
trainees after the exercise. Furthermore,the better postscenario briefing sessionsare those in which the instructor has
substantial information amailable for
presentation and discussion with thetrainees regarding various aspects of theproblem. Much of the information recordedby the instructor is normally generated bythe training device (e.g., tactical
parameters and events). The typical
computer-based training device has the
capability to record these data for lateraccessing by the instructor during the
postscenario briefing. In addition, a
considerable part of the instructor's tasksis to monitor and record actions of the
trainee- that could not be automaticallyrecorded by the system (e.g., communicationbetween team members, coordination). These
types of subjective performance indicatorswould be relatively difficult to achieve
via automated Means in many trainingsituations. Hence; the instructor should
focus a substantial portion of his timemonitoring and recording, these types ,of.performance indicators for later feedback.
141
The training device could assist theinstructor by facilitating his recording ofthis information via a standardizedinterface with the training device.
The SMARTTS system records a wide varietyof tactical variables that occur during theexercise scenario. Additionally, a set ofobjective performance indicators are
generated and also automatically recordedduring the exercise scenario. Finally, a
capability is provided for the instruCtorto enter a variety of subjectiveobservations into the training device/-computer storage facility. For example, ifthe instructor observes highly proficient
communication between the fire controlcoordinatbr and the plot coordinator duringa particular time of the scenario regarainga particular aspect of the scenario, he caneasily enter this observation into the
computer by designating the appropriateobservation. All of the above- data are
readily accessible to the instructor duringthe exercise, and also followin4 the
exercise for discussion in the briefing
session. This set of capabilities not onlyreduces the instructor's load, but also
provides .tools to the instructor for
training that he here-to-fore did not
have. For example, the instructortypically was unable to monitor changes inthe fire control system solution accuracyover the course of the exercise as a
function of ownship's maneuvering; this
informayon is now automatically recordedand can be accessed by the instructor atany time, including during the postscenariobriefing session.
Scenario/Exercise Control. The instructorshould be provided with a wide range of
control capabilities to enable him to
readily control all aspects of the
scenario, and also control presentation ofpertinent information to the trainees as
necessary. The control capabiliteS shouldbe provided in a convenient location,similar to that for the monitoring
information, preferOly near the 'traineessuch that the instructor can continuallymonitor the trainees and the problem whileentering necessary' control commands.Ideally, the instructor would spend a
minimum of time controlling the exercise,devoting most of his resources to othertraining-related activities. In this
regard, scripted scenarios should be
available wherein the target has predeter-"mine actions. Additionally, other types ofcontrol for the target and other aspects ofthe problem would be desirable if theyfreed the instructor from these tasks. It
should be' noted, however, that the
instructor sDpuld be allowed to control, anyof the aspects he so desires. In,addition
.00
to the scenario control capabilities, the
information presentation capabilities
should likewise be located for apPropriateviewing by the trainees, and 'so that-the
instructor can control the information
presentation via convenient and effectivemedia. The location of this information
presentation (e.g., feedback) would
noftally pe, either in the operational
simulator,environment (e.g., attack center)and/or in a briefing room. In either case,
the instructor should have complete controlover the information presented, enabling
.hiM, rapid configuration of the desired
displays.
The instructor's console in the SMARTTS
system is located near the fire control
party in the attack center. The instructor
is presented with monitoring information ona color-graphic CRT as well as the plasmadisplay entry device. The instructor can
control many of the felevant tactical
problem variables (e.g., aneuver targets),
from this console, as we 1 as control a
wide variety of information to be presented
to the trainees. The informationpresentation to the trainees is achieved in
the attack center via two overhead-mountedcolor CRT displays. A variety of displayformats can be put up independently on eachof these CRTs via command from the
instructor's console. The similar
instructor's console located in the
, adjacent classroom controls the informationto be displayed on a large screen displag'
during briefing sessions.
SMARTTS has the normal scripted scenario
capability, along with, an automatic
interactive target (AIT). The AIT
capability provides a computer-controlled
model of an enemy target platform. This
model is based on the best available
intelligence information for the particularplatform. The model automatically controls
the target in response to the evolving
situation events in real-time. Cues are
provided to the instructor prior to any
action on the part of the AIT, enabling the
instructor to (1) allow the AIT to carry'out its planned action, (2) override the
AIT's planned action by having it continue
doing it current activity, (3) enter a
different target course of action, or (4)
, take over manual control .of the target.
The All' capability is intended to reduce
the instructor's load in controlling the
principle ,target, which has been observed
to be considerable. The AIT, furthermore,
brings in the best available intelligenceinformation.to probabilistically control the
target in a realistic fashion. This
capability will assist in off-setting
experience differences and biases between
instructors.'
142
Information Pgpsentation. Information
presentation to the trainees, in the form
of a preproblem ,briefing or postproblem
feedback,- is- an' lesSential part of an
effective training process. Appropriate
information must be generated by the
training system, and presented to the
trainees in an effective way to assist
their assimilation. A wide variety of
media is available for presenting
information to the trainees. The most
common is that of verbal presentation by
the ; instructor. Unfortunately, ' verbal,
presentation is limited with regard to
handling complex relationships. The use ofaugmenting graphical information (i.e.,
pictures) and examples has been shown to
substantially increase the effectiveness oftraining (Lesgold, Pellegrino, Fokkema, and
Glaser, 1978). Hence, a visual informationdisplay capability should be made available
in the instructional subsystem, particu-
larly for complex training situations. A
variety of capabilities may be associatedwith that information display, such as a
fast time model to explore alternative
actions in a .given problem. . Also, a
variety of information should be availablein the training device regarding the
recently completed exercise scenario for
presentation to the trainees. Ready access
to, and flexible control over, this
information and these capabilities should
be provided to the instructor at an
appropriate location. Obviously, the
monitor, record, and control capabilities
act together with the information display
capabilities to provide appropriate
information .to the trainees.' The
information presentation capabilities are
further discussed below under the briefing_function which overlaps considerably with'
this monitor and control function.
The monitor and control capabilities of theinstructional subsystem are quite extensiveon SMARM, as part of the submarine combat
system trainer. These capabilities are
generic, although their specific character-istics have been tailored tot;the submarine
tactics training problem.. The above
discussion presents a very general summaryoverview of the particular characteristicsincorporated_ in SMARTTS for this instructor
function, and its associated tasks, tssues
and considerations.
_BRIEFING. The briefing of trainees is a
TiTi-TF--fraining process function of the
instructor. It obviously overlaps with the
monitor and control tasks as noted above,and also overlaps with the trainee
interface requirementt of the instructionalsubsystem. It is the opinion of this
author that, many of the trainee interfate
characteristicS of the insti-uctional
subsystem are the same as thoie requtredfor the briefing function. The briefingfunction' of the instructor deals withproviding external infbrmation- to thetrainees to correlate with their actual ordesired actions. This information is a
primary means by which training actuallyoccurs. For example, this informationprovides feedback informing the trainee ofhow successful his particdlar actions werewith regard to the operational objectives.Since a major objective of training is todevelop an awareness on the part of the.
trainee of the relationship betweenavailable information, alternative actionsthat can be taken, and.the resultant likelysituation outcomes, external informationpresented to the trainee is extremelyimportant. Hence, the effectiveness withwhich information can be presented to andassimulated by the trainee will have a
direct bearing on the effectiveness of thetraining process. Information presented ,tothe trainee attempts to achieve thefollowing:
(1) explain a particular evolvingsituation (e.g., events that occurduring a tactical encounter),
(2) identify possible alternativesituations and actions (e.g., thetypes of targets likely to beencountered in this area, each
target's likely.set of tactics, andappropriate defenses),
(3) the outcome of the trainee'sparticular actions during theproblem (e.g., the trainee teamstactical approach on a target),
-(4) the relationships between tacticalvariables that were relevant in theproblem '(e.g., the relationshipbetween ownship speed andprobability of counterdetection), and
-() the likely impact of variousalternattve trainee actions on thesituation outcome (e.g., the impactof different ownship speeds onachieving a successful targetapproach). These catergories ofinformation might be differentiallybrought .up -prior to a real-timetraining exercise, during freezepause in that exercise, or followingthe exercise.
The, information categories ;1all into twogeneral classes. The first is to provideinformation regarding a particularproblem. This is typically in the form offeedback after the problem occurred. It
may involve, dissecting the problem to
143
investigate the trainee's actions, why hetook those actions, what options wereavailable to him, ahd sT on. The-secondclass of information is somewhatindependent of the particular actions on
the part of the trainee. Rather, it dealswith the general set of possible Problems,investigating relevant issues surrounding aparticular set of -problems. This wouldnormally occur in a priefing session priorto the real-time exercise on the device.It may also evolve during the postproblembriefing session. Whereas the former classof information presentation class relies onthe data generated and-collected during thetraining exercise, this latter relies on a
fast time modeling capability to generatethe various alternative actions, tacticalparamete?s, and performance indicators.-This briefing capability should be locatedwhere most convenient to the trainees, andconducive to the types of activitiesoccurring during the.training process.
Information displays in the SMARTTS system'have been provided both in. the attack
'center and in the classroom. These twolocations recognize the differentactivities that may take place in both.The classroom can operate in conjunctionwith the attack center (e.g., monitoringthe ongoing exercise in the attack center),or both can operate independently on
separate problems. A variety ofperformance indicators are available fordisplay to provide the trainees withinformation concerning the various aspectsof their performance. The intention - in
SMARTTS has been not to develop performahceindicators pertaining to good or bad,
performance, but rather to genenateinformation that can be meaningful from thestandpoint of informing the trainee aboutthe scenario and his performance. Often,performance indicators ,change in anopposing fashion, necessitating trade-offsbetween them when selecting the appropriatetactical action. From a -training processstandpoint, the important consideration is
not how well .the trainee did, but ratherthat the trainee understand the impact ofhis actions and the other actions availableto him. SMARTTS provides the instructorwith a wide range of capabilities to accessthe varibus information that was recordedduring the training problem. .A variety ofinformation 'display formats are availablefor presenting 'and discussing thisinforMation, as well as a range offlexibility to *dissect the information andfocus on particular aspects of theproblem. The instructor's subjectiveobservations, which were entered during theexercise, are also available for a
reference during the postproblem briefingsession. Also, time-flag -units are
available at seyeral trainee locations to
qvableqiie.trainees to place a time-taggedindication in the scenario' recorging at a.
point when they were concerned with a
particular issue and unable to bring it tothe intentiori of the instructor. These
trainee-instituted time-tags can be
accessed during the postproblem briefing
seSsion for investigation.
The fast-time simulation, capability,
discussed above several times, js also
available in SMARTTS to generate problemsfor presentation and discussion. The
fast-time capability may be used in a
preexercise briefing wherein the trainees
can be given a preview of the particularexercise, they will encounter, pr other
relevant exercises. Various performance,
indicators and tactical parameters can be'
investigated at this prebrIefing time to
focus on the aspects of importance duringthe problem. Alternative sets of ownshipand/or target actions can also be addressedat this time to explore their impact on thevarious tactical parameters and performance
indicators. This capability is also
available during a problem freeze and
during ,the postproblem briefing session.
During the latter session, alternative setsof 'actions may be explored for the
parttcular probleM just completed in Aheattack center. It is important to note
that models are provided in SMARTTS to
generate infOrmation that would nOrmally begenerated by the trainee team (e.g., targetmotion analysis solution). This informa-
tion, generated in'fast,time, is useful forcomparison with the actual values' generatedby the trainee team with regard to ,the
alternative actions being investigated forcomparative purposes.
It is important that an adequate human
factors design be implemented foro the
instructor/computer interface with regard
to the briefing capabilities. At this
time, the instructor is standing on-stage
and has to perform in a, timely fashion.These briefing capabilites provide the
instructor with a substantial amount of
information to access and present, to the
trainees. However, it must be facilitatedin,..such a way that it can be brought up
flexibly and quickly, with relative ease on-the part of the instructor. It must also,he presented clearly to the trainees. The
design of the SMARTTS instructor's console(e.g., a color graphic CRT and a plasma
display entry device), as noted above has
been designed to facilitate this interlace.'
Training System Management
A variety of capabilities are desirable fortraining system managment. The most
important capability that the device
144 ,
automation can assist is the recording andstorage of data generated during exercisescenario runs and the statistical analysisroutines to enable investigations of thesedata at a later time. As noted earlier,important training system-related data aregenerated each time the training device'isused. These data can be extremely usefulin evaluating-and upgrading a variety, 6f
aspects of the training process (e.g.,
development of exercises as noted above);.,
The SMARTTS front-end analysis has
identified the need for-,many of theecapabilities. The preprototype SMARTTS
'system has the capability to record and
long-term store exercise data for lat7analysis. Additional analysis capabilitiesshould become a part of the- later SMARTTSdevelopmental units.
SUMMARyn
The training device is more than a
simulator. It has both simulation and
instructional subsystems, each specificallydesigned .for a vastly different purpose.
The instructional subsystem encompasses
capabilities that directly sppport the
training process. These capabilities focus
on the instructor, the -trainee, and
training system management. Their
incorporation in modern , training
devices/training systems is, essential to
achieve the high levels of training
cost/effectiveness requisite for adequateoperational effectiveness of modern weaponsystems.
.
Although many training support"capabilitieS'should be automated, such as the, computer
driven automatic interactive target and .
others discussed in this paper, many
additional capabilities that are part ofthe instructional subsystem can be
effectively accomplished via traditional
manual means (e.g., instructor guidelines
for exercise development). The training
device/training system should provide a
range of capabilities in the instructionalsubsystem Selected on the basis of theircost/effectiveness impact on the trainingprocess. These capabi1ities5 furthermore,should be flexibly tailored to each
specific training situation. The trainingdevice/training system can greatly assist
the instructor in enabling him access to
here-t6-fore unavailable complex
information directly impacting trainee
performance (e.g., probability of
counterdetection). Presentation of this
complex information can also be enhanced
via presentation on highly eqective media(e.g., large screen- graphical display). 'Avariety of datsa manipulation capabilitieS
can enable disecting various operational-,
problems, investigation of alternative
ye.)
actions, illustration of- a r nge ofexamples, and so on (e.g., vi fast-timecomputer modeling). These and 'many othercomputer-based and manual capabilities makeup essential elements of *the instructionalsubsystem.
The instructional subsystem encompasseselements beyond those discussed in thispaper, such as a variety of manual mediatypically used in the training process.
, The instructor himself, for example, is
probably the single most important element'of the instructional subsystem and thetraining system (Gardenier and Hamm01,1981). Empirical research hat shown thatthe "instruotor can have a substantialimpact on the effectiveness ,of
- simulator-based training. The capabilitiesdisCussed in this papeftprovide.tools forthe instructor to enablve him ,to achieve a
more effective training process.Nevertheless, the fundaTentalcharacteristics of the'instructor hiMself,independent of these tools, may have a
substantial impact on the effectiveness ofany training situation.
As discussed early in this paper, *e aKenow emerging on the third generationtraining device, emphasizing thea importanceof the instructional subsystem arid built-intraining technolbgy capabilities to enhancethe training process._ ,Although -theemphasiO' given to the instructionalsubsystem in the past has been eelativelyminori its importance 19. becomingrecognized: with inoreasing emphasis givento these capabilities in the design oftpday's training devices/training systems.
,.SMARTTS is one example, and a forerunner ofthat trend. Another example is a set ofguideline's for' deck officer trainingsystems (Gynther, Hammell, Grasso, andPittsley, 1982) which address the use anddeign of the shiphand)ing/ship bridgesimulator for trabining senioe commercialship deck officers. This report recognizesand places emphasis on ."three majorelements of the training system - tbesimulator design, the training prograM
__:hructure and____ the instructorqualifications" (page i). This document,which' Ts intended to provide trainingsystem design guidance to the potentialusers of shiphandling/ship bridgesimulator-based training devices, addressesand recommends many of the trainingassistance characteristics discussed inthis paper, and embodied in the SMARTTSpreprototype.
The instructional subsystem has beenpresented,in this paper as one of two majorparts of the training device. AS such thetraining-related capabilities were stronglyemphasized, while the fundamental
t,
engineering aspects of this subsystem wereplayed down. In the past, along'with Ahelack -of adequate instructionalcapabilities,. most training- devicesreceived inadequate attention and scrutinyof their training-related capabilities.Following from the effective proceduresestablished to monitor the design,,anddevelopment of the hardware and, softwareaspects of the training device, a similarproceduralPolicy, should be established bytraining device procurement activities tomonitor and evaluate the potential trainingeffectiveness of the training device.These "procedures should evaluate thetraining device from its initial conceptualdesign stages on through development,factory acceptance testing, and siteacceptance testing. The focus ofww.theseadditional evaluations should not be on thespecific hardware and softwaee engineeringaspects, but rather on the potential impactof the trainers and their developmentcharacteristics on training effectiveness.The evaluation of the potential trainingdevicds characteristics should nbt end withthe initial conceptual design; rather, it
should continue to monitor the specificdetails of the implementation of thoseinitial concepts. For example, a detailedhuman factors review of the variouscharacteristics of the instructor interfaceshould be accomplished at appropriatestages during the development process toverify effective design .details. SMARTTSdeveloped a training system evaluation test(TSET) to be conducted at the time of theFactory Acceptance Test for the hardwareand software. The TSET addressed specificcharacteristics,of the design as they wereexpected to impact the effectiveness of thetraining process. This type of evaluation,should* take place at all stages frominitial conceptual design through' theproduction run of a particular device, andcontinue after the device is operational toconstantly evolve its capabilities andhence improve its training effectiveness.
This paper focused heavily on trainingsystem aids for assisting the instructor,and providing an effective interface withthe trainee. It did address, although to alinlited extent, the /importance of thetr$ining methodologies employed forachlevement .of an effective trainingprocess. The instructor, the trainingdetice and its tools are mechanisms forimplementing an effective trainingpr9cess._ The methods they use in
imPlemdritin94 this process are extremely,important for achieving maximallycost/effective training. The trainingmethods are, obviously, tan extremelyimportant part of the instructionalsubsystem. They represent the strategythat is employed to achieve the training
1
s objectives. As training system cal3abilites
improve, the breath nd potential
effectiveness of available training
methodologies also increase. For example,
learning by example hAs been shown to be an
effective training method (Letgold et al,
1978). This can be accomplished in the
modern training system in a variety of
ways, including several fatilitated by
capabilities discussed in this paper (e.g.,
fast-time modeling in the classroom to
investigate alternative tactics). By
providing appropriate instructional support
capabilities on the training device,
effective training methods that could not
be used due to the problem complexity or
other situation limitations are now
available.
The training expert must also be careful
not to overlook other training methodology
principles which are available but might be
overshadowed by tile automated capabillttnprovi,ded on sophisticated training-
devices. For example, overlearning is a
potentially, effective approach for the
training of7 individuals to perform under
high stress conditions; to take advantage
of the benefits of overlearning In suchsituations does not -require 'any particular
instructional subsystem capabilities
(Fitts, 1965). As always, the most
cost/effective training process will be
achieved by taking into account all of the
releVant elements that impact that process,including the simulation characteristics of
the training device, the training methods
employed, and the capabilities of the
instructionarsubsystem.
REFERENCES
Lesgold, A.M., J.W. Pelligrino, S.D.
Fokkema, and R. Glaser. "Cognitive
Psychoiogy and Instruction" in Proceedings
of Data International Conference held at
the Free University of Amsterdam, June
13-1, 1977 sponsored by the NATO SpecialProgram Panel on Human Fadtors. New York,
Plenum Press, 1978.
Chenzoff, A.P. et al. "Guidelines For
Training Situation Analysis .(TSA)."
App ie c Assocfates, Incorporated,
'Valencia, Pennsylvania, July 1965.
Comptroller General. Report to the
Congress of the United States;
"Effectiveness Of U.S. Forces Can Be
Increased Through Improved Weapon System
Design." U.S. General Accounting Office,
PSAD-81-17, January 29, 1981:
Krahenbuhl, G.S., P.W. Darst, J.R Marett,
L.C. Reuther, and S.M. Constable.
146
"Undergraduate Pilot Training:. Instructor
Pilot Behavior And,. Student Stress And
Pqrformance, Level II." Department of
Health and Physical Education, Tempe,
Arizona, July 1980.
Charles, J.P. "Instructor Pilot's Role In
Simulator Training (Phase III)." U.S.
Naval Training Equipment Report,
NAVTRAEQUIPCEN 76-C-0034-2, June 1978. .
Gardenier, J.S. and T.J. Hammell.
"Shiphan.dling Simulator Standards and
Training Technology." U.S. Coast Guard,
Washington, D.C. and CAORF/Ship Analytics,
North Stonington, Connecticut. Paper
presented 'at the Second Interriational
Maritime Simulation Conference, Kings
Point, New York, June 1981.
Gynther, J.W., T.J. Hammell, J.A. Grasso,
and V.M. Pittsley. ."Simulators For Mariner
o Guidelinei For
Deck Officer Training System." National
Maritime-Research Center, Kings Point, New
York, January 1982.
NAVEDTRA 106A. "Interservice Procedures
For Instructional Systems Development."
Center for Education Technology, Florida
Stater University and U.S. Army Combat Arms
Training Board, Ft. Benming, Georgia, 1975. .
Fitts; P.M. "Factors In Complex Skill
Training" in R. Glaser (Ed.) "Training
Research and Education." Wiley, New York,
1965.A
Hammell, T.J. "The Training Device Is More
Than A Simulator." Ship Analytics,,
Incorporated, North Stonington,
Connecticut. Paper presented at the Second
International Maritime Simulation
Conference, Kings Point, New York, June
1981.
Hammell, T.J., J.W. Gynther, J.A. Grasso
and M.E. Gaffney. "Simulators For Mariner
Training and Licensing' Phase 2:
Investigation Of Simulator Characteristics
For Training Senior _Mariners." National
Maritime Research Center, Kings Point, New
York, January 1981.
Hammell, T.J. arid T.N. Crosby.
"Development- of SMARTTS Training
Technology." Eclectech Associates, North
Stonington Connecticut And, Naval Training
Equipment Center, Orlando, Florida. Paper
presented at the Second Interservice
Industry Training Equipment Conference,
Salt Cake City, November 1980.
Caro, P.W. "Aircraft Simulators and Pilot
Training" in Human Factors, 1973.
Hammell, T.J., F.P. Sroka, and F.L. Allen."Volume I of II, Study of Training DeviceNeeds for ,Meeting Basic Offjcer TacticsTraining. Requirements." Naval TrainingDevice Center, Orlando. Florida, 1971.
Hammell, T.J., C.E. GasteYer and A.J.Pesch. "Volume I of II, Advanced OfficerTactics Training Device Needs andPerformance Measurement. Technique."Genera/ -Qynamics/Electric -Boat Division,Groton, Connecticut, November1973.
0-
4zi
147/1481
PROCEDURES MONITORING,AND SCORING IN A SIMULATOR
Steve SeidenstickerTerry Kryway
Tactical and Training Systems DivisionLogicou, Inc.San Diego, CA
ABSTRACT
As the quality of cockpit simulation increases and as economic pressuresforce more training into simulatori, the number and type of training taskshandled ip tlte simulator are also expanding. Many of these-new tasks areprocedures oriented. This is particularly evident in pilot replacement trainingin high performance aircraft. In such situations the simulator is used toexpose the student to emergencies and other situations where judgement andthe ability to follow certain procedures accurately and quickly are vital.To do this rational techniques are needed to detect and aapign meaning tothe procedural events. This paper xill relate in some deith the effortsto incorporate a comprehensive procedures monitoring and scoring facilityin an Operational Flight Trainer.
BACKGROUND
The quality of flight and weapons systimmsimulation in military simulators has advancedrapidly with the application of.digital computers. This is particularly visible inthe areas of visual scene and weapons systemsimulations. Transitional (from one aircraftto another) training curricula have takenadvantage of these recent,advancements andflight trainers are now being used in matarioswhere more than just flying skills and therote responses to emergencies are being taiga.Instructors are now subjecting the studentto situations where preplanning, headwatkand a thorough knowledge of the aircraftand its systems are required.
General Problem
The quality of training received framthese expensive systems depends a great.dealon the instructor's ability to operate thesetrainers and of his ability to keep up withactivities in the cockpit. Often the existermeof such ability is an exception rather thanthe rule. This is not because instructorsare incapable of the tasks, but because thecomplexities of operation require substantialtraining and experience in order for theinstructors to become proficient. And operationof the trtinet ig Only one Of the mtny responsibilities of the flight instructor.
An example of such a situation stemsfrom a recent improvement to the Navy's F-14Operational Flight Trainer (OFT), device2F95. The visual s stem has been epharmedto simulate carriek operations includingcatapult launches. The training syllabushas taken advantage of this and now singleengine failures are being practicediduringthe catapult shots. A timing probleml'exists
14,9
in the.insertion ofthe engine failure afterthe catapult stroke has started. If it isnot inserted at precisely the correct moment,the simulation is not accomplished correctlyand the training value is lost. Thereforethe instrUctor spends most of his attentionin the operation of the trainer and not tothe student's reaction to the emergency. Ifthe student crashes, the instructor obviouslynotices the result, but does not know what,if anything, the student was doing priorto getting wet.
Recent emphasis has been placed onsimulator design to help the instructor withscenario generation and probled control. Oneof the important aspects.ol problem controlis that of procedures monitoring and performance
measurement. Though simulators in the pasthave attempted to provide the instructorwith this type of information, in Most casesthese enhancements are the first to be ignoredby the instructor.
Early Attempts
An mingle of recEnt performance measurement
capabilities are on the existing F-14 OFT.This trainer has the capability of monitoringthe performance of a trainee and issuinga hardCopy printout whenever performancelimits are exceeded. The instructor mayselect five of ten flight parameters-andset a low and high'limit on these values.He may also segment the exercise.by time.Then by paging into an alphanumeric display,he may observe the trainee performance withrespect to thes flight parameters.
Same o the problems voiced by instructors
*ith respect to this particular mechanigm.are:
The performance monitoring is restricted
to flight parameters only.
The high and low limits are not stmulimdilted
and the values are often meaninglessbecause they usually must be relatedto the scenario.-
Indexing by time into the trainingsession is not,effective in that iistructors
do not keep track of a training session
by time, but by training task. Time
references may be meaningful if used
withine particular training task but
not irlused with respect to the entire
training session (e.g. two milesafter commencing.an approach is meaning ul,
however, 17 minutes after ,commencing
the exercise is not).'
Setting up the performance-measurement
is'a laborious process. The instructor
must use a keyboard and paging functions
for data entry.
The data generated is not readily available
nor is it in a format easy to interpret
and apply to the evaluation of the
student's progress.
Another performance measurement feature
in a recently delivered flight simulator
is an overview of the cockpit activity where
the trainee's 20 most recent actions are
listed on a CRT.
Interviews with and observations ofusers of this device indicate that the time
event monitor is not very effective for monitoring
specific procedures'in the cockpit because:
Not all of the 20 most recent actions
may be applicable to the task the instrucfor
is monitoring and the page tends to
become cluttered with inappropriate
information.
Certain procedures applicable to the
task may not be represented on this
page and must be followed on another
area of the Instructor/Operator Station
(I0S).
When the student becomes very busy
in the cockpit, these action messages
appear and disappear more qutCkly than
they can be read.
ISS APPROACH
Rational techniques are needed to detect
and assign meaning to the procedura4 events
and performance measurements in'a drainer.
This information must be made readi45available
and must be presented in an easy ko)understand
format to the user. While this may not seem
to be a difficult task on the surface it
has not been built into any trainer 4uccessfully.
The remainder of this paper describes
efforts by Logicon personnel to incorporate
a comprehensive procedures mOnitoring and
scoring facility in the InstrUctor Support
System (ISS). The /SS is the preduct of
a general effort to improve 106s sponsored
by the U.S. Naval Training Equipment Center.
Problem Management
In approaching this effort one of thefirst technical issues that arose was the
sheer size of the problem and computer processing
power required to handle it. The F-14 OFThas approximately 350 sWitches and other
prlot adjustable mechanisms in the.cockpit.
If one attempted to monitor all of them all
the time and condense, evaluate, and present,
derived information to-the instructor, one-
would require moie processing power than
is available with any computer and more software
than a small army of talented programmers
could produce.
The only reasonable solution appearedto be the creation of a mechanism wherebyonly the subset of activity in the cockpit
:-that was appropriate at the moment would
be monitored and processed. This resulted
in the establishment of "task modules".*
1100
These are operationally logical training
segments such as takegf, departure, or hydraulic
malfunction. A training session/scenario
is composed of a group of'these task modules.
Part of each task module are detectable "events".
The ISS can follow the scenario by watching
for these events. For example, a combination
of events suilb as gear transition to up,
flaps transition to up, speed increasing
through 200'knots, heading increasing and
passing 275 degrees, "start's" a task module
representing a particular departure from
NAS Miramar. When this task module is started,
only those flight parameters ind procedures
applicable to this training task are monitored
and processed.
Task modules may also bke "forced" to
run by direct action of the instructor.More than one task module can run at the
same time, if the activities associated with
them overlap. This quality was also used
to monitor vital safety functions at all
times the aircraft was off the ground. A.
task module was defined whose start condition
Nwas "weight off wheels" and stop condition
* A companion paper, also prepared for this
workshop, entitled MODULAR CONTROL OF SIMULATORS
explains the concept of task modules in depth.
Although this paper indicates how task modules
fit into procedures monitoring and scoring,
the reader is encouraged to refer to the
other paper for additional information.
1501
was "weight on wheels". Among other itemsit monitored the,accelerometer and informedthe instructor of any aircraft overstresses.It was nicknamed the "umbrella" and keptthe 1SS from looking stupid by not informingthe instructor if some disaster occurredThe maximum number of concurrently runningtask nodules in the ISS is four. This constraintwas due to processing power limitations,but it did not hamper operations.Task Module Structure
In order to understand in more detailhow the ISS monitors procedures and performancemeasurement,one must examine the ISS softwareand task module structure. The system isdriven by a data base consisting of taskmodule data files. Each task module is definedby a set of data files Which are unique tothat individual task.
One of the files within a task moduleset defines detectable events (or actions)that are directly, functionally, and physicallyrelated to the training task. Directly relatedactions are those that are depicted in theNATOPS manual as part of the procedure (e.g.switch A - off, switch B - stby, lever C- up, etc.). Functionally related actionsare any actions which are similar in functionbut are not part of the procedure. (e.g.in a particular hydraulic failure a secondaryhydraulic isolate switch is part of the hydraulic
system of the aircraft, but does not playa-part in the particular procedure.) Byidentifying this switch, a diagnostic messagecan be created informing the instructor ifthis switch is mistakenly thrown. Physicallyrelated actions are those linked to switches-that..are in close proximity to the appro-priate switch and may look the same (e.g. ifthe wrong circuit breaker was pulled in aprocedure, this identification would producea diagnostic message).
In the definition of these events thefollowing considerations are made:
Relationship of the cockpit devicevalue to a reference value (greaterthan, less than, etc.).
Stahle time (to avoid triggering ontransient changes).
Whether the event is external, somethinghappening in the trainer, or internal,acombination of external events.
Whether the event is a verificationof a value or a change in its state.
Whether the event has the "dropout"property (whether it is consideredto be another occurreiace when the reverseof the initial happens).
Procedures Monitorinf;
A program dedicated to monitoring theseevents is notified when a task module becomesactive and opens a file defining the appropriateevents for that task module. In this way,only a select group of switches are monitoredat any one time.
Whenever any of these events have beendetected, this event monitoring program informs
another program, dedicated to the proceduresmonitoring function. Like the event monitor,this program is also informed when a particulartask module becomes active. It opens a filewhich has defined the relationship of theevents.
This program evalnates 'aitiona withrespect to relationships and contingenciesaad causes-a variety of actions to be takenby the system. Mist is, it 'can evaluatea sequence of events such as the case whereswitches A B C must be thrown in sequence,,and C D E must be thrown after the the firstthree bul not necessar y in sequence. Anexample of this ia t e engagement of nosewheel steering during an F-14 takeoff. Thesteering is optional prior.to 15 knots, a
requirement between 15 and 80 knots,.andmust be disengaged prior to 100 knots.
With respect to contingencies, notonly must the specific action be identified,but also the circumstauces under which itis to take place must be defined. That is,when an event occurs, alternate actions maybe specified depending on other circumstances8r contingencies. For example, during takeoffif the aircraft heading drifts off the runwayheading by a specified amount while the planeis still on the runway, an illegal actionhas occurred andadiagnostic message is gener-ated. However, heading changes after thewheels are- off the runway are normal andno action is taken by the system.
Performance Measurement
One of the actions that may be takenby the system se the result of theoccurranceof an event is taking measurements of flightparameters. For,example, when the aircraftreaches a certain altitude the system maystart measuring the pilot's ability to maintainthat level altitude. Or it may take a singlemeasurement at a specific point. For example;when the aircraft reaches Ground ControlledApproach (GCA) minimums, it may measure thedistance from the centerline of the runway.A performance measurement program is dedicatedto taking measurements associated with a
task module while it is active and when caspletedrcalculates a score for the task. It, likethe other programs, is informed when a particulartask module is active.
A
The measurements are each defined in
one of the task module files. They fall
into several general categories:
Continuous. A measurement with predefined
start and stop conditions. The parameteris sampled 1 to 20 times per second.
Aionitor. A continuous measurementwherein a limit is defined. The parameter
is sample during the time that it isoutside the limit.
Snap Shot. A single parameter is sampled
once.
Procedures Evaluation
Performance measurement for normal and.emerge4cy
procedures, because of.their nature, is handled
quite differently than flight parameters.The following factors, or subset of them,
are considered:
Critical errors. Those actions which,
by theirproduce catastrophe.
omission or commission, will
Recognition reaction time. This measures
student response to certain cockpitstimuli in the case of aircraft systemdegradation and/or failures.
Total procedure time. This measures
the total time required to complete
the procedure.
Percent of mandatory actions taken.These are actions that are depictedin the F-14 NATOPS manual.
Percent of optiOnal actions. These
are actions that are not manda-torybut demonstrate that the student iswell prepared and well in commilnd of
the situation.
Scoring.
The performance measurement programis informed when the task module is completed
and calculates a score. It does so by comparing
both the flight parameters measured and the
procedures evaluated with criteria unique
to the task module.
Scoring of performance by a computeris a controversi'll issue and isone of the
first to be questioned by students and instructors
alike. Developers of the ISS have attemptedto defuse this problem by putting the grading
criteria under control of the user's community
and by making visible to the instructorhis request) the measurements and calculatrons
used to derive the score.
The key to this is a scoring template
in which are placed all measurement definitions
and scoring formulas. This template is one
of the files making up the task module definition
and is easily established or modified using
a text edi,tor. After a talk module has beenexecuted the system display.); the template
in a format similar to that in which it was
created but with the addition of the actual
measurements takep and all the figures used
in deriving the final score.
Figure 1 is a template for a task module
which defines the SAN, PEDRO DEPARTURE from
NAS Miramar. The first column under "MEASU E"
are measurements that the student is gedon in performing this depirture.
DME SNAPSHOT is the distance from aparticular fix-the student is to fly over.
This is a single measurement.
RADIAL DEVIATION is the a.bility
fly a specific radial. This is a cont,i.nuous
measurement. RMS indicates that "root meansquared" is the basic transform used in processing
the data. 4:i
AIRSPEED DEVIATION is ability to maintain
a specific airspeed. This is a continuousmeasurement and the RMS transform is used.
The second column under "NOMINAL" isthe reference value of the particular measure-
ment. This can also be used to differentiatebetween measurements with the same name. In
this example the first radial measured is
the 280 degree radial and the second is the300 degree radial. Likewise the 1, 2, 3,
next to the DME SNAPSHOT meesurements indicate
the first second and third fixes in the departure.
The "4.0, 3.5, 3.0, 2.5" columns are the
scores on the Navy's 4.0 scale. The grade
for a particular step in a task module isderived by canparing the measured value with
the range of values below these columns.For example, the first measurement listedon figure 1 is the distance at closest point
of approach from the first fix the student
was to fly over. The actual value measuredwas .79 miles recorded under the "VALUE"
column. By looking across the template opposite--
the first DME measurement we see that themeasured value, .79, falls between 0 and
2.5 which is under the 4.0 column. Therefore
the student is assigned a 4.0 for thatmeemure-
ment. This is recorded under the "GRADE"column. The last column "WT" is the weightfactor given to that particular measurement.In this case it was worth 10 percent of the
total grade. After all of the steps havebeen calculated in the method described above,they are summed to form the final grade,
which in this example is 2.7.
This example does not show any procedures
scoring. If it did, REACTION TIME, % MANDATORYSTEPS, or the like would appear as additional
measurements. That is, procedures are scored
GRADING CRITERIA
MEASURE
SAN PEDRO DEPT
NOMINAL 40 3 5 3 0
FINAL GRADE = 2
2 5 VALUE 'GRADE vT
DME SNAPSHOT 1 2 5 ' 2 75..-
3 3 5 79 4 0 10
1 0 2 SI 2 lg 3 01RADIAL DEVIATIONCRMS) 200 2 3 5 10 469 V 0 10
200 0 2 DJ 3 01 5 01ALTITUDE DEYIATION(RMS) 2000 100 200 300 SOO 206 96 3 0 10
2000 0 100 01 200 01 300 01AIRSPEED DEVIATIONOIMS) 325 10 25 SO 100 69 06 2 5 10
325 0 10 01 25 01 SO 01OPE SNAPSHOT 2 1' 1 5 2 3 55 V 0 10
2 0 t 01 1 51 2 OfRADIAL DEVIATIONCRNS/ 300 2 3 5 10 i3 16 2 0 20
300 0 2 Oi 3 01 5 01ALTITUDE DEVIATION(RmS) 26000 100 200 300 500 535 34 2 0 20
26000 0 100 01 200 01 300 01DME SNAPSHOT 3 I 2 3 5 5 13 2 'V to
3 0 1 01 2 01 3 01
Figure 1. .Scori'ng Template
in the same manner as flight parameters.An exception is the CRITICAL ERROR. If oneof these occurs, the task module is assignedthe,lowest pdesible grade regardless of other,measurements.
Information Presentation
Although it may not seem to have directapplication to procedure monitoring, themanner in which the information is presentedto the user is very important to the successfulapplication of these features. Tbe ISS approachagain centers around task modules. As partof each task module's definition, picturesare defined in the task module data files.As task modules become active and are completed,appropriate pictures are added to and removedfrom the user's displays. Mak modules coveringprocedures (e.g. check lists, malfunctions)are composed of text identical to the proceduresas depitted in the aircraftaNATOPS manual.As each step in the procedure is accomplished,the action is noted next to the appropriatestep.
Figure 2 shows two concurrently runniagtask modules on the upper half of the display,a normal checklist on the left and a malfunctionon the right. Both are proceduu orientedand indicate that the steps are in progress.The section belov, labeled "diagnostics",containsmessages generated by the proceduresmonitor program. These appear at the bottomof that section and move up either When displacedby a new diagnostic message or when 30 seconds
153
have passed. All messages eventually disappear,from the top of that section. The smallrectangles represent menu selections that
_can be eicked by the user yia7a touch mechanimn,associAta.d.vitb-t16'display. This is theinput/control mechanism of the ISS.
SUMMARY
<Tr-
This paper has descObed a software/hardwaresystem that is able to monitor procedures,and score individually, and concurrently,the many training tasks'in a complex trainingscenario. This system has approached theproblems from the instructor's viewpointand id intended to give him understandableand appropriate information on an unclutteredinstructor/operator station. Using thishe will be able to better evaluate the student's
performance and provide the instruction neededto improve it.
The 1SS, of which this procedure monitoring
and scoring mechanism is a key part, hasbeen under development for several yearsand is now undergoing operational evaluationat NAS Miramar. The concepts developed inthis prototype promise improved trainingin existing flight trainers and can serveas the basis for instructor/operator stationson new simulators.
ABOUT THE AUTHORS
Steve Seidensticker is the TeOhnicalContract Manager for the ISS development
-LANDIM CHEM LIST-
I v:NO SWEEP MODE SW ZO AUTO
WEELS 3 DOvN
3 FLAPS
DLC AS DESIRED
5 nOOK AS DESIREk
5 NARNESS LOMED
' SPEED BRAKES
3 BRAKES
FUEL
CHEM
CHEM
- I-OT'S VERBAL RESPONSE -
SAT SELECT! MSAT SELECT
uNScHEDuLE0 VINGSvEro
.1 CHERI VINO SWEEPHOL POO ANO HOLD
OT SIptFU..LArr
LT 5 INN FULL FWO FULL FOMVARD
Z DECEL BELOV !MN BELov SIii
3 CA LE/ 4 LEZ PULL
REMAIN IN EMERO WINOSWEEP POSITIONANO COMPLY WITH EMERO WS SCHEDULE
LANO AS SOON AS PRACTICABLE
DiAomoSTICS
SPEED BRAKES NOT EXTENDEDFLAPS NOT FLU. 001104
..HF NOT SET ON APPROACH CONTROL - 251 I
.ALF.NcTICNSELECT:oNS
SELECT ENvIRONmENT --OPTIONS (LAND)
SELECT
RE-POSITION ESE7C71OPTIONS
CHECK LIST -- SELECT!
jUA' ,FREEZEI IHOCOPy vOICE HEN)
Figure 2. Running Task Modules
project at Logicon. He has been invol,vedin the project from iEs earl/ stages in 1978.He holds la master's degree in System's Mana&ment
from the University of Southern California.
Terry Kryway ih a training system analyst
at Logicon and has been responsible for the
154
operational design of the ISS. He previouslyserved two tours as a flight instructor inNavy Fleet Readiness Squadrons. Mr. Krywayholds a bachelor'a deeree in Naval Sciencefrom the Naval Postgraduate School at Monterey,California.
L
vi
THE CASE FOR A STUDENT SELF-TRAINING CAPABILITYIN FLIGHT SIMULATORS
JOSEPH L. DICKMANSperry Systems Management - SECOR
Fairfax, Virginia
ABSTRACT
The conventional use of instructor pilots at flight simulators contains anuMber of problems--a continuous training program is required because of instructorrotation to the assignments, pilots are often not interested in being simulatorinstructors, and they frequently require assistance from device operators. Further-more, to obtain maximum utilization of a simulator a large number of instructorsare required. The use of student self-training programs, enabling a simulator tobe operated without an instructor when required, would solve many of theseproblems. This paper presents the advantages of such a capability and discusses
-the related design considerations. It describes the required instructionalprograms and offers several solutions to the problem of locating simulator con-trols in the cockpit. In addition, it discusseg"the subjecis of safety andrealism and concludes with recommendations applicable to future specifications.
INTR UCTION
onventionally, flight simulators havebeen equipped with more or leSs elaborateinstructor stations, located either on-board(in the cabin behind the pilot's orco-pilot's seat) or remotely (entirelyremoved from the cockpit). With theon-board instructor station the instructoriS able directly to observe the crewmembers and evaluate their performance;
_with the remote instructor station theinstructor observes the crew performancethrough repeater instruments and CRT displays.
In either case, the trend has been to .
strengthen and emphasize the roie of theinstructor. Instructor stations have beendesigned to enable the Instructor to takemaximum advantage of the simulator's in-structional capabilities, to provide himwith extensive information- regarding theprogress of the student's training, and atthe same time to reduce his (the instruc-tor's) workload.
,r
Yet there are many problems inherent.in this arrangement. Instructors requiretraining--the more sophisticated thesimulator the more extensive is thisrequirement. Then, after a certain amountof utilization, most instructors rotate toother assignments and never return toinstructing on the simulator. Or, if theygo to sea duty or other forms of lengthyTOY (temporary duty), they need retrainingwhen they return.
Many pilots do not want to be simulatorinstructors. If the simulator is complex,pilots find that learning and keeping
155
current on its operating procedures de-tract from their concentration on keepingcurrent on the aircraft. These proceduresare perhaps not difficult to learn butapparently are impossible to retain with-out frequent practice. The problem isexacerbated by the fact that Many instruc-tor stations are equipped with a confusingarray of switches and other controls, oftenpdorly labeled, and offer a bewilderingvariety of'instructional features, manyrarely used. Furthermore, most pilotsnever accept having to operate an alpha-numeric keyboard with lengthy formatsfor data entry.
It is possible to design instructor
station controls, displays, and programsto reduce the drands on instructors.
'The instructor station for Device 2F132,the F/A-18 Operational Flight Trainer,is an example of such an effort.l1) However,the ultimate solution to this requirementhas not yet been demonstrated.
The uSe of device operators is a wayto solve these problems. Device operatorswho do not have other, conflicting dutiescan keep current on all of the instructorstation features and procedures and canassist instructors in many ways, fromserving as operator of the alphanumerickeyboard to handling all of the complexinstructional features. However, deviceoperators are usually not provided aton-board instructor stations, probably inorder to minimize the number of persons.in the,pabin and to conserve on the sizeof the instructor station. Furthermore,if device operators are always required,in addition to instructors, the manpower
requirements for operating the simulator
become sivnificantly increased.
. Another approach is to hire formerpilots--retired milftary or naval personnel--
to be simulator instructors. Since they will
presumably have no other function, they cankeep as proficient on the simulator as
device operators can. However, there are
issues involved in this approach--theinstructors' credibility with the students,
the cost of the additional personnel, andthe availability of qualified individuals--that may militate against widespread
adoption.
Regardless of the source of the instruc-tors, the reqUirement to have the simulator
manned by someone other than the students
is a limitation on its utilization. A
modern'simulator-is capable of operating
at least 16 hours per day. To continually
provide an instructor for this level ofoperation for five days per week requires a
force of at least six instructors if theyare not active duty pilots, or approximately
twice that number if they are.
A different solution to all of theseproblems is to develop.a concept of studentself-traininacjnot to eliminate all instruc-
tors, of course, but to provide users with
the capability to conduct some trainingwithout the presence of an instructor
-being required.
CONCEPT
In a simulator with a student self-training capability the student should be
able to conduct a complete training exerciseusing controls entirely in.the cockpit.To this end he should be able to initialize
at a desired geographical location, fly a
pre-planned flight profile; introduce and
remove malfunctions, operate simulator
control functions such as freeze and crash
override, and operate instructionalfeatures such as demonstrations and per-
..
formance measuring.
The exercise should be part of an
overall training syllabus that would include
both aircraft missions and simulatowxer-,cises. The qumber of self-training exercises
with ah instructor-being present should be
planned in advance and scheduled in the
syllabus. Self-training should never be an
"ad lib" use of the simulator, resorted toonly when an instructor is absent.
The optimum ratio between self-training
and instructor-monitored exercises in a
syllabus would depend on a variety of condi-
tions. The availability of trained iostruc=
tors wobld be a major factor. If the number
of available instructors is small and the
number of students is large, self-trainingexercises would comprise a substantial part
of the syllabus. The qualifications of '
the students would be another factor. A
pilot who needs only refresher trainingcould use more self-training exercises than
a recent flying sChool graduate. In general,
if instructors are used priparily forinitial indoctrination and/for checkrides,the ratio of self-training exercises toinstructor-monitored exercises could be
in the order of four or five to.one.
DESIGN CONSIDERATIONS
iproviding a student self-trainingcapability will have a significant impact
on simulator design--on the instructional
system software and, to a lesser extent,
on the student station hardware (i.e.,
the cockpit).
Controls
Controls in the student staten will
have to be considerably more elabdVate than
for,a conventional simulator, which mayhave only a Freeze switch located lery
inconspicuously. How elaborate will depdnd
on the number of functions provided for the
student--to the same degree that the com-
plexity of an instructor station dependson the functions provided...for the instructor.
If cost'considerations dictate an austereapproach, student station controls can be
kept to a minimUm--only sufficient forinitialization; freee, motion control and
emergency off. Additional capabilities
can be provided at increased cost, although
the fact that these capabilities must be :in
the Cockpit, not at an instructor station,
will have a distinctly limiting effect.
Student station controls must have
certain required characteristics. The
first of these requirements is that thetbntrols--and the instructional featuresthat they relate to--Must be understandableand easily operated by untrained individuals,
In this case, "untrained" means not having
attended an operator's course and havingbeen required to read only minimum instruc-
tions on the simulator's operations.Simplicity and consistency in the design of
simulator controls wjll be required to a
degree far greater than heretofore.
A.second requirement is that student
'station simulator controls be-easily
accessible. Since the Student will be
devoting his full attention,to flying the
156-
simulator, it is imperative that the distrac-tion of locating and operating non-aircraftrelated controls be kept to a minimum.
A third requirement, which is not easilyreconciled with the previous one, is thatcontrols be located as inconspicuously a$possible. On the premise that student accep-tance of a simulator depends on how closelyits cockpit, sound effects, and "feel" resem-ble the aCtual aircraft, it is apparent thatpanels, switches, and indicators that areforeign to the aircraft would be an irritantto the student. Obtrusive simulator controlsin the cockpit would be as distracting asthose that are hardito find.
Two simulato`rs built by Sperry SystemsManagemett SW:1R contain,controls foroperation from the cockpit, although towmuch they will be usecifor aircrew self-training has apparently not been determined,The two approaches have both similaritiesand differences, with respect to scope ofcapabilities and concept of operation.
One design, the CH-53 Operational FlightTrainer, the first unit of which is scheduledto be delivered to the U.S. Marine Corps inlate summer 1982, contains a small controlpanel on each side of the cockpit, above andin front of the two side windows (see
- Figure 1). The panels are identical, each
Figure 1. CH-53 OFT Remote Control Panels
157
LI 3
.44
I.
e.*
having 'push-button switches for Initial
Conditions 'Select, Freeze, Motion Ready/On,Emergency Off, and Motion Emergency Disable.Two thumbwheel, controls on each panelenable the students to select a set ofinitial conditions from twenty available.The control panels are not covered orotherwise concealed but are located abovethe pilots and copilot's normal forwardfield of view..
The other approach is found in theHU-25A and HH-65A Flight Training Systemscurrently being built for the U.S. CoastGuard for delivery in early 1984. A 20-keykeypad, provided primarily forfthe instructorstation:which is located in the cabin,,isinstalled with a cable that enables it tobe placed on the pedestal or the floorbetween the pilot's and copilot's seats.The keys include.Freeze, Override (forcrash), Store (for flight conditions), Reset,Enter, Clear, Remove (for malfunctions),ten digits, two punctuation marks, andbackspace (see Figure 2).
1
asim
7
2*101111/1INIIII
41=111
. 5
84111.111/1111061
0
1103
6
111111,
9o.NINmommta
111.1001111.10
STORE
411141P
RESET
411111111.
OVRO
4.1.11.1111
RMV
01101110
110ENT
ixwm
.1111ctR
...FRZ
Figure 2. HU-25A/HH-65A FTS Keypad
In both the CH-63 and the Coast Guardtrainers the students will be able to usethe instructor station controls to accomplishfunctions that are not possible via thecockpit simulator controls. Conductingdemonstrations would be an example of such
a function. Also, if necessary one of thestudents will be able to call up displays on
610
158
the instructor station CRTs and use them forself-training purposes. For example, thestudents, will be able to inspect the air-craTt track depicted on an approach display,after making an approach.
The self-training capabtlities of thesesimblators.are facilitated by the fact thatthey have on-board instructor stations andtwo-man crews. The copilot.cay readily go .
to the instructor station and use the simu-lator controls as he desires without-interfering with what the pilot is doing.If the simulator had a single-place cockpitand a remote instructor station, thestudent station controls for self-trainingwould have to be more self-sufficient.
Formats
The design of the controls will beinfluenced considerably by whether the dataentry formats are page-dependent or not.For a "worst case" situation, in which thestudent will have no access to the displaysystem, the formats will have to be non-pagedependent, utilizing input codes for enteringand clearing malfunctions, modifying flightparameters such as fuel load and armament,and modifying environmental parameters suchas ceiling and visibility.
Since an alphanumeric keyboard in the,cockpit is out of the question, all inputcodes will have to be useable on a keypad.This requirement suggests that the codesshould be numeric, but to help the studentremember at least the most-used ones thekeys on the keypad could 4e labeled bothalphabetically and numerically, like atelephone, and the codes could be alpha-betital.
If alphabetical Codes are used, andother alphabetical inputs such as N, E, S,and W for latitude and longitude arerequired, the student will need a shiftkey to select one of the three letters on akey and to differentiate between letters andnumerals. This'approach tends to complicatethe operation of the keypad but the alterna-tive, the use of all-numeric codes, is less
desirable.
There is another alternative to theLAe of shift keys, i.e. a two-digit code',used inthe FAA's Voice Response System fortelephonic weather briefings, that identi-fies each letter on a key. For example, theletter "A" is selected by entering "2,1"(referring to the 2 key and the first letteron it). Similarly, "B" is "2,2". This
method, however, will increasp the timespent in entering all forMate, and is con-
sidered to be acceptable only if the inputcodes can be kept to one or two letters inlength.
The number of input codes'*required willdepend on how much capability is provided tothe student. It is likely, however, thatthe number Vill exceed what can be rememzbered by the student, particularly if heis not uing the simulator frequently.Therefore, a list of input codes will beneeded to be kept in the cockpit. This list.could be part of an abbreviated checklist,similar to the Pocket Checklist used inaircraft, and could cover all of the oper-ating procedures that the student would needto be reminded of.
Functions
In a self-training mode the studentcould forego' many functions that are*available at the instructor station in aconventional simulator. Also, many func-tions that are usually accomp}ished withcontrols could be performed via alphanumeri-cal entries to the computer. Theseprinciples will have to be imaginativelyapplied to the design of the student stationin order to keep.the number of controls to aminimum. A minimum-sized control unit,whether it is a panel like the CH-53 OFT ora keypad like the Coast Guard simulators,is essential to meeting the three requireddesign characteristics discussed previously.
%
The following discusses a broad rahge,of functions typically exercised at aninstructor station and describes how theycould be best performed with-controls inthe cockpit.
Turn-On Procedures. Turn-on procedurescan be considered to include powering-up.thesystem, initializing the computer andperipheral equipment, turning on the visualsystem if there is one, and pefforming areadiness test. None of these proceduresrequires special consideration for the self=training concept; they can all be accom-plished by maintenance personnel, whopresumably will always be present.
Initialization. In conventional simu-lators, initialization is accomplished in avariety of ways: thumbwheel controls,pushbutton switches, and alphanumeric key-board or keypad entries are used, sometimesin combinations, to select the set desiredand enter it into the,computer. Keyboardentries and pushbutton switches are used todisplay the available sets on a CRT at the'instructor station. If the instructorwishes, he cin modify the parameters- by
J11.1
Is
making keyboard entries.
Forthe self-training cohcept, keypadentries should be,used exclusively,. in orderto minimize the need for hardware controls.In the absence of a CRT in the,cockpit, theavailable sets can be included in the abbre-viated checklist mentioned previously.
The question arises whether the studentshould be allowed to make on-lihe modifica-tions to an initial condins set before it -
is entereda,and if he is a owed to make 'i modifications, how he will be able to record
them. The simplest approach is to allow nopre-entry modiftcationy. In this case, if.
the student is not satisfied with the initial.conditions he can make grameter chabgesafter the set is entered. The student willthen observe the results directly on theaircraft instruments.
Motion. Because of safety considera-tions, the motion system presents a specialproblem for the self-training concept. Witha six-post system, the operator, beforeactivating the system, must have a positiveindication that access ladders are stowedand maintenance platforms around the cockpitare cl,eared. Even smaller systems havesafety problems.
159
One approach, for the self-trainingconcept, would be to make the maintenancepersonnel responsi6le for operating themotion system--they could;use an intercomsystem to communicate with the student when .
necessary. On the other hand, it would bealmost mandatory to provide a "disable"capability in*the cockpit. If a switch isprovided for that purpose, it could bedesigned to indicate a "motion ready" condi-tion and to provide.both an "enable" and"disable" capability for the.student
, some types of installation, this may notbe feasible; an additional switch may berequired). Additionally, the student could13"e requiged to obtaih-a verbal clearancewith thermaintgnance personnel, over theintercom, before activating the system.
Normal Operating Functions. There area number of routine operating.capabilities,
usually performed with push-button switchesat 0 converitional instructor station, thatmust be provided in the cockpit, in 6 waythat conserves hardware controls as much aspossible. The most basic of these isFreeze; dedicating a push-button switch forthis functtoh appears-to be mandatOry.Emer§ency OR is a similar requirement,although the switch should be guarded in away that guarantees against inadvertent actu-ation. Crash Override can be accomplished
13"
with a keypad entry, if it is assumed thatthe student will not need to activate it as
-an immediate response to difficulty. Slew-
ing,Which in a conventional 'simulatorusUally 'requires a jaystick control aq Apushbutton switch; can be omitted in the
self-training mode. If the student wants
to change the geographical position of the
simulateds aircraft he can enter a neW;latitude and lOngitude with a keypad entry
' (or', with a more sophisticated program, aradial and DME.from a NAVAID). Or-te can
reinitialize. Although possibly not essen-
tial, Store IC's and Reset are very Useful
functions; they enable the instructor tostore the existing fliglig conditions at anytime and then to reset the simulated air-
craft there when desiredlater. If,these
functions are provided for the self-training
concept, two pushbutton switches would be
needed. Finally-ran intercom capability
with maintenance personnel must be provided.
A switch is not needed, however, if the
student's microphone is continually "hot".
Instructional Features. 'The major in-Structional features found in most current
simulators are demonstrations, checkrides(also known as programmed missions),playback, parameter record, and event print.
Probably the one most ipplicable to theself-training concept is demonstrations; the
possibilities for learning by observation
and emulation, through this method, are '
almost limitless. All of the actionsrequired to select, activate, and termtnate
a demonstration can be taken with keypad
entries (plus use of the Freeze switch Ito
start after the pre-demonstration initial '
conditions have been attained).
Checkrides are also very applicable to
the self-training concept, although they,will pose a problem with respect to the-
monitoring function. In a Conventional
simulator, the instructor uses the CRTdisplays to monitor the progress of the
(mission; if the student makes a gross error
sufficient to confuse,the computer and induce
it.to scorej leg that is.different from the
one that the student is flying, the instruc-
tor can correct the situation with aJManual
Advance switch. In the self-training mode
jt is not feasible to provide the stulipntwith this capability or to expect him to
monitor the Mission leg by leg. It is
essential that the mi-ssions be very .care-fully designed so as to reduce to a minimum
the need for instructor interventions if
the mission cannot be completed as intended,
it will have to be aborted.
Playback (DynaMic Replay), will be of
lesser usefulness without:ag,instructor
160
present to contribute comments. The student
will know generally what his errors were;
redding the maneuver Or procedure correctly
will be his primary concern. -If desired,
however, Playback can be operated with keypad
entries, assisted by,the Freeze switch,
A variation'of Playback called Minute
Replay, or sometimes Instant Replay, enables
the instructor to store a number of segments
of flight history, usually Of one-minute
duration each, and,to replay them during the
critique period-after the mission. ThisA
capability is not considered to be important
-in the self-training mode. To provide it,
however, a switch will be needed to enable
the Student to quickly select the segment
to be stored.
Parameter Record is a Tittle-used
data-gathering capability. To operate
Parameter Record the instructor must select
individual parameters to be monitored,
specify the reference value and tolerances,
and start and stop the recording. These are
time-consuming functions that are not at all
suitable for a student to perform.
SimilarlyEvent Print is not appro-
priate for the self-training concept: This
feature prints, either on the ParameterRecord printout or independently, the timeof occurrence of a number of events that the
instructor selects in advance. To be useful
this capability must be part of a verydetailed analysis and critiqueof themissionwhich only an instructor can conduct.
Malfunction Control. As in conventionalsimulators, malfunctions can be entered
and cleared in the self-training mode by
using keypad entries. The list of program-
mible malfunctions can total several hundred
items depending on the type of aircraft apd
amount of attention to this subject desired
by the user. A shorter list can be devised
for the self-training mode and included in
the student's abtweviated checklist. A
Mal'function OveWide control, if_desired,wfll requift a pushbutton switch.
;Critics of self-training could say that,
considerable traintng value is lost, due to
the absence of surprise, if the student
enters the malfunction himself. However,
all training value is not lost, particularly
with respect to practicing the steps in
emergency procedures. Furthermore, it
would be possible to enable the student toprolgram a series of malfunctions to occur,
. randomly during a mission, thereby assuring
that he would at least not have foreknoWledge
of when a malfunction would occur, Also,
programmed malfunciion "packgbes", with
1
different types of'malfunctions and levelsof difficulty depending on.the mission andstudent qualifications, could be enteredby the maintenance personnel.
Miseellaneous. Additional functionsfound in conventional simulators can beaccommodated in the self-training mode withkeypad entries or omitted entirely in theinterest of simplicity. A volume controlfor sound effects, for example, can beomitted (if considered necessary thisfunction can be accomplished with a keypadentry). Controls for the voice recorder,if one is provided, can be omitted--separate,control is not needed in the self-trainiol mode. The catapult-hold and '-firefunctions, for a carrier-based aircraft,dan be accomplished with keypad entries.Alternatively, catapult fire can be program-med to occur a few seconds after the pilot
gives the "ready" signal by turning on hisexterior.lights, or in another way, afterhe lowers the hand-grip. Similarly, chocks,external power, starting air, rough air, andarresting gear'can be provided via keypadentries. A warning horn for a malfunctionof the simulated oxygen system, providedat the instructor station in conventional
simulators for safety reasons, can also beinstalled at the student statipn (if, infact, it.is truly needed).
Optimum Design
In,summary, it is concluded that themost desirable functions for the self-training concept could be accomplished witha keypad containing 20 keys. The followingfunctions arp considered to be the mostuseful: Mofion Ready, Motion On, EmergencyOff, Freeze, Malfunction Override, Store ICs,Reset, ten digits, Enter, Clear, punctuation,and backspace.
The 20 keys could be accommodated in aunit similar to(the hand-held terminal manu-factured by th Termiflex Corporation.--Various models of the Termiflex have thecapability of displaying one or two lines ofalphanumeric text, each 10 to 12 charactersin length. This feature would be esseritial
for the self-training concept, to enable thestudent to verify hi-5-input codes and valuesbefore entering them into the computer.Three shift keys would prov.ide access toOe letters and punctuation. An example ofa self-training terminal is shown inFigure 3. 7--.
The terminal sho.uld have some featuresnot found in the current models of the'Termiflex. The keys should be backlighted,to enable readidb when the coCkpit is .
161
darkened for simulated night operation.Further, certain function keys (Free/e,Motion Ready/On, and Malfunction Override)should be brightened when selected, toprovide a status indication to the student.Also, the Emergency Off key should beguarded, preferably,with a cover that wouldhave to be lifted before the key is actuated.
A hand-held terminal eliminates theproblem of finding a blank panel in thecockpit behind which to conceal the switches.The terminal could be hung behind the sator stowed under it, seat design permitt1ng,and could.be removed entirely for instructor-monitored missions if it had a plug-incapability similar to the Termiflex.
Displays
In the4oregOing discussion of simulatorfunctions and the options available foraccomplishing them, it has been assumed thatthe CRT displays normally found at aninstructor station would not be availablein the cockpit. However, it is possibleto display both graphic and alphanumericdata to the student via the visual system.This approach has been used modestly in theA-6E Night Carrier Landing Trainer (NCLT),'Device 2F122; one display, depicting a plotof certain parameters (vertical velocity,bank;.pitch, rpm, glide slope, and center-line) recorded during an approach, can beshown on the two visual display CRT's infront of the pilot and B/N.
In the NCLT the alphanumeric/graphic
display replaces the visual scene; analternative method is used in the A-4M OFT,Device 2F108/2834F--a line of weapon scoringdata is superimposed across the bottom of -
the visual scene. The parameters include ahit or mis's evaluation, bearing and range ofthe input, airspeed at release, and others.This information is available only on theinstructor station viSual monitor, but, itcould be easily displayed to the pilots.ifdesired.
The possibility of using the visualsystem for alphanumeric/graphic displays_provides almost unlimited opportunitiesfor enhancing self-training. If a side-viewing visual CRT is provided in thecockpit, it could be used, as frequentlyas desired by the student, for any'of thedisplays available at the instructor station.A front-viewing visual CRT could also be,.used, but in a more restricted wa,Y.
Of predictable interest to fhe studentwould be the procedure monitoring display4$of normal and-emergency procedures showing
e
".-
'N
O.
,
Figure 3. Student Self-Training Terminal
,
..
162...),)
the results of the student's attempts, the
approach displays showing the various pub-lished approaches and a record of theaircraft track, and GCA/CCA/ILS displays withboth horizontal and vertical projectionsof the aircraft track. Static data such asthe list of input codes and the contentsof the initial conditions sets, Which wouldbe available as instructor station displays,could also be presented in the cbckpit,rather than in a Pocket Checklist assuggested previously. Special displaysproviding checklist instructions for oper-ating the simulator could be included.
If displays are available to thestudent, it is possible that tie will want toprint the most significant ones for post-mission analysis. Checkride summary datawould be particularly useful. In any event,the display printout function could beoperated by a keypad entry.
Safety
In any discussion of students operatinga simulator without an instructor, concernis usually expressed regarding safety.There are a number of potential sources ofdanger around a simulator--high voltageelectricity, high pressure hydraulics,the motion system, and possibly others.However, most of these should be of concernonly to maintenance personnel. There shouldbe no reason for Students to bg exposed tohigh voltage, for example. In any event,maintenance personnel will always be presentduring self-training periods and can be maderesponsible for enforcing restrictions onaccess to hazardous areas.
The only source of danger of importanceto the self-training concept is the motionsystem. The danger, of course, is topersonnel working outside-and in thevicinity of the student station. As dis-cussed previously, the best solution to thisproblem is to have motion system.turn-on
procedures involving both.the students andthe maintenance personnel. Communicationbetween the two is a necessary element ofthese procedures.
CONCLUSIONS
Excluding the basiC flight trainingperiod, most training in flying is conductedwithout the immediate presence of aninstructor. This statement is made on thepremise that flying to maintain proficiencyis essentially training. Throughout hiscareer a pilot receives instrument checks,annual proficiency checks, upgrade thining,and refresher training, during all of which
163
an instructorpr supervisor is very much inevidence; but in most of his flying he isexpected to train himself--to assimilateguidance and instructions on the ground andto apply them in the aircraft. .Especiallyin practicing takeoffs and landings, ininstrument flying, and in weapon delivery,he learns primarily by self-correction andrepetition. In brief, he is training byacquiring experience.
It follows that a pilot can learn ina simulator also by these methods. .What
makes self-training effective in an aircraftis the pilot's conscientiousness and pro-fessionalism; the same characteristics in thestudent can make self-training effective in asimulator. There is a difference in the twosituations, of course. The penalty for lackof motivation or a lapse in concentration is
sometimes fatally severe in an aircraft; ina simulator there is no penalty other thanwasted time and a relatively small amountof money.
A controversial aspect of self-trarnilgis the effect of the loss of realism causedby the student's frequent interacting withthe simulator. From time to time during aself-training exercise, the student willtake such actions at reinitializing thesimulator, entering and clearing malfunc-tions, storing flight conditions andresetttng to the stored conditions, andadding fuel and armament. In addition hemay display emergency proeedures via the CRTsystem and review them before entering mal-functions, freeze the trainer and study thepertinent approach plates, andOnterruptthe flight and call up a demonstration. Allof hese actions would be foreign to hisprocedures as a pilot. The question is,would they reduce the transfer of trainingexpected from the simulated control re-sponses, instrument readings, and visualscenes that he is also perceiving?
It is believed that the answer to thatquestion is negative. Again, professionalismand conscientiousness will enable the studentto bridge the gap between managing his owntraining and receiving effective_trainingexperiences. One ingredient is necessary:the management task must not become tooengrossing.
A related concern is whether the designof the student station--and the associated
software--is sufficiently simple. It wouldbe possible to'overwhelm the student byattempting to provide all the capabilitiesof an instructor station. Designers willhave to keep in mind the three required
characteristics for student station controls
cited praviously--simpl'city, acceAsibility,and inconspicuousness--a _tbe-likessity tobe conservative in providing instructional
features. The need for disciplined andresponsive human factors engineering will bepossibly greater than for any other area ofdevelopment of a simulator.
In conclusion, it is recommended thatspecifications for flight simulators (cockpitprocedures trainers, operational flighttrainers, weapon system trainers, teamtactics trainers, and similar trainingdevices) include provisions for a studentself-training capability, in addition to aconventional instructor station. The users
will gain tremendous flexibility in simulatorutilization. In the final analysis, theywill be able to schedule simulators likeaircraft--simply, some flights can be solo
and some dual. 41\
REFERENCES
1. Dickman, J. L. A Comparative Analysisof Three Approaches to the Man-MachineInterface Problem at the Flight SimulatorInstructor/Operator Station. Proceedings
of 3rd Interservice/Industry TrainingEquipment Conference, Orlando, FL,November 1981.
164
GENERIC SIMIOTOR INSERUCROR TRAINING
DR. GEORGE W. MENZER AND DR.. STEVE R. OSBORNEAllen Corporation of America
Flight sirulation has been extensivelyused in both military and commercial aviationtraining for the past several decades. Everyyear the complexity and versatility of flightsimulators grow. However, the full trainingpotential of these devices is not oftenrealized. There are, of course, a variety ofreasons why this situation persists. Lack oftransfer of training due to procedural orconfigurational similarities between thesimulator and pareht aircraft, lack Ofadequate out-the-windaa visual displays, andthe cunyleteness of the mission relevantaspects of the simulations are but a few.However, the point of this paper is that thetraining potential of flight simulators couldbe enhanced by improving the kncwledge andunderstanding of the operating principles ofthe devices, by understanding instructionalarrangements and by utilizing instructionalskills.
Lack of realization of the full trainingAcotential of many sirulators varies a greatdeal from one simulator to another. Thisvariance is created by the complexity of themission being simulated, the age and sophis-tication of the simulator, the instructor/operator station design, and the availabilityof instructional aids.
In this paper, a solution to the problemis explored--the development of genericinstructor training1.- packages for flightsimulators. In the following sections of thepaper, the several general topical areasrelating to generic instructor training arepresented. These include the advantages ofsimulator training, the principles ofeffective instruction related to simulatorinstructional features, and possible imple-mentation and delivery strategies.
Advantages of Simulator TrainingGround-based flight simulators play an
irportant role in the overall training schemeof most civilian and nearly all militaryaviation training programs. The advantagesof sirulator training are clear. It offers ameans to provide cost effective and instruc-tionally effective training in a safe environ-ment. Cost savings are realized in both theobvious and iame not so Obvious trainingparameters. Obviously, fuel and maintenancecosts associated with actual flight areconserved. Net so obviously, tremendoussavings in time are realized in simulatortraining. These savings are observed in bothstudent and instructor time requirements.Students get more relevant training per hourof "flight" because they can concentrate on
the goals of a particular mission while.instructors may supervise much more relevanttraining per hour of their commitment.Surulators also allow a safe trainingenvironment. In the area of safety, theadvantages are clear and obvioussimulatorsallow practice in flight activities wherethe only negative consequences are learningwith some erbarrassment and where realisticresponses to emergency or compound emergencysituations may be practiced. Last, but fromthe point of view of this paper, by no meansleast, simulators provide a necessaryhands-on instructional format for flightoperations. With snulators as a part ofthe overall flight training regimen, theopportunity is Tresented to allow hands-onand associated cognitive learning experi-ences of the proper training density in
proximity to the preceding academic eventsand to the, sUbsequent inflight events.These simulator Events are particularlyvaluable becausb they provide practice withappropriate feedback for the learningobjectives of interest, as well as allowrepeated exposures to particularly difficultand important segrents of a mission.
Needless to say, a ccuFonent of anyinstructor training course related to flightsimulation should be a thorough discussionof these and other advantages of simulation.
Instructional Principles and InstructionalFeatures
A generic simulator instructor trainingprogram should amplify a few basic prin-ciples of effective instruction particularlyas they relate to the usual set of instruc-tional features found on most in-serviceflight simulators. The following is a listof such principles with their relationshipto some commonly available instructionalfeatures.
1. Student Preparation Leads toEffective Learning. It is a well estab-lished, even intuitively obvious, fact thatthe better prepared the student is when hecomes to a simulator exercise, the more hewill profit from it. A generic simulatorinstructor training program should bedesigned to emphasize this point as well asto key the instructor to use some commonlyavailable student preparation aids. Theseaids would 'include the formal ,simulatorexercise guides and prompts/briefing aids.Formal simulator exercise guides are univer-sally vailable in all Navy trainingsquadrons. Some are squadron developedwhile others are the result of contractor
165 1 (3
supported ISD development activities. The
generic training should emphasize that
through the use of these guides both theinstructor ana the student will be preparedfor the content and flow of the activitiesin the exerdise. Also, depending on thesophistication of the,documents, the studentmay have an excellent feeling for the level
of, proficiency he is expected to demon-
strate. At least one simulator program, theF-14 OFT program at NAS Miramar, has beensupported by a very sophisticated device, theInstructor Support System, which allows CRTdisplayed briefing aids. Aids of this type
could become part of a device-residentoeneric instructor course or any course insupport of a new ICS design that is suffi-ciently complex to contain it.
2. Accurate and Timely Feedback Leads to
Improvements in Performance. Another well
known principle of effective instruction isthat accurate and timely feedback' leads to
improvements in performance. A properly
designed simulator instructor training-coursewould emphasize this point and relate it tothe several instructional features which arecommonly available on most simulators thatallow accurate and timely feedback. These
instructional features include freeze record/
replay, automated performance measures,
graphics displays and hardcopy printouts.
The freeze function allows imediate feedbackfor the development of fine Motor control as
well as directional control. It is
particu1nr1y useful because it can be
employed in, both real time or during a
performance replay session. Record/replay
has an obvious application and, in fact,
provides the basis of many of the other
instructional features. Automated perform-ance meastires provide an excellent oppor-tunity for lessening the workload of the
instructor while providing a more accurateand complete record of performance than eventhe most diligent instructor is capable of
providing. However, while the value of
automated performance measures should be
emphasized in a generic simulator instructortraining program, it must be noted that thevalue of this instructional feature depends
on the amount of instructor inputs to thevariables, the weights of the measures and
the accuracy of the computations. While
automated performance measurement systems are
not generally available for most current
simulators, it is clear that they will beavailable in future systems or in outboard
support systems. Graphics displays, parti-
cularly those which display approach, weapons
delivery, and pattern information, are an
excellent weans to provide post:.session
performance feedback. These displays are
fairly common on many current simulators,although some are fairly unsophisticated.
However, future simulator instructor stationsand support systems for current simulators
166
will include these displays. A clear emrphasis should be placed on the value and useof these displays in a generic instructorcourse. Hardcopy performance printoits arealmost universally available on all complex
simulators. However, they are almost
universally unused as an instructional
feature. A generic instructor training
course would emphasize the potential value
of these printouts as well as providesuggestions about how they could be sum-marized to be a more useful resource. Here
again, the value and acceptance of theseprintouts depends on the legitimacy of themeasures expressed.
3. Practice Improves Performance.
Another well established principle of
effective instruction is that practice,
particularly practice coMbined with accurate
and timely feedback, improves performance.
As a matter of fact, it has been conclu-sively demonstrated that practice beyond thepoint of proficiency is effective in consol-idating learning and making it less likely
that deterioration in future performan:e .
will occur. Opportunities for practice aeprovided by both supporting materials andinstructional features. That is, within the
supporting materials designed for specifictraining goals, the training objectivi4havebeen arranged to provide practice that isrelated to the difficulty, criticality, andfrequency of the psychomotor response in
question. From the point of view of in-structional features, the reset function is
a key feature that allows repeated practice
flior difficult or problem maneuvers. Reset
is a commonly available feature which alloys
the instructor to reinitiate a training
routine repeatedly with constant or variableflight conditions until satisfactory per-formance is observed. Malfunction insertion
is another practice-relevant instructiomilfeature. With this copponly availdaefeature, the instructor tan mdnually or
automatically insert aircraft malfunctions
lat any stage in a "ssion. These malfunc-
tions may be presen d once, or repeatedly,at the instructor's option. This featureallows excellent practice in dealing withemergency situations.
A generic simulator instructor trainingprogram would emphasize the importance or'use of these Practice-relevant features.
The course would include clear directionrelating to the use and adherence to the'supporting Materials- Several ' problem
analytic studies have shown that departdrefrom the training routines specified in
supporting"materials is a problem in manyNavy training squadrons. The course would
also include direction relating to the:
proper use of the reset and malfunctioninsertion features. This direction shouldemphasize that resetting is an excellent and
timely method to provide repeated practicefor problem raneuvers. Emphasis would beplaced on the proper use of the rmlfunctioninsertion features. Here the theme wouldcenter on adherence to required malfunctioninsertions specified on supporting materialsand, the proper use of random malfunctioninsertion. The generic course would make itclear that intentidnally overloading thestudent with compound emergencies is unreal-,istic and may lead to confusion andrestricted skill advancement in the areas ofprimary,interest.
4. Guidance and' Mbtivation ImprovesPerformance. A final generally acceptedprinciple of effective instruction relatedto simulator training is that proper guid-ance and motivatiOn inproves performance.Surely, one of t most inportant jobs ofthe simulator in ctor is to provideproper guidance M direction to thestudent. The gener è9urse would emphasizethat this guidance il1st be organized,mission relevant, and provided in an in-structional mode. That is, it must beprovided in a cooperative setting charac-terized by Joint achievement student/instructor goals. Several instructionalfeatures relate to the organization ofstudent guidance in the simulator setting.These include use of the trainer supportingmaterials and use of the prompts/briefingaids. These features provide a vehicle forguidance. However, the key vehicle forguidance is a sensitive student/instructorrelationship. A generic course shouldemphasize that point. Instructor providedmotivation is another key feature in thestudent/instructor relationship. A genericcourse should emphasize the need formotivational comments, rewards, and opencommunication. In addition, the courseshould enphasize the use of some commonlyavailable instructional features which mayaid student motivation. These would Include'the demonstration feature, the freezefeature and all the features that providefeedback about performance. A genericcourse should erphasize that demonstrationscan be motivationally effective if they arenot overused or if they are used as afeedback mechanism to provide direction onthe proper procedures. The freeze featurecan also have a profound motivational effectwhen it is used to intercept poorperformance or to indicate a "terminal"condition such as striking the ground orbeing hit by a weapon. The course wouldpoint out clearly that the instructor is
responsible tor providing motivation throughthe thoughtful use of all of the performancefeedback features. These would includeautomated performance Treasures, record/replay, graphics displays, and hardcopyprintout. The emphasis here should be onthe proper and sensitive use of these dataSources to improve performance.
167/168
Implementation of Generic TrainingThere are several potential advantages
for a generic simulator training program.The program would undoubtedly improve theeffectiveness of the instructor cadre forevery simulator where it might be imple-mented. This improved effectiveness shouldlead to a more professional ffproach to theoverall use of simulator training, or a partof flight training in general, and to poten-tial improvement in student performance. In
addition, the,course should lead to improve-rents in student management from both theinterpersonal and administrative points-of-view. Interpersonal inproverrents couldresult from the course content in the areasof guidance, motivation, and performancefeedback. Administrative improvements couldresult from the generally inproved organi-iation of simulator training and from, theuse of automgted management aids that6hightbe part of a device-resident training course.
A disadvantage of the proposed conceptis that each instructor training courseassociated with a particular sirnalato wouldhave to be tailored to reflect the particularinstructional features and the nature of thesupporting materials of that device. Thisrequirement admittedly diminishes the genericvalue of the concept but, hopefully, only asmall amount of tailoring would be requiredto address most simulators. The trainingcourse could be impleniented in two distinctlydifferent formats--device-resident or device-independent. In the device-resident format,the course could be contained,in the softwareof the simulator =touter and presented on aCRT associated with the IOS, or a supportingsystem. With this format,the course would becontinuously available, obviously convenient,and might even be made contingent for in-structor utilization of the device. In thedevice-independent format, the course couldbe presented in a stand alone redia, such asa workbook or slide/tape, and could be pre-sented in an individualized study atmospherein a learning center, or training supportcenter. This format would allow instructoraccess in a self-paced, individualizedarrangerent which is consistent with themanning dehands otmost training squadrons.In either forMat, the course would demand aminimum of intrusion into training squadronactivities during its development whileproviding a maximum of benefit for instruetqctraining.
APPLICATION OF ARTIFICAL INTELLIGENCE AND VOICE RECOGNITION/SYNTHESISTO NAVAL TRAINING
JOHN A. MODRICK, JAN D. WALD AND JOHN BROCK
HONEYWELL, LNC.SYSTEMS AND RESEARCH CENTER
MINNEAPOLIS, MN 55440
INTRODUCTION
The objective of the research reportedhere is to evaluate the feasibility and useful-ne,se of applying two emerging technologies tonaval training. These technologies areartifical intelligence and voice recognitionand synthesis.
,
Artifical intelligence (AI) is a body ofconcepts and techniques which have been devel-
oped to permit computers/machines to do semeof the complexzcognitive activities that havebeen regarded traditionally as the uniqueprovidence of 'ivan beings, in some instancesexceptionally talented human beings. Rep-resentative cognitive activities are problemformulation, searching for problem solutions,diagnosis, and decisionmaking. Theseactivities require representation of knowledgeand inferential reasoning about that knowledge.
If these concepts can be combined withvoice recognition and synthesis, then an inter-active, oral dialogue cap take place; theverbal exchanges can be flexible, meaningful,and appropriate to the context of the on-goingevents. Applied to training and education,computer-aided instruction moves a major stepin the direction of a realistic journeyman-apprentice relationship. The journeyman canbe designed to be not only a technical expertbut an expert tutor as well.
The organization of this paper consistsof four parts:
Review and Evaluation of the ComponentTechnologies and Principal Concepts inArtifical Intelligence
Review of Naval Tratning Needs Suitablefor a Prototype Course Using AI and VoiceRecognition/Synthesis
Recommended Guidelines'for InstructorSupport
Discussion of Implications and Issues
This research has been funded through twosources. One source is an on-going, multi-year project under Honeywell's program ofIndependent Research and Development. Thesecond source is a contract with the NavalTraining Equipment Center (NTEC 81-C-0093-13)entitled "Use of Voice Technology a theInstructor's Assistant."
It is assumed that the demographic pre-.
dictions of unavailability of future militirymanpower are accurate. Therefore, it will benecessary to develop more efficient trainingsystems and instructienal methodology that aremore effective in amount of competence producedper hour of instruction. The proposed approachis feasible and will provide a training capa-bility to meet the nPed. The cost,versusbenefit is more difficult to assess, .however.We can assume that software costs will continueto accelerate and that all technology is expen-sive to deelop. Intelligent systems of,thistype are software intensive. Further, thetechnological development needed is undeter-mined but certainly net trivial.
Artifical intelligence and voice tech-nology may no longer be embryonic but they arefar from mature for most applications.
REVIE: AND EVALUATION OF THE COMPONENTTECHNOLOGIES AND PRINCIPAL CONCEPTS IN
The needs'of the Navy in instructionalca0bility were examined briefly for thepurpose of guiding the survey of technology.The results of that survey were that the Navyneeds instructor support, friendly and for-giving interfaces between the training devicesand both the instructor and student, andautomated performance measurement. Thefunctions and tasks for which instructorsupport is needed are:
Subject Matter ExpertisePerformance EvaluationProblem Selection
Deliver:ing FeedbackProgress ManagementDocumentationRole Playing
The review of AI technology was organizedinto the following topical categories:
Knowledge RepresentationLearning TheoryThe Nature of; Expertise
Intelligence Computer Aided InstruCtionSoftware Architecture for ArtificalIntelltgence
Natural Language UnderstandingInteractive Speech
The mijor conclusions will be summarizedbriefly.
1691 t
The area of Knowledge Representation isknown as Domain Knowledge. There are two
relevant domains for a training system: the
knowledge content of the technical specialtybeing taught and the instructional knowledgeof methods, strategies, and procedures. They
present no unresolvable problems for knowledge
representation. The Content of the domainswill become manifest as the training materi-
als are developed. However, not all of the
necessary knowledge has been explicitlyarticulated or well-defined.
An important area of -in-Structional
knowledge is the evaluation of.student perfor-
mance. The level of effectiveness of a train-ing system is dependent on the availabilityof two kinds of evaluative data:
Assessment of the level of.performancein terms of level of competence and one'sstanding in the subject matter.
Diagnosis of errors committed in terms ofspecific deficiencies in knowledge orskill and necessary corrective action.
Existing techniques f knowledge rep-
resentation are adequa for constructing anintelligent, interactive training device to
support-an instructor. These techniques con-sist of representing things and events_in tem-plates, frames, and scripts and the relation-ships among them in control rules and produc-tion rules. These rules also govern additions
to knowledge derived internally by inference.
The contribution of Learning Theory isprimarily in providing a structure and method-
ology for preparing training materials. One's
first expectation might be that learningtheory would provide models of the learningprocess; however, it performs that,function
only indirectly. Learning theory will providea generic, transportable framework and methodsfor developing courseware and rules and tech-niques for performing task analysis. These
methods and techniques entail specification ofstimuli, acceptable and unacceptable responses,feedback, sequencing, branching, display for-mats and so on; they are in turn a function of
the domain content, training objectives,instructional strategy, media, rules for per-formance asFessment, and specific instructionaltechniques?' All of this is dependent on the
learning process. However, that knowledgemust be encoded into the rules for...instructionalexpertise and it will be transparent in the
application.
Our interest in the nature of expertiseis how the knowledge of competent practi-tioners should be used in the training
system. Examination of the representation ofthat knowledge revealed that the form of rep-resentation varies with domain. Domains
.170
differ ii structure encompassing the gamutfrom production rules to hierarchical organ-ization; multiple conceptualizations for thesame-phenomena are available in some domains;multiple levels of abstraction are character-
istic of some domains.
Problem solving methods also varied across
domains. Different methods are available for
different tasks. A given problem solvingmethod is also adapted to the requirements andconstraints of specific task conditions.
Therefore, it seems unlikely that exper:tise can be captured in a few, simple, genericalgorichms or procedures that are generaliz-able across knowledge domains or even within
a domain, The interaction between the nature
of expertise and knowledge representation isperhaps the most critical element in develop-ing an intelligent training system. Knowledge
structures and control rules that canrepresentthe complex contingencies will have to be
developed. The subject matter to be learnedat novice levels will probably be relativelysimple and without much of the troublesome
complexity. This problem must be faced,though, when the training moves into moresophisticated or intermediate level topis.
One perspective on the purpose for devel-ooing these intelligent systems is toshorten the transition time from novice to
expert. We want to produce higher levelsof competence in shorter periods if instruction
and with lower support costs. Ideally, this
transubstantation will be accomplished ontrainees who are naive with respect to thetechnical area and of no more than averageintelligence (general intellectual ability).An instructional system that will accomplishthis feat will fieed a model of the sequencesof developmental stages from novice to expert,.
states within those stages, and the pedagogicaltechniques for transformation between states
or stages. At the present state of our tech-nology such a model must be derived within the
context of a specific domain.
The Software Architecture for ArtificalIntelligence was viewed as consisting of two
parts: the hardware necessary to execute the
program and the software. The hardware was
found to be specific to the training programand thus no generalizations can be made. It
was concluded that LISP is the software systemthat should be used for development andoperation of such'training systems. Knowledgerepresentation techniques exist in LISP aswell as an assortment of supporting tools andfacilities. Further, there is a history ofexperience with this language which providesboth a body of knowledge and experience AIpractioners to draw upon.
Interactive Speech and Natural LanguageUnderstanding present no obstacle in principle
although deriving and organizing the detaileddialogue requirements can be a challengingmanagement task. Spoken messages can besynthesized at the current state-of-the-art.Generation of the messages in the context ofon-going events and conditions and in anadeqUate approximation to real time s uncer-taip and must be determined in the specificapplications.
Current and near future voice reCognitionsystems can recognize utterances of only a fewwords in length; recognition.and understandingof unconstrained, connected speech is notfeasible and may not be practical for sometime. This character of voice technologyimposes a requirement that the natural lan-guage dialogue consist of short statements.Short statements may be characteristic ofspeeeh ivLtask-oriented situatipns and thusthis tonstraint may not be a practicallimitation.
REVIEW OF NAVAL TRAINING NEEDS SUITABLE FORA PaOTOIYPE COURSE
Four training areas were reviewed forsuitability. They are:
Electronic MaintenanceAir Intercept Contr'ol
Naval Flight Officer CourseAnti-Submarine Warfare Team Training
In addition, training equipment in teamtraining and air traffic control was analyzed.The team trainers analyzed were TACDEW,14A2 ASW, and 14Al2; the air traffic controltrainers were PARTS (McCauley and Semple,1980) and the AIC Trainer (Halley, King, andRegelson, 1978).
The criteria used to evalpate the train-ers were cost-benefit, availability/shortfallof instructors, appropriateness of contentto AI and voice, impact on readiness, andstability of the subjeCrmatter. The cost-
benefit criterion was broken down into threeparts. First, the developmental cost of theprototype was treated in terms of theavailability of appropriate information,data, and on-going training. Second, operat-ing cost was treated as cost of instructorsas the major factor differentiating amongcourses. The third part was a payback com-ponent in terms of readiness of trained per-sonnel and reduction in personnel replacement.
P:-propriateness of the s..,bject matter forthe use of artifical intellijence and inter-active voice is an important methodologicalconsideration. The prototype application mustbe able to reflect an advantage from utilizingthe new technologies. Artifical intelligencewill be beneficial in domains that have'asignificant cognitive component;`voice tech-nology is useful in areas of high worWITad
71
which can be reduced by using hearing and .speech as media of interaction.rather thanvission and manual responses.
ASW team training was chosen out of thefour training areas as the best candidate fora prototype application. Electronic main-tenance was rejected because it does not have
a signifitant voice potential and develop-mental costs could be high. Maintenance-skills are poorly understood and significantcosts and delay could be incurred in gettingnecessary information. Air intercept controland the naval flight offictrs course wererejected because the subject matter is notstable.
RECOMMENDED GUIDELINES FOR INSTRUCTOR SUPPORT
.Analysis of the instruction yieldedneeds for the.following modules for
instructor support:
Record Keeping,Scenario Set UpSubject Matter ExpertSubject Matter TutorSpeech GenerationSpeech RecognitionAbility to UnderAand ErrorsOmniscient Instructor
The major components of an AutomatedTraining System are:.
Instructor Work StationStudent Work StationDomain ExpertInstructional ExpertPerformance Measurement
The Instructor Wogk Station was.examinedin more detail to identify the capability itshould have. The Instructor Work Stationshould have the fgllowing functions:
-
Explanation of Decisions and ActionsRe&ord Keeping and SchedulingScenario Design and DebugCommunications ControlProblem Status
A
Anjmportant capability of the system isto'recognize and cope with errors committed'by the trainee. This.capability entailsseveral requirements in knowledge and use ofthe knowledgt to make inferences about thetrainee's behavior. The system must know theconstraints reflected in the rules for correctbehavior. Since the application is ASW teamtraining, the system must also know how thetrainee deals with other team members. Each
player must itnow the roles of the otherplayers at a minimal level of their nominal,expected behaviors in.relation to one's ownrole.
Therefore, the system must know.hot onlythe role structure of the team but also each,trainee'sibeliefs about knowledge and com-petence of other players. It must be able tofollow a trainee's Teasoning about the knowl-edne of other players and know whether he.atiributes erroneous or unexpected behavi-orfrom another player to a lack of knowledge,misinformation, or faulty, missing, or in-appropriate procedures. The inferences thetrainee makes and how,he responds to theotherplayer's behavior are relevant; he may fillin missing information, correct the other,player, request verification, or adjust hisown behavior to compensate for the perceiveddeficiencies in the other player's behavior.
These'capabilities have'been clusteredinto four sets which represent increasing,levels capability aruilderated to three stagesin developing an intrligent, automated in-
-struCtor or instructor's assistant, based onCrowe et al., (1981). We have modified themodel of Crowe et al. by adding a fourthstage Consisting of the ability to providefeedback and performance evaluation to.teambehaviors. Thd cross-correlation of theseclassifications is depicted in Table 1. '
The correspondence between the majorcomponents of an Automated Training Systemand the AI teciinological areas on which theydepend is summarized in Table 2.
DISCUSSION OF IMPLICATIONS AND ISSUES
Opyr- intent in t'his,section is to be
exptielt about some issues and concerns thatwere recurrent during our analyses and setaside for the purpose of attaining theimmediate goal of developing some guidelinesfor how to proceed in developing an simulated,intelligent, talking instructor's assiitant.They should at least be stated openly becausethey may represent potential problems orobstacles to developing, fielding, 'and
effectiveness of this type cif system. The
intrepid innovator can then beware andtake appropriate cautionary or preemptiveactions.
These issues and implications are:
Impact on the InstructorReduction of Personal ,Contact between
the Studeit and InstructorThe Achievable Level of NaturalnessEffect of Complexity on Friendliness
and. Support
.Importance of Message Generation VersusVoice Synthesis in the User-SystemInterface
Compressibility of the-"Novice-To-Expert" Curve
There are several possible effects on theinstructor and his, role which cin have a neg-ativé impact_on system effectiveness. Thecontent of instructor training will necessarilybe changed and perhaps increased in length anddifficulty. Our intuition tells us fhat theinstructor will need a better understanding ofthe instructional process than he presentlyhas. At the same time his automated assistantwil-1 permit him to handle more trainees andattain'higher levels of proficiency; hisproductivity should undergo a marked enhance-ment. However, without an instruc4ionallysophisticated supervisor the automated in-structor's assistance can run amok like-the ,
sorcerer's apprentice.
The instructor's morale, involvement, andsense of using his own talents wi 1 be affect-
ed. These factors will be enhan d if the in-structor perceives his automated assistant ask,tool which helps him in 'doin his job, assomething to which he delegate tasks andwhich'increases his effectveifrss and permitshim to do a better job. If i contrast hesees the "iron monster" as so ething which hemust labor to support and domi tes hisactivities, then his mcirale, Iiiotivation, and,
involvement will decline.
Reduction of personal contact between theinstructor and trainee can compromise theeffectiveness of social reinforcgrs andattenuate an important source'of feedback anddiagnostic information which every good in-structor learns to use. It will also minimizethe opportUnity to use a mentor relationshipto model behaviors and facilitate the instruc-tional process. The voice of the simulatedassistant may take on a personality and areality-as an individual and thus be able tosatisfy some of these social functions.Realizing them may be a functi.on of thecreativity in preparation of the dialogues,the types interactions embedded in them, andthe naturalness of the interaction as per-ceived by the traidee. The personality 'of theautomated dssistant may be another domain ofexpertise needed to develop and.deploy thesesystems. . ,
,The achievable level of naturalness isunknown at this time ds is the required levelof naturalness. Naturalness and friendlinessare concepts that are not well defineg4andhave a large subjective, intuitive co-MAnentas the terms are used. Specifications aredifficult to state and satisfy under suchconditions. These terms need to,be differ-entiated and made more.precist.
The impact of naturalness and the com-plexity of the interactions on sys-tem sbppoftand maintenance is a very serious issue. Tue,
transparency of natural, friendly, simpPedialogues is achieved by putting the com-plexity and processing behind the -interf,acing
172
1
Technology Tutor
Expert System X
Explanation
,s
Simulation (With Errors)
Speech Generation:
Speech Recognition
Understanding Misconceptions& Intent
Learning Model & Feedback
Stages of System Development
Limited Role Student Omniscient
Play Model Instructor
TABLE 1. CORRESPONDENCE BETWEEN STAGES OF DEVELOPMENT,OF ANAUTOMATED INSTRUCTOR'S ASSISTANCE AND AI CAPABILITIES
X
Key Trainer'Components
Know-ledge'
Rppresen-Okeion
NaturalLanguage
Under-standing_
Inter-activeSpeechqechnology
Learn-ing
Theory
,Cogni.-
tive.
Science
4
ICAI
,
Al
Archi-tecture,
. ,
Indtructorworkstation
Studentworkstation
.
Domainexpert
sUbsystem
Instructionalexpert'ilbsystem
Smulationsubsystem
Performance
measurement,sbbsystem
.
.
.
,
.
.
+
+
-
+
+
...
+
_f
.
-
f
,
so
,
4
.
.,
+
.
," .
. '
.
.
+
.
+
f
I-
. ..
. .
+
'
.
4-
..+
t
.
fi
+
+
4-
.
IABLE. 2. Al FLCHNOLOGILS
tf174
Is
,
console and into the software. 2Experienqetells us that'maintenance and support,in-crease as software and,hardware become more-complex; Murphy's Law is imperatiVe.
'Voice synthesis with some degree ofnatural sounding speech is relatively easy to_achieve in the current state of the art.However, the appropriateness of the messagein the context of on-going events and the -
speech oommunity of the listener are moredifficult to accomplish. These features arethe problem of message generation. It is
dependent on apot of information processinginvOlving knoW1edge representation, inference,and domain knowledge. This areas0 criticalto the 'effective use of voice tecHcology.
Compressibility of the curve for trans-forming a-novice jnto some level oftexpertis an ultimate instructional issue. There
are many differences between novice andexpert in amount of knowledge, its organ-ization, techniques for formulating andsolving problems, and tfie use of heuristicsderived from experience and the dccumulatedlore of the discipline. A subset of thesethings must be chosen in terms of theirutility on the job for a given level ofproficiency, the ease of learning them, and s
the progression of states of know1-edge andskill in the growth df,profiCiency. There
is a multitude of empirical questions in-herent in this issue; they must o6 unscraith-
ed addressed if we are to have an effect-ive technology 'Ifor'autylation -of instruct-
ional systems, .
CONCLUDING REMARKS
Application of artifical intelligence"and voice technology to training systems inthe fd7-m of an automated inslructor's
assistant is feasible and can provide s'ig-nificant incre'ases in the cost-effectivenessof training. Artifical intejligence providesseveral concepts and teChniques which areuseful in,structuring and impleMenting thedesign anduse of sUch :a system. This paper
is a brief summary of_an on-gOin,g investi7gation of the feasibility, util.ity, andapproach to developing an autiimAedinstructor's assistant for.use in trainingNavy personnel. ASW team training was chosenas the vehicle for a prototype applicationof the automated instructor's assistant.
REFERENCES
Crowe, W., Hicklin, M., Kelly, M., Obermayer,R.,,and Satzer, W. Team Training Through .-
Communications%Control, Report NAVTRAEQUIPCEN
,80-C-0095-2, 1981. Naval Training Equipmententer, Orlando, Florida 32813.
175
,McCauley,-M.E., Semple, C.A. Precisionapproach radar"training systems (PARTS)training effectiveness evaluation.NAVTRAEOUIPCEN 79-C-0042-1. 1979 Naval
Training Equipment Center, Orlando, Florida.
Halley-, R., King*, m.p., Regelson, E.C.Fpnctional requirement for ail. interceptcontroller prototype training system-NAVTRAEQUIPCEN 78-C-0182-4. 1978 NavalTraining Equipment Center, Orlando, Florida.
NAVTRAEQUIPCEN 111-341
DISTRIUTI.ON LIST
Naval Training Equipment CenterOrlando,,FL 32813
Technical LibraryNaval Training Equipment CenterOrlando, FL 32813
Commanding OfficerNavy Submarine Base New LondonATTN: Psychology Section, Box 600Groton, CT 06340
Dr. Donald W. ConnollyResearch PsychologistFederal Aviation AdministrationFAA NAFEC ANA-230, Bldg. 8Atlantic City, NJ 08405
National Aviation FacilitlesExperimental CenterLibrary -
Atlantic City, NJ 08405
6DR Joseph Funaro.,Code 602 ,
Human Factors Engineering Di;.Naval Air Development CqnterWarminster, PA 18974
CommanderNaval Air Development CenterATTN: Technical LibraryWarminster, RA 18974
Technical LibraryOUSDR&ERoom 30122Washington, DC 20301
Director
Defense Research and EngineeringWashington:DC 20301
Corpmanding OfficenAir Force Office of Scientific
ResearchTechnical,,Library
Washingtob, DC 20301 .
100
1
1
1
1
1
1
1
1
OUSDR&E (R&AT) (BLS) 1
CAPT Paul R. ChatelierRoam 30129, The PentagonWashington, DC 20301
4 OASD (MRA & 1)/Training 2
Room 38922, The PentagonWashington, DC 20301
Chief, Research OfficeOffice Deputy Chief of Staff
for PersonnelQepartment of ArmyWashington, DC.20310
1National Defense University 1
Researdh Directorate A
Ft. McNair, DC 20319
1
HQAFSC/DLSAndrews AFB, DC 20334
AFOSR/NL (Dr. A. R. Fregley) 1
Bolling AFBWashington, DC 20332
Dr. Genevieve HaddadAFOSR/NLBolling AFB, DC 20332
Chief of Naval Operations 1
OP-115DLCOR Jim LawhornWashington, DC 20350"
Assistant Secretary o'f the Navy 1Research, Engineering & SystemsWashington, DC 20350
Chief of Naval Operations 1
OP=115gesdarCh, Development & StudiesRoom 6836Washington, DC 20350
Office of Deputy Chief of Naval 1Operations
Manpower, Personnel & Training(0P-01)
..Washington, DC 20350
1Of17
1
NAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Chief of Naval OperationsOP-593BWashingt8n, DC 20350
Chief of Naval OperationsOP-987HATTN: Dr. R. G. Smith
Washington, DC 20350,
-Chief af Naval OperStionsOP-596C
, Washington, DC 20350
Chief of Naval MaterialMAT 031MWasKington, DC 20360
CommanderNaval Electronic Systems CmdCode 03Washington, DC 20360
CDR Dick Gibson ,
Naval Air Systems CommandCode 413'Washington, DC 20361
CDR Tom JonesNaval Air Systems CommandCode 334AWashington, DC 20361
Commander coo
Naval Air SystemS'CommandAIR 340FWashington, qc 2036,1
CommanderNaval Air Systems CommandTechnical LibraryAIR-950DWashington,,DC 20361
-LT Thgmas N. CrosbyNaval Air Systems CommapdCode 5313X
Washington, DC 20361
Commander
Naval Air Systems CommandAIR 413FWashington, DC 20361
-1
'1? CommanderNaval Sea Systems CommandCode 61R2 (Mr..P. J. Andrews)
Washington, DC 20362
Director, Personnel & Training 1
Analysis OfficeBuilding 200-3Washington Navy YardWashington, DC 20374
1
1
1
Naval Research LaboratoryATTN: LibraryWashington, DC 20375
CoMManderNaval' Supply Systems Command
1 Code 03Washington, DC 20376
1
1
1
1
HQ, Marine.CorpsCode APCATTN: LTC J. W. BiermasWashington, DC 20380
Sciertific AdvisorHq, US' Marine Corps
WasNrigton, DC 20380
Commandant of Aarine Coros
Code OTTP-31Washington; DC 20380
r-
ObraryDivision o'f Public DocumentsGovernment Printing OfficeWashington, DC 20402 ,
-Scientific Technical Information"Office
NASA*Washington, DC 20546
Federal Aviation AdministrationTechnical LibraryBureau Reseach,& DevelopmentWashingtOn, DC 20590'
1 US Coast.Guard (G-P-1/62)400 Seventh Street, SWWashington, DC 20590
2 ,of 171
1
1
1
NAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Dr. Sam SchiflettNaval Air Test Center'SY 721
Patuxent River, MO 20670
CommanderNaval Air Test CenterCT 176
Patuxent River, Maryland 20670
Karl FeintuchArinc Research Corporation2551 Rive Rd./Annapoli*MT 21401
Director Educational DevelopmentAcademic Computing CenterUS Naval AcademyAnnapolis, MD 21402
'Dr. JesseOrlanskyInstitute for Defense Analyses
Science & Technology Dix..
400 Army-Navy DriveArlington, VA 22202
Dr:Martin Tolcott .
Offica,of Naval Research800 N. Quincy St.Arlington, VA_ 22217
CDR P. M. CurranOffice of Naval Research800 N. Quincy St.
Arrington, VA 222T7
Personnel 4 Training Pesearch.Programs
Office of Naval ResearchCode 458
Psychological,Sciences Div..800 N. Quincy St.-Arlington, VA 22217
Chief of Naval Resei)-chCode 455800 N. Quincy St.Arlington, VA 22217
Chief of Nav'al ResearchCode 442 PT800 N. Quincy St.Arlington, VA 22217
.
,
Dr. Henry M. Halff,Office of Naval ResearchCode 442Arlington, VA 22217
1 Marshall FarrPersonnel & Training ResearchGroup
Code 442.Office of Naval Research
1800 N. Quincy St.Arlington, VA 22217
Center for Naval AnalysesATTN: Dr. R. F. Lockman
12000 N. Beauregard St.Alexandria, VA 22311
1
1
1
Defense Technical Information 12Center
Cameron Station1
Alexandria, VA 22314
PERI-OUUS Army Research Institute for 1
the Behavioral & SocialSciences
1 5001 Eisenhower Ave.Alexandria, VA 22333
US Army Research Institute for 1
the Behavioral P Social Sciences1
2 ATTN.: Russell Kirton5001 Eisenhower Ave.Alexandria, VA 22333
3 Commanding OfficerFlget combat Training CenterAtlantic
ATTN: Mr. Hartz, Code 02ADam NeckVirginia Beach, VA 23461
Robert Goodwin
1Commander Naval Air Force, USAtlantic Fleet
NOrfolk, VA 23511
itrmy Training Support CenterATTSC-DSFt. Eustis, VA 23604
3 of 17
1
1
1
NAVTRAEQUIPCEN IH-341
DISTRIBUTION.LIST (cont'd)
CommanderHQ, TRADOCATTN: ATTNG-PAFt. Monroe, VA 23651
ChiefARI.Field UnitP. O. Box 2086ATTN: LibrarianFt. Benning, GA 31905
LT Frank C. Petho, MSC, USN
Chief Naval Education & Training
Code-433NAS Pensacola, FL 32508
LomManding OfficerNaval Education Training Program &
Development Center
ATTN: Technical Library
Pensacola, FL 32509
Command_img OfficerNaval Aerospace Medical Rsch.
LaboratoryCode L5Department of PsychologyPensacola, FL 32512
TAWC/TNEglin AFB, FL 32542
USAHEL/USAAVNCATTN: DRXHE-FR (Dr. Hofmann)
P. O. BoA,476Ft. Rucker, AL 36362
Dr. Will BickleyUSARI Field UnitP. O. Box 476Ft. Rucker, AL 36362
ChiefARI Field UnitP.O. Box 476'`Ft. Rucker, AL 36362
1
1
1
1
1
1
1
1
1
Chief of Naval Technical Tng.1
Code 0161NAS Memphis (75)Millington, TN 38054 .
4 of 17
Mr. Harold A. Kottmann
ASD/YWEWright Patterson AFB, OH 45433
Mr. Steven%P. Gunzburger
ASD/ENETCWright Patterson AFB, OH
Mr. Craig McLeanUSAF, ASD/YWEWright Patterson AFB, OH 454133
Air Force Human ResourcesLaboratory
AFHRL/LRLogistics Research Div.
Wright Patterson AFB, OH 45433
Mr. John Schreibner 1
Naval Air Systems CommandCode 4131Washinaton, DC 20301
Dr. Kenneth Boff 1
ARAMRL/HEA-Wright Patterson AFB, OH 45433
Mr:Jim Loomis 1
ASD/ENETVWright Patterson AFB, OH 45433
Mr. R. G. Cameron 1
ASD/ENETCWright Patterson AFB, OH 45433
Dr. Ralph R. Canter 1
US Arty Research InstituteField Unit.P. O. Box 16117Ft. Harrison, IN 46216
Headquarters 34 Tactical Airlift 1
Training Group/TTDILittle Rock AFB, AK 72076
Chief, Methodology & Standards 1
StaffFederal Aviation AdministrationAcademy
Aeronautical Center, AAC-914P. O. Box 25082Oklahoma City, OK 73125
10.-
NAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Commandant
US Army Field Artillery SchoolCounterfire.DepartmentATTN: Mr. Eugene C. RogersFt. Sill, OK 73503
CommandantUS Army Field Artillery SchoolATSF-TD-TSMr. InmanFt. Sill, OK 73503
CAPT Michael E. Shannon4235 STS/XPX USAFCarswell AFB, TX 76127
HeadquartersAir Training Command, XPTIATTN: Mr. GoldmanRandolph AFB, TX 78148
US Air Force Human Resources LabAFHRL-MPM
Manpower & Personnel Div.Manpower & Force Mgm. Systems Br.Brooks AFB, TX 78235
US Air Force Human ResourcesLaboratory
TSZ
Brooks AFB, TX 78235
AFHRL/PEBrooks AFB, TX 78235
Chief of Naval Air TrainingATTN: Code 333NAS
Corpus Christi, TX 78419
1 Mr. Harold D. WarnerAFHRL/OT (UDRI)
Williams AFB, AZ 85224
1
1
1
1
1
1
3
Chief of &a1 Education & Training 1
Liaison Office'Human Resources LaboratoryFlying Training DiviisionWilliams AFB, AZ 85224
US Air Force Human Resources Lab 1
AFHRL-OT
Operational Training Div.Williams AFB, AZ 85224
AFHRL/OTOLuke AFB, AZ 85309
Commanding OfficerNaval Education & Training Support
Center, PacificCode N5B (Mr. Rothenberg)San Diego, CA 92132
1
1
Commander 1
Naval Air ForceUS Pacific Fleet (Mr. Bolwerk/Code 316)NAS North IslandSan Diego, CA 92135'
Commanding OfficerFleet Training CenterATTN: Training DepartmentNayal StationSan Diego, CA 92136
1 CommanderTraining CommandATTN: Educational AdvisorUS Pacific FleetSan Diego, CA 92147
uS Ar Force Human Resources Lab 1AFHRL-Ur (Dr. Rockway)Williams AFB, AZ 85224
Dr. Thomas LongridgeAFHRL/OTR
Williams AFB, AZ 85224
Commanding OfficerHuman Resources LaboratoryOperational Training Div.Williams AFB, AZ 85224
1
Commanding OfficerFleet Combat Training Center,
PacificCode 09ASan Diego, CA 92147
Commandix OfficerFleet Anti-Submarine WarfareTraining Center, Pacific
ATTN: Code 001San Diego, CA 92147
5 of 17
1 i)
1
1
1
1
..,.......,""--
i
NAVTRAEQUIPCEN 'IH-341
DISTRIBUTION LIST (cont'd)
Navy Personnel Research &Development Center
ATTN: M. McDowellLibrary, Code P201L
San Diego, CA 92152
LT Wayne R. HelmHuman Factors, EngineeringCode 1226Point Mugu, CA 93042
CommanderPacific Missile Test Center
Point Mugu, CA 93042
Frank Fowler3306 TES (ATC)STOP 223Edwards AFB, CA 93523
LTC J. E. Singhton
3306 TESEdwards AFB, CA 93523
CommanderNaval Weapons CenterCode 3154ATTN: Mr. Bob CurtisChina Lake, CA 93555
CommanderNaval Weapons CenterHuman Factors Branch (Code 3194)
ATTN: Mr. Ronald A. Erickson
China Lake, CA 93555
Dr. Wayne D. GrayARI Field UnitP. O. Box 5787Presidio of Monterey, CA 93940
AFHRL Technology OfficeATTN: MAJ Duncan L. DieterlyNASA-Ames Research Cen., MS 239-2
Moffett Field, CA 94035
Mr. Robert WrightAeroMechanics Lab. (USAAVRADCOM)Ames Research Ctr., MS 239-2Moffett Field, CA 94035
1
1
1
3
1
1
1
1
1
1
Dr. David C. Nagel
LM-239-3 .
NASA Ames Research CenterMoffett Field, CA 94035
Dr. J. HuddlestpnHead of Personnel Psychology
Army Personnel ResearchEstablishment
C/o RAE, FarnboroughHants, ENGLAND
Mr. David A. CitrinGeneral Electric Ordnance Systems
Rm 2372100 Plastics AvenuePittsfield, MA 01201
Mr. Ron HandyPacer Systems
87 2nd AvenueBurlington, MA 01803
Corporate-Tech Planning Inc.
ATTN: Katherine E. Blythe
275 Wyman St.Waltham, MA 02154
Mr. Albert StevensBolt Beranek & Newman, Inc.
10 Moulton St.Cambridge, MA 02238
Mr. Harvey F. SilvermanProfessor of Engineering
Brown UniversityProvidence, RI 02912
DirectorG/G-Training Dynamics90 National DriveGlastonbury, CT 06033
Analysis & Technology, Inc.
New London OfficeP. O. Box 769ATTN: Frederick . Ball
New London, CT 06320
Mysteeh Associates, Inc.P. O. Box 220Mystic, CT 06355
.6 of 17
411.
NAVTRAEQUIPCEN IH-341
ISTRIBUTION LIST (cont'd)
Eclectech AssociatesNorth Stonington Professional-Center
/ P. O. Box 178North Stonington, CT 06359
Dr. Thomas J. HamMellEclectech AssociatesP. O. Box 178North Stonington, CT 06333
Mr. W. Bertsche, Eclectech Associates\ P. O. Box 178
North Stonington, CI: 06359
Analysis & Technology, Inc.ATTN: Bonnie E. JenningsTechnology parkRoute 2, P.-0. Box 220North StoRington, CT 06359
Sonalysts, Inc.215 Parkway NocthWaterford, CT 06385
Dunlap & Associates, Inc.One Parkland DriveATTN: J. T. Fucigna, Executive VPDarien, CT 06820
E. P. Washchilla
Talley Industries, ResearchCenter
58 Progress DriveP. O. Box 1456Stanford, CT 06902
Rockwell InternationalAutonetics Marine Systems Div.New England Operations1028 Poquonnock Rd.Groton, CT 06340
1 Rowland & Co.P. O. Box 61Haddonfield, NJ 08033
Techplan Corp.1000 Maplewood Dr.
3 Maple Shade, NJ 08052
1
Geoffrey M. HeddonComputer Sciences Corp.P. O. Box NMorrestown, NJ 08057
John D. GouldIBM Research Center
1
Yorktown Heights, NY 10598
1
1
1
1
Lockheed,Electronics Co., Inc. 1
ATTN: SOport Engineering Dept.Mail Station 11001501 US Highway 22, C. S. #1
Plainfield, NJ 07061
Systems Engineering Assocs. Corp. 1
P. O. Box 203019 Cherry Hill Industrial ParkPerina Blvd,Cherry Hill, NJ 08003
Grumman Aerospace Corp.Plant 36, CO2-04ATTN: Mr. Sam CampbellBetVage, LI, NY 11714
Martin PlawskyGrumman Aerospace Corp.C21-05Bethpage, NY 11714
Glynn R. RamsayGould, Inc.
Simulation Systems Div.Melville, NY 11747
Martin MorganlancOrGould SSD50 Marcus Drive 'o
Melville, NY 11747"
Howard JaslowGould Simulation Systems50 Marcus Dr.Melville, NY 11747
Harris Corp.
PRD Electronics Div.6801 Jerico TurnpikeSyosset, NY 11791
Eric/IR
Syracuse UniverlitySchool of EducationSyracus, NY 13210
76f 17
NAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Drs. Cohen and Stark
ink DivisionThe Singer CompanyBinghamton, NY 13902
Bob GamacheSinger-Link FSD
Colesville-Rd.8inghamton,,NY 13902
Alexandra B. Taylor
Singer-Link FSDBinghamton, NY 13902
SCIPAR, Inc.ATTN: Leslie A. Duffy
P. O. Box 185Buffalo, NY 14221
Falcon Research & Development Co.
ATTN: Patricia D. Pfohl
One American DriveBuffalo, NY 14225
Calspan Corporation4455 Genesee St.Buffalo, NY 14225
George R. Purifoy, Jr.
Applied Science Assocs., Inc.
Box 158Valencia, PA 16059
Dr. M. M. ReddiVice President, Engineering DeptFranklin Research Center20th St. &Benjamin Franklin ParkwayPhiladelphia, PA 19103
James J. PerrielloDeputy Div. Mgr., Arlington Div.Systems Consultants, Inc.2828 Pennsylvania Ave.Washington, DC 20007.
Charles B. Colton, P. E.Marketing DirectorSystems Comultants, Inc.2828 Pennsylvania Ave.Washington, DC ,20007
American Institute fosr 'Research-1
in the Behavioral Sciences
1055 Thomas Jefferson St., NW
Washington, DC 20007
2 Robert F. Archer 1
American Institutes for Research
in the Behavioral Sciences
1055 Thomas Jefferson St., NW
Washinoton, DC 20007
1
1
1
1
1
1
American Psychological Assoc. 1
Psychological Abstracts1200 17th St., NWWashington, DC 20036
Robert T. HennessyCommittee on Human Factors
National Research Council
2101 Constitution Ave., NW
Washington, DC 20418
TRACOR, Inc.ATTN: ',Training Section Head
Two Pine Hill Center
P. O. Box 600Lexington Park, MD 20653
GP Technology Corp.
5615 Landover.Rd.Landover Hills, MD 20784
B-K Dynamics, Inc.
ATTN: Barbara Schardt
MIS Division15825 Shady Grove Rd.
Rockville, MD 20850
E. Scott Baudhuin
Singet^-Link11809 Tech. Rd.
Silver Spring, MD 20904,
Richard Kause
Singer-LinkLink Simulation Sys. Div.
11800 Tech. Rd.Silver Spring, MD 20904
ORI, Inc.1400 Spring St.Silver Spring, MD 20910
8 of 17
1
1
1
1
2
1
NAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Vitro Laboratories Div.ATTN: MO
Automation Industries, Inc.14000 Georgia Ave.Silver Spring, MD 20910
Westinghouse Electric Corp.Defense Group, ILS Div.ATTN: Carole W. ForsytheMS 77401111 Schilling RoadHunt Valley, MD 21030
Technical Information CenterGeneral Physics Corp.1000 Century PlazaColumbia, MD 21044
Joyce F. Peacock, LibrarianAAI Corp.
Technical LibraryP. O. Box 6767Baltimore, MD 21204
CADCOMATTN: Mr. James W. Foust107 Ridgely Ave.Annapolis, MD 21401
Ruth EhrensbergerARINC Research Corp.2551 Riva Rd.Annapolis, MD 21401
American Communications Corp.Mail Station%B7617 Little River TurnpikeAnnadale, VA 22003
Synectics,Corp.ATTN: Mr.'E. M. Jaehne10400 Eaton PlaceFairfax, VA 22030
Institute of Human PerformanceCircle-Towers Plaza Level9411-R Lee HighwayFairfax, VA 22031
Biotechnology, Inc.ATTN: Alma Collie3027 Rosemary LaneFalls Church, VA 22042
1 Bert AltonSperry Systems Management12010 Sunrise Valley DriveReston, VA 22091
1
Joseph L. Dickman 10
Sperry Systems Management-SECOR1 12010 Sunrise Valley Dr.
Reston, VA 22091
1
1
Library, Tactical OperationsAnalysis Div.
Science Applications, Inc.8400 Westpark DriveMcLean, VA 22102
1
Behavioral Sciences Department 1
ATTN: R. J. ElyThe BDM Corporation7915 Jones Branch DriveMcLean, VA 22102
PRO Technical Applications, Inc. 1ATTN: Karen B. Burklow7600 Old Springhouse Road
1 McLean, VA 22102
Dr. Edward N. HobsonThe BDM Corporation7915 Jones Branch DriveMcLean, VA 22102
1
General Research Corporation 1
ATTN: Mr. Robert Watt7655 Old Springhouse Road
1 Westgate Research Park ,
McLean, VA 22102
1
1
1
Advanced Technology, Inc.ATTN: Debbie ,BurnsSuite 5007923 Jones Branch DriveMcLean, VA 22102
Alan G. Windt, PhDSAI Comsystems Corporation7921 Jones Branch DriveMcLean, VA 22102
1
1
Human Sciences Research, Inc. 1
Westgate Research Park7710 Old Springhouse RoadMcLean, VA 22102
9 of 17
'NAVTRAEQUIPCEN 1H-341
DISTRIBUTpN LIST (cont'd)
Engineering Research Associates 1
ATTN: Erik Thamm1517 Westbranch DriveMcLean, VA 22102
Jean HillmanAdvanced Technology, Inc.7923%lones Branch DriveSuite 500McLean, VA 22102
Georoe H. Datesman, Jr.Program ManagerApplied Science Associates, Inc.
P. O. Box 975Middleburg, VA 22117
XMCO, Inc.ATTN: Director of Training
Developments-6501 Loisdale Court, Suite 606
Springfield, VA 22150
Litton MellonicsATTN: Ms. Sue Tepper, Llbn.P. O. Box 1286Springfield, VA 22151
Performance Measurement Assocs..
Inc.
131 Park St., NE
_Vienna, VA 22180
Delex Systems, Inc.ATTN: J. Golden .
Suite 700, 4150 Leesburg PikeVienna, VA 22180
Raymond G. FoxSociety for AppTied.LearningTechnology
50 Culpeper St:Warrenton, VA 22186 .
Analysis & Technology, Inc.ATTN: Bradley Keyes,Taylor Building (NC-3 - Pentouse2531 Jefferson Davis Highway.
Arlington, VA 22202
Potomac Research, Inc.1600 North Beauregard St..Alexandria, VA 22311
1
1
1
Rockwell InternationalATTN: D. B. Lafferman1745 Jefferson Davi's Highway
Suite 1107'
Arlington, VA 22202
Mantech International
Century Building2341 Jefferson Davis HighwayArlington, VA 22202
1
-1
1 W. Kessler 1
Data-Design Laborator4es'Suite 3071755 South Jefferson Davis HighwayArlington, VA 22202-
SRI International1611 North Kent St.Arlington, VA 22209
Don R. Lyon
Editor-in-ChiefNews & MotesPersonnel & Traini,ng Research
'-Group .
,
Office of Naval Research
Arlington, VA 22217
Control Data Corporation
ATTN:- Mr.JWW. Doescher
1800 North Beauregard St.
=Alexandria, VA 2231y.
Dr. Edgar L. Shriver, President 1
Kinton fricorporated `
.SUite-205.1500 North Beauregard S .
Alexandria; VA' 22311
1
1
Director .
Human Resources Research
Organization300 N. Uashington St.
'Alexandria, VA 22314
Allen Corps of America'
ATTN: W..M. 'Rugemer, Jr.
401 Wythe St./ .
Alexandria,' YA 223121
10 of 17 0f
1.1
NAVTRAEQUIRCEN
DISTRIBUTION LIST (cont'd)
Essex Corporation 1
ATTN: Dr. Thomas B. Malone333 N. Fairfax St.Algxandria, VA 22314
Dr. Saul Lavisky, Executive Office,' 1
HumRRO300 N. Washington St.Alexandria, VA 22314
Q. E. D. Systems, Inc.ATTN: Special Projects4646 Witchduck Rd.Virginia,-,Beach, VA 23455
Comptek Research, Inc.#5 Kroger Executive Center, Suite 127Norfolkv VA 23502
,)
,Janet L. Winner4208 Colonial Ave.Norfolk, yA 23508"
Dr. Robert C. WilligesIOusrial Eng.:Et Operations Rsch.Virginia polytechnic Inst. & State
UniversityBlaCksburg; VA -24Q61
. R. L. ParkefLockheed Georgia Co.Cuttomer Tng. Dept.86 S..Cobb.Marietta, GA -3w6.3
Dr. Ross Gagliano.Head, Operat-cons Research ad,"anCh
Georgia Tech. t'ng. Experiment Statio
326 Baker Bldg.Atlanta, GA 30332
. David M. EakiWBell Aerospace Textron1,1ral coastal Siltems Center
Sid§. 319Panama City, FL.7 32407
B. L. NissleyBell AerciOace TextronNaval Coastal Systems Center.Pariama city,'FL 3207
1
1
Georgia Intitute of Technology 1
COmputer Science & Tech. Lab. -
Engineering Experiment StationAtlanta, GA 30332
,
B. E. Mulligan 1
URiversity of GeorgiaDept of Psychology:Athens, GA 30602
Paul W. CaroSeville Research Corp.400 Plaza Bldg.Pensacola, FL 32505
`311
Seville Research Corporation 1
Pace Blvd-.'at FairfieldPensacola, FL 32505
I . John T. McShera,.Jr.2112,Lewis Turner Blvd.Ft. Walton Beach, FL 3254$
° .G. H. Balz 1
Balz Enterprises, Inc.2524 :Chinook Trail
Maitland, fL 32751
mr. A. F. Plogstedt:\,M/S 100 . 1
'McDonnell Douglas Altl-onautics Co.
P. 0:,Box 640 A
32780,
R. Fagari
-250 W. Lake Sue Ave.1Winter Park, 'FL .32798
11,
1
John P. QuinnNSI
1315 S. Semoran Blvd.Winter Rark,"-FL 32792
Grace P. Waldrop ,
McDonald & Associates, Inc.
.988 Woodcock 'Rd.
'Suite 136Orlando, FL 32803
Bruce . McDonaldMdDonald & Assotiates988 Woodcock Rd.prlando. FL-32803
11 of 17
1
1
NAVTRAEQUIPCEN
DISTRIBUTION LIST (cont'c)
Appli-Mation, Inc.31291 Maguire Blvd.Suite 244Orlando, FL 32803
AMichael SkurzewskiVeda, Inc.
3203 Lawton Rd.Suite 233Orlando, FL 32803
Jim HazzardNeda, Inc.3203 Lawton Rd,Suite 233Orlando, FL 32803
John KeeganAllen Corporation of 'America3751 Maguire.Bldg.Suite 270Orlando, FL 32803
Bob BakerAllen Corporation of America999 Woodcock Rd.Suite 308Orlando, FL 32803
S,cience Applications, Ihc.3655 Ma'guire Blvd. -
Suite 150Or:lando, FL 32805
Diana Z. BryanEagle Technology3165'McCroryll., STE. 235Orlando, FL 32803
Deborah BushE-Tech, Inc.3165 McCr6ry Pl.Orlando, FL 32803
William HintonSpectrum of America1040 Woodcock Rd.."Suite 214Orlando, FL 32803
_Moses Aronson,Aronson Industries3705 Wilder Lane.OiAlando, FL 32804
".
40
1 Walter M. KomanskiSpectrum of America1040 Woodcock Rd.,Suite 214Orlando, FL 32803.
1
1
Northrop Services, Inc.ATTN: G. F. Stackpole3203 Lawton RoadSuite 201Orlando, FL 32803
Harold Rosenblum1310 Webster St.
Orlando, FL 32804
Martin A. Traub2 Appli-Mation, Inc.
6955 Hanging Moss RoadOrlando, FL 32807
1
1
1
Dave DalyRowland & Co.6961 Hanging Moss'Rd.
Orl'ando,-FL 32807
Diana Barryfymland &'Co.6961 Hanging ploss Rd.
,Orlando, FL 32307
Patn4cia KilgoreRowland & Co.6961 Hanging Moss 1Rd.
Orlando, FL 32807_
Dr. Angelo P. VerdiRowland & C. 7
6961 Hanging Moss Rd.Orlando., 'FL 32807
R. W. Schultheis, Jr. '-Computer Systems DivisionPerkin.,Elmer
,7200 Lake Ellenor Dr.
Orlando,-FL 32809,
Jeffery L. Maxey3165 McCrory PL,Orlando,. FL 32813
Ergonomics Associates,, Inc.
P. O. Rix-2098-7 -__
Orlando,. FL 32814
12 of 17. d
2
1
1
1
1
1
1
1
;
c,-
NAVTRAEQUIPCEN,IH-341,
DISTRIBUfION LIST (cont'd)
DBA Systems, Inc.ATTN: Advanced ProgramsP. 0: Drawer 550Melbourne, FL 32901
T. F. McCrea
Chrysler Defense2423 Auburn Dr.Cocoa, FL 32922'
HAP-International, Inc.Intercontinental Bank Bldg.,3899.N. W. 7th St., Suite 211Miami, FL 33126
Thomas P. Ross, Jr.-- Science Applications, Inc.
2109 W. Clinton Ave.Suite 800Huntsville, AL 35805
1 'John M. HowardStaff ScientistHuman Factors'Oiv.SSS Consulting2600 Far Hills
2 Dayton, OH 45419
1
1
'Rudolf K. Walter.1
Iflternatio;pal Aerospace Technologies,,Inc.
Executive Pla4a, Suite 1034717 University DriveHuntsville, AL 35805
Dr Rid-lard S. Jensen,.Aviation Psychology Lab.Dept. of AviationThe Ohio state UniversityBox 422,Columbus, OH 43210
\Joseph R. GleyduraGoodyear Aeropace Corp.1210 Massillon Rd.Akron, OH -44315
George . PalerGoodyear Aerospace1210 Massillon Rd.Akron, OH 44315
Donald K. OldhamGoodyear Aerospace1210,MasSillon Rd,Akron, OH. 44315
le
Corp."
Simulation Technology, Inc.ATTN: R. B. Penwell
4124 Linden Ae., Sui.te 200
Dayton,AH 45432, .
1
, 3
1
Carmine:VaccarinoSRL, Inc.2800 Indian Ripple'Rd.Dayton, ,OH 45440.
Systeffis Research Lab. Inc.ATTN: George A. Whiteside2800 Indian Ripple Rd.Dayton, OH 45440
James R. Bridges
Prodram mgr. Tna EquipmentGeneral DynamicsLand Systems Div.P. O. Box 1901
Warren, Micgan ;3090
1
1
1
3
Bjorksten Research Laboratories, 1Inc.
P. O. Box 9444ATTN: Mr. Michael J. MaloneyMadjson, WS .53715
John F. BrockSystems & Research Center'Honeywell, Inc..2600 R-idgeway ParkwayMinileapolis, MN 55413
Tony ModrickHoneywelloSyitems Sc. ResearchP. O. Box 3.12".
Minneapolis, MN '55440
J. L. RunquistMarket AnalystSperry UnivacUnivac 'Park.
P. 0: Box 3525St. Paul, MN 55165
*
1
Center
1
6 o
McDonnell, DougleS ilstronatitics C0.711 ATM: 'E. J. Reilly .
Dept. E080, Bldg 106,..tevel 2 IOW. ,P. O. ox 516St. Lotis, MO ,63166, 1
-
13 of17
0.
NAVTRAOUIPCEN IH-341 ,
DISTRIBUTION LIST (cont'd)
John W. Robbins 1
McDonnell Doqglas AStronautics Co.Dept 1502, Bldg 81, Level 2, Post El0
BoxSt. Ldufs, MO 63166
,
J.,GermerothMOISnnell Douglas Electronics Co.2600 N.. Third St.
.Se. Charles; MO 63301
Ross E.AilsliegerCrew SYstems Techbologr- M.SBoeing Military Airplane Co.
3.801 S, Oliver St.Wichita, KS '67210
Robert B. GrowSimuflite Tng. Intl,7610 StemmonsSuite 300Dallas, TX 75247
V. S. 'Per4ps, Jr. ,
Vaught Corpoi-ationP. 0% Box 225907M. S. 31-02'Dallas, TX' 752155
' Ur: Ronald R..Calkinsj'Vaught,CorporationBldg. 31P. O. Box 226907::____Dall-as,:fX 75265
General Dynamics/Fol7t Worth
PIFTN:- J. A. Davis
Pt' O. 'BoX 746, MZ 2865
Fort Worth; TX 76101
- Terry'NagleBell Aerospace Textron6800 Plp2a,Dr. .
NeW Orleans, LA, 70127.
bert TuckerBell Aerospace Textron'6800 plaza,pr.New..Orlecins, LA 70127
-Jerry%Wages. 77Bell Aerospace"Textron6800 Plaza Drive d"
NeW Orleans, LA 70127,
, '
1
. 1
. K36-40
Pete ChismBell Aerbsfwe Textron6800 Plaza Dr.New, Orleans, LA 70127
jobn Priest1651 Rockwood TrailFayetteville, AR 72701
Lynn R. WhismanBurtek, Inc'. -
P. O. Box 1677Tulsa, OK 71101
R. L. SandsGeneral Dynamics, Fort WorthP. O. Box 748,,MZ 837.
2 Fort Worth, TX 76101
1
Mr. Frank'J. RoyGeneral Dynamics, Tort WorthP. O. Box 746, MZ 2219
Fort Worth, TX 76101.
Joseph C. Stein4235 Stategic Trairripg Sq.
4235 STS/XPXCarswell AFB,'TX 76126
1
1
1
1
2
"Ford Aerospace & Communications 1
CorporatidnSpace Information.Systems"OperationP. O. Box 5,8487
Houston; T-X 77058
Richard L. "Strongl-.. 2
-.Singer Co.2222 Bay-Area Blvd:Mouston, TX 77058
1 Technology, Inc.*. Life Sciences Div.
. P. 0..8ox 32644 '
San Antonio, TX 78216
'1 Texas Tech UniverityPsychology DepartmentBox 4100 .
1*. ATTN: Dr. Charles G. Halcomb ;
Ambbock, TX 794095
Applied Science As,sociates, InBox46057fFt. 'Bliss', TY, 7.9916
14 Of 17 .
1
1
1
NAVTRAEQUfPCEN IH-341
DISTRIBUTION LIST (cont'd)
H. Rudy Ramsey
Science Applications, Inc.7935 E., Prentice Ave.-
Englewood, CO 80110
Christopher Lin
United Airlines Flight Tng. Cen:Stapleton International AirportDenVer, CO 80207
University of DenverATTN: Ralph E Williams, Associate
Head
University ParkDenver, Colorado 80208
Hughes Ajrcraft Co.Corporate Marketing/RFP Contl.olP. O. Box 92996Los Angeles, CA 90009
D. W. LinderOrgn, 2170/84Northrop CorP.One Northrop Ave.Hawthorne, CA 90250'
C. W. MeshierNorthrop,Corp. .4
One Hawthorne Ave., MS 91Hawthorne, CA 90250
Vernon E. CarterNorthrop-Corp.Human Factors -
3901 W. 'Broadway,
Hawthorne, CA 90250
John Westland205 Avenue I, Suite 15Redondo Beach, cp, 90277, '
Edward C. TaylorTRW Defense Systems Group'#1 TRW Circle' .
Redondo Beach, CA 90277
Barten & Atsociates225 Santa Monica Blvd.Santa Monica, CA 940401
William S. Meiselechnology SerVice COrporationComputer Sciences' D4v.2950 31st St.Sa.nta Monica,,CA 90405
1
1
1
Human Factprs SocietyATTN: Bulletin Editor0.-*0..Box 1364\Santa Monica, CA\\90406
Dr. James E. Robert 1
Medical Systems Technipl Services,Inc.
23824 Hawthorne4ilvd., Culte 206'Torrance, CA 90505
William E. HaulmanMail Code 35-93Douglas Aircraft Co.3855 Lakewood Blvd.Long Beach, CA 90846
1
Stanley M. Aronberg A 1
Mail Code 35-93Douglas Aircraft Co.3855 Lakewood Blvd.
1 Long Beach, CA 90846
XYZU Information Corp. 1
ATTN: Frank Fuchs, Project Mgr.P. O. Box 193Canoga Park, CA 91305
1
Clarence SempleCanyon Research Group, Inc,741 Lakefield Rd., Suite BWestlake Village, CA 91361
1
1 1 0
Ms. Sarah A. S. Goldberg . 1
PerceptroniCS, Inc.6271 Variel Ave..Woodland Hills, CA 91367.
1 Ms. 8etti Huls 1
Lockheed-California Co.P. O. Box 551Burbank, CA 91520
_ 1
John P. Char1,2s,
ICON, Inc.3401 Eangor rl.San Piego, CA 92106
Instructional Science &Development, Inc.
5059 Newport Ave., Suite 3031 San Diego, CA 92107
15 of 17
1
2
1
1LAVTRAEQUIPCEff-IH-341V'
DISTRIft;TION-LIST (cpnt'd)
Miss Nancy MarcueVeda, Inc. *.
7851 Mission Center CourtSuite'320San Diego, CA 92108
-Pace Sys,tem14, inc.
3737 Camtno Del.Rio South.Suite 411San Diego, CA 92)08
TRACOR, Inc.3a20 Kenyon St., Suite 209San Dieoo, CA 92110
Advanced Technology, Inc.2120 San Diego Ave.Suite 105San Dieo6,' CA 92110
Mr. Raymond E. MarshallEducational TechnologitEssex CorporationSan Diego Facility3211 Jefferson St.San Diego,"CA 92110.
Mantech International Corpor'ation
ATTN: Mr. Robert L. McCauley8352 Clairemont Mesi Blvd.San Diego, CA 92111
Mr. Steve OsborneAllen Corporation8361 Vickens St.Suite 307San Diego, CA 92111
Systems Exploration, Inc.ATTN: Mr. K. Reilly, Senior
Systems Engineer4010 Morena BoulevardSan Diego, CA 92117
Dr. H. Harvey CohenProgram DirectorSafety 5ciences.Division of WSA, Inc:7586 Trade St.San Diego, CA- 92121
Mr. R. Dup'ree
Cubic CorporationDefenSe Systems Div9333 Balboa Ave.San Digo, CA 92123.
) Mr. David A. MillerCubic Defense Systems Div9333 Balboa Ave.'.Sin Diego, CA _92123
,hr. D. C..Snyder.Cubic Colcp..
Defense Systems Div.9333-Bi1 bokAve,San Diego, CA 92123
1
1
1
1
1
1
Dr, Gerhard Eichweber -1
Kurt Eichweber PRAZISIONSGERATEWERKSchuetzenstr 75-852000, Hamburg 50
W. GERMANY,.
Dr, ROgert C. FeugeMathetics, -Inc.
P. O. Box 26655San Diego, CA 92126
Mr. Jeff_Punch'esMathetics, Inc.P. 0. pox 26655$ari Diego, CA 92126
SAI Comsystems Corp.P. O. Box A-81126-San Diego, CA 92138
.
Mr. .Robbin HalTey
'LOGICONP. O. Box 80158.San Diego, CA 92118
.
Mr. Steve SeidenstiaerLOGICON4010 Sorento Valley Blvd..-s-Sall Diego; CA 92138
Mr: L. D. EoanLOGICON, Inc.P. O. Box 8a158San Diego, CA 92138
Mr. Terry KrywayLOGICON, Inc.P. 0 Box 8_0138ban Diego, CA 92138
16 of 171.."sa
1
1
.1 .
1
2
2
1
IJAVTRAEQUIPCEN IH-341
DISTRIBUTION LIST (cont'd)
Dr. Steven K. Nul)ughes AircraftGround System GroupP: O. 3ox 3310, MS 688/N104Fullerton, CA 92634
John J. "e!ullen Assocs.; Inc.ATTN: Mr. R. S. Downeye021 Sper.ry Ave., Suite 30Ventura, CA 93003
Human Factors-Research, Inc.'ATTN: Dr. R. Mackie5775 Dawson Ave.Goleta, CA 93017
2 s. Katie M. DentLockheed Missiles & Space Co.Organization 6653Bldq. 573 ,
Sunnv.KAle, CA 94(r6
1
1
AMETEKOffshore Research & Engineering Div.ATTN: Mrs. Marjorie A.'Burnight1224 Coast Village CircleSanta Barbara, CA 93108
Mr. C. Dennis WylieHuman Factors Research,5775 DawSon Ave.Goleta CA 93117
HumRRO/Western Div., Carmegl Ofc.27857 Berwick DriveCarmel, CA.' 93923
Pulsepower Systems, Inc.ATTN: Ms. Patricia M. Snowden,
Librarian815 American St.San Carlos, CA 94070
0Mr. -D. P. Germeraad, Mgr.Dept. 57-70, Bldg. 568Lockheed Missiles & Space Co., Inc.P. O. Box 504
Sunnyvale, CA 94086
Mr. Bill HefleyLockheed Missiles & Space Co.Organ. 60-95Bldg. 156AP. O. Box 5,04
Sunnyvale,p 94Q88
1
Mr. Henry T.Manning1293 Shelby DriveFairfield, CA 94533
Sperry Support ServicesWest Coast OperationsBldg. 92
Benicia Industrial ParkBenicia, CA 94510
Mr. John B. SinacoriP. 0. Box 1043 -
Hollister, CA 95023
4
1
FMC Corporation 1
ATTN: 'r. Carl W. SullingerMail Drop 770
1105 Coleman Ave., Box 1201San Jose, CA 95108
Ms. Winnie Lee 1
Crew Systems & Simulation TechnologyThe Boeing Aerospace Co.P.A. Box 3999, M.S. 82-87Seattle, WA 98124
Mr. Wolf J. Habensteit 1
Crew Systems & Simulation TechnologyThe Boeing Aerospace Co..P. Q. Box 3999, M.S. 82-87-Seattle, WA 98124
Dr. D. G. PearceBehavioral Sciences Div.Defense & Civil, Institute of
Environmerital MedicineP. O. Box 2000
Downsview, Ontario M3M3 CANADA
'Colonel M. D. Calnan (DODC) 1
National Defense HeadquartersOttawa, Ontario,' CANADA K1A 0K2
Dr. Eldad G4-Ezer 1
Israfl Aircraft Industries Ltd.Electronics Div./MBT Weapdbs 5yslems.R,O.B.. 105
.4"..1Ati,UD 56000
' ISRAEL
1
44,
17,of 17
18,2;
0