9
Paul Moore, P.E., Cdr. Eric Runnerstrom, USN (retired), Timothy Shaw, and Ryan Downs Navy Human Computer Interface (HCI) Design Principles and Processes ABSTRACT As technology advances and manning aboard ships is reduced, manual operations and analog equipment are being replaced by digital controls, automation, and remote human computer interfaces (HCI) as the primary means of operating the ship. In such an environment, the ship must be optimized so that the hardware, software and crew complement one another. The HCI, therefore, is increasingly vital to ship performance, particularly in dynamic, stressful scenarios that require operator decisions and actions. This paper outlines some of the key HCI design principles and lessons learned through Navy and industry experience. INTRODUCTION The use of HCIs aboard Navy ships has increased steadily over the past twenty years. When first introduced, HCIs were used rarely and only with the most cutting- edge ship systems; now HCIs are expected to be the default (and sometimes only) means of user interaction with ship systems. The existing Navy HCI standards and guidance, on the other hand, have not kept up with the pace of technology being installed in ships. Early technologies limited the HCIs to displaying only text, with little variety in the font style, size, or color. As display resolution and color depth increased, HCIs continued to incorporate more detailed graphs, maps, charts, pictures, video, and even three dimensional models. These relatively recent advances in affordable HCI technology provide the capability to effectively convey substantial amounts of information to the crew. With this capability to produce exceptionally good HCIs comes the flexibility to produce exceptionally poor HCIs. HCI design specifications should include testable performance requirements representative of the operations the HCI will support. More often than not, HCIs are designed to comply with applicable military guidance and standards, but these provide only general information about human factors and workstation ergonomics. HCI designers are forced to develop supplemental documentation to support their particular applications. When considered in isolation, the quality of the resultant HCIs may be acceptable, especially when evaluated only for its designated function. However, compatibility with other HCIs typically is not invoked as a requirement when HCIs are developed for new systems. When several independently developed HCIs are combined to create an overall interface between the crew and ship hardware and software, inconsistencies among the various interfaces can degrade human performance, and increase costs of manpower, training, design, and maintenance. The variety of different HCIs aboard ships today, some good and some not so good, demonstrates that existing guidance does not consistently produce good quality displays, nor displays with the consistent look and feel needed to achieve good human performance. MEASUREABLE EFFECTS OF HCI DESIGN As with any other part of an integrated system design, poor HCI design can degrade overall system performance and increase

Navy Human Computer Interface (HCI)

  • Upload
    lamnhan

  • View
    242

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Navy Human Computer Interface (HCI)

Paul Moore, P.E., Cdr. Eric Runnerstrom, USN (retired),Timothy Shaw, and Ryan Downs

Navy Human Computer Interface (HCI) Design Principlesand Processes

ABSTRACTAs technology advances and manningaboard ships is reduced, manual operationsand analog equipment are being replaced bydigital controls, automation, and remotehuman computer interfaces (HCI) as theprimary means of operating the ship. In suchan environment, the ship must be optimizedso that the hardware, software and crewcomplement one another. The HCI,therefore, is increasingly vital to shipperformance, particularly in dynamic,stressful scenarios that require operatordecisions and actions. This paper outlinessome of the key HCI design principles andlessons learned through Navy and industryexperience.

INTRODUCTIONThe use of HCIs aboard Navy ships hasincreased steadily over the past twentyyears. When first introduced, HCIs wereused rarely and only with the most cutting-edge ship systems; now HCIs are expectedto be the default (and sometimes only)means of user interaction with ship systems.

The existing Navy HCI standards andguidance, on the other hand, have not keptup with the pace of technology beinginstalled in ships. Early technologies limitedthe HCIs to displaying only text, with littlevariety in the font style, size, or color. Asdisplay resolution and color depth increased,HCIs continued to incorporate more detailedgraphs, maps, charts, pictures, video, andeven three dimensional models. Theserelatively recent advances in affordable HCItechnology provide the capability toeffectively convey substantial amounts ofinformation to the crew. With this capability

to produce exceptionally good HCIs comesthe flexibility to produce exceptionally poorHCIs.

HCI design specifications should includetestable performance requirementsrepresentative of the operations the HCI willsupport. More often than not, HCIs aredesigned to comply with applicable militaryguidance and standards, but these provideonly general information about humanfactors and workstation ergonomics. HCIdesigners are forced to developsupplemental documentation to support theirparticular applications. When considered inisolation, the quality of the resultant HCIsmay be acceptable, especially whenevaluated only for its designated function.However, compatibility with other HCIstypically is not invoked as a requirementwhen HCIs are developed for new systems.When several independently developedHCIs are combined to create an overallinterface between the crew and shiphardware and software, inconsistenciesamong the various interfaces can degradehuman performance, and increase costs ofmanpower, training, design, andmaintenance.

The variety of different HCIs aboard shipstoday, some good and some not so good,demonstrates that existing guidance does notconsistently produce good quality displays,nor displays with the consistent look andfeel needed to achieve good humanperformance.

MEASUREABLE EFFECTS OFHCI DESIGNAs with any other part of an integratedsystem design, poor HCI design can degradeoverall system performance and increase

Page 2: Navy Human Computer Interface (HCI)

total ownership cost. Minor deficiencies innon-critical HCI applications may onlyamount to an annoyance, while significantdeficiencies can result in unacceptableperformance.

Of course, the HCI is only one factor thatleads to correct operator actions andadequate performance. Consider an HCIintended for air defense situation awareness.One validation of that HCI might evaluatethe time it takes an operator to detect a newtrack and correctly identify its affiliation. Inaddition to the HCI, system performancewould be influenced by the operator’straining, stress levels, workload,communication with other people andsystems, etc. Factors such as these must beconsidered when defining systemrequirements and how the HCI will bedesigned and tested.

Human ErrorIn March of 1979, Three Mile IslandNuclear Generating Station, Unit 2experienced a series of events that led to thepartial meltdown of the reactor core and arelease of nuclear material to theatmosphere. Subsequent analyses of theevents indicate that the accident could havebeen averted if operators had taken theappropriate actions.

Up to this point, plant designers hadassumed that operators would not take stepsthat were detrimental to safe operation,accidentally or otherwise. The NuclearRegulatory Commission (NRC) set out toquantify this risk of human error such thatother existing plants could continue tooperate safely. They developed a human riskanalysis, which considers the probabilitythat operators would take inappropriateactions or fail to take necessary actions (i.e.,operator error), the likely consequences ofthese errors, and any additional factors thatcould instigate or mitigate potentialaccidents.

The human risk analysis methodology isdescribed in NUREG/CR-1278. While thisdocument focuses on nuclear power plantinterfaces in 1983, the results are clearlyrelevant today for HCIs in the Navy andotherwise. The quality of the operator’sinterface has a significant effect onperformance the likelihood of human error.

The complexity of the situation and the needfor rapid decisions in a changing situationincrease the likelihood of operator error. Apoor HCI exacerbates such situations,slowing the operator response time andincreasing, possibly significantly, the chanceof human errors. Such delays and errorsduring critical situations (e.g., combat ordamage control) can have catastrophicconsequences.

Human System PerformanceWhile two design solutions can be comparedfor a measure of their relative performanceor effectiveness, it can be challenging tomeasure the human performance enabled byan HCI to absolute criteria. The ability topractically measure and test performancecriteria must be considered when preparingHCI design specifications. Nevertheless,worthwhile aspects of human performancecan be measured, at least in terms of theoverall system (i.e., hardware, software, andpeople). The Chief of Naval Operations hasdirected the Navy to develop a HumanSystems Performance AssessmentCapability (HSPAC) to coordinate Navyefforts to capture and validate humansystems performance. HSPAC has begun bydeveloping a taxonomy of human systemperformance measures and a repository tostore those measures (Winters, 2007).

Generic human system performance metrics,such as workload and situation awareness,tend to be somewhat subjective and difficultto measure; therefore, they are difficult toenforce. Discrete, measurable performancerequirements, such as response time,accuracy, or probability of erroneous action,are specific to the tasks being evaluated.

Page 3: Navy Human Computer Interface (HCI)

Consequently, the expected operators andtasks should be defined during conceptdesign and system specification, at least interms of a concept of operations, well beforeHCI development begins.

TrainingPoor or inconsistent HCI design also canincrease training costs. Some programs haveknowingly accepted HCI deficiencies,reasoning that the design could never beperfect and that training programs wouldcompensate for the HCI flaws and enableadequate operator performance. Thisapproach increases the training burden,which will have to be born continuously aslong as the HCI is in the Fleet. Futureequipment may be designed forcompatibility with the deficient design,perpetuating the flaws for generations ofequipment. Correcting the HCI, on the otherhand, would be a one-time cost, and likelywould be considerably less expensive thanincreasing repetitive training costs.

An HCI that is intuitive to the user, that is,that the user naturally understands, typicallywould not require extensive training for anoperator to achieve some minimallyacceptable level of performance. As anexample gauge of an intuitive HCI design,operators could be able to begin using thenew HCI with less than 10 minutes oftraining, provided they have a goodunderstanding of the intended ship system orfunction. Note that this 10 minute criteria issimply a rule of thumb based on experiencewith a usability test of a particularlyeffective HCI. Of course, a novice userwould not have the level of performance ofan experience user. The point is that anintuitive HCI requires minimal training.

Commonality or compatibility betweenHCIs also is important given the currentoptimized manning initiatives. On newerships like LCS and DDG 1000, personnelwill be required to perform a variety of taskson different systems. If the ship’s HCIs weredesigned for compatibility, training

programs could be consolidated and sailorscould more quickly assume new roles.Otherwise, sailors would need moreindividual training programs and wouldlikely exhibit reduced performance whenchanging roles.

LESSONS LEARNEDHCI designs have contributed to poorperformance in a number of welldocumented Navy and industrialapplications. The following section does notcapture all of the lessons learned, but issimply intended to illustrate some morecommon discrepancies.

Perceived User NeedsFrom our experience in developing displaysand from assessments of displays developedby others, we conclude that HCI developershave a tendency to produce displays that areintuitive to the engineers that design thesystem, as opposed to the expected shipoperators. Such displays tend to be orientedtowards understanding the intricacies of thesystem and convey extensive, in-depthinformation about the system state. We havelearned that such displays actually make theoperator’s task more difficult because thedisplay is unnecessarily complex.

The shipboard operator needs to quickly andeasily find the information needed to make adecision or perform a task. This informationgenerally is a consolidated, simplifiedrepresentation of the system, often muchsimpler than all of the information a designengineer would like displayed. Bydisplaying unnecessary or irrelevantinformation, the HCI makes it more difficultfor the operator to find the requisiteinformation and make important decisions,increasing their response time andincreasing the probability for operator error.

The HCI developer must start the designwith a good understanding of the operator’sneed for information and exercise thediscipline to limit displayed information to

Page 4: Navy Human Computer Interface (HCI)

only what is needed to support operatortasks.

The example HCI in Figure 1 illustrates howthe HCI designers’ perception of the systemdoes not necessarily match the operators’mental model. This HCI is used to operate aJP-5 fuel transfer system aboard an aircraftcarrier. The system uses purple lines todepict the fuel transfer lines, because NSTM520 dictates that JP-5 pipes and valvehandwheels shall be painted purple. Usersmust look at valve and pump alignment tomanage fuel transfer, processing, supply,and tank ballasting operations. Rows oficons across the top and bottom of thedisplay show the inventory and status ofeach tank along the port and starboard sidesof the ship.

Figure 1 – HCI for JP-5 system aboard USSJohn C. Stennis (CVN 74). Photo by MCSNJosue Leopoldo Escobosa.

The JP-5 system can be configured tosupport a wide range of operations, but it isdifficult to discern what operations arecurrently ongoing and what transfer lines areavailable. Static information is provided asbright purple lines, while dynamicinformation about valve positions appearsless prominent. The design engineers mightbe able to determine configuration at aglance, but it is likely that the typical userwill need to trace the intertwined pipesfrequently to understand the systemalignment.

Figure 2 – JP-5 system inventorymanagement. Photo by MCSN JosueLeopoldo Escobosa.

The HCI may be appealing in normal officelighting conditions, but the HCI appears tobe the brightest source of light in thisdarkened space. To support daylightoperations, a bright display might benecessary, but the window border and purplelines seem to provide too much contrastagainst the black background in thisapplication. Since the system is used indarkened conditions, a dimmer display (or adimming capability) might be warranted.

Combining Without IntegratingMany existing HCIs have been developed asstand-alone systems, apparently without anyrequirement for compatibility with otherexisting or developmental systems. It alsoappears that sufficient time and funding wasnot allocated to ensure that the HCI wouldenable good human performance. The lackof integration can be particularly onerous forthe ship’s crew when a ship is modernizedand becomes a mix of older legacy systemsand newly installed, state-of-the-art systems.

For example, when AEGIS class cruiserswere first designed, the informationprovided to the 20 (or more) operators in theCombat Information Center (CIC) wasconsolidated from a variety of sources andpresented in different locations and formats.Operators were required to integrate theinformation mentally and make decisions,

Page 5: Navy Human Computer Interface (HCI)

while communicating with other personnelinside CIC and aboard nearby ships.

Figure 3 – Combat Information Centeraboard USS Vincennes (CG-49). Photo byTim Masterson.

On July 3, 1988, the USS Vincennes shotdown Iran Air Flight 655 (IR655) based on amistaken identification as a hostile aircraft.The flight profile suggested that IR655 washeaded in a descending path directly towardthe Vincennes, typical of an Iranian fighter.IR655 was actually ascending along apublished commercial air corridor. Itappears that the system automaticallyswitched the track number for IR655 from4474 to 4131. Some users were not aware ofthe switch and reported the rate of descentfor the newly assigned track number 4474,an American A-6, several hundred milesaway. Information about the commercial aircorridor was also not integrated with theother track information, since this was onlyavailable in print form.

Mimicking Prior DesignsWhen modifying or designing a new HCIfor an existing application, it is important tocarefully define the scope and functionalbehavior of the HCI. In an effort to avoid thecost of rigorous analyses, some HCI projectshave created requirements for the newdesign to have the same functionality as theexisting design.

This type of requirement forces a bottom-updesign approach, which is bound toperpetuate the poor performance resultingfrom the existing, inadequate design.Additionally, it restricts the design team’sability to improve upon the design based onlessons learned from the existing design.Contrary to the project team’s belief,mimicking the previous design is likely tocause inadequate performance andcapability, which could increase thesystem’s total lifecycle cost.

DESIGN PRINCIPLESThere has been a general recognition in theNavy that shipboard HCIs needimprovement. To address this need, theHuman Systems Integration Directorate inthe Naval Sea Systems Command(NAVSEA) developed the CommonPresentation Layer (CPL) Guide. The twokey objectives of the CPL Guide are toenable compatibility of HCIs across allshipboard operational displays and to fosterthe application of good human factors in thedevelopment of HCIs. The CPL Guide isbased on Navy and DoD standards, HCIstyle guides from several Navy programs,and standards from other industries such asthe commercial nuclear power industry. Inaddition, the CPL Guide went throughextensive review throughout the Navycommunity. The CPL Guide was signed anddistributed as NAVSEA Standard 03-01 onSeptember 18, 2006. In addition, ASTMF1166, “Standard Practice for HumanEngineering Design for Marine Systems,Equipment, and Facilities,” was reviewedand revised to be consistent with the CPLGuide, and NAVSEA is investigatingincorporating CPL and ASTM F1166references into the American Bureau ofShipping Naval Vessel Rules.

Section 2 of the CPL Guide includes acomprehensive set of general HCI designprinciples, which explain that HCIs shouldbe intuitive, consistent with userexpectations, and designed to supportoperators’ tasks. Although the CPL Guide

Page 6: Navy Human Computer Interface (HCI)

does not address the HCI design processexplicitly, it does recommend the use of aformal, rigorous, and methodological designprocess. These topics are discussed furtherbelow.

IntuitiveOperators develop a mental model todescribe the system(s) they monitor andcontrol. They use the model to develop anawareness of the current situation based oninput from the HCI and other outsidesources and to project the outcome ofvarious system changes. Operators use thisunderstanding of the system to decide uponthe best possible course of action.

If the HCI is not aligned with the operators’mental model, it requires a great deal ofmental (cognitive) workload to understandthe current situation. For example, operatorsmight need to consult a legend to understandthe significance of each symbol and colorcode used. This translation – from displayedinformation to the operators’ mental model –increases the operator’s mental workload,resulting in a slower response, a higher riskof errors, and a reduced capability tosuccessfully accomplish multiple tasks.Such deficiencies are exacerbated, perhapssignificantly, in complex, dynamic, stressfulsituations such as combat or damage control.An intuitive HCI minimizes the operators’effort needed to develop situation awarenessand make decisions. This enables morereliable performance with less experiencedusers, even in stressful situations such ascombat or damage control.

To develop displays that match the user’smental model, HCI designers need tounderstand and consult with the intendedusers and support communities, includingtraining and maintenance support. Forexample, designers should take advantage ofthe mental model developed and reinforcedby an existing training program. Conversely,as the new HCIs are developed, designersshould ensure that training programs arerevised to support the new HCIs.

Different operators should be expected tohave slightly different mental models, suchthat real displays will not be able to exactlymatch every operator’s mental model.Consequently, the cognitive workloadneeded to translate display information willnever be zero for all operators, particularlywhen conveying abstract ideas. Designersshould develop simple, intuitive HCIs thatwill minimize the operators’ workloadneeded for translation. This will improveoperator reliability and situation awareness,while increasing the scope of tasks that canbe adequately handled. The HCI should bedesigned to reinforce and maintain themental model, using methods such aslabeling windows, and grouping relateditems together.

The intended users may have severaldifferent mental models. For example, someusers might focus their mental model on thespatial orientation of the systemcomponents, while others focus on asequence of actions that need to beperformed. Usability testing may be neededto determine which model or models shouldbe supported by the HCI. Among othermetrics, usability testing should considerwhether:

• User performance is acceptable or betterwith a given HCI variation (i.e., mentalmodel);

• Users are comfortable or satisfied withthe tested HCI;

• The sample users are representative ofthose that will eventually use the HCI;(Note that designers are generally notconsidered “representative users,”because they benefit from specialknowledge about the HCI and system.)and

• Additional design refinements andusability testing is warranted.

It is important to ask users about the waythey conceptualize the system (i.e., theirmental model) early in the design process,preferably before showing them anyexample concepts. Example concepts can

Page 7: Navy Human Computer Interface (HCI)

bias the operators’ description of theirmental model. Additionally, users tend to beenthusiastic about aesthetically pleasing HCIconcepts, even if they would not promotethe best performance.

ConsistentConsistency is a very important factor forHCIs that may be used in stressfulsituations. Humans are good at recognizingpatterns as well as differences and changesin perceived patterns. The user may inferdifferences that do not exist, if similarinformation is presented in different ways.For example, if a user is shown the speed offive pumps, and only one value is displayedin bold, the user may assume, erroneously,some increased importance for, or problemwith, that pump. Inconsistent methods fordisplay navigation also increase operatorerror probability and slow operator responsetime. Difficulties such as these will beexacerbated under stressful situations.

Minor inconsistencies, such as those causedby display hardware, software, and ambientconditions are to be expected. The realisticexpectation is to achieve compatibility ofHCIs, while minor inconsistencies can beignored. Note that the terms “compatibility”and “minor inconsistencies” are subjectiveand context-sensitive. As an example, twoTACSIT displays may use different shadesof red to denote hostile contacts. The twoshades of red may be “inconsistent” on somelevel, but could be considered compatible ifusers can seamlessly switch between the twosystems. Alternatively, if one shade appearspink, the user may prescribe differentmeanings to the two shades of red, resultingin an incompatibility.

Task OrientedIn a well designed system with effectivehuman-system integration, each operatorshould be assigned a unique set of roles andresponsibilities, giving each operator aunique perspective of the ship. Therefore,each operator needs to access different setsof information and controls to complete their

job. Even when several users need access tothe same information, they may eachconsider that data in terms of a differentmental model as described above. Forexample, many Navy operators need to viewa map of the area surrounding the ship.However, a navigator, a remote vehicleoperator, and an anti-aircraft warfare officerwill be interested in different ranges and setsof supporting information. Each HCI shouldbe tailored to the specific functions assignedto each operator.

Good Design PracticesAs with software development in general,the quality of the HCI design process has asignificant influence on the quality of thefinal design. Ideally, the HCI design processshould be an integral part of the entiresystem design process. The followingguidelines should be followed to the extentpossible:

• MIL-HDBK-46855 (Department ofDefense 1999),

• The Human System Integration Guide(Virtual SYSCOM 2005), and

• The HSI Top-Down RequirementsAnalysis for Ship Manpower Reduction(NAVSEA 2000).

In summary of these documents, HCIdevelopment projects need to ensure thatsufficient budget and schedule are providedto accommodate a rigorous design process.The design process should include

• Analyses to understand theapplication and support the HCIdesign,

• Early and frequent input to the designfrom all relevant stakeholders,particularly potential users,

• The definition of testableperformance requirements,

• Integration to ensure compatibilitywith other ship systems (includinghardware, software, and people),

• Conducting realistic, rigoroususability testing, and

Page 8: Navy Human Computer Interface (HCI)

• Iteration to correct and refine the HCIas a result of testing and user inputs.

Experience with less successful HCI designprojects has helped identify some commonissues with their design processes:

• Process steps are skipped completely oraccomplished in a superficial manner.

• Process steps are performed to anextreme level of detail. While this mightnot generate a bad design, it isunnecessarily costly.

• Process steps are performed out oforder.

• As a corollary to the previous item,many organizations expect to considertesting only once when the design isnearly complete. Usability testingshould begin with early prototypes andcontinue to final validation.

• The process is too rigid to allow forinevitable redesign, or cost and scheduleconcerns are allowed to override thecorrection of design deficiencies.

• Relevant stakeholders are not involvedthroughout the process.

CONCLUSIONSNew HCIs are often developed to supportunique applications and to take advantage ofthe newest technologies. Consequently,designers often are focused on advancing thestate of the art and may not pay a great dealof attention to relevant Navy experience andguidance for HCI design and designprocesses. Even when designers followapplicable guidance, compatibility withexisting systems may not be a definedrequirement, so it may not be considered.Prior to the CPL Guide, available guidancefor HCIs left designers with wide latitude,resulting in very different HCIs beingproduced in accordance with the samestandards. Since Navy HCIs are developedby numerous developers using differenttechnologies, these HCIs tend to beincompatible with one another and cannot beintegrated into an effective system.The resulting poor operator performance –

in terms of slow response times and errors indecision making – can have catastrophicconsequences in stressful situations such ascombat and damage control.

The Common Presentation Layer Guideprovides a source of HCI guidance that,when applied, will move the Navy towardmore compatible HCIs, with the goodhuman factors that enable effective operatorperformance. The CPL Guide enablesimproved human performance and reducedtraining costs. It also helps reduce the costof HCI development and upgrades, sincemost programs could simply apply the CPLGuide in lieu of preparing their own HCIstyle guide.

REFERENCESDepartment of Defense, HSI VirtualSYSCOM Working Group, Human SystemIntegration Guide, May 2005.

Department of Defense, MIL-HDBK-46855A, Human Engineering Program,Process, and Procedures, May 17, 1999.

Department of Defense, NAVSEA 05H(formerly NAVSEA 03), HSI Top-DownRequirements Analysis for Ship ManpowerReduction, 2000.

Department of Defense, NAVSEA Standard03-01, Common Presentation Layer Guide,September 2006.

U.S. Nuclear Regulatory Commission,NUREG/CR-1278, Handbook of HumanReliability Analysis with Emphasis onNuclear Power Plant Applications, FinalReport Prepared by Sandia NationalLaboratories, A. D. Swain and H. E.Guttmann, August 1983.

Winters, John and Pester-DeWan, Dr.Joanne, Developing a Repository for HumanPerformance Measures: Possibilities,Preconditions, and Pitfalls, ASNE HSISymposium 2007.

Page 9: Navy Human Computer Interface (HCI)

ACKNOWLEDGEMENTSThe authors would like to thank all of thesupporters of and contributors to theCommon Presentation Layer Guide, andespecially Mr. J. Robert Bost of Serco, Inc.and Dr. Robert Beaton of NAVSEA 05H.

Paul Moore, P.E. is the principal authorand an employee of MPR Associates, Inc.His work focuses on Human SystemsIntegration and control systems engineeringfor the U.S. Navy and the commercialnuclear power industry. He is also theprincipal author / editor of the CommonPresentation Layer Guide.

Cdr. Eric Runnerstrom, USN (retired)has over 35 years of training and experiencein naval engineering, with an emphasis inship survivability, damage control,firefighting, supervisory control systems,and human systems integration. He holds amaster’s degree in naval engineering fromthe Massachusetts Institute of Technology.

Timothy Shaw is an engineer with MPRAssociates, Inc. His work focuses on damagecontrol engineering and softwaredevelopment for the U.S. Navy. He is alsoone of the principal developers of theDamage Modeler software application, oneof the three software components of theDamage Control Tactical ManagementSystem (DCTMS).

Ryan Downs is an engineer with MPRAssociates, Inc. Mr. Downs has beeninvolved in a variety of projects for the U.S.Navy and the nuclear power industryemphasizing hardware and software design,distributed control system design andqualification, equipment qualificationtesting, fluid system automation, networkdesign, shipboard test plan execution, valvediagnostics, and ship survivability designevaluations.