Lessons Learned Applying the Mission Diagnostic

Embed Size (px)

Citation preview

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    1/23

    Lessons Learned Applying the MissionDiagnostic

    Audrey DorofeeLisa MarinoChristopher Alberts

    March 2008

    TECHNICAL NOTECMU/SEI-2008-TN-004

    Dynamic SystemsUnlimited distribution subject to the copyright.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    2/23

    This work is sponsored by the U.S. Department of Defense.

    The Software Engineering Institute is a federally funded research and development center sponsored by the U.S.Department of Defense.

    Copyright 2008 Carnegie Mellon University.

    NO WARRANTY

    THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL ISFURNISHED ON AN AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OFANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITEDTO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTSOBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKEANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, ORCOPYRIGHT INFRINGEMENT.

    Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.

    Internal use. Permission to reproduce this document and to prepare derivative works from this document for inter-nal use is granted, provided the copyright and No Warranty statements are included with all reproductions andderivative works.

    External use. Requests for permission to reproduce this document or prepare derivative works of this document forexternal and commercial use should be addressed to the SEI Licensing Agent.

    This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 withCarnegie Mellon University for the operation of the Software Engineering Institute, a federally funded researchand development center. The Government of the United States has a royalty-free government-purpose license touse, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so,for government purposes pursuant to the copyright license under the clause at 252.227-7013.

    For information about purchasing paper copies of SEI reports, please visit the publications portion of our Web site(http://www.sei.cmu.edu/publications/pubweb.ht ml)..

    http://www.sei.cmu.edu/publications/pubweb.htmlhttp://www.sei.cmu.edu/publications/pubweb.html
  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    3/23

    SOFTWARE ENGINEERING INSTITUTE | i

    Table of Contents

    Abstract v

    1 Introduction 1 1.1 Overview of Mission Diagnostic 1 1.2 Contents of this Technical Note 2

    2 Overview of the Conducted Assessment 3 2.1 Project Description 3 2.2 Purpose of this Assessment 3 2.3 Mission Diagnostic Process Used 4 2.4 Participant Composition 4

    3 Tailoring the Mission Diagnostic 5

    3.1 Product and Process Risk Analysis 5 3.2 Driver Superset for a New Domain 5 3.3 Teleconferencing for Interviews 6 3.4 Outcome-Based Scenario Analysis 6

    4 Results of the Conducted Assessment 7 4.1 Three Outcome Scenarios 7 4.2 Long-Term Issues 8

    5 Lessons Learned 9 5.1 New Set of Drivers 9 5.2 Wording of Driver Questions 9 5.3 New Technique for Interviewing 9 5.4 Success of Outcome-Based Scenario Analysis 10 5.5 Communications and Logistics 10 5.6 Sponsors for this Assessment 11

    6 Summary 12 6.1 Mission Diagnostic and In-Depth Assessments 12 6.2 Outcome-Based Scenario Analysis Drives New Protocol 12 6.3 Basis for New Framework for Risk 12

    References 13

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    4/23

    i i | CMU/SEI-2008-TN-004

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    5/23

    SOFTWARE ENGINEERING INSTITUTE | i i i

    Acknowledgments

    The authors wish to thank Linda Levine, Melissa Kasan Ludwick, Jeannine Siviy, and Mary-

    Catherine Ward of the Software Engineering Institute for their reviews and comments of this doc-ument.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    6/23

    i v | CMU/SEI-2008-TN-004

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    7/23

    SOFTWARE ENGINEERING INSTITUTE | v

    Abstract

    The Mission Success in Complex Environments (MSCE) research team along with Acquisition

    Support Program (ASP) staff applied the Mission Diagnostic (MD) to a project for the develop-ment and broad deployment of a software Application. The purpose of this MD engagement wasto conduct a rapid evaluation of the Application projects potential for success during its alphaphase deployment, knowing additional deployments were to quickly follow. Software develop-ment and deployment was a new domain for applying the MD, and this particular project includedadditional constraints. Therefore, we modified the basic MD in several ways. We developed anew set of MD drivers for use with software development and deployment projects, used telecon-ferencing to collect data from project personnel, and included outcome-based scenarios to assess avariety of mission outcomes. From this experience, we have also derived the basis for a new suc-cess-driven framework with an integrated risk perspective. This technical note describes the adap-tation of the MD necessary for this customer and the lessons we learned from its use.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    8/23

    vi | CMU/SEI-2008-TN-004

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    9/23

    SOFTWARE ENGINEERING INSTITUTE | 1

    1 Introduction 1

    The purpose of this technical note is to describe the lessons that were learned from applying the

    Mission Diagnostic (MD) to a customers software development and deployment project. TheMission Success in Complex Environments (MSCE) research team along with Acquisition Sup-port Program (ASP) staff applied the MD to a project for the development and broad deploymentof a software Application just prior to its initial or alpha deployment. Software development anddeployment was a new domain for applying the MD, and this particular project included addi-tional constraints. Therefore, we modified the basic MD in several ways. This technical note de-scribes the adaptation of the MD necessary for this customer and the lessons we learned from itsuse. These lessons are from the first application of the MD with this customer. Lessons from fu-ture use of the MD with this customer may be documented at a later date. The information pro-vided in this technical note has been sanitized to protect the customers identity and any sensitiveinformation.

    1.1 OVERVIEW OF MISSION DIAGNOSTIC

    The MD is a risk-based assessment for evaluating a missions current condition and determiningwhether it is on track for success. This time-efficient assessment can be applied to projects, pro-grams, and processes across the life cycle and throughout the supply chain. An MD assessment isstraightforward to conduct, and the basic version can be self-applied by people who are responsi-ble for overseeing projects, programs, and processes.

    By providing a broad overview of the state of possible success for a mission, the MD can beviewed as a first-pass screening of a project, program, or process to diagnose any circumstances

    that might affect its potential for success. More detailed, follow-on evaluations might be requiredwhen a missions potential for success is judged to be unacceptable.

    The MD uses a driver-based approach for gauging a missions potential for success. In the MDcontext, a driver 2 is a quantitative or qualitative parameter that provides an indirect measure of abroad concept. Drivers are used to measure the current conditions affecting a given mission. In theMD, a five-point scale is used to establish a value for each driver. A simple algorithm based onthese drivers is used to estimate the potential for success and the degree of progress toward thedesired outcome.

    An important aspect of an MD assessment is time efficiency. To ensure that the assessment can be

    completed in a reasonable amount of time, you need to keep the number of drivers small. Experi-ence has shown that good results are achieved by using between 10 and 20 drivers in an assess-ment.

    1 The Mission Diagnostic overview material presented here is from the Mission Diagnostic Protocol, Version 1.0: A Risk-Based Approach for Assessing the Potential for Mission Success [Alberts 2008] technical report.

    2 An example of a driver (drivers are worded as questions) is Are people at the alpha deployment site and re-lated sites prepared to operate, maintain, and support the application?

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    10/23

    2 | CMU/SEI-2008-TN-004

    Each driver represents a key driver that can guide a mission toward success or failure, based on itscurrent state or value. A success driver is a condition or circumstance that guides a mission to-ward a successful outcome, while a failure driver is a condition or circumstance that guides a mis-sion toward an unsuccessful outcome. A driver can be one or the other, based on its current stateor value. Drivers are worded as yes/no questions, where a yes answer denotes that the driver is a

    success driver in this context and no denotes a failure driver.3

    Each mission has a mixture of success and failure drivers currently influencing the eventual out-come. The philosophy underlying the MD is that the relative impact of success and failure driverscan be used to gauge a missions potential for success. A predominance of success drivers in rela-tion to failure drivers indicates an acceptable potential for success (and vice versa).

    The analysis of drivers in an MD assessment can be conceptually broken into the following twoparts:1. Evaluate each driver to determine the extent to which each is present.

    2. Analyze the entire set of drivers to determine the missions potential for success.

    1.2 CONTENTS OF THIS TECHNICAL NOTE

    The contents of this technical note are as follows: Introductionan explanation of the purpose of this document and a brief overview of the

    Mission Diagnostic

    Overview of this MD Assessmentcustomer project description and the purpose and con-straints of the initial MD application

    Tailoring the Mission Diagnostica summary of the types of tailoring that were done to thebasic MD to suit its application for this customer and this project

    Results of the MD Assessmenta brief summary of the results of assessing this customerproject

    Lessons Learnedthe lessons we learned from applying the MD

    Summarythe impacts on our future work

    3 For example, given a driver Is the project budget adequate for all activities and required resources?, a No answer can be viewed as a failure driver and a Yes answer as a success driver.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    11/23

    SOFTWARE ENGINEERING INSTITUTE | 3

    2 Overview of the Conducted Assessment

    2.1 PROJECT DESCRIPTION

    This customers project was a development and deployment mission to create a replacement ap-plication. This replacement application (hereafter referred to as the Application) would be one of a larger suite of new enterprise-wide applications servicing geographically distributed sites. Thisparticular Application focused on enhancing system performance so that the customer service rep-resentatives could do their jobs more efficiently and in a timely manner. The Application wouldhave a phased rollout (alpha, beta, and enterprise-wide) with increasing functionality and sites ineach phase.

    The underlying infrastructure and support were relatively new and built to connect the enterprisesites. This project would develop one of the first new applications to integrate with the existing,

    legacy applications and the new architecture. To accomplish this, personnel had to: Reengineer associated business processes

    Learn newer technologies (Java, Web services, etc.)

    Interface with several applications being upgraded or replaced

    Deliver the same functionality and improve performance

    Complete successful integration testing

    Retrain end users and support personnel

    Deploy at staged intervals with increasing functionality and coverage

    Achieve schedule compression

    Complete architecture changes to the underlying infrastructure in parallel with developmentand deployment of the Application

    The project had attracted a lot of internal attention as well as the attention of senior internal andexternal stakeholders, and the pressure to succeed was considerable. Several externally drivenchanges (e.g., increases in data security requirements and changes to integration and test re-sources) had occurred, which had already compressed the projects schedule when the SoftwareEngineering Institute (SEI) was brought in to assess the situation.

    2.2 PURPOSE OF THIS ASSESSMENT

    The goals of the initial use of the MD with this customer were to make a rapid evaluation of thepotential for success for the alpha deployment of the Application and, as a side benefit, identifyany major issues with the rest of the development and deployment process and plan. The con-straints were as follows: Keep costs at a minimum

    Disrupt project personnel as little as possible

    Produce results before the first deployment (alpha) was installed and configured

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    12/23

    4 | CMU/SEI-2008-TN-004

    2.3 MISSION DIAGNOSTIC PROCESS USED

    Our assessment process consisted of the following high-level activities, which were consistentwith the general approach for conducting an MD assessment: Met with customer to present MD overview

    Conducted initial fact-finding interviews Tailored the MD process and materials

    Interviewed project participants (ten interviews at one hour per interview)

    Conducted post-interview analysis

    Developed assessment report and presented results

    2.4 PARTICIPANT COMPOSITION

    The SEI analysis team had the following skills: Risk assessment

    Software development and deployment

    Project and change management

    Business process and systems analysis

    Training and transition

    The customer personnel 4 we interviewed represented the following technical and project areas: Business process analysis

    Project management

    Program management

    Systems management Testing

    Training

    4 We were unable to interview any software development personnel.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    13/23

    SOFTWARE ENGINEERING INSTITUTE | 5

    3 Tailoring the Mission Diagnostic

    Previously, we used the MD in two areas: cyber-security incident response and portfolio man-

    agement. For this assessment, we were analyzing a software development project near its alphadeployment. We believed the general set of 10 driver questions would need to be adjusted to con-sider specific development and deployment issues as well as more general project issues. A newsuperset of drivers relevant to software development and deployment would be needed. Two addi-tional factors also played into this MD assessment of the project. First, the staged beta and enter-prise-wide deployment for the Application was already scheduled. In addition, there was ongoing,parallel development of applications and components that would be required for this Applicationto fully function. Thus, for this MD assessment we also needed to consider any issues that mightaffect development and deployment beyond alpha.

    3.1 PRODUCT AND PROCESS RISK ANALYSIS

    It was clear from early discussions with the customer that this project had critical aspects in boththe product itself (the Application) and in the processes being used for development and deploy-ment. Thus, we needed to focus on product and process aspects during this assessment. In addi-tion, most personnel were involved in and knowledgeable about only one of these two aspects.Consequently, we developed two sets of drivers, a superset, focusing on the technical aspects of developing and deploying the product

    processes being used to manage the project and its development and deployment

    This distinction would allow us to give the customer results that would help them understandwhether their issues were associated with the Application itself, the processes they employed, orboth.

    3.2 DRIVER SUPERSET FOR A NEW DOMAIN

    To develop this superset of drivers, we began with a series of interviews with senior managers andteam leads to understand the critical aspects of this project, taking care to note what they believedcould lead to success or failure. We drafted a set of proposed drivers and solicited internal SEIsubject matter experts to ensure all critical aspects for software development and deploymentwere represented. Once we were satisfied with content completeness, we began mapping this pro-posed set against our general 10 drivers. 5 While there was some overlap, we clearly identifiedseveral new areas that would need to be represented. Drivers specific to product and process wereneeded as well as ones specifically addressing development, deployment, and management issues.The end result was the creation of a driver superset for this new domain that consisted of 10 pro-

    ject drivers and 8 technical drivers.

    5 These drivers are referenced in the Mission Diagnostic Protocol, Version 1.0: A Risk-Based Approach for As- sessing the Potential for Mission Success [Alberts 2008] technical report.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    14/23

    6 | CMU/SEI-2008-TN-004

    The 10 project drivers addressed aspects of Organizational change and pressure

    Sponsorship

    Project goals, funding, and schedule

    Plans Requirements

    Team skills and abilities

    Team coordination

    Risk management

    The eight technical drivers addressed aspects of Integration

    Operational functionality

    User preparedness Operator preparedness

    Security

    Infrastructure

    Roll-back and contingency planning

    3.3 TELECONFERENCING FOR INTERVIEWS

    Previous MD assessments conducted all interviews face-to-face. The interviewees for this MDassessment were scattered across several sites that were not geographically near one another. Be-

    cause of travel costs and the need to minimize impact on project personnel, we decided to conductall interviews using teleconferencing instead of our usual face-to-face interviews. We interviewedmanagers and team leads first and, if needed, reserved the right to conduct a second round of in-terviews with additional project personnel. As it turned out, we did not need to conduct the addi-tional interviews.

    3.4 OUTCOME-BASED SCENARIO ANALYSIS

    In previous applications of MD, we used a simple algorithm to estimate the potential for success.With the complexity of this project, the customer wanted a third-party, expert opinion rather thanan estimate. Therefore, we chose outcome-based scenario analysis since it is a more robust analy-

    sis approach that looks at the potential for success of different project outcomes.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    15/23

    SOFTWARE ENGINEERING INSTITUTE | 7

    4 Results of the Conducted Assessment

    We conducted the tailored MD, focusing on the alpha deployment of the Application. A sanitized

    summary of some of the important aspects of the results is provided here.

    4.1 THREE OUTCOME SCENARIOS

    Having gathered quite a bit of information from the interviews, and a limited amount of data fromdocumentation provided by the customer, we were able to do a root cause analysis. While the an-swers we received 6 for each question varied widely, it was not difficult to identify the majorthemes that represented critical issues: Organizational practice

    Project management

    Support activities Local or task-related activities

    Integration

    Based on these major themes, we created three outcome scenarios to analyze and to determinetheir potential for success. These outcome scenarios represented minimal, moderate, and goodpictures of success for the alpha deployment of the Application.

    1. Minimal successthe project would identify key problem areas with the Application, theinfrastructure, and deployment sites

    2. Moderate successthe Application would be able to successfully complete some core cus-

    tomer functions3. Good successthe Application would be able to successfully complete core customer func-

    tions and would operate successfully with the infrastructure and all other relevant compo-nents and applications

    The three outcomes were analyzed with the driver data to determine their potential for success.We determined that only the Minimal success outcome was likely to occur. The outcomes withModerate and Good levels of success both looked unlikely. In other words, only the outcomewhich set a relatively low bar of success was likely to be achieved. In fact, the alpha deploymentof the Application was most likely to be more of a proof of concept as opposed to a true alphatest.

    6 Nearly all of the drivers had answers from interviewees that covered the entire range of possibilities from Yes toNo. There was little consistency of viewpoints. In fact, some of the positive answers were based on firm beliefsby the interviewees that issues were being addressed, even if they did not know by whom.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    16/23

    8 | CMU/SEI-2008-TN-004

    4.2 LONG-TERM ISSUES

    Long-term deployment issues were also identified during the interviews. Some issues arose fromspecific questions and some were identified by the analysis team as we looked at the data we col-lected. These issues can be characterized as having to do with planning, integration, site prepara-tion, and user training. For example, the following issues were identified:

    Management, organizational, and political conditions will likely continue to have an adverseaffect on the development.

    Schedule compression is likely to continue.

    Coordination problems exist within the project team as well as enterprise-wide.

    Major problems exist related to coordination and management of external dependencies.

    No master schedule exists for the enterprise-wide deployment.

    Local software customizations may have occurred, and local infrastructures are unknown.

    Increases in scale, complexity, and integration are likely. Rollout of additional applications requiring integration is likely to occur.

    Security requirements will increase in complexity.

    No testing environment exists for the new architecture.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    17/23

    SOFTWARE ENGINEERING INSTITUTE | 9

    5 Lessons Learned

    Based on this MD assessment, there were several lessons we learned that we will apply to future

    MD engagements and future work in MOSAIC.

    5.1 NEW SET OF DRIVERS

    Our three sets of drivers from previous customer engagements, general , incident management ,and portfolio management , have been useful for most circumstances. With the completion of thisproject, we now have another set of drivers to use for software development and deployment pro-

    jects. While we need to pilot these drivers on another project to verify their effectiveness, we areconfident that they will require minimum alteration for future use.

    This revised driver superset was successfully used during the interviews without change except

    for one small adjustment (see Wording of Driver Questions). By asking the right questions, tai-lored to this particular project, the interviews were efficient and effective, and few follow-onquestions were needed to clarify points.

    5.2 WORDING OF DRIVER QUESTIONS

    Two of the questions were misinterpreted by the interviewees, leading to early confusion on theanswers. Specifically, one driver was troublesome as the interviewees attempted to parse an im-plied double-negative: Do management, organizational, and political issues have little or no im-

    pact on alpha phase activities? Interviewees heard the words political issues and impact andpromptly answered Yes, thinking this meant political issues had a large impact on the project.We eventually changed the wording of the driver to avoid confusion and reversed the scale on theanswers already collected to be consistent with the rest during analysis. Future MD assessmentswill continue to keep this potential for misinterpretation in mind.

    5.3 NEW TECHNIQUE FOR INTERVIEWING

    This was the first time we relied solely on teleconferencing for conducting interviews and had tocreate a new technique. 7 As part of our research, we develop, test, and refine methods and tech-niques for success-driven management. We refined this new technique somewhat during succes-sive teleconferences.

    Since we were not able to assess body language, we had to pay close attention to verbal cues. Itwas much easier than we anticipated to gauge the interviewees level of comfort with the inter-view, their commitment to the project, and any concerns or biases they had. Nearly all of the in-terviewees were completely forthcoming and the telephone presented no real barrier to communi-cation or data gathering. Interviewees appeared to be candid and did not express any reservation

    7 This technique is as yet unnamed.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    18/23

    1 0 | CMU/SEI-2008-TN-004

    or hesitation about participating. In addition, our more traditional face-to-face interview format of one facilitator and one or more note-takers continued to hold up well during the teleconferences.We appreciated the facilitators ability to keep the interview moving along since rambling couldhave been an issue given the short time frames for the teleconferences and the difficulties of sche-duling additional time.

    5.4 SUCCESS OF OUTCOME-BASED SCENARIO ANALYSIS

    While we used the outcome-based scenario analysis to provide a clearer picture of possible suc-cess to the customer, we also wanted to test the validity of our results. Therefore, we used thestandard MD algorithmic analysis 8 to verify the reasonableness of our outcome-based scenarioanalysis. The standard algorithmic analysis, which does not provide context when interpretingresults, determined the overall potential for project success was between low and moderate. 9 If wegave the customer the results of the standard algorithmic analysis, it would have painted a bleak picture and simply told them the project was in trouble.

    The scenario-based results we provided were just as bleak. However, by identifying and analyzingthe three different outcome scenarios and different sets of tradeoffs, we were able to elaborate andprovide the customer with a clearer view of what outcomes were likely for this project and ensurethem that some success was possible. Our results indicated that one of the outcomes was likely tooccur and the other two were not likely to occur. Adding the three scenario scores produced anoverall score (using a simple average) that was equivalent to the standard algorithmic score ( low and moderate) , providing an informal validation of our use of the outcome scenarios.

    5.5 COMMUNICATIONS AND LOGISTICS

    As with any assessment, all communications and logistics were to be handled by an SEI point of

    contact (POC) and a customer POC; however, vacation schedules led to multiple SEI points of contact. To complicate matters, the customer POC let the interviewees schedule their own inter-views creating many customer points of contact and stretching out the timeframe for interviews.Some of these occurrences could have been alleviated if we were on the customer site; often, theforcing function of a site visit makes communications and logistics run more smoothly.

    To alleviate such occurrences in future analyses, we see the need to create liaison responsibilitiesto ensure One POC each for the SEI and the customer

    All logistics and communications are properly tracked, managed, and handled in a timely fa-shion.

    Any follow-up activities are completed.

    8 The simplest algorithm applies a set number of points to each driver, based on the answer. These are addedtogether to reach a total score that is compared to a defined evaluation scale to determine the potential of suc-cess for the project. For example, if there were 10 drivers, with a maximum of 10 points each, one could as-sume a total score of 90 would indicate a very high potential for project success.

    9 The terms low and moderate are part of the scale used in the Mission Diagnostic Protocol, Version 1.0: A Risk- Based Approach for Assessing the Potential for Mission Success [Alberts 07].

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    19/23

    SOFTWARE ENGINEERING INSTITUTE | 1 1

    All analysis team personnel and customer personnel are identified and introduced.

    Written ground rules and agreement for customer and SEI POC

    5.6 SPONSORS FOR THIS ASSESSMENT

    This MD engagement had two distinct sponsors, in two different parts of the customer organiza-tion. This eventually led to unclear ownership of the results and difficulties in delivering the finalresults, particularly as their specific goals and objectives became increasingly disjointed as timepassed. Inconsistent communications increased the amount of confusion about who the primarystakeholder actually was. In the future, we will clarify the goals and objectives of all sponsors andstakeholders and continue to verify those goals and objectives at key points in the schedule, par-ticularly prior to delivering any results.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    20/23

    1 2 | CMU/SEI-2008-TN-004

    6 Summary

    In summary, there were several outcomes from this MD assessment that will become part of our

    future work and direction.

    6.1 MISSION DIAGNOSTIC AND IN-DEPTH ASSESSMENTS

    A full-blown, in-depth assessment (e.g., a detailed technical assessment, complete risk evaluation,or architectural assessment) could not have been performed with the limited resources available toassess the Application alpha deployment. While the MD was a high-level as opposed to an in-depth assessment, it none-the-less found similar issues that have been discovered in previous in-depth project assessments. As such, the MD could possibly be used as a cost-effective, quick as-sessment leading to more in-depth assessments or in combination with such assessments.

    For this particular assessment, the MD took much longer than usual due to the teleconferencingrestriction and the availability of interviewees. Done within its normal, compressed time frame (2-3 days with proper logistical planning), an MD might be advantageous in conjunction with in-depth assessments focused on specific problem areas identified by the MD. The MD drivers mightalso provide an alternative structure for data analysis for in-depth assessments, particularly thosewith less-formal processes.

    6.2 OUTCOME-BASED SCENARIO ANALYSIS DRIVES NEW PROTOCOL

    The use of the outcome-based scenario analysis was borrowed from another assessment method, Mission Assurance Analysis Protocol (MAAP) [Alberts 05]. It added a degree of complexity to the

    basic MD. As such, we feel we have the basis for a new protocol that is more than the basic MD,but not as complex as the more resource-intensive MAAP. We will document this new protocol ata future date after additional testing and refinement.

    6.3 BASIS FOR NEW FRAMEWORK FOR RISK

    Working with this customer required us to create the revised drivers and consider the differentissues facing the customer, including the very distinctive layers of responsibility and communica-tion. During the analysis, we began refining our understanding of risk and success managementand the different layers of information, responsibility, communication, and risk mitigation thatoccur across an organization and across multiple organizations. This new understanding has lead

    to research on a new framework for success management complete with an integrated perspectiveof risk.

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    21/23

    SOFTWARE ENGINEERING INSTITUTE | 1 3

    References

    URLs are valid as of the publication date of this document.

    [Alberts 2005]Alberts, Christopher & Dorofee, Audrey. Mission Assurance Analysis Protocol (MAAP): As-sessing Risk in Complex Environments (CMU/SEI-2005-TN-032, ADA441906). SoftwareEngineering Institute, Carnegie Mellon University, 2005.http://www.sei.cmu.edu/publications/documents/05.reports/05tn032.html

    [Alberts 2008]Alberts, Christopher & Dorofee, Audrey. Mission Diagnostic Protocol, version 1.0 (CMU/SEI-2008-TR-005). Software Engineering Institute, Carnegie Mellon University,2008.http://www.sei.cmu.edu/publications/documents/08.reports/08tr005.html

    http://www.sei.cmu.edu/publications/documents/05.reports/05tn032.htmlhttp://www.sei.cmu.edu/publications/documents/08.reports/08tr005.htmlhttp://www.sei.cmu.edu/publications/documents/08.reports/08tr005.htmlhttp://www.sei.cmu.edu/publications/documents/05.reports/05tn032.html
  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    22/23

    1 4 | CMU/SEI-2008-TN-004

  • 8/14/2019 Lessons Learned Applying the Mission Diagnostic

    23/23

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, search-ing existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments re-garding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquar-ters Services, Directorate for information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Officeof Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503.

    1. AGENCY USE ONLY (Leave Blank)

    2. REPORT DATE March 2008

    3. REPORT TYPE AND DATESCOVERED

    Final4. TITLE AND SUBTITLE

    Lessons Learned Applying the Mission Diagnostic

    5. FUNDING NUMBERS

    FA8721-05-C-00036. AUTHOR (S)

    Audrey Dorofee, Lisa Marino, Christopher J. Alberts7. PERFORMING ORGANIZATION NAME (S) AND ADDRESS (ES)

    Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213

    8. PERFORMING ORGANIZATIONREPORT NUMBER

    CMU/SEI-2008-TN-004

    9. SPONSORING /MONITORING AGENCY NAME (S) AND ADDRESS (ES)HQ ESC/XPK5 Eglin Street

    Hanscom AFB, MA 01731-2116

    10. SPONSORING /MONITORINGAGENCY REPORT NUMBER

    11. SUPPLEMENTARY NOTES

    12A DISTRIBUTION /AVAILABILITY STATEMENT

    Unclassified/Unlimited, DTIC, NTIS

    12B DISTRIBUTION CODE

    13. ABSTRACT (MAXIMUM200 WORDS )

    The Mission Success in Complex Environments (MSCE) research team along with Acquisition Support Program (ASP) staff applied theMission Diagnostic (MD) to a project for the development and broad deployment of a software application. The purpose of this MD en-gagement was to conduct a rapid evaluation of the Application projects potential for success during its a lpha phase deployment, know-ing additional deployments were to quickly follow. Software development and deployment was a new domain for applying the MD, andthis particular project included additional constraints. Therefore, we modified the basic MD in several ways. We developed a new set ofMD drivers for use with software development and deployment projects, used teleconferencing to collect data from project personnel,and included outcome-based scenarios to assess a variety of mission outcomes. From this experience, we have also derived the basis

    for a new success-driven framework with an integrated risk perspective. This technical note describes the adaptation of the MD neces-sary for this customer and the lessons we learned from its use

    14. SUBJECT TERMS

    risk assessments, risk analysis, software deployment, software deployment risk, software de-velopment, software development risk, evaluating risk, risk evaluation

    15. NUMBER OF PAGES

    22

    16. PRICE CODE

    17. SECURITY CLASSIFICATION OFREPORT

    Unclassified

    18. SECURITY CLASSIFICATIONOF THIS PAGE

    Unclassified

    19. SECURITYCLASSIFICATION OFABSTRACT

    Unclassified

    20. LIMITATION OFABSTRACT

    UL

    NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89) Prescribed by ANSI Std. Z39-18298-102