134
DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-i Interim Version 1.9 November 7 2008 Appendix B Systems Engineering Life Cycle Table of Contents 1. Introduction .............................................................................................................. 1 1.1 Purpose 1.2 Applicability 1.3 DHS SELC Process Overview 1.4 Relationship to the DHS Acquisition Review Process and the Capital Planning and Investment Control Process 1.4.1 Alignment with the DHS Acquisition Review Process 1.4.2 Alignment with the DHS Capital Planning and Investment Control Process 1.5 Governance, Roles and Responsibilities 1.6 Document Organization 2. SELC Overview ...................................................................................................... 13 2.1 Overview of DHS SELC Elements 2.1.1 SELC Entry Criteria 2.1.2 SELC Stages 2.1.3 SELC Stage Reviews 2.1.4 SELC Exit Criteria 2.2 DHS Development Methodologies 2.2.1 Types of Development Methodologies 2.2.2 Selecting an Appropriate Development Methodology 2.3 Project Tailoring, Deviations, and Waivers 2.4 Refine Documentation 2.5 Other SELC Considerations 3. Stage A: Solution Engineering ............................................................................. 21 3.1 Solution Engineering Activities 3.1.1 Document the Concept of Operation 3.1.2 Identify and Analyze Potential Solution Alternatives 3.1.2.1 Analysis of Alternatives Study Plan 3.1.2.2 Study Plan Review 3.1.3 Develop Life Cycle Cost Estimate 3.1.4 Define Operations Requirements 3.1.5 Develop Acquisition Plan 3.1.6 Develop the Integrated Logistics Support Plan (ILSP) 3.1.7 Establish Acquisition Program Baseline 3.2 Documents 3.2.1 Solution Engineering Review Participants 3.2.2 Exit Criteria 4. Stage 2: Planning ................................................................................................... 28 4.1 Planning Entry Criteria 4.2 Planning Activities 4.2.1 Develop SE Risk Management Plan 4.2.2 Develop SE Quality Assurance Plan 4.2.3 Develop Training Plan 4.2.4 Develop Configuration Management Plan 4.2.5 Complete Privacy Threshold Analysis (As Applicable)

Pia Reservation SDLC

Embed Size (px)

Citation preview

Page 1: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-i Interim Version 1.9 November 7 2008

Appendix B Systems Engineering Life Cycle

Table of Contents

1. Introduction.............................................................................................................. 1 1.1 Purpose 1.2 Applicability 1.3 DHS SELC Process Overview 1.4 Relationship to the DHS Acquisition Review Process and the Capital Planning and Investment

Control Process 1.4.1 Alignment with the DHS Acquisition Review Process 1.4.2 Alignment with the DHS Capital Planning and Investment Control Process

1.5 Governance, Roles and Responsibilities 1.6 Document Organization

2. SELC Overview ...................................................................................................... 13 2.1 Overview of DHS SELC Elements

2.1.1 SELC Entry Criteria 2.1.2 SELC Stages 2.1.3 SELC Stage Reviews 2.1.4 SELC Exit Criteria

2.2 DHS Development Methodologies 2.2.1 Types of Development Methodologies 2.2.2 Selecting an Appropriate Development Methodology

2.3 Project Tailoring, Deviations, and Waivers 2.4 Refine Documentation

2.5 Other SELC Considerations

3. Stage A: Solution Engineering ............................................................................. 21

3.1 Solution Engineering Activities 3.1.1 Document the Concept of Operation 3.1.2 Identify and Analyze Potential Solution Alternatives

3.1.2.1 Analysis of Alternatives Study Plan 3.1.2.2 Study Plan Review

3.1.3 Develop Life Cycle Cost Estimate 3.1.4 Define Operations Requirements 3.1.5 Develop Acquisition Plan 3.1.6 Develop the Integrated Logistics Support Plan (ILSP) 3.1.7 Establish Acquisition Program Baseline

3.2 Documents 3.2.1 Solution Engineering Review Participants 3.2.2 Exit Criteria

4. Stage 2: Planning................................................................................................... 28 4.1 Planning Entry Criteria 4.2 Planning Activities

4.2.1 Develop SE Risk Management Plan 4.2.2 Develop SE Quality Assurance Plan 4.2.3 Develop Training Plan 4.2.4 Develop Configuration Management Plan 4.2.5 Complete Privacy Threshold Analysis (As Applicable)

Page 2: Pia Reservation SDLC

B-ii DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

4.2.6 Develop Section 508 Electronic and Information Technology Accessibility Plan (As Applicable)

4.2.7 Develop System Data Management Plan (IT Only) 4.3 Documents 4.4 Stage Review and Exit Criteria

4.4.1 Exit Criteria

5. Stage 2: Requirements Definition......................................................................... 34 5.1 Requirements Definition Activities

5.1.1 Develop Functional Requirements 5.1.2 Develop Requirements Traceability Matrix 5.1.3 Evaluate Infrastructure and Environmental Aspects 5.1.4 Develop Test and Evaluation Master Plan 5.1.5 Developmental Test Plan 5.1.6 Complete Information Security Activities (IT Only)

5.1.6.1 Develop Security Requirements Traceability Matrix (IT Only) 5.1.6.2 Develop Plan of Action and Milestones (IT Only) 5.1.6.3 Initiate Development of System Security Plan (IT Only) 5.1.6.4 Develop Security Risk Assessment (IT Only) 5.1.6.5 Develop Security Test and Evaluation Plan (IT Only)

5.1.7 Develop Service Level Agreements 5.1.8 Develop Disaster Recovery Plan (IT Only) 5.1.9 Map Project Data to HLS EA Data Architecture (IT Only) 5.1.10 Map Documents to the HLS EA Technical Reference Model (IT Only)

5.2 Documents 5.3 System Definition Review and Exit Criteria

5.3.1 Exit Criteria

6. Stage 3: Design...................................................................................................... 45 6.1 Design Activities

6.1.1 Define System Requirements and Update the Requirements Traceability Matrix 6.1.2 Develop Logical Design 6.1.3 Develop Data Architecture (IT Only) 6.1.4 Conduct Preliminary Design Review 6.1.5 Develop System Design 6.1.6 Validation and Verification (IT Only) 6.1.7 Develop Technology Insertion Decision Request (IT Only) 6.1.8 Develop Interconnection Security Agreements (IT Only) 6.1.9 Initiate Privacy Impact Assessment (IT Only) 6.1.10 Initiate System of Records Notice (IT Only)

6.2 Documents 6.3 Stage Reviews and Exit Criteria

6.3.1 CDR Participants 6.3.2 Exit Criteria

7. Stage 4: Development ........................................................................................... 56 7.1 Development Activities

7.1.1 Build, Construct, and Configure the System 7.1.2 Conduct Unit Testing 7.1.3 Develop Test Case Specifications 7.1.4 Develop Version Description Document (IT Only) 7.1.5 Develop Documentation

7.1.5.1 Develop Operators Manual 7.1.5.2 Develop Maintenance Manuals 7.1.5.3 Develop User Manuals

7.2 Documents

Page 3: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-iii Interim Version 1.9 November 7 2008

7.3 Test Readiness Review and Exit Criteria 7.4 Exit Criteria

8. Stage 5: Integration and Test................................................................................ 60

8.1 Integration and Test Activities 8.1.1 Conduct System Testing and Generate Test Reports 8.1.2 Verify and Validate User Documentation 8.1.3 Conduct Acceptance Testing 8.1.4 Generate Section 508 Assistive Technology Interoperability Test Results (As Applicable) 8.1.5 Conduct Security Test and Evaluation (IT Only) 8.1.6 Prepare Security Certification Documentation (IT Only) 8.1.7 Complete Privacy Impact Assessment (IT Only) 8.1.8 Complete System of Records Notice (IT Only) 8.1.9 Complete Insertion Packages (IT Only)

8.2 Documents 8.3 Production Readiness Review and Exit Criteria

8.3.1 Exit Criteria

9. Stage 6: Implementation ....................................................................................... 66 9.1 Implementation Activities

9.1.1 Conduct Pilot 9.1.2 Complete Preparation of Operational Sites and Deploy Solution to Production

Environment 9.1.3 Perform Data Conversion and Load Production Data (IT Only) 9.1.4 Coordinate Changes to Corresponding Business Practices 9.1.5 Publish System of Records Notice and Acquire Privacy Office Affirmation (IT Only) 9.1.6 Obtain Authority to Operate Letter (IT Only)

9.2 Documents 9.3 Operational Readiness Review and Exit Criteria

9.3.1 Exit Criteria

10. Stage 7: Operations and Maintenance ................................................................. 70 10.1 Operations and Maintenance Activities

10.1.1 Operate System and System Documentation 10.1.2 Identify and Make System Enhancements 10.1.3 Test Disaster Recovery 10.1.4 Document Post Implementation Review Results 10.1.5 Conduct Operational Analyses 10.1.6 Develop Lessons Learned Report 10.1.7 Perform Continuous Monitoring (IT Only) 10.1.8 Develop Section 508 Accessibility Incident Remediation Report (IT Only) .......................

10.2 Documents......................................................................................................................................

11. Stage 8: Disposition .............................................................................................. 74 11.1 Disposition Activities ....................................................................................................................... 11.2 Documents......................................................................................................................................

ATTACHMENTS

Attachment B1. Systems Engineering Life Cycle Development Methodologies ... 76 B1-1 Spiral Development Methodology................................................................................................... B1-2 Iterative/Incremental Methodology..................................................................................................

Page 4: Pia Reservation SDLC

B-iv DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

B1-3 Waterfall Methodology ....................................................................................................................

Attachment B2. Other SELC Considerations........................................................... 81 B2-1 Index of Required Project Elements B2-2 Project Management B2-3 Organizational Change Management B2-4 Enterprise Architecture Alignment B2-5 Requirements Definition B2-6 Prototyping B2-7 Information Security B2-8 Privacy Compliance B2-9 Critical Infrastructure Protection B2-10 Human Factors Engineering B2-11 Section 508 Electronic and Information Technology Accessibility B2-12 Risk Management B2-13 Independent Verification & Validation B2-14 Electronic Records Management B2-15 National Information Exchange Model (NIEM) B2-16 SELC and Service Oriented Architecture (IT Only)

Attachment B3: Summary of Exit Criteria .............................................................. 103 Attachment B4: SELC Document Matrix ................................................................ 104 Attachment B5: Acronyms ...................................................................................... 110 Attachment B6: Glossary ........................................................................................ 113 Attachment B7: References ................................................................................... 127

List of Tables Table 1-1. DHS SELC Approval Authorities Table 1-2. DHS SELC Stakeholder Roles and Responsibilities Table 1-3. Document Organization and Descriptions Table 2-1. Development Methodology Selection Guide Table 2-2. Summary of Recommended Project Elements Table 3-1. Solution Engineering Stage Documents Table 3-2. Participants for Solution Engineering Review Table 3-3. Solution Engineering Stage Exit Criteria Table 4-1. DHS IT Project Management Plan Elements Table 4-2. Planning Stage Entry Criteria Table 4-3. Planning Stage Documents Table 4-4. Planning Stage Exit Criteria Table 5-1. Requirements Definition Documents Table 5-2. Requirements Definition Stage Exit Criteria Table 6-1. SRD Elements Table 6-2. Logical Design Elements Table 6-3 Detailed Design Elements Table 6-4. Deployment Plan Elements Table 6-5. Design Stage Documents Table 6-6. Participants for Critical Design Review Table 6-7. Design Stage Exit Criteria Table 7-1. Development Stage Documents Table 7-2. Development Stage Exit Criteria Table 8-1. Integration and Test Documents Table 8-2. Integration and Test Exit Criteria

Page 5: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-v Interim Version 1.9 November 7 2008

Table 9-1. Implementation Stage Documents Table 9-2. Implementation Exit Criteria Table 10-1. O&M Security Monitoring, Security Updates, and Security Reporting Table 10.2. O & M Documents Table 11-1. Disposition Documents List of Tables - Attachments Table B1-1. Spiral Methodology: Advantages vs. Disadvantages Table B1-2. Iterative/Incremental Methodology Table B1-3. Waterfall Methodology: Advantages vs. Disadvantages Table B2-1. Index of Required Project Elements Table B2-2. Common Phases and Activities in a Project Life Cycle Table B2-3. Integration Checklist Table B4-1. Governing Authorities, DHS Organizations, and Websites Table B4-2. Full Artifact Matrix

List of Figures Figure 1-1. DHS SELC Process Figure 1-2. DHS SELC Alignment with ARP and CPIC Figure 2-1. SELC Summary Snapshot Figure 5-1 Developmental Testing and Operational Testing and Key Documents List of Figures - Attachments Figure B1-1. Spiral Methodology Figure B1-2. Iterative Methodology Figure B1-3. Incremental Methodology Figure B1-4. Waterfall Methodology

Page 6: Pia Reservation SDLC
Page 7: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-1 Interim Version 1.9 November 7 2008

1. Introduction

1.1 Purpose A Systems Engineering Life Cycle (SELC) is a systems engineering framework for enabling efficient and effective delivery of capability to users, and is one of several key Department of Homeland Security (DHS) processes for managing acquisitions of programs and their related projects. Carnegie Mellon’s Software Engineering Institute (SEI) defines systems engineering as:

“The interdisciplinary approach governing the total technical and managerial effort required to transform a set of customer needs, expectations, and constraints into a product solution and support that solution throughout the product’s life. This includes the definition of technical performance measures, the integration of engineering specialties towards the establishment of a product architecture, and the definition of supporting life-cycle processes that balance cost, performance, and schedule objectives.”

An SELC guides the definition, execution, and management of an interdisciplinary set of tasks required to plan, define, design, develop, implement, operate, and dispose of systems. This DHS SELC Guide documents the Systems Engineering (SE) methodology and its application throughout the DHS enterprise. To support the stated purpose, the SELC Guide describes the SELC and its relationship to other DHS enterprise-wide processes and defines the following: • DHS SELC roles and responsibilities • DHS SELC elements (stages, including activities and products, entry/exit criteria,

and reviews) • System development methodologies, guidance on their usage, and methods for

customization The purpose of this DHS SELC Guide is to standardize the system life cycle process across DHS Components and to ensure that DHS capabilities are efficiently and effectively delivered. The DHS SELC Guide is designed to ensure that appropriate activities are planned and implemented in each stage of the life cycle to increase the project’s success. The following key concepts form the basis for the DHS SELC Guide:

• The DHS SELC represents the systems engineering guidance for the acquisition management process.

• The DHS SELC is the framework used to guide all DHS projects with specific guidance provided for each type of acquisition mechanism (capital investment IT and non-IT, enterprise services, etc).

• The SELC provides flexibility by supporting tailoring based on the unique characteristics of a project (e.g., size, scope, complexity, risk, and security

Page 8: Pia Reservation SDLC

B-2 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

categorization) documented in the Project Tailoring Plan. The project and program managers are responsible for tailoring the SELC process for the project’s specific characteristics as appropriate. Tailoring is the cornerstone of any life cycle process.

• The scope of the SELC Guide begins with the Solution Engineering stage (i.e., ADE 1), and continues through Disposal.

• The SELC specifies stage reviews that are based on defined exit criteria to assess progress, quality, and readiness to proceed. Signed stage review approval letters, the Project Tailoring Plan, and the updated Project Management Plan (along with project schedule) must be delivered to the DHS Periodic Reporting team following the conclusion of each stage review.

• The Component is responsible for: o Ensuring that all SELC entry criteria are satisfied. o Conducting SELC stage reviews to authorize projects to move to next stage.

The DHS SELC Guide discusses nine process stages. The following figure, DHS SELC Process – Capital Assets, summarizes the SELC process. Each element of the process is discussed in the following pages.

DHS SELC Process – Capital Assets

The new Acquisition Management Process applies to Enterprise Services (as well as Capital Assets). The figure on the next page depicts a modified life cycle tailored for Enterprise Services.

Stage 1:Planning

Stage 2:Requirements

Definition

Stage 3:Design

Stage 4:Development

Stage 5:Integration

& Test

Stage 6:Implementation

Stage 7:Operations

& Maintenance

SDR CDR TRR PRR ORR PIR

Plan the project and acquire

resources neededto achieve solution

Analyze user needs and document

functionalrequirements

Transform requirements into

detailedsystem design

Convert thesystem design

into system

Integrate and testwith other systems;

conduct UAT; develop C&A

System moved toProduction

environment;Production data has

been loaded

The system isoperated to carry

outintended function

PPR

Note: A Project Tailoring Plan must be developed that defines what stages, activities, and artifacts will be completed for the project. The Project Tailoring Plan should reflect the unique characteristics of the project and provide the best opportunity to deliver the system effectively.

PDR

Stage 8:Disposition

The system isdisposed

Engineer the program solution to

ensure all alternatives are

considered

SER

Stage A:Solution

Engineering

SPR

ADE1

ADE2 ADE

3

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

SPR: Study Plan ReviewSER: Solution Engineering ReviewPPR: Project Planning ReviewSDR: System Definition ReviewPDR: Preliminary Design ReviewCDR: Critical Design Review

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

SPR: Study Plan ReviewSER: Solution Engineering ReviewPPR: Project Planning ReviewSDR: System Definition ReviewPDR: Preliminary Design ReviewCDR: Critical Design Review

Page 9: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-3 Interim Version 1.9 November 7 2008

DHS SELC Process – Enterprise Services

1.2 Applicability The DHS SELC Guide is applicable to all DHS projects throughout Components or other DHS organizations (unless specifically exempted) whose purpose is to deliver a DHS capability. IT Specific The DHS SELC applies to all DHS IT projects. As defined in MD 0007.1, IT is:

“Any equipment or interconnected system or subsystem of equipment/software, or any national security system, that is used in the automatic acquisition, storage, manipulation, management, movement, control, display (including geospatial technologies), switching, interchange, transmission (wired or wireless telecommunications), or reception of data, voice, video, or information by an executive agency…equipment is used by DHS if the equipment is used by DHS directly or is used by DHS organizational partners (including other Federal agencies, State and local governments and private contractors) under a contract with DHS which (a) requires the use of such equipment, or (b) requires the use, to a significant extent, of such equipment in the performance of a service or the furnishing of a product. It includes computers, ancillary equipment (including imaging peripherals, input, output, and storage devices necessary for security and surveillance), peripheral equipment designed to be controlled by the central processing unit of a computer, software, firmware and similar procedures, services (including support services), and related resources. It does not include any equipment acquired by a contractor

Page 10: Pia Reservation SDLC

B-4 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

incidental to a contract, or equipment which contains imbedded information technology that is used as an integral part of the product, but the principal function of which is not the acquisition, storage, analysis, evaluation, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information..”1

For the purpose of the SELC, IT projects include the following;

• IT projects and systems across all IT environments (e.g., mainframe, client-server, embedded, firmware, network).

• IT projects and systems across all classifications (unclassified, classified, and national security).

• IT projects and systems for applications, systems, and infrastructure, including those acquired, contractually developed, and developed in-house.

• IT projects from non-IT investments or acquisitions that involve the development of an IT system.

While minor enhancements and modifications are expected during the Operations and Maintenance (O&M) stage, major enhancements to systems in O&M require a project to be treated as a new IT project and comply with the full life cycle requirements of the SELC. Major enhancements constitute any combined change that exceeds $2.5 million in cost from planning through deployment (excluding O&M).

1.3 DHS SELC Process Overview Given that the SELC is the systems engineering methodology for an acquisition, the first stage of the SELC is one that focuses on the larger acquisition program, whereas the remainder focus on the projects associated with the program. The DHS SELC methodology is composed of nine process stages:

• Solution Engineering (focuses on the Acquisition Program) • Planning • Requirements Definition • Design • Development • Integration and Test • Implementation • Operations and Maintenance • Disposition

Each stage has a defined set of activities that represents a logical unit of work. Each stage has associated documents to record the results of the activities performed. Stage Reviews are held at an appropriate point in time to validate that the acquisition has completed requirements for that stage and is ready to advance to the next stage. Exit

1 MD 0007.1, section IV.J.

Page 11: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-5 Interim Version 1.9 November 7 2008

criteria are directly related to the function of the stage and to the activities performed in the stage. For purposes of illustration, the DHS SELC stages and stage reviews are represented sequentially in Figure 1-1, which is traditionally how they have been performed.

Stage 1:Planning

Stage 2:Requirements

Definition

Stage 3:Design

Stage 4:Development

Stage 5:Integration

& Test

Stage 6:Implementation

Stage 7:Operations

& Maintenance

SDR CDR TRR PRR ORR PIR

Plan the project and acquire

resources neededto achieve solution

Analyze user needs and document

functionalrequirements

Transform requirements into

detailedsystem design

Convert thesystem design

into system

Integrate and testwith other systems;

conduct UAT; develop C&A

System moved toProduction

environment;Production data has

been loaded

The system isoperated to carry

outintended function

PPR

Note: A Project Tailoring Plan must be developed that defines what stages, activities, and artifacts will be completed for the project. The Project Tailoring Plan should reflect the unique characteristics of the project and provide the best opportunity to deliver the system effectively.

PDR

Stage 8:Disposition

The system isdisposed

Engineer the program solution to

ensure all alternatives are

considered

SER

Stage A:Solution

Engineering

SPR

ADE1

ADE2

ADE3

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

SPR: Study Plan ReviewSER: Solution Engineering ReviewPPR: Project Planning ReviewSDR: System Definition ReviewPDR: Preliminary Design ReviewCDR: Critical Design Review

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

KeyTRR: Test Readiness ReviewPRR: Production Readiness ReviewORR: Operational Readiness ReviewPIR: Post Implementation Review

SPR: Study Plan ReviewSER: Solution Engineering ReviewPPR: Project Planning ReviewSDR: System Definition ReviewPDR: Preliminary Design ReviewCDR: Critical Design Review

Figure 1-1: DHS SELC Process

1.4 Relationship to the DHS Acquisition Review Process and the Capital Planning and Investment Control Process As one of the foundational elements of the DHS acquisition process, the objective of the DHS SELC Guide is to provide a framework to guide effective and efficient enterprise-wide capability delivery. Understanding the relationship of the DHS SELC to other DHS enterprise governance processes is essential to this objective. The primary applicable DHS governance processes are the DHS Acquisition Review Process (ARP), as mandated by Directive 102-01, and the CPIC Process, as mandated by MD 4200. The CPIC process will use the products and information produced in the ARP to fulfill its’ reporting requirements. These processes are focused on selecting and managing both IT and non-IT acquisitions (also referred to as programs and projects). Each acquisition may comprise multiple programs and projects. The SELC applies to the level of activity at which capabilities are produced and delivered, which may be at a singular acquisition level or through the related efforts of multiple types of acquisition. The applicability is determined through the SELC tailoring process based on the individual needs and conditions of each acquisition. For purposes of illustration, the alignment of the Systems Engineering processes from a sequential point of view (see section for other delivery patterns) is depicted in Figure 1-2. This view provides insight into the alignment of the different phases of each governance process, as well as the alignment of some of the SELC stage reviews with those of the ARP. It is important to note that the responsibility for ensuring compliance with both the ARP and the SELC lies with a single project team, which may be part of a program team. The SELC is tightly integrated with the ARP. Therefore, there are dependencies between deliverables in each process (e.g., the Functional Requirements Document in the SELC is dependent upon the information found in the ARP’s Mission Needs Statement [MNS], Concept of Operations [CONOPS] and Operational Requirements Document [ORD]).

Page 12: Pia Reservation SDLC

B-6 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Fig

ure

1-2.

DH

S SE

LC A

lignm

ent w

ith A

RP

and

CPI

C

Nee

d

Pre-

Sele

ct

Dis

posi

tion

Sele

ct

Con

trol

Plan

ning

Obt

ain

Ope

ratio

ns &

M

aint

enan

ceIm

plem

enta

tion

Inte

grat

ion

& T

est

Dev

elop

men

tD

esig

nR

equi

rem

ents

D

efin

ition

Solu

tion

Engi

neer

ing

Prod

uce/

Dep

loy/

Supp

ort

Ana

lyze

/ Se

lect

Eval

uate

Nee

d

Pre-

Sele

ct

Dis

posi

tion

Sele

ct

Con

trol

Plan

ning

Obt

ain

Ope

ratio

ns &

M

aint

enan

ceIm

plem

enta

tion

Inte

grat

ion

& T

est

Dev

elop

men

tD

esig

nR

equi

rem

ents

D

efin

ition

Solu

tion

Engi

neer

ing

Prod

uce/

Dep

loy/

Supp

ort

Ana

lyze

/ Se

lect

Eval

uate

CPI

C

AR

P

SELC

PIR

PIR

SDR

SDR

CD

RC

DR

TR

RT

RR

PR

RP

RR

OR

RO

RR

PP

RP

PR

AR

P A

cqui

sitio

n D

ecis

ion

Even

tsSP

R: S

tudy

Pla

n R

evie

wSE

R: S

olut

ion

Engi

neer

ing

Rev

iew

PPR

: Pro

ject

Pla

nnin

g R

evie

wSD

R:

Syst

em D

efin

ition

Rev

iew

PDR

: Pre

limin

ary

Des

ign

Rev

iew

CD

R: C

ritic

al D

esig

n R

evie

w

TRR

: Tes

t Rea

dine

ss R

evie

wPR

R: P

rodu

ctio

n R

eadi

ness

Rev

iew

OR

R: O

pera

tiona

l Rea

dine

ss R

evie

wPI

R: P

ost I

mpl

emen

tatio

n R

evie

w

01

3

0: C

olle

ct N

eeds

1: V

alid

ate

Nee

ds2A

: App

rove

App

roac

h2B

: App

rove

Mec

hani

sms

3: A

ppro

ve D

eplo

ymen

t/Sup

port

SLC

Sta

ge R

evie

ws

PD

RP

DR

2B2A2A

SER

SER

SPR

SPR

Page 13: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-7 Interim Version 1.9 November 7 2008

1.4.1 Alignment with the DHS Acquisition Review Process The DHS Acquisition Review Process (ARP) (Directive 102-01) is one of the departmental governance processes that establishes governance bodies (e.g., Acquisition Review Board [ARB]) and ensures oversight, control, reporting, and review of all investments. Directive 102-01 defines acquisitions as any capital investment program/project, service contract, or interagency agreement used for the purpose of delivering Homeland Security capabilities, and categorizes investments according to investment levels. The SELC is tailorable for each type of acquisition. The ARP process, in conjunction with the SELC, ensures that acquisitions directly support and further the DHS mission(s), provide the intended benefits and capabilities to stakeholders and customers, and enhance investment sharing opportunities though standardization. The alignment of many of the SELC stage reviews with Acquisition Decision Events (ADEs) in the ARP provides opportunities to leverage deliverables and products to comply with the needs of both the ARP and the SELC (e.g., Risk Management Plan). Acquisition reviews may result in conditions being levied on the acquisitions or result in cancellation of an acquisition. Prior to entering the acquisition phases, the Department and Components create and collect needs based on capability gaps and shortfalls. Gaps and capability deficiencies may be identified via over-budget requests in the Planning, Programming, Budgeting, and Execution (PPBE) process, executive direction, legislative mandates, or from operational/user groups within the Components. ADE 0 is the decision point where Department and Component mission area owners select their initiatives to form the basis of the activities in the Need phase. The phases of the Capital Asset ARP are contained in the Directive 102-01 Guidebook. 1.4.2 Alignment with the DHS Capital Planning and Investment Control Process The CPIC process (MD 4200), as described in the DHS CPIC Guide, integrates strategic planning, enterprise architecture, portfolio management, privacy, security, budgeting, procurement, and the management of assets. The Department’s portfolio of capital investments comprises acquisitions that have been determined to provide the requisite mission capability by the DHS ARP and have been approved for funding through the DHS PPBE process as mandated in MD 1330. The overall portfolio of acquisitions comprises assets and services designed to achieve DHS’ strategic goals and objectives with an affordable life cycle cost and acceptable risk. CPIC will utilize information provided in the ARP and its’ approved documentation set to complete its’ reporting requirements. CPIC emphasizes the selection of investments/programs/acquisitions. In Directive102-01, these three terms are considered to be congruent concepts, although the use of one or another of them tends to emphasize different decision

Page 14: Pia Reservation SDLC

B-8 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

support systems. For example, the use of the term “investment” emphasizes the PPBE decision system. It is the intent of DHS to interlink the key decision support systems. CPIC is a part of acquisition considered in the broadest sense: that of providing the organization with the capabilities it needs to accomplish the mission. However, as noted above, there are some differences between CPIC processes and documentation, and acquisition processes and documentation. The plan to use acquisition information as the source information for the CPIC process is the first important step towards interlinking these DHS decision systems. CPIC is important to the SELC process because it levies several requirements on investments. Depending on the CPIC investment thresholds and the operational state of the investment (e.g., Development, Steady-State), it requires PMs to:

• Complete an OMB Business Case Exhibit 300

• Complete an annual Exhibit 53 report

• Conduct Periodic Reporting

• Conduct Operational Analysis The CPIC process includes four phases: Pre-Select, Select, Control and Evaluate.

• Pre-Select Phase. All proposed IT initiatives must go through the Pre-Select Phase, during which time the sponsor presents the new initiative to review authorities for consideration. During this phase of the CPIC process, the Department identifies all new, ongoing, and operational investments for the Department's IT portfolio. Within this phase, proposed IT initiatives are screened, scored, and ranked relative to other initiatives. The principal objectives of the Pre-Select phase are to determine whether the initiative is viable and whether the appropriate level of analysis and documentation has been completed.

The Pre-Select decision is made at the Component level. Key outcomes resulting from the assessment process include: (1) defining the level of review required to approve the initiative for funding, and (2) defining the appropriate set of analysis/documentation needed to characterize the initiative fully as it moves forward in the process.

• Select Phase. The Select Phase ensures that Resource Allocation Plan (RAP)/Capital Investment Plan (CIP) submissions include the resource requirements for investments. Additionally, it ensures that new and existing investments are assessed against a uniform set of evaluation criteria and thresholds. The Exhibit 300 business case is the primary artifact used in the CPIC Select Phase to support the annual review and (re-)selection of both new and continuing investments. It is a formal justification for a project or program that outlines the associated benefits, cost, risks, and alignment with organizational

Page 15: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-9 Interim Version 1.9 November 7 2008

objectives. The Exhibit 300 is updated annually as part of the CPIC process to reflect the performance of the investment. Exhibit 300s are reviewed and scored against OMB scoring criteria. Poorly scoring Exhibits 300 must be updated to meet the required threshold or risk not being included in the annual budget submission.

• Control Phase. The objective of the Control Phase is to ensure that the project is performing within acceptable cost, schedule, and performance parameters and to ensure the continual assessment and mitigation of potential risks through periodic reporting and IT acquisition reviews. The CPIC Guide describes the requirements for managing and reporting performance per periodic reporting. The Acquisition Program Baseline (APB) defines the critical cost, schedule, and performance parameters for the investment and is the baseline against which programs report their progress. Periodic reporting is required throughout the CPIC Control phase.

• IT Specific: It should be noted that, as part of the CPIC Control Phase, all IT procurements (contracts) must be reviewed. As mandated by Management Directive 0007.1, the DHS CIO reviews and approves all procurements of $2.5M or more; the Component CIOs review procurements of less than $2.5M. The DHS CIO has a review process for approving IT procurements that informs the ARP. Additional information on the DHS CIO’s IT review can be found on DHS Interactive (https://interactive.dhs.gov/suite/portal.do?$p=2041) or by contacting the review coordinator ([email protected]). For Component reviews, please contact the offices of the Component CIOs for review policies and procedures.

• Evaluate Phase. The purpose of the Evaluate Phase is to: 1) determine how well the investment is meeting its performance, cost, and schedule objectives, and 2) determine the extent to which the CPIC process improved the outcome of the investment. The two primary activities in the CPIC Evaluate Phase are the Post-Implementation Review (PIR), which is conducted approximately six months after deployment, and the Operational Analysis (OA), which is conducted annually to determine if the investment is meeting its performance goals.

1.5 Governance, Roles and Responsibilities Table 1-1 lists the approval authority for each of the SELC stage reviews for each project type. Table 1-2 lists the key DHS stakeholders and describes their role in the DHS SELC. It also identifies ARP and CPIC roles.

Page 16: Pia Reservation SDLC

B-10 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Table 1-1: DHS SELC Approval Authorities

Project Type SELC Stage

Review Capital Investment IT

Capital Investment Non-IT

Enterprise Service

AoA Study Plan Review (SPR)

APMD/CIO APMD/CIO APMD/CIO

SER CIO Component SE group1 Component SE group1 PPR Component CIO Component SE group1 Component SE group1 SDR Component CIO Component SE group1 Component SE group1

PDR Component CIO Component SE group1 Component SE group1

CDR Component CIO Component SE group1 Component SE group1 TRR Component CIO Component SE group1 Component SE group1 PRR Component CIO Component SE group1 Component SE group1 ORR Component CIO Component SE group1 Component SE group1 PIR Component CIO Component SE group1 Component SE group1

1 Monitored by APMD

Table 1-2: DHS SELC Stakeholder Roles and Responsibilities

Role Definition Program / Project Manager

A Program Manager is the responsible person who, with significant discretionary authority, is uniquely empowered to plan the scope of work, life cycle cost, schedule, and performance acceptability levels (for an assigned program[s]), and who is responsible and accountable for accomplishing program objectives or production requirements through the acquisition of in-house, contract or reimbursable support resources, as appropriate, and as agreed to in an approved APB. The Program Manager is responsible for management of the project manager(s) and ensuring the project within the program is successfully completed. The Program / Project Manager is responsible for establishing the project team, completing the documentation set, presenting the business case and status of the project through all phases of the review and approval process, scheduling and coordinating the SELC stage reviews, and managing the performance of the project throughout the life cycle. The Program / Project Manager must be certified at the appropriate level, per Management Directive 0782.

IT Specific Component CIO IT Approving Authority: The Component CIO is the most senior Federal executive in

the Component exercising leadership and authority over mission-unique IT policies, programs, services, solutions, and resources. The Component CIO acts to implement the policies of the DHS CIO. This includes compliance of IT acquisitions with the DHS SELC. The Component CIO is responsible for: 1. Ensuring that all SELC entry criteria are satisfied. 2. Conducting SELC stage reviews.

Designated Accrediting Authority (DAA)

The DAA is responsible for operating a system at an acceptable level of risk based on the System Security Plan (SSP)and final risk assessment, and is accountable for the risk he or she accepts.

Page 17: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-11 Interim Version 1.9 November 7 2008

Role Definition DHS Chief Information Officer (DHS CIO)

The DHS CIO is responsible for ensuring that DHS IT projects are aligned with DHS strategic business objectives and that system development projects comply with DHS policies. He or she is also responsible for the proper level of oversight over IT projects and for the IT architecture.

DHS Chief Information Security Officer (CISO)

The CISO issues Department-wide IT security policy, guidance, and architecture requirements for all DHS IT systems and networks. The CISO implements and manages the Department-wide IT Security Program and ensures compliance with the Federal Information Security Management Act (FISMA) and OMB requirements.

DHS Chief Privacy Officer The Chief Privacy Officer ensures that technology implementations across DHS sustain privacy protections as mandated by Section 208 of the E-Government Act of 2002 and Section 222 of the Homeland Security Act.

DHS Enterprise Architecture Board (EAB)

The DHS EAB, in support of the DHS investment review process, is responsible for the following: 1. Reviewing individual investments and making recommendations to the DHS

CIO for approving them consistent with the criteria and thresholds identified in DHS Directive 102-01, Acquisition Management Process.

2. Requiring that each IT investment aligns with the HLS EA and is approved by the EAB before submission to the CIO for final approval and inclusion in the annual budget submission.

3. Directing, overseeing, and approving the HLS EA and ensuring compliance with OMB Federal Enterprise Architecture (FEA) guidance.

DHS Enterprise Data Management Office (EDMO)

The EDMO is the organization within the DHS OCIO that coordinates and facilitates the establishment and maturation of the DHS enterprise data management policies, governance processes, and data stewardship programs.

DHS IT Portfolio Managers The IT Portfolio Manager acts as an agent of the DHS CIO in managing an assigned IT portfolio, regardless of funding sources, across a number of IT projects. The primary responsibilities of the IT Portfolio Manager are to: 1. Apply DHS IT portfolio management processes. 2. Provide oversight of investments/projects within the portfolio. 3. Support budget formulation. 4. Review portfolio acquisitions and performance. 5. Support the development and implementation of EA targets. 6. Develop and review IT business cases.

Information System Security Officer (ISSO)

The ISSO ensures that appropriate steps are taken to implement information security requirements for information systems throughout the life cycle and works closely with the ISSM to interpret and apply IT security policies and implement procedures.

Information Systems Security Manager (ISSM)

The ISSM is the principal interface between the Office of the CISO and the ISSOs and other security practitioners. As such, the ISSM plays a critical role in ensuring that the DHS IT Security Program is both implemented and maintained throughout the Components and their IT systems.

Office on Accessible Systems and Technology (OAST) – Section 508

OAST is responsible for establishing Departmental policies, practices, and procedures to ensure that the Electronic Information Technology (EIT) that is procured, developed, maintained, or used is accessible to DHS employees and customers with disabilities. Additionally, OAST is responsible for providing programmatic and technical assistance to DHS Components regarding implementation of 29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998 EIT Accessibility.

Project Sponsor The project sponsor represents the operational needs of the business unit and the system users; and, participates in all life cycle stages to ensure that the system meets the requirements and business need.

Page 18: Pia Reservation SDLC

B-12 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

1.6 Document Organization Table 1-3 below provides an overview of organization of the document and includes a description of the document’s sections and attachments.

Table 1-3. Document Organization and Descriptions

Section Description Section 1: Introduction Describes the purpose of the SELC and provides background regarding its development,

objectives, and benefits. Additionally, this section covers the SELC-related authorities, provides a high-level overview of the process and its relationship to the DHS investment review and CPIC processes. It also documents the applicability of the SELC to IT projects and describes roles and responsibilities.

Section 2: SELC Overview

Describes the elements that constitute the SELC, the transition at DHS to a Service Oriented Architecture (SOA) and how this transition impacts system development. It also covers development methodologies and how projects are tailored. Finally, it identifies the required elements that all Project Managers must ensure are incorporated in IT projects.

Section 3: Solution Engineering Stage

Describes the elements (activities, documents) associated with the solution engineering needed to properly define the program capability solution.

Section 4: Planning Stage

Describes the entry criteria, activities, documents, stage review, and exit criteria for the Planning Stage.

Section 5: Requirements Definition Stage

Describes the activities, documents, stage review, and exit criteria for the Requirements Definition Stage.

Section 6: Design Stage

Describes the activities, documents, stage review, and exit criteria for the Design Stage.

Section 7: Development Stage

Describes the activities, documents, stage review, and exit criteria for the Development Stage.

Section 8: Integration and Test Stage

Describes the activities, documents, stage review, and exit criteria for the Integration and Test Stage.

Section 9: Implementation Stage

Describes the activities, documents, stage review, and exit criteria for the Implementation Stage.

Section 10: Operations and Maintenance Stage

Describes the activities, documents, stage review, and exit criteria for the Operations and Maintenance Stage.

Section 11: Disposition

Describes the activities to ensure users are notified of the disposition of the system, secure and archive the system and associated data in accordance with DHS and Federal policies regarding retention of electronic records, and to dispose of the system assets.

Attachment B-1: SELC Development Methodologies

Describes the Spiral, Iterative/Incremental, and Waterfall methodologies and illustrates how they are implemented in the SELC.

Attachment B-2: Work Patterns

Provides work patterns (also referred to as pre-tailored plans) suitable for usage or for further tailoring. Work patterns are provided for each of the three development methodologies, as well as for projects with special focus (service delivery, Commercial Off-the-Shelf [COTS], and infrastructure) efforts.

Attachment B-3: Other SELC Considerations

Provides detailed DHS information on the required elements that must be incorporated in all projects.

Attachment B-4: Summary of Exit Criteria

Lists all of the SELC exit criteria by SELC stage.

Page 19: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-13 Interim Version 1.9 November 7 2008

Section Description Attachment B-5: SELC Artifact Matrix

Lists all the documents required throughout the life cycle and denotes when they are updated.

Attachment B-6: Acronyms

Lists the definitions of acronyms used in the document.

Attachment B-7: Glossary

Defines key terms.

Attachment B-8: References

Lists the references consulted in the development of this Guide.

2. SELC Overview

This section defines the DHS SELC framework and the elements it comprises. As such, it provides the basis for the detailed information on each stage of the SELC in subsequent sections of this document. For IT acquisitions, this section also provides information regarding the transition that DHS is making to a Service Oriented Architecture (SOA). Lastly, it discusses different development methodologies and technical approaches and how they may be used to tailor a project to best achieve project success.

2.1 Overview of DHS SELC Elements This section presents general information on each element (i.e., Entry Criteria, Stages, Stage Reviews, and Exit Criteria) of the SELC framework as a preface to the detailed information on each DHS SELC stage that follows in Sections 3 through 11.

2.1.1 SELC Entry Criteria The following represent the SELC entry criteria for the solution engineering stage.

• Mission Needs Statement (MNS) approval • ADE 1 Acquisition Decision Memo (ADM)

2.1.2 SELC Stages The SELC consists of the following nine stages: 1. Solution Engineering (program focused) 2. Planning 3. Requirements Definition 4. Design 5. Development 6. Integration and Test 7. Implementation 8. Operations and Maintenance 9. Disposition

Page 20: Pia Reservation SDLC

B-14 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Each SELC stage requires the completion of a set of related activities. The SELC guide provides a description of each of these activities and the requisite documents for each stage. The outputs from the activities performed in each stage are recorded in documents that are placed under Configuration Management (CM) control. Many documents are created in one stage and updated in subsequent stages as additional details become available. The completion of these activities, unless tailored out, is generally considered a prerequisite for moving to the next stage of the SELC.

2.1.3 SELC Stage Reviews The SELC process is governed through stage reviews (see Figure 1-1) that provide the opportunity to assess project progress against defined exit criteria. These reviews provide a knowledge point or mechanism for management to determine how and if a project should proceed (e.g., does more work need to be done before the project is ready to enter the next stage of the life cycle?). Stage reviews are conducted at fixed points in the SELC and have a specified list of participants and evaluation criteria. Stage reviews are conducted at the end of each stage to ensure all exit criteria for the stage have been satisfactorily addressed. For IT projects, the one exception is the final review, the PIR, which should be performed six months after deployment.2 Stage reviews are conducted by management teams specific to the organization defined by the project (e.g., by the project sponsor and Component CIO) and may also include DHS level organizations (e.g., DHS IT Portfolio Managers). The Program/Project Manager is responsible for arranging, coordinating, and leading the stage reviews while the decision authority is responsible for sign-off that the project has satisfied all the exit criteria and is ready to proceed to the next stage. However, it is expected that the decision authority will rely on the appropriate experts (e.g., EA, testing, security, infrastructure, budget) to evaluate the readiness to proceed. Some key experts are identified in the lists of stage review participants in the following sections. Evidence of decision authority sign-off must be delivered to the DHS Periodic Reporting Team and be maintained as part of the project documentation. A scanned electronic copy of the signed stage review approval letter along with the updated project management plan (and associated project schedule) must be emailed to the DHS Periodic Reporting Team at [email protected] following the conclusion of each stage review. Stages may be modified as part of the project tailoring process. If stages are combined as part of an approved tailoring plan, only one stage review will be held at the end of the combined stages (e.g., a single Design/Development stage review would be held rather than two stage reviews, one for the Design Stage and one for the Development Stage). Stage review participant listings can be found in each of the detailed descriptions of the stages in Sections 3-11. The SELC documents or pointers to the authority where the latest template can be obtained have been posted on DHS Online under Components/Management/CIO/EBMO/SELC. The following URL is the location of the template listing. DHS SELC Website:

2 Per guidance in the DHS CPIC Guide.

Page 21: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-15 Interim Version 1.9 November 7 2008

https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?community=MGMT&index=151&id=2039580019

2.1.4 SELC Exit Criteria Each SELC stage is associated with specific exit criteria that must be satisfied before a project can proceed to the next stage. Exit criteria are presented in question format and categorized by domain (e.g. project management, enterprise architecture) to provide content-centered guidance rather than merely a checklist of documents to be completed. It is critical to understand that the determination of project readiness to transition to the next stage is made by satisfactory compliance with the content of the exit criteria, NOT by the evidence of documents produced. Project Managers should review the exit criteria at the start of each SELC stage and plan the stage activities accordingly to ensure successful completion of the exit criteria and to avoid rework or delays. Factors critical to successful stage reviews are:

• All exit criteria for each stage review must be satisfactorily fulfilled, including required documents, to obtain authorization for moving to the next stage.

• The downstream consumer of the products produced in a stage (e.g., a development manager dependent on using the design documentation) must attend, review, and sign-off on stage completion. The PM must review any significant issues identified, assess the impact to the project, and determine if proceeding to the next stage is desirable. This is to ensure that the next stage can effectively begin and to minimize the need to return to a previous stage to correct incomplete activities. To minimize issues with availability, the downstream consumers may be limited to the lead of the next stage. For example, following Requirements Definition, the Design lead would be expected to review the requirements while the Development lead may not be expected to review the requirements in preparation for the Requirement Definition stage review.

• At each stage review, evidence must be provided that clearly substantiates the successful completion of the exit criteria. For example, in testing requirements, tests must successfully produce the required results to be used as evidence of “successfully” meeting exit criteria. The act of testing in itself is not sufficient evidence if tests fail to produce required results.

For a complete listing of exit criteria for each stage of the DHS SELC, reference Attachment B-4 to this document.

Page 22: Pia Reservation SDLC

B-16 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Fig

ure

2-1:

Sum

mar

y Sn

apsh

ot (s

umm

ariz

es th

e SE

LC st

ages

, rev

iew

s, an

d pr

oduc

ts)

•FR

D•S

LAs

•Site

Pre

p Pl

an•R

TM•T

EMP

•SD

R A

ppro

val L

ette

rIT

Spe

cific

•SR

TM•P

OA

&M

•SSP

•Con

tinge

ncy

Plan

•DR

Pla

n•S

RA

•ST&

E Pl

an•M

ap to

the

Dat

a A

rchi

tect

ure

•NIE

M IE

PD•M

ap to

TR

M

•PM

P•D

HS

Perio

dic

Rep

ortin

g•R

isk

Man

agem

ent

Plan

•Qua

lity

Assu

ranc

e Pl

an•T

rain

ing

Plan

•PPR

App

rova

l Let

ter

IT S

peci

fic•C

M P

lan

•PTA

•Sec

tion

508

EIT

Acc

essi

bilit

y Pl

an•D

M P

lan

Stag

e 1:

Plan

ning

Stag

e 2:

Req

uire

men

ts

Def

initi

on

Stag

e 3:

Des

ign

Stag

e 4:

Dev

elop

men

t

Stag

e 5:

Inte

grat

ion

& T

est

Stag

e 6:

Impl

emen

tatio

n

Stag

e 7:

Ope

ratio

ns

&

Mai

nten

ance

SDR

CD

RTR

RPR

RO

RR

PIR

•SR

D•L

ogic

al D

esig

n D

ocum

ent

•Sys

tem

Des

ign

Doc

umen

t•D

eplo

ymen

t Pla

n•P

DR

App

rova

l Let

ter

•CD

R A

ppro

val L

ette

rIT

Spe

cific

•ISA

•Dat

a Ar

chite

ctur

e D

ocum

ent

•Tec

hnol

ogy

Inse

rtio

n D

ecis

ion

Req

uest

•PIA

•SO

RN

•Tra

inin

g M

ater

ials

•Tes

t Cas

e Sp

ecifi

catio

n•S

yste

m

Acc

epta

nce

Test

Pr

oced

ures

•Ope

rato

rs

Man

uals

•Mai

nten

ance

M

anua

ls•U

ser M

anua

ls•I

nitia

l Sys

tem

Im

plem

enta

tion

•TR

R A

ppro

val

Lette

r IT S

peci

fic•V

DD

•Sys

tem

Tes

t Rep

ort

•Acc

epta

nce

Test

R

epor

t•D

HS

Perio

dic

Rep

ortin

g•P

RR

App

rova

l Let

ter

IT S

peci

fic•S

ectio

n 50

8 A

ssis

tive

Tech

nolo

gy

Inte

rope

rabi

lity

Test

R

epor

t•E

A In

sert

ion

Pack

ages

•SA

R•S

ecur

ity

Acc

redi

tatio

n pa

ckag

e

•Pilo

t Res

ults

Rep

ort

•OR

R A

ppro

val L

ette

r IT

Spe

cific

•Aut

horit

y To

O

pera

te (A

TO) L

ette

r

•Per

form

ance

R

epor

ts•O

pera

tiona

l A

naly

ses

•Les

sons

Lea

rned

•PIR

Res

ults

IT S

peci

fic•F

ISM

A m

etric

s re

port

s•S

ecur

ity In

cide

nt

repo

rts

•C&

A U

pdat

es•P

rivac

y D

ocum

enta

tion

Key

TRR

: Tes

t Rea

dine

ss R

evie

wPR

R: P

rodu

ctio

n R

eadi

ness

Rev

iew

OR

R: O

pera

tiona

l Rea

dine

ss R

evie

wPI

R: P

ost I

mpl

emen

tatio

n R

evie

w

Key

TRR

: Tes

t Rea

dine

ss R

evie

wPR

R: P

rodu

ctio

n R

eadi

ness

Rev

iew

OR

R: O

pera

tiona

l Rea

dine

ss R

evie

wPI

R: P

ost I

mpl

emen

tatio

n R

evie

w

SPR:

Stu

dy P

lan

Rev

iew

SER

: So

lutio

n En

gine

erin

g R

evie

wPP

R:

Proj

ect P

lann

ing

Rev

iew

SDR

: Sy

stem

Def

initi

on R

evie

wPD

R:

Prel

imin

ary

Des

ign

Rev

iew

CD

R: C

ritic

al D

esig

n R

evie

w

Key

TRR

: Tes

t Rea

dine

ss R

evie

wPR

R: P

rodu

ctio

n R

eadi

ness

Rev

iew

OR

R: O

pera

tiona

l Rea

dine

ss R

evie

wPI

R: P

ost I

mpl

emen

tatio

n R

evie

w

Key

TRR

: Tes

t Rea

dine

ss R

evie

wPR

R: P

rodu

ctio

n R

eadi

ness

Rev

iew

OR

R: O

pera

tiona

l Rea

dine

ss R

evie

wPI

R: P

ost I

mpl

emen

tatio

n R

evie

w

SPR:

Stu

dy P

lan

Rev

iew

SER

: So

lutio

n En

gine

erin

g R

evie

wPP

R:

Proj

ect P

lann

ing

Rev

iew

SDR

: Sy

stem

Def

initi

on R

evie

wPD

R:

Prel

imin

ary

Des

ign

Rev

iew

CD

R: C

ritic

al D

esig

n R

evie

w

Plan

the

proj

ect a

nd

acqu

ire

reso

urce

s ne

eded

to a

chie

ve s

olut

ion

Ana

lyze

use

r nee

ds

and

docu

men

tfu

nctio

nal

requ

irem

ents

Tran

sfor

m

requ

irem

ents

into

de

taile

dsy

stem

des

ign

Con

vert

the

syst

em d

esig

nin

to s

yste

m

Inte

grat

e an

d te

stw

ith o

ther

sys

tem

s;

cond

uct U

AT;

deve

lop

C&

A

Syst

em m

oved

toPr

oduc

tion

envi

ronm

ent;

Prod

uctio

n da

ta h

as

been

load

ed

The

syst

em is

oper

ated

to c

arry

ou

tin

tend

ed fu

nctio

n

PPR

Not

e: A

Pro

ject

Tai

lorin

g Pl

an m

ust b

e de

velo

ped

that

def

ines

what s

tage

s,

activ

ities

, and

art

ifact

s w

ill b

e co

mpl

eted

for t

he p

roje

ct.

The Pr

ojec

t Ta

ilorin

g Pl

an s

houl

d re

flect

the

uniq

ue c

hara

cter

istic

s of

the

proj

ect a

nd

prov

ide

the

best

opp

ortu

nity

to d

eliv

er th

e sy

stem

effe

ctiv

ely.

PDR

Stag

e 8:

Dis

posi

tion

•Dis

posi

tion

App

rova

l Req

uest

•A

rchi

ved

Syst

emIT

Spe

cific

•Arc

hive

d D

ata

The

syst

em is

disp

osed

•CO

NO

Ps•O

RD

•AP

•AoA

•CB

A•L

CC

E•A

PB•P

roje

ct T

ailo

ring

Plan

(s)

•M

S2B

AD

MIT

Spe

cific

•Ser

vice

Reu

se P

lan

•HLS

EA

Prog

ram

A

lignm

ent D

R•M

ap to

Bus

ines

s A

rchi

tect

ure

•Map

OC

IO P

ortfo

lios

•FIP

S 19

9 Se

curit

y C

ateg

oriz

atio

n•P

relim

inar

y SR

A

Engi

neer

the

prog

ram

sol

utio

n to

en

sure

all

alte

rnat

ives

are

co

nsid

ered

SER

Stag

e A

:So

lutio

nEn

gine

erin

g

SPR

AD

E 1A

DE 2

AD

E 3

Page 23: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-17 11-05-08

2.2 DHS Development Methodologies Projects vary in the challenges they present. They can range from a simple installation and configuration to intense software development; from tried and tested technology to bleeding edge integration; and from software development to systems development. Given the variety of project types and project characteristics, a single development methodology is not sufficient to meet the needs of all projects. This section discusses various methodologies and provides a guide for selecting the one most appropriate for optimizing project success.

2.2.1 Types of Development Methodologies (IT) History demonstrates that building information systems is a complex undertaking with high degrees of risk and uncertainty. Therefore, PMs must select the appropriate development methodology given their project’s unique characteristics. Though the spiral development methodology is the preferred development methodology in DHS for IT acquisitions, the characteristics of some project types may be best suited to other development methodologies, and PMs may choose to use other development methodologies more appropriate for their project. For a description of the Spiral, Iterative/Incremental and Waterfall methodologies, see Attachment B-1. For a description of the associated work patterns, see Attachment B-2. Regardless of the development methodology chosen, an approved Project Tailoring Plan, as described later in this section, must be developed and must justify the use of the chosen development methodology. Choosing an alternative development methodology does not relieve a Project Manager from adhering to the SELC or the ARP. The Project Tailoring Plan must describe how the proposed development methodology aligns with the SELC and ARP and why it is the best option for successfully completing the project.

2.2.2 Selecting an Appropriate Development Methodology (IT) PMs are responsible for selecting the development methodology best suited to address the characteristics of the project. The most important criterion is that the method selected for use is one that provides the project the best opportunity for successful delivery. As part of the selection process, the business requirements, risks, operational environment, and organizational impact should be considered. The Development Methodology Selection Guide presented in Table 2-1, adapted from the Department of the Air Force, Guidelines for Successful Acquisition and Management (GSAM) Condensed Version, February 2003, lists the system development methods addressed in the SELC Guide and the criteria to use in evaluating their suitability for a project. Depending on the characteristics of the project, PMs may elect to utilize a methodology other than one of those listed in Table 2-1. However, a Project Tailoring Plan must be approved (e.g., for Level 1 and Level 2 acquisitions, Table 2 in Directive 102-01). The use of Incremental and Spiral methodologies may require multiple ADE 2B reviews.

Page 24: Pia Reservation SDLC

B-18 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Table 2-1: Development Methodology Selection Guide

No. Project Criteria Waterfall Iterative/ Incremental Spiral

1 Requirements are known and stable

2 User needs are unclear/not well defined

3 New capability is innovative for organization 4 Many external dependencies (legislative approval,

trade agreement, etc.) exist 5 An early initial operational capability is needed

6 Early functionality is needed to refine requirements for subsequent deliveries

7 Task duration/effort can be accurately predicted 8 Significant risks need to be addressed

9 Must interface with other systems

10 Need to integrate new or future technology

11 Software is large and complex

12 Software is small or limited in functionality

13 Software is highly interactive with user

14 User interface is major design factor 15 Initial cost and schedule estimates must be followed

16 Detailed documentation is necessary

17 Minimize impact on current operations

18 Full system must be implemented

19 Project management must be simpler

20 System must be responsive to user needs

21 Progress must be demonstrated early

22 User feedback is needed

23 Reduce the costs of fixes and corrections

Key: For the criterion the methodology is: = Recommended = Satisfactory [none] = Not Recommended

2.3 Project Tailoring, Deviations, and Waivers The DHS SELC Guide should be applied in a tailored manner appropriate to project size, scope, complexity, risk, and security categorization. Tailoring is a technique that facilitates the flexibility in the design and application of an appropriate development life cycle to fit project characteristics, while ensuring compliance with requirements of the SELC Guide. Thus, the number of SELC activities and documents required for development efforts may differ between acquisitions due to tailoring based on project characteristics. Specific SELC requirements may be waived as part of an approved Project Tailoring Plan. Deviations – the approved alteration of the standard requirements of the DHS SELC Guide – are also part of the tailoring process.

Page 25: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-19 11-05-08

For example, for acquisitions that are less complex than the average project, the SELC may be tailored to allow one or more required documents to be combined to form a single artifact. Whether the SELC is tailored or not, the project must provide evidence proving successful completion of required SELC activities and must ensure that project documents record them in sufficient detail. A product called the Project Tailoring Plan (for IT and non-IT) is required to document the development approach for the program/project and is developed as an entry criterion to the Planning Stage of the DHS SELC (i.e., part of the ARP ADE 2B). The Project Tailoring Plan may be a discrete sub-set of the overall Project Management Plan rather than a stand-alone artifact. In accordance with Directive 102-01, for Level 1 and Level 2 programs (where not delegated), the Project Tailoring Plan is approved at the HQ level at ADE 2B, and any subsequent changes to it must be coordinated with the approval authority. For Level 3 and below projects, approval is at the Component level. The approval of the Project Tailoring Plan is granted as Project Managers are responsible for developing a Project Tailoring Plan that is best suited to address the characteristics of the project. In order to assist Project Managers, this DHS SELC Guide provides six work patterns in Attachment B-2. A work pattern is a tailored version of the DHS SELC and is based on either a standard development methodology (e.g., spiral) or other special use (e.g., COTS implementation). Project Managers may select one of the work patterns as is or tailor the selected work pattern further to meet the needs of the project. The Project Tailoring Plan should indicate how the chosen methodology will fit with the ARP review cycle. For example, spiral or incremental acquisitions may require separate 2B (or a 2A review when requirements, AoA’s, ORD segments, costs and other planning for a new spiral were not approved at a prior 2A). Most important is that the Project Tailoring Plan define the optimal development methodology that provides the project the best opportunity for successful delivery and meets all of the required standards for reviews. Evidence of plan approval should be maintained with the project documentation.

Page 26: Pia Reservation SDLC

B-20 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

2.4 Refine Documentation

Many documents are prepared in the life cycle of a project; some are created and finalized in a single DHS SELC stage while others are updated in successive stages after creation. Project baselines, the Project Management Plan, and the System Security Plan (SSP) are examples of documents that are refined over multiple stages. This applies particularly to security, privacy, and Section 508 Accessibility documentation (29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998. For more information on project documents and their status per stage, refer to the DHS SELC Artifact Summary, found in Attachment B-5 of this document.

2.5 Other SELC Considerations The DHS SELC requires the use of a best practice project management methodology (e.g., PMI’s Project Management Body of Knowledge [PMBOK]). The requirement to use a best practice project management methodology is intended to increase the probability of successful project completion. In addition to the project management requirement, the DHS SELC requires Project Managers to evaluate and incorporate the project activities described in Table 2-2. Some of the elements described in the table, such as organizational change management, are best practices. Others, however, such as Information Security, are prescribed by DHS policy. The PM should evaluate these activities as a part of the project tailoring activity. The Project Tailoring Plan, therefore, should reflect the incorporation of the applicable and/or required activities. Detailed information, including a description of each activity, its benefits, and associated considerations, is located in Attachment B-3 of this document.

Table 2-2: Summary of Recommended Project Elements

Element High-Level Description Project/Program Management Process & Plan

A best practice methodology used to apply proven knowledge, skills, tools, and techniques to a broad range of activities in order to meet the cost, scope, and schedule requirements of a project. The project management plan must contain all applicable elements from both the best practice methodology selected for use and the requirements found in the DHS SELC Guide.

Organizational Change Management (OCM)

The process of assessing the impact of change in an enterprise (e.g., a change in mission, strategy, and IT systems) to the people and culture of an organization.

Requirements Definition The practice of determining and recording stakeholder needs through requirements gathering, analysis, specification, verification, validation, and management.

Prototyping The creation of a temporary system for validating the interpretation of the requirements, as well as for eliciting new requirements.3

Critical Infrastructure Protection

The identification and protection of vital systems and assets that, if damaged, would present a severe impact to security, economic security, and national public heath and/or safety.

3 Guide to the Software Engineering Body of Knowledge, IEEE Computer Society, 2004 Version.

Page 27: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-21 11-05-08

Element High-Level Description Human Factors Engineering

A discipline devoted to optimizing the interaction of people, including persons with disabilities, with the devices and applications they must use in order to minimize errors and maximize efficiency.

Risk Management A structured process for identifying and managing risk with the objective of balancing cost, schedule, and performance goals within program funding constraints.

IT Specific Enterprise Architecture Alignment

A strategic information asset base that defines the mission, the information and technologies necessary to perform the mission, and the transitional processes for implementing new technologies in response to the changing mission needs. An EA includes a baseline architecture, target architecture, and a sequencing plan. EA alignment refers to how well activities support the transition to the Homeland Security Target Architecture.

Information Security A set of practices to address the protection of information and information systems from unauthorized activities and to promote a variety of assurances. DHS requires that all solutions using IT in any form are certified and accredited in accordance with DHS security policies before they are deployed in an operational environment.

Privacy Compliance A structured review process to ensure that all privacy compliance requirements are met prior to final project implementation.

Section 508 EIT Accessibility

Federal law that requires that Electronic and Information Technology (EIT) is accessible to people with disabilities. 29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998 was enacted to eliminate barriers in information and data, to make available new opportunities for people with disabilities, and to encourage development of technologies to achieve these goals. Section 508 standards are closely related to Human Factors Engineering (HFE). Any human factors engineering initiative needs to be inclusive of persons with disabilities, as optimizing interaction with devices and applications often has a significant impact on accessibility. Human Factors Engineering for users with disabilities is based on Section 508 EIT Accessibility Standards.

Independent Validation & Verification (IV&V)

An examination and certification of the project performed by an organization independent of the development organization. The IV&V results enable more effective decision making.

Electronic Records Management

Requirements related to the creation, maintenance, use, and disposition of electronic records.

3. Stage A: Solution Engineering

The purpose of the Solution Engineering Stage is to conduct the systems engineering necessary to determine the optimal solution set to solve the problem defined in the MNS, which is approved at or prior to ADE 1. This stage will define the projects and other activities that will deliver capability to DHS users and stakeholders. It is important to note that this stage is the only SELC stage that focuses on the program and total capability including potentially non-materiel solutions. To distinguish this difference it is labeled as Stage A while the remaining stages are numbered 1-9.

Page 28: Pia Reservation SDLC

B-22 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

3.1 Solution Engineering Activities The following sub-sections describe the activities performed in the Solution Engineering stage.

3.1.1 Document the Concept of Operation The Concept of Operations (CONOPS) describes the operational view of the proposed solution(s) from the user’s perspective. A CONOPS is used to communicate high-level, conceptual, future business and mission operations to the project sponsors, end-users, planning and design teams, and other stakeholders. Specifically, it provides the framework for the development of an operational capability and provides an understanding of the organizational impact of the proposed solution (e.g., the Business Impact Assessment). The content of the CONOPS includes a description of the operations processes and associated roles and responsibilities of operators. It addresses the full Doctrine, Organization, Training, Materiel, Leadership, Personnel and Facilities (DOTMLPF) and Regulations/Grants (RG) spectrum. For example, a CONOPS might describe how combinations of additional trained personnel, new doctrine and modernized materiel systems could fill specific capability gaps. The CONOPS enables users to visualize how the proposed solution will operate in the real world environment. Therefore it should account for likely situations, conditions and external events. Scenarios, mission threads and use cases are ways to add a sense of realism to the CONOP and help assess its’ robustness. The CONOP is also a vehicle to communicate the impact of the proposed solutions and investments to leaders and decision makers. CONOPS guidance is provided as Appendix F: Concept of Operations, of Instruction/Guidebook 102-01-001.

3.1.2 Identify and Analyze Potential Solution Alternatives An Analysis of Alternatives (AoA) is an analytical comparison of the operational effectiveness and other benefits, suitability, and risk of alternatives that satisfy established capability needs. Initially, the AoA process typically explores numerous conceptual solutions across DOTMLPF/RG with the goal of identifying the most promising options to achieve the desired capabilities, thereby guiding the Concept Refinement Phase. Subsequently, at ADE 2, the AoA is used to justify the rationale for formal initiation of the acquisition program. Ideas, concepts and options across the DOTMLPF/RG solution space are identified and assessed against such criteria as effectiveness, suitability, feasibility, alignment (e.g., to the EA) timeliness, cost and risk. The AoA seeks to reach a balanced solution that changes one or more of these elements to maximize mission outcomes. Therefore, the AoA process should trade-off combinations of options within and across the DOTMLPF/RG factors and present a recommended solution for consideration by decision makers. It is important to understand that non-materiel solutions such as training or additional personnel may represent considerable investments and may need to be costed out. The AoA process incorporates cost analysis, benefit analysis, risk analysis, and is a precursor to Life Cycle Cost Estimates (LCCEs), Acquisition Plans, and other implementation plans. The AoA should be developed in parallel with the ORD so that requirements, costs, and risks can be balanced. The DHS AoA process does not

Page 29: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-23 11-05-08

assume that any one factor is held constant a priori. It assumes that decision makers will be given the information they need to make trade offs among all factors including requirements and costs. The Study Team should be aware of services that exist within DHS and should maximize their use and avoid duplication. Generally, COTS, reuse and common solutions are preferred to unique solutions, The Study Team may recommend a preferred solution or solution set for decision makers or present decision makers with a range of alternative solutions for selection. The results of the analysis are documented in the AoA report. 3.1.2.1 AoA Study Plan The AoA Guidance in Appendix G: Analysis of Alternatives of Instruction/Guidebook 102-01-001 contains additional details and formats. The AoA is guided by an AoA Study Plan (AoASP) The AoASP will set assumptions, bounds and constraints on the analysis of alternatives. It may rule out certain alternatives or may include others to open up the field of possibilities. A key purpose of the AoASP is to ensure unbiased exploration of a broad range of feasible alternatives, and balanced solutions that address the DOTMLPF/RG spectrum. The SP will identify the participants and roles of the analysis team, the review and approval process for the AoA, and required resources including the need for Subject Matter Experts from Components, Industry and FFRDCs/Labs. The AoASP is developed jointly by a representative from Headquarters appointed by the DUSM or designee and a representative from the lead Component appointed by the Component Head or designee. AoASP development should start shortly after ADE-1 and be completed within 30 days or less. Review and approval of the joint AoASP will depend on the program’s scope, size, criticality and other key factors. 3.1.2.2 Study Plan Review (SPR) The purpose of the SPR is to review the initial study plans assumptions and the scope and methods of analysis. The Lead Component or development agent holds the (SPR) with Headquarters representation. Level of HQ participation will vary depending on several factors, for example, an AoASP for a program that has a significant impact on several components may have representation from the DHS DUSM office and multiple CXOs.

3.1.3 Develop Life Cycle Cost Estimate The Life Cycle Cost Estimate (LCCE), documented with related assumptions and risks, is created using information from the Acquisition Plan, Work Breakdown Structure (WBS), project schedule, and information from entry criteria documents. The results are used to help define the Acquisition Program Baseline (APB). The LCCE estimates the total cost of a system from initiation through disposal. The LCCE is often developed at a more granular and precise level of detail than the Cost Benefit Analysis (CBA) cost estimates for the preferred or recommended solution. Guidance for the LCCE is provided in Appendix I: Life Cycle Cost Estimate of Instruction/Guidebook 102-01-001.

Page 30: Pia Reservation SDLC

B-24 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

3.1.4 Define Operational Requirements ORD Guidance in Appendix H: Operational Requirements Document of Instruction /Guidebook 102-01-001 contains a discussion on guidance for details and formats. Operational requirements are defined to capture the business or operational user needs without regard to the technical aspects of the system. Operational requirements are high-level requirements that describe the mission, objectives, and operational capabilities that are desired in the capability and project solution.

3.1.5 Develop Acquisition Plan The Acquisition Plan (AP) defines how all Government human resources, contractor support services, support services, facilities, hardware, software and telecommunications and other capabilities are acquired during the life of the program and project(s). The purposes of the AP is to ensure that best alternatives and strategies for acquisition are used, Federal and DHS regulations and guidance are followed, and required resources can be obtained and are available when needed, across the entire lifecycle of the acquisition. For additional guidance refer to Appendix E: Acquisition Plan Guide of Instruction/Guidebook 102-01-001. Based on the acquisition strategy selected, PMs may need to solicit and award support contracts to support resource requirements that are not available to be filled from within DHS and related stakeholder organizations. Elements of resource planning which may affect project schedule include activities such as training and the time required for project personnel to acquire security clearances.

3.1.6 Develop the Integrated Logistics Support Plan (ILSP) The Integrated Logistics Support Plan (ILSP) is a preliminary document at ADE-2 used to support the ADE-2A decision; a high-level strategy for providing supportability and sustainment that will be updated through the course of the acquisition cycle with increasing detail and fidelity as the program progresses. It should provide critical insight into the approach, schedule, and funding requirements for integrating supportability requirements into the systems engineering process to ensure supportability of the design and for developing/obtaining sustainment products. The ILSP provides the basis for assumptions and planning for life cycle costs reflected in the APB and LCCE, and is integrated with SELC requirements. Acquisitions will be independently assessed as meeting the approved ILSP at each succeeding ADE. Specific assessment criteria and certification requirements are under development. Guidance is provided as Appendix J: Supportability and Sustainment of Instruction/Guidebook 102-01-001.

3.1.7 Establish Acquisition Program Baseline Baseline performance measures should be established as the minimum standard for operations. Performance metrics should be developed to measure improvement and/or operational consistency. These metrics should be quantitative, to the extent possible, in order to avoid inconsistent evaluation of the data collected and resulting analyses. The Acquisition Program Baseline (APB):

Page 31: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-25 11-05-08

• Establishes project performance requirements, schedule requirements, and a life cycle cost estimate (note that APBs should be detailed to the project level) as a “contract” between the Acquisition Decision Authority (ADA) and the PM.

• Includes parameters that, if not met, will result in the ADA reevaluating the project and potentially considering alternative program concepts or design approaches and re-baselining.

The Acquisition Program Baseline is a requirement of Directive 102-01. Additional Guidance is provided in Appendix K: Acquisition Program Baseline (APB) of the Instruction/Guidebook 102-01-001.

3.2 Documents The project team records the results of their efforts in a set of documents, which are presented in Table 3-1, Solution Engineering Stage Documents. All documents should be placed under CM control.

Table 3-1: Solution Engineering Stage Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Mission Need Statement ARP (DIR 102-01) C FCapability Development Plan (CDP) ARP (DIR 102-01) C/FAcquisition Plan ARP (DIR 102-01) C U U U FCONOPS ARP (DIR 102-01) C/F Analysis of Alternatives / Alternatives Analysis ARP (DIR 102-01) C/F Lifecycle Cost Estimate (LCCE) ARP (DIR 102-01) C U U U UOperational Requirements Document (ORD) ARP (DIR 102-01) C/F Integrated Logistics Support Plan (ILSP) ARP (DIR 102-01) C U U FAcquisition Program Baseline ARP (DIR 102-01) C U FProject Tailoring Plan DHS SELC C/FService Reuse Plan DHS SELC C U U U U U U F FHLS EA Program Alignment Decision Request DHS EA Process C U U U F FProgram Alignment Documentation Matrix DHS EA Process C U U U FMap to Business Architecture DHS EA Process C U U U FMap to OCIO Portfolios DHS EA Process C U U U FSection 508 National Security Exception Determination DHS OAST C/FCritical Infrastructure Protection Report DHS CISO C FFIPS 199 Security Categorization DHS CISO C UPreliminary Security Risk Assessment DHS CISO FProject Authorization Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

C = Create; F = Finalize, and U = Update

Page 32: Pia Reservation SDLC

B-26 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

If the program is not approved, due to issues and/or risks, a remediation plan must be developed and approved by the approval authority. The approval authority will establish the timeframe for the next SER approval opportunity.

3.2.1 Solution Engineering Review Participants Table 3-2 lists the suggested participants for the SER. Because acquisitions have different characteristics, additional subject matter experts (e.g., architecture, performance, data, contracting, budget, infrastructure, etc.) may be needed for specific expertise.

Table 3-2: Participants for Solution Engineering Review

Participant Responsibility Program Sponsor • Validates that the program is aligned with the project’s objectives

• Validates that the program is within cost, schedule, and performance constraints • Validates that all risks are defined and manageable

Program Manager • Presents program status across all aspects of program (technical, cost, schedule, risk) For IT Acquisitions

CIO • Validates that the program has been sufficiently engineered to enter the project stages and begin delivering incremental capabilities

• Validates that all exit criteria are met • Approves/disapproves program to proceed

DHS IT Portfolio Manager(s)

• Validates that IT Portfolio objectives are appropriately documented in requirements • Identifies/validates if an overlap of functionality exists between the program and

investments/systems in the IT Portfolio For Non-IT Acquisitions

Component Acquisition Group

• Validates that the program has been sufficiently engineered to enter the project stages and begin delivering incremental capabilities

• Validates that all exit criteria are met • Approves/disapproves program to proceed

APMD with Component Acquisition Group

• Validates that Portfolio objectives are appropriately documented in requirements • Identifies/validates if an overlap of functionality exists between the program and

investments/systems in the Portfolio

3.2.2 Exit Criteria The objective of the SER is to evaluate the documents produced and to ensure that all exit criteria are satisfied for a given program. Table 3-3 lists the exit criteria for the Solution Engineering stage.

Page 33: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-27 11-05-08

Table 3-3: Solution Engineering Stage Exit Criteria

Domain Exit Criteria Program Management

• Have all documents defined in the approved Project Tailoring Plan been completed and reviewed for completeness?

• Has a CONOPS for the proposed system been developed and validated by users that shows how the preferred solutions would work in the real world environment, and fill existing gaps and meet new challenges?

• Does the AoA address the full spectrum of DOTMLPF/R/G/S alternatives? • Does the AoA address all possible options, and is it unbiased toward one or another

(type) of solution? • Have the feasible options been traded-off in the AoA to arrive at an optimized solution

that balances mission effectiveness, suitability performance, cost, schedule and risk within realistic constraints?

• Has an ORD been developed that captures Key Performance Parameters (including interoperability if applicable), Critical Operational Issues for operational testing, and derived technical parameters, as well as IOC and FOC requirements?

• Have non-materiel solutions been identified to provide holistic solutions to gaps? • Have the total lifecycle costs, including support/sustainment been captured within

sensitivity ranges in a LCCE? • Have all changes to policies and/or regulations that require long lead times been

identified and included in the plans, and is the likelihood of such changes been included in the risk analysis?

• Has an APB been developed that is aligned to program formulation, includes an integration segment if applicable (e.g., for SoS) and provides realistic and pertinent performance, cost and schedule parameters for each project (useful segment) been produced, and it is consistent with the ORD, AOA, CONOPs and other projects?

• Does the schedule include time for staff to acquire security clearances If necessary? • Has a Project Management Plan (PMP), including documentation of project scope,

tasks, schedule, allocated resources, and interrelationships with other projects, been developed?

• Does the integrated master schedule include project resourcing, discrete work packages, internal and external dependencies, and critical path, to the extent program formulation has identified projects?

• Does the Business Strategy adequately address the most effective mechanisms for each project?

• Have the risks been reviewed and are they deemed acceptable to move to the next stage?

• Is the Service Reuse Plan still accurate and complete? • Does the cost estimate in the APB and LCCE fit within the existing budget? • If the project does not qualify for a National Security Exception, has a Section 508 EIT

Accessibility Plan been prepared? Requirements • Have operational scenarios been analyzed and defined in the CONOPs?

• Have the business requirements collected to date been specified in clear, meaningful, and testable format using “shall” statements?

• Have all the business requirements been reviewed by the acceptance test team to ensure that the requirements are clear, meaningful, and testable?

Information Security • Has the security categorization been completed? • Has the preliminary risk assessment been completed?

Privacy Compliance • Has a Privacy Threshold Analysis (PTA) been completed and approved by the Privacy Office?

Software Engineering • Have business or operational requirements been validated by end users? Data Management • If appropriate (if performing inter-component or inter-agency information exchanges),

has information exchange scenario planning been completed?

Page 34: Pia Reservation SDLC

B-28 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

4. Stage 2: Planning

The purpose of the Planning Stage is to validate that sufficient analysis and planning have been conducted to begin the project. All facets of the project are analyzed to ensure that the cost, scope, and schedule are technically feasible and acceptable to stakeholders. This includes, but is not limited to, creating a project plan for the development of the proposed capability that addresses the stated business need. At the end of the Planning Stage, the project team and stakeholders should have a comprehensive project management plan (Table 4-1) that includes an integrated schedule for development, and cost estimates. The stage Exit Review for the Planning stage, the Project Planning Review (PPR), is held at the completion of the Planning Stage and ensures that project resources, activities, schedules, and tools are allocated and baselined prior to advancing to Stage 3, Requirements Definition.

Table 4-1: DHS IT Project Management Plan Elements

DHS SELC Common Project Management Plan Elements

No. Element Name

1 Project initiation documents: project charter, authority, sponsor, scope, project team organizational structure, list of stakeholders, and stakeholder roles and responsibilities

2 Resource plan

3 Relationship agreements (if any)

4 Work Breakdown Structure

5 Integrated Master Schedule

6 Identification/definition of management processes to be used

7 Identification of management reviews and oversight authority

8 Communication plans

4.1 Planning Entry Criteria Relative to the ARP, the Planning Stage occurs after ADE-2B. Many activities are performed and documents produced as part of the DHS Analyze/Select and Obtain phases of the ARP and the pre-select and select phases of the CPIC process that occur prior to the initiation of DHS SELC processes for a project. (See Section 1.5 for more information on the alignment of the SELC to the DHS ARP.) A high-level assessment of the need (documented in the Mission Need Statement) is developed into an ORD, alternative solutions are analyzed, a lifecycle cost estimate is developed, an APB is developed, an AP is developed, and with approval of the acquisition at ADE-2A and 2B, contracting and development actions can begin.

Page 35: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-29 11-05-08

Additionally, the DHS SELC requires two new documents that must be approved by the ADE-2B ADA: the Project Tailoring Plan and the Service Reuse Plan (for IT only). The Project Tailoring Plan is developed to define any SELC adaptation required to meet the needs of the project and the technical solution, while the Service Reuse Plan is created to define those services that the project will create for reuse by other systems and those that are already in service and the project plans to reuse. Table 4-2 lists the specific SELC documents that must be approved by the ADE-2B ADA to enter the Planning stage. Refer to the Directive 102-01 for the complete list of documents required for the ADE.

Table 4-2: Planning Stage Entry Criteria

Line No. Documents Description

1 Project Tailoring Plan The Project Tailoring Plan documents the system development approach in terms of the proposed SELC stages, activities, documents, and exit criteria.

2 Service Reuse Plan (IT Only)

The Service Reuse Plan identifies the service(s) that a project plans to reuse, enhance, or create new as part of the HLS EA Alignment review. The Service Reuse Plan identifies and defines the use of DHS services to meet project needs and to support deployment of the DHS SOA.

4.2 Planning Activities Following the successful completion of the ADE-2A of the ARP process, the SELC process begins with SELC Stage 1, Planning. The following sub-sections describe the activities performed in the Planning stage.

4.2.1 Develop SE Risk Management Plan The Risk Management Plan details the process used to identify, document, track, mitigate, and report project risk. Numerous best practices are available to provide guidance on the development of an effective and efficient Risk Management Plan, such as those from organizations like the Software Engineering Institute (SEI) (developer of the Capability Maturity Model Integrated [CMMI]) and the Project Management Institute (PMI). Acquisitions are encouraged to use an automated risk management tool.

4.2.2 Develop SE Quality Assurance Plan The Quality Assurance (QA) Plan documents the mechanism for verifying that the delivered products satisfy contractual agreements, meet or exceed quality standards, and comply with the processes specifically identified for use via the tailored instance of the DHS SELC Guide approved for the project. The QA Plan identifies the requirements, approach, activities, and approvals required to assess and validate project products and deliverables against requirements.

4.2.3 Develop Training Plan The Training Plan outlines the objectives, needs, strategy, and curriculum for training users on the new or enhanced system. The plan presents the activities needed to support the development of training materials, coordination of training schedules,

Page 36: Pia Reservation SDLC

B-30 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

reservation of personnel and facilities, planning for training needs, and other training-related tasks. If appropriate, the training plan should be jointly created with the project’s Organizational Change Management team. Training activities are developed to teach user personnel the use of the system as specified in the training criteria. The plan includes the target audience and topics on which training must be conducted. It includes how the topics will be addressed, the format of the training program, and the list of topics to be covered, materials, time, space requirements, and proposed schedules.

4.2.4 Develop Configuration Management Plan The CM Plan describes the process that will be used to identify, manage, control, and audit the system documentation and baseline. The plan should define the CM policy, procedures, structures, and roles/responsibilities to be used in executing configuration management. The plan should address the identification of configurable Items, an assessment of the impact of change, and the configuration management tool and process (e.g., document versioning, storage, retrieval). The plan should also define the Change Control Board (CCB) and identify the processes used by the CCB in evaluating and approving changes to the system. The process should include assessments of both the cost and schedule impact of making changes. Acquisitions have the option of using a program-level CM Plan, if one exists. However, the PM is responsible for reviewing the program-level CM Plan to ensure that it directly applies to the project as it is written. If changes are needed to the program-level CM Plan, the PM should update the CM Plan to specifically address the project’s needs.

4.2.5 Complete Privacy Threshold Analysis (As Applicable) As part of the planning process, the project must complete a Privacy Threshold Analysis (PTA) to determine if the proposed technology, system, or program collects, maintains, and/or shares information in a form that could possibly identify or be used to identify individuals. The PTA defines the initial assessment of privacy protection issues. The PTA is a short form that asks for information related to:

• The type and status of the project • Whether the project collects, maintains, or shares information that could relate in any

way to an individual • The nature of the information used by the project • The status of the project in the C&A process The PTA Template is available from the website of the DHS Privacy Office found on DHS Online and via email: [email protected]. Once completed, the PTA form should be sent to the DHS Privacy Office for review and validation. The information from the PTA is used by the DHS Privacy Office for an initial determination regarding the existence of potential privacy issues that would trigger more comprehensive privacy compliance documentation, the Privacy Impact Assessment (PIA), and the legal document called the System of Records Notification (SORN).

Page 37: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-31 11-05-08

4.2.6 Develop Section 508 Electronic and Information Technology Accessibility Plan (As Applicable) Section 508 (citation: 29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998), mandates access to the use of information and data for persons with disabilities that is comparable to that enjoyed by persons without disabilities. All DHS IT system development efforts must have a Section 508 Electronic and Information Technology Accessibility Plan that defines its approach to meeting Section 508 requirements. In the initial stage of planning, it is important to determine if Section 508 technical standards apply or whether any exception to Section 508 applies based on the information available. Most exceptions cannot be determined at this point, as most are driven by the business needs and often do not fully exempt an entire project from providing comparable access. One exception that should be reviewed at this early stage is the Section 508 National Security Exception (36CFR1194.2a). If the project or part of the project qualifies for a National Security Exception, this would reduce the required planning for the remainder of the life cycle. Section 508 National Security Exceptions require approval from the DHS Office of Accessible Systems and Technology (OAST). The determination of whether or not a Section 508 National Security Exception applies can be ascertained from the answers to the following questions regarding project scope:

• Is it for the command and control of military forces? • Is it intended to be part of a weapons system? • Does it include cryptologic activities used to support National Security? For a complete description of the determination criteria, refer to National Institute of Standards and Technology (NIST) Special Publication 800-59, Guideline for Identifying an Information System as a National Security System. If the project does not qualify for a National Security Exception as determined by OAST, a Section 508 EIT Accessibility Plan must be developed and include both functional and technical requirements, design, integration, and testing as well as continued operations and support. Information on EIT Accessibility is presented in Attachment B, section B2.11 later in this document.

4.2.7 Develop System Data Management Plan (IT Only) The Data Management Plan identifies the information needs, data requirements, data conversion, and data security strategies. The goals of data management include re-use of existing resources through the discovery of available data services and data repositories in order to provide timely, accurate information and supporting data protection. Project staff need to know: 1) what data are available, 2) the quality of the data, 3) how the data are used, 4) how to incorporate the data into resource management decisions, and 5) how the data will be managed over time. The Data Management Plan is a document whose content evolves over multiple phases. Incorporation of reviews of the Data Management Plan after each stage by key reviewers and data management resources ensures that impacts to the Component and Enterprise data architecture are evaluated and incorporated, where necessary.

Page 38: Pia Reservation SDLC

B-32 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

A key activity in the Data Management Plan is the assessment of alignment products to the DHS Data Reference Model. The assessment results in the creation of several products (i.e., the Program Data Architecture alignment, Data Asset and Logical Data Model registration as well as Information Exchange planning). If inter-Component or inter-agency information exchanges (transfer of information between two parties) will be developed, the information exchanges must be developed in accordance with the National Information Exchange Model (NIEM) IEPD. The Data Management Plan must address NIEM scenario planning activities. Scenario planning is a business analysis approach to identifying information exchanges and is the first phase in the IEPD development methodology. For more information on the data management plan, refer to the Enterprise Data Management Office (EDMO).

4.3 Documents The project team records the results of their efforts in Stage 1 Planning in a set of documents, which are presented in Table 4-3, Planning Stage Artifact Summary. Documents must be traceable to the project Work Breakdown Schedule (WBS). All documents should be placed under CM control. For a full listing of the documents, including others that get updated during the Planning stage, see Attachment B-5 of this document.

Table 4-3: Planning Stage Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Project Management Plan (Includes Integrated Master Schedule) (PMP) DHS SELC C U U U U U FConfiguration Management Plan DHS SELC C FPrivacy Threshold Analysis (PTA) Privacy Office C/FSection 508 EIT Accessibility Plan DHS OAST C U U U U U FRisk Management Plan DHS SELC C/FQuality Assurance Plan DHS SELC C/FData Management Plan DHS SELC C FTraining Plan DHS SELC C/FProject Planning Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

4.4 Stage Review and Exit Criteria The objective of the Project Planning Review ( PPR) is to evaluate project readiness to proceed to Stage 2, Requirements Definition. The PPR is conducted by the project sponsor and the Project Manager to evaluate whether or not the products produced satisfy stated requirements, with results being approved by the approval authority. As the basis for evaluation, the PPR uses the set of exit criteria to evaluate completion of activities and products from Stage 1.

Page 39: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-33 11-05-08

If approved, the results of the PPR are recorded in a document called the Project Planning Review Approval Letter, which must be signed by the approval authority. However, it is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. The PPR Approval Letter is countersigned by the PM. A scanned electronic copy of the signed stage review approval letter, the latest version of the Project Tailoring Plan, and the updated project management plan (and associated project schedule) must be emailed to the DHS Periodic Reporting team at [email protected]. If the project is not approved due to issues and/or risks, a remediation plan must be developed and approved by the approval authority. The approval authority will establish the timeframe for the next PPR approval opportunity.

4.4.1 Exit Criteria The objective of the PPR is to evaluate the products produced and to ensure that all exit criteria are satisfied for a given project. Table 4-4 lists the exit criteria for the Planning stage.

Table 4-4: Planning Stage Exit Criteria

Domain Exit Criteria Project Management • Have the risks been reviewed and are they deemed acceptable to move to the next

stage? • Do the plans accurately reflect the type of development methodology (e.g., spiral,

waterfall, iterative) identified in the Project Tailoring Plan? • Have all products defined in the approved Project Tailoring Plan been completed and

reviewed for completeness? • Has a Business CONOPS for the proposed system been developed? • Have all changes to policies and/or regulations that require lead times been identified

and included in the schedule? • Has access to the Risk Management Database been granted to appropriate project

team members? • Has a WBS been fully developed? • Is the project schedule executable (technical, cost,)? • Does the schedule include time for staff to acquire security clearances? • Does the integrated master schedule include project resourcing, discrete work

packages, internal and external dependencies, and critical path? • Has a Project Management Plan (PMP), including documentation of project scope,

tasks, schedule, allocated resources, and interrelationships with other projects, been developed?

• Is the Service Reuse Plan still accurate and complete? • Does the updated cost estimate fit within the existing budget? • If the project does not qualify for a National Security Exception, has a Section 508 EIT

Accessibility Plan been prepared? Requirements • Have operational scenarios been analyzed and defined?

• Have the business requirements collected to date been specified in clear, meaningful, and testable format using “shall” statements?

• Have all the business requirements been reviewed by the acceptance test team to ensure that the requirements are clear, meaningful, and testable?

Information Security • Has the security categorization been completed? • Has the preliminary risk assessment been completed?

Privacy Compliance • Has a Privacy Threshold Analysis (PTA) been completed and approved by the Privacy Office?

Page 40: Pia Reservation SDLC

B-34 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Domain Exit Criteria Configuration Management

• Has an initial CM Plan been defined? • Have CM tools and processes been specified, and has tool access been granted to

appropriate project team members? Software Engineering • Have business requirements been validated by end users? Data Management • If appropriate (if performing inter-component or inter-agency information exchanges),

has information exchange scenario planning been completed?

5. Stage 2: Requirements Definition

The purpose of the Requirements Definition stage is to gather, analyze, and document requirements including functional, performance, and data requirements. Documents related to requirements from earlier activities (i.e., MNS, ORD, CONOPS) form the basis for further user needs analysis and the development of detailed functional requirements. If analysis activities reveal new insights into the overall requirements, all documents, must be revised to reflect this new information. Multiple iterations of the Requirements Definition stage may occur in the project life cycle, especially if the project strategy includes life cycle segments for prototypes or pilots, or if it uses a spiral or iterative/incremental development methodology. During each iteration, the documents appropriate for the subset of project requirements allocated to the specific iteration are revisited and further refined, as necessary. The Requirements Definition Stage continues until the requirements are defined in enough detail such that design can begin. Requirements must include considerations for performance, capacity, growth and continuity of services (Continuity of Operations and Disaster Recovery), Accessibility (29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998), and must include entries to meet the needs of persons with disabilities as required by the Rehabilitation Act of 1973, as amended in 1998. Requirements should also include security, privacy, entries for electronic record management, record disposition schedule, and requirements unique to that Component. Acquisitions are encouraged to use an automated requirements management tool. Formal establishment of Service Level Agreements (SLA) should begin during the Requirements Definition stage to allow time for all parties to establish an agreement that meets the needs of the project. The writing of a Memorandum of Understanding (MOU) or an SLA should simply be the restating of a subset of the requirements captured during the Requirements Definition stage. A System Definition Review (SDR) verifies that the exit criteria for the Requirements Definition Stage have been met; successful completion of the SDR authorizes the project to proceed to the Design Stage.

5.1 Requirements Definition Activities The following sub-sections describe the activities performed in the Requirements Definition Stage.

Page 41: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-35 11-05-08

5.1.1 Develop Functional Requirements The Functional Requirements Document (FRD) formally describes requirements in terms of system function, inputs, processes, outputs, external interfaces, performance, system availability, and maintainability. Each requirement must be measurable, testable, and traceable to a source. Source documents from which the functional requirements are derived include the ORD, MNS, CONOPS, and other appropriate sources. Requirements identification and analysis should ensure that hidden requirements critical to the success of the mission are identified and are captured in the FRD. The FRD is the complete listing of user oriented functional and data requirements for the system, including functional (36CFR1194.31) and technical requirements (36CFR1194.21-26) as set forth in 29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998. The functional requirements serve as the foundation for the definition of the system requirements and for the design and development of the solution. At a minimum, requirement definitions should include a unique number, priority, criticality, feasibility, risk, source, and type. Note non-functional requirements (requirements that specify criteria that can be used to judge the operation of a system, rather than specific behaviors, such as availability) must also be captured and described. The system will not be correctly specified unless these requirements are also identified. In the case where inter-Component or inter-Agency information exchanges are being developed, NIEM requirements must be documented in the NIEM IEPD document.

5.1.2 Develop Requirements Traceability Matrix The requirements traceability matrix (RTM) is created to map each detailed functional requirement to its source and is updated throughout the systems engineering life cycle. The RTM provides traceability from business requirements and other sources to the FRD and into the test cases. As the life cycle progresses, additional traceability will be added from FRD to the System Requirements Document (SRD), into high-level design elements, into detailed design components, and finally into test cases and procedures.

5.1.3 Evaluate Infrastructure and Environmental Aspects In addition to traditional IT requirements identification, DHS IT project teams must also evaluate technology infrastructure, environmental and process needs, and non-IT aspects. Any requirements identified during this evaluation must also be documented in the FRD. Findings from the infrastructure evaluation should be used to develop the Site Preparation Plan, which describes activities required to prepare facilities and environments (e.g., development, integration, test, production, data center, off-site storage) in time for installation and operation of the solution at all locations, based on an analysis of existing conditions and the solution design requirements. This includes all aspects of facilities preparation (construction, electrical, networking) and compliance with all applicable legal requirements, including environmental requirements and those for Section 508 Accessibility, privacy, and security. With regard to environmental impact, the National Environmental Policy Act (NEPA) requires Federal agencies to integrate environmental values into their decision making

Page 42: Pia Reservation SDLC

B-36 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

processes by considering the environmental impacts of their proposed actions and reasonable alternatives to those actions. To meet this requirement, Federal agencies prepare a detailed statement known as an Environmental Impact Statement (EIS). While this activity does not apply to some projects, PMs should consider the impact of their efforts on the environment. For example, if a project plans to deploy wireless technology, the project should consider possible effects the wireless communications equipment might have on the environment.

5.1.4 Develop Test and Evaluation Master Plan The Test and Evaluation Master Plan (TEMP) is the overarching program-level planning document for Test and Evaluation (T&E)-related activities for major acquisition programs. The TEMP describes the necessary Developmental Test and Evaluation (DT&E) and Operational Test and Evaluation (OT&E) that needs to be conducted to determine system technical performance, operational effectiveness / suitability, and limitations. The TEMP identifies all critical technical characteristics and operational issues and describes the objectives, responsibilities, resources, and schedules for all completed and planned T&E, including modeling and simulation tools used in the T&E process. Also, the TEMP describes all subordinate plans (e.g., DT&E and OT&E plans), required reports (e.g., DT&E and OT&E reports), and assigns responsibility for preparing and approving these plans and reports. Most acquisition programs are required to have an approved TEMP and subordinate test plans prior to commencing associated T&E unless a specific waiver is granted by the DHS Science &Technology (S&T) Director, T&E and Standards Division. Appendix L:Test and Evaluation Master Plan, of Instruction/Guidebook 102-01-001 provides guidance and a Template for the TEMP.

The PM is responsible for coordinating the overall T&E strategy and approach for the program. The PM should prepare a TEMP in accordance with the TEMP template as outlined in this document after approval of the initial Operational Requirement Document (ORD). The program’s TEMP should be approved by the Component Acquisition Executive (CAE) and the DHS S&T Director, T&E and Standards Division prior to formal submission to the Acquisition Decision Authority (ADA) at Acquisition Decision Event (ADE) 2. The fundamental purpose of T&E in an acquisition program is to verify attainment of technical performance (in Developmental Testing), and required operational effectiveness and suitability (in Operational Testing). As a system undergoes design, development, and integration, the emphasis in testing moves gradually from DT&E, which is concerned mainly with validating the contract requirements and the attainment of engineering design goals and manufacturing process objectives, to OT&E, which focuses on verifying operational effectiveness and suitability. The TEMP ensures that, at a minimum, critical capabilities of the system are adequately tested and can be implemented. The TEMP documents the scope, content, methodology, sequence, management of, and responsibilities for test activities. The PM must consider the appropriate testing approach. Usually, depending upon the system being developed, more than one approach is needed, e.g., functional (behavioral), specification based (requirements), stress, and exploratory tests.

Page 43: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-37 11-05-08

Figure 5-1 shows the relationship of developmental and operational testing, the types of tests conducted under each, and the capstone governing documents for each. Under the overall strategic guidance of the TEMP, the PM develops a Developmental Test Plan (DTP). The TEMP describes testing at a strategic (mission, goals, objectives level) and the DTP at a tactical, more technical level.

Figure 5-1: Developmental Testing and Operational Testing and Key Documents

5.1.5 Developmental Test Plan The Developmental Test Plan (DTP), guided by the capstone strategic TEMP, describes the purpose, activities, scope, responsibilities, schedule, resources, and other information for developmental testing. It is prepared by the PM. Unit, integration, and functional testing are performed under the direction of the PM. System Testing, Security Test and Evaluation (ST&E for IT systems) and User Acceptance Testing are performed independently of the development team. Acceptance tests will be performed by user groups in a test environment that duplicates the production environment as much as possible. The user groups must ensure that:

• Requirements are defined in a manner that is verifiable

Page 44: Pia Reservation SDLC

B-38 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

• Requirements support the traceability from the source documentation to the design documentation to the test documentation

• Functional requirements are properly implemented by the system in the production environment

The DTP should have a schedule for DT that shows the DT activities in the context of the TEMP schedule of activities and describes any dependencies between DT and OT. The DTP should also address the data required to test the system and any procedures required to develop test data or to protect sensitive data. Note that any DHS data (non-fictitious data containing information about people or resources) used in testing must meet privacy compliance requirements, which do not distinguish between the use of data for operational systems and use of data during testing, pilots, or prototypes. This means that before actual testing occurs, all required privacy compliance documentation must be completed and approved. This policy also applies to OT.

5.1.6 Complete Information Security Activities (IT Only) The Office of Information Security maintains both a classified and an unclassified version of the CISO-approved C&A tool. DHS 4300A, Sensitive Systems Handbook applies to all unclassified systems and used throughout this document as reference. For classified systems, all references should be made to DHS 4300B, National Security Systems Handbook. Access to the classified or unclassified version of the CISO-approved C&A tool should be made through the Component’s Information System Security Manager (ISSM).

5.1.6.1 Develop Security Requirements Traceability Matrix (IT Only) During the Requirements Definition stage, initial steps toward the system C&A are accomplished. The purpose of these initial steps is to analyze the threats to and vulnerabilities of the system, to determine the potential for losses or compromise, and to use the analysis as a basis for identifying appropriate security controls for reducing risk. The Security Requirements Traceability Matrix (SRTM) documents the security requirements for each system. It is automatically generated by the CISO-approved C&A tool through the use of a questionnaire that facilitates the selection of appropriate requirements and controls, prior to generating the SRTM. Each IT project must develop an SRTM, which becomes the basis for the security design, certification, and accreditation activities for the project.

5.1.6.2 Develop Plan of Action and Milestones (IT Only) The Plan of Action and Milestones (POA&M) document is a tool that describes plans and associated tasks to correct identified weaknesses that can potentially be harmful to DHS systems. The POA&M is also used to document progress in correcting previously identified weaknesses; it is updated often throughout the systems engineering life cycle. For more information on the POA&M, refer to DHS 4300A.

Page 45: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-39 11-05-08

5.1.6.3 Initiate Development of System Security Plan (IT Only) The System Security Plan (SSP) provides an overview of the security requirements of the system and describes the controls that are currently in place or planned to meet security requirements. The SSP details the management, technical and operational security controls required for the system based on the type of information being processed and the degree of sensitivity and risk. DHS 4300A and DHS 4300B prescribe the SSP requirements based on the C&A process appropriate for the system. The CISO-approved C&A tool currently provides SSP templates. Additionally, the security team must:

• Develop the ST&E in support of the C&A process. • Identify necessary security metrics and plan for data collection and reporting. • Enter initial system information in the CISO-approved tool for FISMA reporting; a

completed and validated Privacy Threshold Analysis must also be loaded into the Trusted Agent FISMA tool.

5.1.6.4 Develop Security Risk Assessment (IT Only) Like the SRTM, the Security Risk Assessment (SRA) is part of the system C&A process. As outlined in the Department of Homeland Security Certification and Accreditation Guidance for SBU Systems User’s Manual, the steps for conducting a security risk assessment include:

• Inventory and System Boundary Verification • Information Security Categorizations4 • Determination of E-Authentication Requirements5 • Risk Assessment • Creation of C&A package using the CISO-approved C&A tool The Department of Homeland Security Certification and Accreditation Guidance for SBU Systems User’s Manual contains step-by-step directions for completing the security risk assessment as well as information about obtaining accounts for the CISO-approved C&A tool. The CISO-approved C&A tool contains templates for completing the Risk Assessment document. If contractors are used on a project, the contract must include the appropriate Homeland Security Acquisition Regulations clauses that identify the appropriate level of security clearance required of the personnel fulfilling the resource requirement for the project team, as well as to indicate the method to be used to ensure that these and other security requirements are met. For more information regarding the SRA, refer to DHS 4300A.

4 DHS Information Security Categorization Guide. 5 DHS Information Security Categorization Guide.

Page 46: Pia Reservation SDLC

B-40 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

5.1.6.5 Develop Security Test and Evaluation Plan (IT Only) The ST&E Plan, part of the System Security Plan, is a document developed to support the certification and accreditation process in accordance with DHS policy found in DHS 4300A. The ST&E Plan documents the specific testing of the security features to be implemented within the system and other related systems. For more information regarding the ST&E, refer to DHS 4300A.

5.1.7 Develop Service Level Agreements In order to facilitate project success, formal agreements – SLAs – should be developed with all organizations (internal and external) that will contribute to and/or share resources with the project. In this case, the term “resources” refers to infrastructure assets, DHS SOA services, personnel, and funding. SLAs may or may not require a monetary exchange, but will always involve some type of exchange transaction. Thus, these agreements should be in the form of a Memorandum of Agreement (MOA) or an MOU so that both parties are aware of and formally agree to the defined terms and conditions. For IT systems that have interfaces to other systems, an SLA, at minimum, should be developed. An Interface Control Document (ICD) or IEPD may be needed. See Attachment C.15 for more guidance. When the SLA is defining the interaction for an IT service (which is not already in the service component reference model), the service(s) should be submitted for insertion into the HLS EA service component reference model. The service component reference model categorizes both “above the line” services (services that cross component organization boundaries) and “below the line” services (services that are internal to a Component). The insertion process varies depending whether the service is “above the line” or “below the line.” “Above the line” services require submission of the Service Insertion Packet (SIP). The SIP documents the service from a business, service level, and technical context. The SIP requires exchange partners to agree to the inputs and outputs at a business and technical level and the level of service to be provided. The exchange should be documented in the form of an ICD that identifies the data exchanged and documents the protocols used to exchange the data. The data exchanged should be documented in the form of a NIEM IEPD if the exchange is inter-Component or inter-Agency. If an IEPD is developed in lieu of an ICD, supplemental documentation is required to explain the protocols used to implement the exchange. Upon approval, the SIP is registered in the HLS EA as a Department-wide service. This human readable registry is intended to foster reuse and consolidation of IT services throughout the Department.

5.1.8 Develop Disaster Recovery Plan (IT Only) The project team must develop a disaster recovery plan. This plan describes the organized process for implementing emergency responses, back-up operation, and post-disaster recovery to ensure the availability of critical IT resources and to facilitate continuous operations in a disaster. A Disaster Recovery Plan takes effect when normal procedures fail to resolve a problem.

Page 47: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-41 11-05-08

5.1.9 Map Project Data to HLS EA Data Architecture (IT Only) To identify and evaluate data requirements for the project, the project team needs to map the data required to support the business requirements to the HLS EA Data Architecture. This activity begins in Requirements Definition and is updated as data needs are clarified at points throughout the lifecycle, to be finalized in the O&M stage. The purpose of this activity is to determine the information that is required by the project to accomplish the mission, regardless of functional, procedural, or organizational boundaries.6 The Map to the Data Architecture document is used for documenting the mapping of the project data to the HLS EA Data Architecture. EDMO will use this mapping to identify potential involvement and support needs as well as impacts to the HLS EA Data Architecture. The Enterprise Architecture Board (EAB) uses the information in the Data Architecture to ensure data consistency and availability and to promote data sharing across the enterprise. For more information on data architecture alignment, refer to the EAB Process Guide.

5.1.10 Map Documents to the HLS EA Technical Reference Model (IT Only) The Technical Reference Model (TRM), a component of the HLS EA, provides guidance to ensure that proposed IT solutions are consistent and to promote interoperability across the DHS enterprise. The implementation of the TRM is accomplished through the use of the Standards Profile (SP). The DHS Standards Profile reflects the industry standards and actual documents that have been adopted for use throughout DHS. DHS IT project teams are responsible for mapping proposed solutions to the DHS TRM (and Target TRM) and Standards Profile in order to ensure compliance with existing standards and with the future architecture. Where no standard is in place, projects should identify the standard to be used. Projects may propose refinements to the TRM via the DHS Technology Insertion Process. For more information on DHS Technology Standards and Documents, refer to the EAB Process Guide.

5.2 Documents The project team records the evidence of its work in the set of documents listed in Table 5-1. The tailored plan approved for use in the project determines the documents applicable for the project. The project WBS must include the corresponding line items for completion of each artifact. All documents should be placed under CM control. For a full listing of the documents, including other items that get updated during the Requirements Definition stage, see Attachment B-5 of this document.

6 EAB Governance Process Guide Version 3.0, March 31, 2006, Department of Homeland Security Enterprise

Architecture Board, Appendix G, p. 121.

Page 48: Pia Reservation SDLC

B-42 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Table 5-1: Requirements Definition Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Functional Requirements Document (FRD) DHS SELC C U U U U FRequirements Traceability Matrix (RTM) DHS SELC C U U U U FTest and Evaluation Master Plan (TEMP) DHS SELC C U FSecurity Requirements Traceability Matrix (SRTM) DHS CISO C U FPlan of Action & Milestone (POA&M) DHS CISO C U U FSystem Security Plan (SSP) DHS CISO C U U U FContingency Plan DHS CISO U FDisaster Recovery Plan DHS SELC C U FSecurity Risk Assessment (SRA) DHS CISO C U FSecurity Test & Evaluation (ST&E) Plan DHS CISO C FMap to the Data Architecture DHS EA Process C U U FMap to the Business Architecture DHS EA Process C U U FMap to Technology Standards & Products DHS EA Process C U U FSystem Definition Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

5.3 System Definition Review and Exit Criteria The objective of the SDR is to evaluate the readiness of a project to proceed to Stage 3, Design. The SDR is conducted by the project sponsor and the Project Manager. As the basis for evaluation, the SDR uses a set of exit criteria to evaluate completion of activities and products for this stage of the SELC. The results of the SDR review are recorded in a document called the System Definition Review Approval Letter, which must be signed by the approval authority along with the project sponsor, who attests that the requirements satisfy the business need. However, it is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. The SDR Approval Letter is countersigned by the PM. A scanned electronic copy of the signed stage review approval letter, along with the updated project management plan (and associated project schedule), must be emailed to the DHS Periodic Reporting team at [email protected]. If the project is not approved due to issues and/or risks, a remediation plan must be developed and approved by the approval authority. The approval authority will establish the timeframe for the next Project Planning Review approval opportunity.

5.3.1 Exit Criteria The objective of the SDR is to evaluate the products produced and to ensure that all exit criteria have been satisfied for a given project. Table 5-2 lists the exit criteria for the Requirements Definition stage.

Page 49: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-43 11-05-08

Table 5-2: Requirements Definition Stage Exit Criteria

Domain Exit Criteria Project Management • Have the risks been reviewed and are they deemed acceptable to move to the next

stage? • Is the project on schedule or have remediation plans been developed to correct for

schedule loss? • Have all Action Items from the PPR been resolved? • Have all products defined in the approved Project Tailoring Plan been completed and

reviewed for completeness? Requirements • Are the business performance metrics defined?

• Can the requirements for this project support any other organizations? • Have the requirements collected to date been specified in clear, meaningful, and

testable format using "shall" statements? • Have all the business requirements been reviewed by the acceptance test team to

ensure that the requirements are clear, meaningful, and testable? • Has requirements interdependency been considered and/or analyzed? • Have scalability, availability, and reliability been addressed? • Have interface requirements for both external and internal system interfaces been

identified and defined? • Do the user interface requirements clearly define all the human interface needs? • Do the training requirements accurately account for the users and administrators of

the system? • Have the Section 508 EIT Accessibility requirements been documented? • Do the reporting requirements ensure that the business users get the information they

need? • Have “as-is” and “to-be” business processes been defined and reflected in the ORD?

Information Security • Has an ISSO been assigned? • Has the boundary and inventory information been validated by the CISO? • Has the e-authentication analysis been completed? • Has the preliminary risk assessment been completed? • Has the C&A package been generated in the CISO-approved C&A tool? • Has the SRTM been developed? • Has the SSP been drafted? • If appropriate, has an ISA been developed? • Has the ST&E plan been developed? • Has the Contingency Plan been completed?

Performance • Do the existing SLAs of any service providers to be used satisfy the project‘s needs? • Has the initial Capacity and Performance analysis included business capability and

performance requirements? • Has the preliminary workload characterization for the project been documented based

on the business volumetrics? • Have system's life-cycle costs been updated based on the updated capacity and

performance analysis? • Have the performance requirements been specified completely in clear, meaningful,

and testable "shall" statements?

Page 50: Pia Reservation SDLC

B-44 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Domain Exit Criteria Data Management • Have the data supporting the business processes been specified to a conceptual

level? • Have data management requirements been defined? • Do data retention requirements meet the business need? • Does the data conversion plan (documented in the data management plan) account

for possible cleansing and data quality issues as well as performance impacts to the existing Data Architecture?

• Are modifications to the Data Architecture necessary to accommodate the proposed system?

• Has alignment to the HLS Data Architecture been provided? • Have data architecture alternatives been categorized, prioritized, and cost-justified?

Configuration Management

• Are all documents, especially requirements, under CM control? • Have Configurable Items (CIs) for the project been identified and submitted to the CM

team? Testing • Does the Test and Evaluation Master Plan (TEMP) identify those responsible for

developing the test procedures, running the test, and identifying which reports will be provided?

• Does the schedule include enough time to conduct integration testing, performance testing, Section 508 Accessibility testing, and acceptance testing while the developers are re-working the code?

• Is the DTP adequate and linked with the TEMP? • Does the DTP describe the independent role of the acceptance test team? • Has the test lead for acceptance testing been appointed? • Does the DTP specify defect severity level definitions? • Are all functional requirements stated such that they are testable?

Enterprise Architecture

• Are the technologies identified in the solution consistent with the target TRM? • Does the technical approach embrace re-usability? • Will the outcome result in new Service Components that can be registered in DHS

Service Registries? • Is the data required by this project already available or will it be made available to

others? • Are modifications to the EA necessary to accommodate the proposed system and

have they been through the Technology Insertion (TI) Decision Request Process? • Are the technologies identified in the solution consistent with the technology patterns

and IT components? • Does the solution conform to DHS data standards? • Are the IT components identified in the Application Architecture? • Does the project provide an enterprise-wide solution? • Have elements been identified for re-use, enhancement, or creation of new services

for the SOA? • Is the project aligned with a DHS IT Portfolio?

Software Engineering • Have business processes been specified to a logical level? • Have processes been documented (e.g., use cases, flow diagrams)? • Do the priorities listed for each requirement accurately represent the business'

capability needs? • Are sources documented for each requirement? • Have business requirements been validated by end users? • Do the SLAs address the needs of all parties by defining the expectations of each? • Does the design include processes and capabilities to monitor and review all aspects

of the SLAs?

Page 51: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-45 11-05-08

Domain Exit Criteria Infrastructure • Has an illustration depicting the conceptual network been developed and

documented? • Has the technical infrastructure been specified to a conceptual level? • Have locations and types of infrastructure components been identified and

documented? • Have all infrastructure requirements (e.g., heating, ventilating, and air conditioning,

power, fail-over, communications, redundancy, facility space) been defined? • Have critical infrastructure designations as defined in HSPD-7 been analyzed and

completed? Operations and Maintenance

• Have system management and support processes (e.g., call center, help desk) for the system been identified?

• Has an initial Disaster Recovery Plan been developed? Transition • Has a Business Impact Assessment (BIA) been initiated as part of Organizational

Change Management activities?

6. Stage 3: Design

The objective of the Design stage is to transform the baselined requirements into comprehensive, logical, and detailed designs to efficiently and effectively guide development. The decisions made in this stage address, in detail, how the system will meet the variety of defined requirements. Activities performed in this stage also ensure that the following objectives are met:

• The detailed design specifications align with the HLS EA (including SOA), Section 508 Accessibility standards, privacy requirements, and security requirements

• Mechanisms to handle system disruptions are considered and planned • All required technology insertions into the DHS EA TRM are completed and

approved Design activities may be conducted in an iterative fashion, producing first a logical or preliminary design of the solution that emphasizes the functional features of the system, then a more detailed design that expands the logical design by providing all the technical detail required to implement the logical design and that allows system development to begin. Projects are encouraged to use an automated design tool. If appropriate, prototyping should be used to help refine requirements, provide stakeholders with an initial view of the system, and gain their approval to proceed with development. More information on prototyping can be found in Attachment B2, section B2.6.2. The Design stage includes a Preliminary Design Review (PDR) and a Critical Design Review (CDR). The PDR is conducted during the Design stage while the CDR is conducted at the end of the Design stage. For information on Developing the Deployment Plan activities, see Appendix J: Supportability and Sustainment Guide of the Instruction/Guidebook 102-01-001.

Page 52: Pia Reservation SDLC

B-46 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

6.1 Design Activities The following sections present typical activities that are performed as part of the Design stage. The set of activities actually performed for a given project in the Design stage may differ, depending upon the SLC tailoring approved for the project.

6.1.1 Define System Requirements and Update the Requirements Traceability Matrix System requirements are defined at the lowest level of granularity sufficient for system and sub-system generation, and are documented in the SRD. System requirements are driven by the functional requirements and ensure that all design elements can be built and tested. The RTM is updated to ensure that all requirements are directly traceable to their source and are accounted for in the design. This process of updating requirements may include scope changes and be governed by a change control board or other control. Table 6-1 lists the SRD elements that should be addressed unless they do not apply due to the characteristics of the project.

Table 6-1: SRD Elements

Typical Elements Addressed in the SRD • Data Requirements (IT only) • Process Requirements • User and System Interface Requirements • Hardware and Software Interfaces • Communication Interfaces (IT only) • Data Currency and Retention (IT only) • Audit Trail Requirements (IT only) • System Availability, Reliability, and Recoverability • System Performance • Fault Tolerance

• System Portability • Operational Scenarios • Security Requirements • Privacy Requirements (IT only) • Section 508 Accessibility Requirements • System Process Models • Data Flow Diagrams • Context Diagrams • Interoperability • Capacity

6.1.2 Develop Logical Design The logical design (often called “preliminary” or “high-level” design, as opposed to “system design” or “detailed design”) identifies, analyzes, defines, and relates the functional features of the solution to the architectural components of the system and is the foundation for development of the detailed design of the system. The logical design addresses the data, applications, technical infrastructure, business processes, organizational, and location aspects of the solution.7 Elements commonly found in a logical design document are presented in Table 6-2.

7 Enterprise Life Cycle Guide v2.0, Internal Revenue Service, May 1, 2006, p.47.

Page 53: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-47 11-05-08

Table 6-2: Logical Design Elements

Common Elements in the Logical Design Document8 • Alignment with higher-level designs and HLS EA • Functional application subsystems and flows (IT

only) • User-system interface and/or prototype • Interface (internal/external) designs • Requirements allocated to logical components • Updated RTM • Verification of technical infrastructure (development

and test environments)

• Identification of required common services • Software architecture strategy (IT only) • Security • Privacy (IT only) • Performance • Application builds (IT only)

6.1.3 Develop Data Architecture (IT Only) The Data Architecture describes the data and data relationships required for the proposed solution, as well as the approach for data storage, access, continuity, and implementation. The Data Architecture document includes a logical data model, a process model to describe data management, a mapping of data entities to business function, requirements, data flows, and data interoperability requirements. Data Architecture development also aligns the data quality assurance plan and data security plan with the data management goals and business requirements. The data architecture document is complemented by the Data Management Plan (DMP). For additional details on Data Architecture documentation, see the Data Architecture template. A key step in the development of the data architecture is maintaining the traceability mappings of the Logical Data Model (LDM) to the EDMO Enterprise LDM and the HLS EA Conceptual Data Model. Additionally, the Data Architecture must maintain data alignment documents and develop metadata for data assets as critical elements of the Data Reference Model (DRM) artifact library. The DRM artifacts are submitted to the EA Center of Excellence (COE) and the EDMO as the Data Insertion package to be registered in the DHS Data Asset Catalog.

6.1.4 Conduct Preliminary Design Review The Preliminary Design Review (PDR) is a multi-disciplinary technical review intended to ensure that the system under review can proceed into detailed design and that it can meet the stated performance requirements within cost (program budget), schedule (program schedule), risk, and other system constraints. Generally, this review assesses the system preliminary design, as captured in performance specifications and logical designs, and ensures that each function in the functional baseline has been allocated to one or more system configuration items. It determines whether the hardware, human, and software preliminary designs are complete, and whether the team is prepared to start detailed design and test procedure development. Often the program performs Early Operational Assessment at this time (e.g., using modeling and simulation).

8 Ibid, p. 45-47.

Page 54: Pia Reservation SDLC

B-48 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

The PDR evaluates the set of requirements to determine whether they correctly and completely satisfy all system requirements allocated to the subsystem. The PDR also determines whether subsystem requirements track with the system design. Any concerns as to whether the logical design is aligned and supportable by the technical architecture should be raised and resolved before the detailed design begins. The PM should tailor the review to the technical scope and risk of the system. Typical PDR success criteria include affirmative answers to the following exit questions:

• Have all critical design issues been resolved? • Can the preliminary design, as disclosed, satisfy the requirements? • Is the preliminary design producible within the production budget? • Is the logical design aligned and supportable by the technical architecture? • Have human integration design factors been reviewed and included, where needed,

in the overall system design? • Are the risks known and manageable for development testing and operational

testing? • Has security been addressed throughout the design? • Have data stores and interchanges been addressed? • Is the program schedule executable (technical/cost risks)? • Is the program properly staffed? • Is the program executable with the existing budget? • Does the updated cost estimate fit within the existing budget? The PM should conduct the PDR when all major design issues have been resolved and work can begin on detailed design. The PDR should address and resolve critical, system-wide issues.

6.1.5 Develop System Design The detailed design expands on the logical design to define all elements of the solution to the lowest level of granularity required to facilitate development and satisfy requirements. During system design, the allocation of requirements to configuration items and then to components of the configuration items is documented in the RTM by adding columns containing references to configuration items and components. In this way, a clear relationship between system components and requirements is established for use during subsequent Validation and Verification (V&V) activities. If current information needs to be converted/migrated/transitioned to the new system, strategies shall be designed for those purposes, especially if converting means re-engineering existing processes. Strategies must also be designed to initiate new and/or update existing agreements, plans, privacy compliance documentation, and assessments (e.g., the Data Management Plan, System Security Plan) to reflect the impact of these conversion and implementation plans. Elements commonly found in a detailed design document are presented in Table 6-3.

Page 55: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-49 11-05-08

Table 6-3: Detailed Design Elements

Common Elements in the Detailed Design Document9 • Demonstrated alignment with higher-level designs,

other design components, and HLS EA • Application, transactions, subsystems, interfaces (IT

Only) • Data designs (IT Only) • Infrastructure designs • External interface designs • Proof of concept/prototype

• Security design • Section 508 Accessibility approach • Performance analysis • Coding standards (IT Only) • Internal interfaces and relationships designed

6.1.6 Validation and Verification (IT Only) The sooner a mistake in the development of a system is discovered, the less expensive it is to fix. Therefore, once the design is complete, each project must conduct a requirements review, V&V, to verify that the system design adequately addresses the requirements. The partitioning of requirements, allocation of requirements, identification of derived requirements, and definition of interfaces are all as important as the development of the initial requirement. See Attachment B, section 2B.13 for more information regarding V&V.

6.1.7 Develop Technology Insertion Decision Request (IT Only) If there are items in the design that are not compliant with the HLS TRM, the project must initiate an EA Technology Insertion (TI) Decision Request process. The TI must be approved by the EAB before any purchases are made and before the project can move beyond the Design stage. For more information regarding the EA TI Decision Request process, refer to the DHS EAB Governance Process Guide.

6.1.8 Develop Interconnection Security Agreements (IT Only) Other systems (if any) that interface with the system under development must be identified, including the exchange of data or functionality that will occur. All connecting areas need to be documented for security and information flow purposes. System interconnection requirements must be documented in the SSP. OMB Circular A-130 requires that written management authorization (an MOU or MOA) be obtained prior to connecting with other systems and/or sharing sensitive data or information. The written authorization details the rules of behavior and controls that must be maintained by the interconnecting systems. A description of the rules for interconnecting the systems and for protecting shared data must be included with the security plan. The system interface requirements are documented in one or more Interconnection Security Agreements (ISA). ISAs provide the formal documentation used to record and control interface agreements among participating systems. Formal agreements should

9 Enterprise Life Cycle Guide v2.0, Internal Revenue Service, May 1, 2006, p.48.

Page 56: Pia Reservation SDLC

B-50 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

be negotiated and initiated between/among DHS Components, other government agencies, and non-government stakeholders as appropriate. For more information, refer to DHS 4300A, Attachment N, Preparation of Interconnection Security Agreements, which identifies the DHS requirements for ISAs.

Table 6-4: Deployment Plan Elements

Elements in the Deployment Plan • Process flow of deployment activities • Description of resources/support materials required • Description of site preparation/facilities work required • Roles and responsibilities • Deployment schedules • Issue resolution plan

• Support requirements • Support personnel requirements • Support documentation requirements • Acceptance test requirements at development site • Acceptance test requirements at deployment site

6.1.9 Initiate Privacy Impact Assessment (IT Only) The completed and validated Privacy Threshold Analysis determines the applicability of the Privacy Impact Assessment (PIA) requirement. Generally, a PIA is required when technology uses (i.e., collects, maintains, disseminates) personally identifiable information. The PIA serves as the primary mechanism for conducting and documenting the privacy analysis. It is a longer form than the Privacy Threshold Analysis and the System of Records Notice, and is organized into nine sections: 1. Information Collected and Maintained 2. Uses of the System and the Information 3. Data Retention 4. Internal Sharing and Disclosure 5. External Sharing and Disclosure 6. Notice 7. Individual Access, Redress, and Correction 8. Technical Access and Security 9. Technology Each section includes a list of specific questions all of which are focused on identifying the information in the system and the different ways that information is used and shared. The PIA must be completed by the PM and approved by the DHS Chief Privacy Officer prior to system implementation. The PIA need not be completed at the end of the SLC Design stage, but it must be completed by the end of the SLC Integration and Test stage. PIAs are published to the public via the DHS Privacy Office website (www.dhs.gov/privacy – follow the links to the PIA section). The only PIAs that are not published publicly are those for national security systems; PIAs still must be completed for all systems in this category. The DHS Privacy Office distributes the PIA template

Page 57: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-51 11-05-08

along with specific guidance for completing the PIA as part of the PTA finalization process. In addition, the PIA template and guidance are available from the DHS Privacy Office website (both internal and external) and directly from the office via email: [email protected].

6.1.10 Initiate System of Records Notice (IT Only) The DHS Privacy Office will coordinate a determination as to whether a System of Records Notice (SORN) is required for the collection(s) of information related to the system. Part of this determination includes a review of existing SORNs to identify any SORNs that may already cover the data and/or any existing SORNs that could be or must be updated to accommodate the intended uses of the system’s data. The SORN, which is drafted in coordination with the Office of General Counsel (OGC), is generally organized into seven sections covering the following issue areas: 1. Categories of individuals covered by the system 2. Categories of records (information) covered by the system 3. Authority for maintaining the system 4. Purpose(s) of the system 5. Routine uses (external sharing) 6. Policies for accessing, storing, and disposing of information 7. Procedures for notice and contesting information in the system SORNs are published to the public in the Federal Register and are also published on the DHS Privacy Office website. The project cannot be implemented until the SORN is finalized, approved, and published for a period of at least 30 days. The nature of the system and the use of data may sometimes require additional steps related to the SORN. These additional steps could include a longer period for publication prior to system implementation. Hence, additional time should be built into the pre-implementation stages to accommodate the privacy compliance documentation requirements. While the SORN does not need to be completed at the end of the Design stage, it must be completed by the end of the Integration and Test stage.

6.2 Documents The project team records the results of their efforts in Stage 3, Design in the set of artifacts presented in Table 6-5. Artifacts must be traceable to the project WBS. Table 6-5 only shows the artifacts that are first created in the Design stage. All artifacts should be placed under CM control. For a full listing of the artifacts, including others that get updated during the Design stage, see Attachment B-5 of this document..

Page 58: Pia Reservation SDLC

B-52 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Table 6-5: Design Stage Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Service Level Agreements DHS SELC C U F FSystem Requirements Document DHS SELC C U U U F FInterconnection Security Agreement (ISA) DHS CISO C/FLogical Design Document DHS SELC C/FData Architecture Document DHS SELC C/FSystem Design Document DHS SELC C U U FTechnology Insertion Decision Request DHS EA Process C/FSite Prep Plan DHS SELC C/FCritical Design Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

C = Create; F = Finalize, and U = Update

6.3 Stage Reviews and Exit Criteria The fundamental purpose of the Design stage is to fully design all aspects of the system so that the development organization can build the system and the test organization can begin developing test procedures to verify the attainment of technical performance specifications, operational effectiveness, and suitability. The PDR should have been conducted during the Design stage to ensure that the logical design and architecture satisfy the requirements. For more information on the PDR, see Section 6.1.4 of this document. At the end of the Design stage, the Project Manager and Project Sponsor conduct a CDR, evaluating the exit criteria to ensure that all exit criteria have been met. A CDR is deemed successful when the project team demonstrates that the design is complete and accurate in its specification and can produce the results defined in the baseline requirements. The approval authority evaluates the design and signs the CDR Approval Letter certifying that the design is sufficiently detailed for the development team to build without further change to the design. Should change to the design be required, another CDR with appropriate sign-offs shall be conducted. It is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. (Some key experts are identified in the following list of stage review participants.) The PDR and CDR Approval Letters are countersigned by the PM. A scanned electronic copy of the signed stage review approval letter along with the updated project management plan (and associated project schedule) must be emailed to the DHS Periodic Reporting team at [email protected]. For IT systems, the Critical Design Review cannot be completed until all technology insertions required for the system have been completed and are approved by the EA COE.

Page 59: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-53 11-05-08

6.3.1 CDR Participants Table 6-6 lists the participants for the CDR.

Table 6-6: Participants for Critical Design Review

Participant Responsibility Component CIO • Validates that the project has sufficiently detailed design elements to allow the

development stage to begin • Validates that all exit criteria have been met • Approves/Disapproves project to proceed to Development Stage and signs CDR

Approval Letter DHS IT Portfolio Manager(s)

• Validates that IT Portfolio objectives are accounted for in the system design

DHS PADRT • Validates that the design is sufficiently defined to allow development to begin Component EAB • Validates that the design aligns with the Component EA and that all products/tools

comply with the Component TRM Project Sponsor • Validates that the project is aligned with the project’s objectives

• Validates that the project is within cost, schedule, and performance constraints • Validates that all risks are defined and manageable

Project Manager • Presents project status across all aspects of the project (technical, cost, schedule, risk)

Business Stakeholder Representatives (number differs with each project)

• Validates that the design accurately and completely reflects the business needs

ISSO • Validates that the design will provide sufficient security safeguards for the system System Development Manager (one responsible for developing/building the system)

• Validates that the design is sufficiently defined to allow development to begin with no re-work of system design

• Signature required on CDR Approval Letter indicating design is sufficient and is approved to build

OAST or Component Section 508 Coordinator

• Approves certain exceptions for Section 508 • Validates that Section 508 requirements are fully accounted for in the system

design

6.3.2 Exit Criteria Table 6-7 lists the exit criteria for the Design stage.

Table 6-7. Design Stage Exit Criteria

Domain Exit Criteria Program Management • Have the risks been reviewed and are they deemed acceptable to move to the next

stage? • Is the project on schedule or have remediation plans been developed to correct for

schedule loss? • Have all Action Items from the SDR been resolved? • Have all artifacts defined in the approved IT Project Tailoring Plan/Project Tailoring

Plan been completed and reviewed for completeness? • Has the development organization been identified and is it ready (possibly under

contract) to begin development? • Has the development manager approved the design as sufficient to develop the

Page 60: Pia Reservation SDLC

B-54 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Domain Exit Criteria solution?

• Have all documents from previous stages been updated as necessary to reflect new information and decisions from the current stage?

• Has the Cost Benefit Analysis been updated to reflect the final system design? • Has the Concept of Operations for the proposed system been updated in accordance

with the final system design? Requirements • Have the functional requirements been logically decomposed to an acceptable level of

detail in the system requirements? • Do the business requirements represent the needs of the system to perform

successfully while in production? • Have Section 508 Accessibility requirements been addressed in the design? • Are the system requirements allocated to software components? • Are the infrastructure requirements defined? • Have all the system requirements been reviewed by the acceptance test team to

ensure the requirements are clear, meaningful, and testable? • Do users agree that the user interface design meets the business need?

Information Security • Has security been designed-in as an integral component of the preliminary system design?

• Has a contractor been identified to conduct ST&E? • Does the security test and evaluation plan provide for the testing of all security

controls? • Does the security design satisfy the specified security categorization (FIPS 199)? • Does the system design provide the security reports needed to audit and monitor the

system in production? • Does the design include sufficient auditing to re-create user/administrator activities?

Privacy • Does the design appropriately protect and limit the use of personal data? Performance • Do the system performance requirements meet the business need?

• Has the model/simulation of the preliminary system design been refined to the level of detail sufficient to assign performance budgets to subsystems?

• Have performance budgets been assigned to subsystems for the amount of time allowed to complete a task and the resources available for that task?

• Has the model/simulation of the final system design been refined to the level of detail sufficient to predict system component performance in the development, test and production environments?

Data Management • Have all major functions performed by the application been defined to a level sufficient to account for transformation of all data elements processed by the function?

• Have data capacities been analyzed and incorporated into the design? • Has test data been identified for unit testing, integration testing, and acceptance

testing? • Has the design demonstrated that the data architecture will provide the capacity to

meet functional and performance requirements? • Are the requisite Information Sharing Agreements in place? • Has the Data Management plan been updated? • Has the DRM been updated and LDM aligned with Enterprise models?

Configuration Management

• Is the system design under configuration management control?

Page 61: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-55 11-05-08

Domain Exit Criteria Testing • Has a contractor been identified for Independent Verification and Validation?

• Are all system requirements stated such that they are testable? • Has the source for development and test data been identified? • Have technical plans been made to conduct testing with legacy systems (either using

their test environment or production environment)? • Have specific Section 508 compliance testing requirements been identified?

Enterprise Architecture • Is the final system design aligned with the Homeland Security EA? Software Engineering • Do the system requirements represent a final decomposition of the functional

requirements? • Has the logical design of all business processes been specified? • Have all software development tools (e.g., IDE, CM) been installed and configured? • Have requirements been updated based on the user review of the proof-of-concept,

pilot, and/or prototype? • Do software vendors acknowledge that all proposed software products have been

previously integrated? • Have human integration design factors been reviewed and included, where needed, in

the overall design? • Have Section 508 technical standards been selected so that functional performance

criteria are fulfilled? • Has the software system design been specified in sufficient detail that a different

contractor could perform all coding without any additional information from the design team?

• Is each design element traceable to its source requirement? • Has the development manager reviewed the system design and concur that the

design is sufficiently detailed to develop code? • Are all system and functional requirements accounted for in the design? • Will the design meet the specified performance requirements? • Are all interfaces between software components and infrastructure defined? • Does the bill of materials for acquisition of equipment and software represent the

entire list for equipment and software? • Is the structure of each to-be-built component and its interfaces defined? • Are the installation and configuration parameters of all COTS products specified?

Infrastructure • Has the logical architecture specified all infrastructure components (new and existing) by type and capability?

• Have impacts to existing equipment (e.g., routers, servers, mainframes, network circuits) been analyzed?

• Has the physical design specified all infrastructure components by vendor, model, and version?

• Is the infrastructure design specified in sufficient detail that a new contractor could build out the design without any additional information from the design team?

• Is the development environment ready for use? Operations and Maintenance

• Has an initial Disaster Recovery Plan been updated?

Transition • Have communications been developed notifying users of transition to the new system?

• Have changes to organizational structure (e.g., help desk, O&M staff) been developed?

• Have plans been developed for retiring system(s) that this system replaces? (I.e., removing h/w, s/w, closing out unneeded interfaces, archiving/subsuming documentation.)

Page 62: Pia Reservation SDLC

B-56 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

7. Stage 4: Development

The objective of the Development stage is to build and test the components designated by the products of the Design stage. Although much of the activity in the Development stage centers around the creation of the elements that make up the system, activities also take place to develop user documentation (operator, maintenance, and user manuals), establish the “integration and test environment” (where testing and verification takes place), etc. The Development stage includes activities for building, configuring, and preparing for testing system components against their allocated requirements. For a complex system design with multiple system components, there may be several simultaneous instances of the Development stage in play during the project life cycle. Development stage test activities are typically performed in the development environment. When all Development stage activities have been completed, all system components that constitute the system will be ready for the Integration and Test Stage.

7.1 Development Activities The following sections represent the typical activities performed as part of the Development stage. The set of activities actually performed for a given project in the Development stage may differ depending upon the SELC tailoring approved for the project.

7.1.1 Build, Construct, and Configure the System This activity includes the steps required to transition from the Detailed Design, which consists primarily of a detailed written description of the system, to an implementation of the design. This activity may be accomplished through internal (i.e., DHS staff) development, through a procurement/acquisition (e.g., using contractors, purchasing COTS/Government Off-the-Shelf [GOTS] software and/or equipment), or a combination of both. The approach to be used will be described in the Acquisition Plan and the PMP. Regardless of the approach (internal, contracted, or combined), the system must be developed and/or acquired in accordance with the Design Specification. Changes to the Detailed Design and/or documented requirements must continue to be managed in accordance with the approved CM Plan. The result of this activity is an initial system implementation that can be tested and eventually deployed.

7.1.2 Conduct Unit Testing As modules or subroutines are developed or updated, the development team must test to verify the module or subroutine work appropriately. This is the lowest level of developmental testing (DT) that can be done on a code module or unit.

7.1.3 Develop Test Case Specifications There are essentially two sets of test cases that need to be developed, one for the independent test team (documented in the Test Case Specification) and one for the

Page 63: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-57 11-05-08

user acceptance team (documented in the System Acceptance Test Procedures). To provide feedback to the PM, test cases need to be developed that test the requirements and capabilities of the system. A test case is a document that describes an input, action, or event and an expected response; it is used to determine if a feature of a system is working correctly. A test case should contain particulars such as test case identifier, test case name, objective, test conditions/setup, input data requirements, steps, and required results. An important potential byproduct of test case development is the identification of problems in any aspect of the requirements or design of a system, including any inconsistencies with mandated requirements, since the team must consider all possible scenarios to develop a complete set of test cases. To complete this activity, the test teams must develop both the Test Case Specification and the System Acceptance Test Procedures.

7.1.4 Develop Version Description Document (IT Only) The Version Description Document (VDD) is developed to describe the component version to be released. It specifies equipment configurations and dependencies and inventories materials released, including software contents, software installation files, and software source files. VDDs must be updated through deployment to accurately reflect the system configuration.

7.1.5 Develop Documentation A variety of user documentation must be developed to assist those who operate, maintain, and use the new system. It is expected that documentation will be updated with a full description of the system components, end user information, and troubleshooting information as the system progresses through the Development and Integration & Test stages. Unless exceptions are granted, manuals should conform to Section 508 requirements for accessibility. The following subsections define the documents required to be produced.

7.1.5.1 Develop Operators Manuals Documentation must be produced that describes the components of the system and how the system is to work from the perspective of those who are to operate it. Everything that is required to operate and support the operation of the system needs to be provided in the operator’s manual, including a full description of the system components, the expectations for system operation, system functionality, and operational trouble shooting.

7.1.5.2 Develop Maintenance Manuals The Maintenance Manual describes the components of the system and the practices required to keep the system functioning. Performance monitoring, periodic testing of facilities and system functionality, and periodic maintenance activities to ensure the system functions as expected are included in the manual. Trouble shooting guides and other help content are also included.

Page 64: Pia Reservation SDLC

B-58 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

7.1.5.3 Develop User Manuals User Manuals describe how the end users – those for whom the system was developed – are to interact with the system in order to receive the benefits it provides. Information describing the functionality that the system provides and how to access it is included in the user manual. User scenarios that depict common interactions are often provided, as are instructions for help desk access, guidance for general troubleshooting from the user’s perspective, and where to go for training.

7.2 Documents The project team records the results of its efforts in Stage 4, Development in a set of documents, presented in Table 7-1. Documents must be traceable to the project WBS. All documents should be placed under CM control. For a full listing of the documents, including other items that get updated during the Development stage, see Attachment B-5 of this document.

Table 7-1: Development Stage Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Training Materials DHS SELC C/FTest Case Specification DHS SELC C/FSystem Acceptance Test Procedures DHS SELC C/FTest Readiness Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

7.3 Test Readiness Review and Exit Criteria The Project Manager and the project sponsor conduct the Test Readiness Review (TRR) at the end of the Development stage in order to review the results of the stage and to ensure that all exit criteria have been met. Unit testing should be complete for all configuration items, test cases should all be approved, and the test environment should be ready to use. The TRR includes the participation of the approval authority, who confirms that all exit criteria have been satisfied by signing the TRR Approval Letter to indicate the project can proceed to the Integration and Test Stage. It is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. The TRR Approval Letter is countersigned by the Project Manager. A scanned electronic copy of the signed stage review approval letter, along with the updated project management plan (and associated project schedule), must be emailed to the DHS Periodic Reporting team at [email protected]. If it is determined that all exit criteria have not been satisfied, another TRR with appropriate sign-offs shall be conducted.

Page 65: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-59 11-05-08

7.4 Exit Criteria Table 7-2 lists the exit criteria for the Development stage.

Table 7-2: Development Stage Exit Criteria

Domain Exit Criteria Program Management • Have the risks been reviewed and are they deemed acceptable to move to the next

stage? • Is the project on schedule or have remediation plans been developed to correct for

schedule loss? • Have all products defined in the approved Project Tailoring Plan have been

completed and reviewed for completeness? • Have all Action Items from the CDR been resolved? • Has the independent test team been identified and is it prepared to begin testing? • Have all documents from previous stages been updated as necessary to reflect new

information and decisions from the current stage? • Has the Cost Benefit Analysis been updated to incorporate the built-out System

Design? Information Security • Have all security controls been unit tested? Privacy • Have all obligations and limitations identified in the privacy compliance

documentation been met? Performance • Has the model/simulation of the developed system been calibrated using the results

of unit/development performance tests? Data Management • Have all changes to the Data Architecture design been recorded in the

corresponding documents?. Configuration Management

• Is all development code under CM control? • Are all COTS product configurations under CM control?

Testing • Do the test cases (integration and acceptance) appropriately test all the requirements?

• Are all the planned test scenarios traceable to the requirements? • Have all development test issues been resolved? • Has the test team lead (integration and acceptance) reviewed the code summary

metrics and deemed the system ready for testing? • Have the integration and acceptance test schedules been approved by the

appropriate test leads? • Do unit test results for components, subsystems, and systems form a satisfactory

basis for proceeding into integration and acceptance testing? • Has the test plan been reviewed and does it provide an actionable plan that

completely validates that the system satisfies all the requirements? • Have test cases for testing Section 508 technical standards and functional

performance requirements been developed? Software Engineering • Has the software been documented sufficiently so that a different contracting team

could understand the coding? • If applicable, have independent code reviews been passed successfully?

Infrastructure • Is the test environment (integration and acceptance) ready for use? • Have upgrades to existing equipment (e.g., routers, servers, mainframes, network

circuits) been contracted for? Operations and Maintenance

• Have all user, operator, and maintenance manuals and procedures been developed? • Has the final Disaster Recovery Plan been developed?

Transition • Have a Training Plan and all training manuals been developed and documented?

Page 66: Pia Reservation SDLC

B-60 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

8. Stage 5: Integration and Test

The purpose of the Integration and Test Stage is to demonstrate, through testing, that the solution developed satisfies all defined requirements and to complete the integration of configuration items that have been readied during the Development stage. For complex systems, the Integration and Test Stage may involve a plan to integrate configuration items incrementally rather than all at once, allowing a staggered development and/or deployment schedule. Projects are encouraged to use automated testing tools, if appropriate. When complete, Stage 5 results in a fully integrated and qualified system, to the extent that it can be verified in an Integration and Test environment that simulates the characteristics of the target operating environment.

8.1 Integration and Test Activities The recommended activities associated with Integration and Test include the verification of user documentation (operators, maintenance, and user manuals), developing and conducting Systems Testing along with the resultant reports, conducting Acceptance Testing, generating Section 508 Assistive Technology Interoperability Test Results, conducting Security Testing and Evaluation, preparing the Security Accreditation Documentation, completing a PIA, and completing a System of Records Notice, if needed. The following subsections present the details of these activities. The set of activities actually performed for a given project in the Integration and Test Stage may differ depending upon the approved Project Tailoring Plan.

8.1.1 Conduct System Testing and Generate Test Reports System testing is conducted to evaluate how well the developed system satisfies the defined and approved requirements. Components (e.g., hardware, software, communications) are assembled, or simulated, to provide a test bed in which to validate requirements. The system is tested to ensure that the integrated functionality delivered meets defined mission needs. To ensure that the system is appropriately tested for Section 508 Accessibility, the test team must also include users with disabilities or those with the knowledge, tools, and skills necessary to emulate users with disabilities. All tests, results, and follow up must be analyzed and documented in the System Test Report. Failed components are migrated back to the Development stage for rework/retest; components that pass migrated ahead for implementation. System testing cannot be conducted by the development organization. This does not mean that the same contractor cannot conduct the testing, but to minimize any conflict of interest, the test organization must report directly to the PM and personnel cannot be shared between the development organization and the test organization. Tests conducted must address the following:

• Functional capability • Integration of all system components • Performance • Security

Page 67: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-61 11-05-08

• Interoperability with external components, including legacy systems • Section 508 Accessibility • Data conversion/loading • Installation process • Operational procedures • Backup/recovery • Disaster recovery/fail-over To enable effective defect analysis and improvements in system design and development, testing activities must include the collection, documentation, and reporting of defect data. The following required characteristics of defects must be documented, tracked, and used in test reporting.

• Defect number • Defect description • Date defect detected/opened • Date defect fixed/closed • Defect severity • Defect priority • Defect status • Name of tester who detected the defect • Name of developer assigned to correct the defect

8.1.2 Verify and Validate User Documentation The content of the user and Operations and Maintenance manuals must be verified against the system to be delivered. In addition, this user documentation must be validated by a representative group of end users to ensure that the level of detail will meet the needs of the end user community.

8.1.3 Conduct Acceptance Testing A representative group of end users is identified to test the system in order to ensure that the functionality delivered meets defined mission needs. The test should be conducted by the government; test scenarios should allow for scripted and unscripted tests that include evaluation of system documentation (e.g., user manuals, operator manuals). The PM and Test Lead work with customers to choose a final test option and to receive approval. Approval documentation should be kept with the Test Plan. Results are reviewed by the PM and documented in the Acceptance Test Report.

8.1.4 Generate Section 508 Assistive Technology Interoperability Test Results (As Applicable) Requirements set forth in 29 U.S.C. Section 508, as amended by P.L. 105-220, August 1998, include standards that define the types of technology covered. It also provides for standards for the minimum level of accessibility for electronic and information

Page 68: Pia Reservation SDLC

B-62 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

technologies in the federal sector, including those used for communication, duplication, computing, storage, presentation, control, transport, and production.10 Section 508 Accessibility requirements define functional performance criteria to ensure sure that the individual assistive technology components defined as part of a DHS solution are interoperable and can be used to create an accessible product. Tests must be performed and test reports generated to prove that Section 508 Accessibility requirements for assistive technology interoperability are met by the system. These test results must be documented in a Section 508 Assistive Technology Interoperability Test Report. For more information on assistive technology interoperability, contact the DHS OAST.

8.1.5 Conduct Security Test and Evaluation (IT Only) Results of the ST&E and the Contingency Plan tests are documented in the SAR. Additional information on the ST&E and SAR activities is provided in the Department of Homeland Security Certification and Accreditation Guidance for SBU Systems User’s Manual. In addition, POA&Ms must be created for any weakness that is to be mitigated as part of the C&A process. Weaknesses that will be accepted and not mitigated are documented in the final SAR and agreed to by the Designated Accrediting Authority (DAA) prior to operation. POA&Ms focus on managing on-going corrective actions.11

8.1.6 Prepare Security Certification Documentation (IT Only) With the successful completion of the S&TE (Certification phase), the security accreditation package is assembled and submitted to the DAA for certification signature. The minimum acceptable accreditation package contains the following documents:

• SSP • SAR • POA&M • Certifying Official Transmittal Letter Supplemental information can be provided as requested by the DAA or Certifying Official. This supplemental information can include items such as the Contingency Plan, final RA, CM plan, standard operating procedures, CONOPS, and other documents.

8.1.7 Complete Privacy Impact Assessment (IT Only) Prior to the completion of the Integration and Test stage, the PIA must be completed and approved by the Privacy Office.

10 Summary of Section 508 Standards, General (Sub-Part A), www.section508.gov. 11 For detailed directions or guidance on the DHS POA&M process, consult the DHS 4300A Sensitive Systems

Handbook, Attachment H -- POA&M Process Guide.

Page 69: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-63 11-05-08

8.1.8 Complete System of Records Notice (IT Only) Prior to the completion of the Integration and Test stage, the SORN must be completed and approved by the Privacy Office. See Section 6.1.10 for more information on the SORN.

8.1.9 Complete Insertion Packages (IT Only) The SIP consists of all the information that must be submitted by the Submitter to the EAB for a decision to update the HLS EA SRM in order for Components to formally state a need for an enterprise (inter-Component or inter-Agency) service or to deploy a service on an enterprise basis. For acquisitions that are using services, the project team must provide the appropriate supporting documentation to assist the decision process. The SIP form has been arranged into three sections that correspond to key stages in the process flow for the SIP review process. See the DHS EA Process Guide for more information on SIP requirements. The Data Insertion Package consists of the Data Asset collection worksheet, information sharing agreements, and Information Exchange Packages. The complete package is required for EA COE and EDMO approval and must be submitted for registration. Updates to the Data Insertion Package by Components are key to an effective DHS DRM and improved information sharing opportunities. 8.1.10 Operational Testing Normally programs and projects enter formal Operational Testing (OT) and Evaluation (OT&E) in this phase prior to production and deployment. OT is governed by Appendix L: Test and Evaluation Master Plan (TEMP) of the Instruction/Guidebook 102-01-001.

8.2 Documents The project team records the results of their efforts in Stage 5, Integration and Test in a set of documents identified in Table 8-1. Documents must be traceable to the project WBS. All documents should be placed under CM control. For a full listing of the documents, including other items that get updated during the Integration and Test stage, see Attachment B-5.

Page 70: Pia Reservation SDLC

B-64 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Table 8-1: Integration and Test Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Operators Manuals DHS SELC C FMaintenance Manuals DHS SELC C FUser Manuals DHS SELC C FSystem Test Report DHS SELC C/FAcceptance Test Report DHS SELC C/FSection 508 Assistive Technology Interoperability Test Report DHS OAST C/F

Service Insertion Package (SIP) DHS EA Process C U F FSecurity Assessment Report (SAR) DHS CISO C/FSecurity Accreditation package DHS CISO C/FPrivacy Impact Assessment (PIA) Privacy Office C/FDHS Periodic Reporting CPIC C U U UProduction Readiness Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

8.3 Production Readiness Review and Exit Criteria The project sponsor and the PM must conduct the PRR to review the results of the Integration and Test Stage to validate that the system developed meets the defined requirements, to evaluate the exit criteria to ensure that all exit criteria have been met, and to assess system readiness for the move to production. The PRR includes the participation of the approval authority, who confirms that all exit criteria are satisfied by signing the PRR Approval Letter to indicate the project can proceed to the Implementation stage. It is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. The PRR Approval Letter is countersigned by the PM. A scanned electronic copy of the signed stage review approval letter, along with the updated project management plan (and associated project schedule), must be emailed to the DHS Periodic Reporting team at [email protected]. Should the project not gain PRR approval, the issues must be addressed and another PRR with appropriate sign-offs must be conducted.

8.3.1 Exit Criteria Table 8-2 lists the exit criteria for the Integration and Test stage.

Page 71: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-65 11-05-08

Table 8-2: Integration and Test Exit Criteria

Domain Exit Criteria Project Management • Have the risks been reviewed and are they deemed acceptable for moving

to the next stage? • Is the project on schedule or have remediation plans been developed to

correct for schedule loss? • Have all products defined in the approved Project Tailoring Plan been

completed and reviewed for completeness? • Have all Action Items from the TRR been resolved? • Have all documents from previous stages been updated as necessary to

reflect new information and decisions from the current stage? Requirements • Has the system been user-tested by individuals with the knowledge, tools,

and ability to use assistive technologies commonly used by people with disabilities and applicable to Section 508 requirements?

Information Security

• Has the ST&E been completed? • Has the Contingency Plan been tested? • Has the SSP been updated? • Has the SAR been completed? • Have POA&Ms been completed as required? • Has the Security Accreditation Package been assembled?

Privacy • If appropriate, has a SORN been published? • Have all system functionalities been tested against requirements and

limitations in privacy compliance documentation? Performance • Have all performance problems identified in system integration tests been

resolved and documented? • Has a range of performance scenarios been tested, considering possible

peak workloads and competition for resources? • Have all end-to-end performance tests been passed? • Have all scalability issues been resolved?

Data Management • Has the data load been successfully tested? • Has the Data Insertion Package been updated and submitted?

Configuration Management • Is the production system under CM control and ready to be pushed to the production environment?

Testing • Have all integration tests been completed successfully? • Did end users successfully complete acceptance testing? • Does the test report identify the number of defects, their severity level, and

their expected resolution date? • Do all defects have resolution plans? • Have all defects, variations, problems, and known errors been recorded in

a defect repository? • Is Section 508 acceptance testing complete?

Page 72: Pia Reservation SDLC

B-66 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Domain Exit Criteria Enterprise Architecture • Has user acceptance testing identified any gaps in required capabilities?

• Will the system provide all of the business capability as planned? • Are there gaps in business capability? • Is the data required by this system already available or will it be made

available? • Does the system include all components assigned to it for each release? • Does the system include all technology assigned to it for each release? • Is the EA Alignment Template complete through Section 5b? • Are all changes required for EA Alignment Template completed for this

stage? Software Engineering • Have all SLAs have been negotiated, agreed to, and signed by all parties? Infrastructure • Do systems and facilities affected by the release have a current Authority

to Operate (ATO)? • Has equipment installation been coordinated with site-specific personnel? • Has the appropriate Help Desk been notified of impending changes? • Have end user workstations been tested to ensure software

interoperability? Operations and Maintenance • Have all user, operator, and maintenance manuals and procedures been

reviewed, tested, and accepted by the operations team? • Have all operational and failover conditions been tested, if appropriate? • Have all operations and batch processing schedules been defined in an

SLA? Transition • Have all training manuals been reviewed and accepted by the operations

team?

9. Stage 6: Implementation

The objective of Stage 6, Implementation, is to prepare the system, operational environment, organization, and users for the intended use of the new solution. For information on Transition to Support and Training activities, see Appendix J: Supportability and Sustainment, of the Instruction/Guidebook 102-01-001.

9.1 Implementation Activities The activities associated with the Implementation stage include completing the preparation of operational sites and deploying the solution to the production environment; performing data extractions, transformations, and loads into production; coordinating changes to business practices; conducting training; developing version description documents; developing a transition-to-support document; finalizing the identification of critical infrastructure assets; conducting pilots and/or parallel operations, if required; publishing a SORN and acquiring Privacy Office affirmation; and obtaining an ATO letter. The set of activities actually performed for a given project in the Implementation stage may differ depending upon the approved Project Tailoring Plan. The following represent the activities performed as part of the Implementation stage.

Page 73: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-67 11-05-08

9.1.1 Conduct Pilot A pilot is an initial release of a system, not a prototype or a separate project. A pilot is developed using full SELC requirements, including those for HLS EA alignment, privacy, security, and Section 508 Accessibility. Privacy compliance requirements do not distinguish between the use of personally identifiable information in operations versus its use in test activities or in pilots. Thus, in order for personally identifiable information to be used as part of a pilot and/or as part of any parallel operations, the privacy compliance documentation must be drafted and approved to accommodate for that specific type of use. The privacy compliance documentation must be finalized, approved, and published (as required) prior to the use of personally identifiable information for testing and/or parallel operations. Compliance with all privacy requirements is required prior to use. Pilots are often used in parallel operation with the current system in order to provide the opportunity to analyze pilot results through information reconciliation and data integrity validation. Information and reports generated from the “old” system can be used as a point of reference for the new system for analysis and results documentation. Pilot results provide feedback on the implementation process, the "actual versus planned" functional capability, and end-user acceptability. They can also enhance decision making for current and future projects. More information on the use of pilots can be found in the found in Appendix of the EA Governance Process Guide.

9.1.2 Complete Preparation of Operational Sites and Deploy Solution to Production Environment All stakeholders must be informed of implementation schedules and increases in resource requirements prior to deployment. If any changes to the network and/or systems environment have occurred since the initial design plan was submitted and approved, the design and implementation plan should be modified to reflect those changes. In addition, any interface and data conversion constraint(s) should be documented, including recommended work-arounds and/or required modification to the original plan. If the constraints will have a significant impact on resource requirements, the revised plan should be submitted to the ARB for review and approval. The operational sites must be prepared and all hardware must be installed. The CM team should load the latest baseline version of the system to all hardware items and a final regression test should be conducted to ensure that all capabilities are functioning properly.

9.1.3 Perform Data Conversion and Load Production Data (IT Only) Data may be extracted from legacy systems, converted or transformed, and uploaded into the target system at multiple points within the development life cycle. Security and privacy policies must be adhered to. It is also advisable that as each batch upload or system interface procedure is completed, both data integrity and business function testing procedures be applied. These procedures may include running parallel operations testing, reconciliations, or combinations of both. Depending upon the size and complexity of legacy data, converting legacy data and loading them to a production

Page 74: Pia Reservation SDLC

B-68 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

environment may take considerable time and may need to be started before the Implementation stage.

9.1.4 Coordinate Changes to Corresponding Business Practices Organizational change management is primarily the responsibility of the project sponsor and end users, but the project team must identify (or recommend) the business process changes required to help the system and its users operate efficiently. Business practice changes are often included in the functional business requirements, CONOPS, and/or BIA as well as in the user training documentation developed in previous stages. This coordination activity should be scheduled in concert with training and deployment activities. Effective management requires that follow up evaluations be performed in order to ensure that the business process changes are meeting the mission needs as planned.

9.1.5 Publish System of Records Notice and Acquire Privacy Office Affirmation (IT Only) A SORN is a publicly published document that describes a certain type of data collection and how that data is used. A “system of records” is a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifier assigned to the individual. The Privacy Act requires each agency to publish notice of its systems of records in the Federal Register. Prior to implementation, the project must receive an affirmative statement from the DHS Privacy Office that all privacy compliance requirements are met. For more information on DHS privacy compliance requirements, contact the Privacy Office, U.S. Department of Homeland Security, Washington, DC 20528, available via email [email protected]. Examples of PIAs and SORNs, as well as additional educational materials related to compliance, are available on the DHS Privacy Office website: www.dhs.gov/privacy.

9.1.6 Obtain Authority to Operate Letter (IT Only) Once security testing is completed and final measures employed, an accreditation package is prepared for obtaining approval to operate from a DAA. In order to operate (and possibly pilot) a DHS system, an ATO letter must be signed by the DAA, finalizing the accreditation process.

9.2 Documents The project team records the results of its efforts in Stage 6, Implementation in a set of documents that is presented in the Implementation Stage Document summary, shown in Table 9-1. Documents must be traceable to the project WBS. All documents should be placed under CM control. For a full listing of the documents, including other items that get updated during the Implementation stage, see Attachment B-5.

Page 75: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-69 11-05-08

Table 9-1: Implementation Stage Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Pilot Results Report ARP (DIR 102-01) C/FFollow on Test Results ARP (DIR 102-01) C/FVersion Description Document DHS SELC C/FCritical Infrastructure Protection Report CIP C/FSystem of Record Notice (SORN) DHS Privacy C/FAuthority To Operate (ATO) Letter DHS CISO C/FOperational Readiness Review Approval Letter DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

9.3 Operational Readiness Review and Exit Criteria An Operational Readiness Review (ORR) is conducted at the conclusion of the Implementation Stage to review the results of the stage and to evaluate whether the system, as implemented, continues to meet mission need and is ready to be moved into production. The ORR requires the participation of the approval authority, who confirms that all exit criteria have been satisfied by signing the ORR Approval Letter to indicate the project can proceed to the Operations and Maintenance stage. It is expected that the approval authority will rely on the appropriate experts (e.g., EA, testing, security, budget) to evaluate the readiness of each project to proceed. The Project Manager countersigns the ORR Approval Letter. A scanned electronic copy of the signed stage review approval letter, along with the updated project management plan (and associated project schedule), must be emailed to the DHS Periodic Reporting team at [email protected]. Should the project not gain ORR approval, the issues must be addressed and another ORR with appropriate sign-offs be conducted.

9.3.1 Exit Criteria The exit criteria for the Implementation stage are listed in Table 9-2

Table 9-2: Implementation Exit Criteria

Domain Exit Criteria Program Management • Have the risks been reviewed and are they deemed acceptable to move to the

next stage? • Is the project on schedule or have remediation plans been developed to correct for

schedule loss? • Have all products defined in the approved Project Tailoring Plan been completed

and reviewed for completeness? • Have all Action Items from the PRR been resolved? • Have all documents from previous stages been updated as necessary to reflect

new information and decisions from the current stage?

Page 76: Pia Reservation SDLC

B-70 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Domain Exit Criteria Information Security • Has the C&A package been signed by the DAA?

• Have all users, administrators, and operators successfully completed security awareness training?

• Has the ATO Letter been signed by the DAA? Performance • Have all performance problems identified in the production environment tests

been resolved? • Has the model/simulation of the system been archived for possible future use

during operations? Data Management • Have all data been loaded into the system and are they ready for use?

• Are all external data sources ready and available for the deployment of this system?

Configuration Management

• Has this system been properly placed under CM control? • Does a CCB exist to evaluate and approve proposed changes to the system

baseline? Testing • Have all production environment tests been completed satisfactorily?

• Are all continuity and recovery processes and procedures complete and tested? Enterprise Architecture • When deployed, will the system provide all of the business capability as originally

planned? • Have any new business capabilities been identified? • Has user acceptance testing identified any gaps in required capabilities? • Is it known who will be utilizing the data created by this system? • Are the requisite Information Sharing Agreements in place? • Are the application components being deployed as planned? • Is the technology interoperable with the infrastructure? • Is the technology being deployed still in alignment with the DHS TRM? • Are all changes required for EA Alignment Template completed for this stage?

Software Engineering • Have all created services been added to the DHS Service Catalogue and submitted to the EA PMO for registry in the service component reference model?

• Are all the availability calculations (algorithms) agreed upon and documented in an SLA?

Infrastructure • Is the production environment ready for use? • Can this system/enhancement be deployed into the production environment given

the current threat level or security posture (e.g., system lock-down)? Operations and Maintenance

• Are all O&M staff ready to begin operations? • Is all scheduled downtime documented in a SLA and agreed-upon by all affected

stakeholders? Transition • Have all users, operators, and maintenance personnel been adequately trained on

the new system?

10. Stage 7: Operations and Maintenance

The objective of Stage 7, Operations and Maintenance (O&M) is to operate the system, make minor enhancements to the system, and conduct periodic reviews (e.g., security). O&M personnel must monitor the current system, identify problems to be fixed, and identify ways to improve the system.

Page 77: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-71 11-05-08

10.1 Operations and Maintenance Activities The following subsections represent the typical activities performed in the O&M stage.

10.1.1 Operate System and System Documentation Regular maintenance (e.g., backup, archival) schedules must be documented and adhered to in order to ensure optimum system/asset performance. Baseline documentation (i.e., Mission Need Statement, Business Case, Security Certifications, IT Contingency Plan, SSP, CM Plan, Incident Response Plan, Risk Assessment, POA&M, PIA, and Section 508 Accessibility Plan) should be regularly maintained and updated, as appropriate.

10.1.2 Identify and Make System Enhancements Users, customers, and maintenance personnel identify modifications to the system needed to resolve issues, enhance system performance, or provide new capabilities. New capabilities may take the form of routine maintenance or constitute enhancements to the system in response to user requests for new or improved capabilities. All system changes shall be reviewed and approved by the system’s Configuration Control Board before implementation in accordance with the system’s CM Plan. Any changes to the system baseline must be made within the CM system and pushed into production, according to the system’s CM Plan. IT Only: While minor enhancements and modifications are expected during O&M, major enhancements to a system in O&M must be treated as new IT projects and comply with the full life cycle requirements of the SELC. Major enhancements are any combined changes that exceed $2.5 million in cost from planning through deployment (excluding O&M costs) unless exempted by the Component CIO.

10.1.3 Test Disaster Recovery Disaster recovery plans are tested during this phase. To the extent possible, all testing should be accomplished by simulating actual conditions. Problems identified during testing, along with optimum solutions, should be discussed and disseminated to appropriate parties as lessons learned. Disaster recovery testing should be performed at regular, periodic intervals in order to create and maintain a sense of awareness and preparedness among personnel. System documentation and procedures should be updated to reflect any corrections made to deficiencies identified during these exercises.

10.1.4 Document Post Implementation Review Results Within six months of deployment, the PIR should be conducted. The PIR should document deployment/implementation and coordination issues, how they were resolved, and how they could be prevented in future releases The results of the PIR are documented in the PIR Results Report, which describes how the newly implemented solution meets business requirements. The PIR focuses on three areas:

Page 78: Pia Reservation SDLC

B-72 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

• Impact to stakeholders and customers • Delivery of expected capability • Achievement of baseline goals The PIR is to be conducted six months after the solution is deployed, with the exception of projects that have been cancelled. Cancelled project are to conduct a PIR immediately. For more information on the PIR and PIR Results Report, see the DHS CPIC Guide.

10.1.5 Conduct Operational Analyses Operational analysis is the method to use to measure the performance and cost of an established project in the O&M phase against baselines as defined in DHS Operational Analysis Guidance. The objective is to measure project achievement in meeting cost, schedule, and performance goals. Major investments that are in the steady state or O&M phase are required to use Operational Analysis as the performance-measurement process to measure the performance and cost of those assets against the established baseline. Operational analyses should be conducted annually or tailored to the nature of the asset. For more information on Operational Analysis, refer to the DHS Operational Analysis Guidance.

10.1.6 Develop Lessons Learned Report The Lessons Learned Report defines opportunities for improving enterprise processes based on the experience of the acquisition team and other stakeholders during the life cycle of the acquisition. The content for the Lessons Learned Report comes from PIR findings, from various reviews held, or from the conduct of required activities during the project life cycle. The objective is to make lessons learned from DHS acquisitions available throughout the Department in order to increase the probability of success for future acquisitions through the improvement of processes, tools, and other project-related entities. For more information on lessons learned, refer to the DHS CPIC Guide.

10.1.7 Perform Continuous Monitoring (IT Only) Security activities conducted during the O&M stage include security administration, oversight and monitoring of the security controls on an ongoing basis, and informing the DAA when changes that may impact the security of the system occur. Security monitoring, security update, and security incident reporting activities, among others, are presented in Table 10-1.

Page 79: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-73 11-05-08

Table 10-1. O&M Security Monitoring, Security Updates, and Security Reporting

10.1.8 Develop Section 508 Accessibility Incident Remediation Report (IT Only) The Accessibility Incident Remediation Report (29 U.S.C. Section 508) documents all end-user problems with Section 508 Accessibility and describes the changes made to remediate the problem(s). The report is included in project documentation with the Section 508 Accessibility Plan; a copy is to be provided to DHS OAST. For more information on the Section 508 Accessibility Incident Remediation Report, refer to DHS OAST.

Activity Description Security Operations and Administration

• Perform backups • Hold annual security awareness classes for all users • Maintain current user administration and access privileges • Hold specialized security training for security personnel • Update security software

Configuration Management and Control

• Use DHS CM and control procedures to document proposed or actual changes to the system (including hardware, software, firmware, and surrounding environment)

• Analyze changes to determine their security impact Ongoing Security Control Verification

• Identify potential security-related problems in the system not identified through the security impact analysis conducted during the CM task

• Select an appropriate set of security controls and evaluate the effectiveness of the controls through security reviews, self-assessments, security testing and evaluation, vulnerability assessments, or security audits (selection to made by Information Security System Engineer and project sponsor)

SSP Maintenance and Security Incident Reports

• Update the SSP as changes occur (annually, at a minimum) to reflect the most recent changes to the system and any identified or potential security impacts

• Report proposed or actual changes and identified or potential security impacts to the authorizing official or designated representative

• Use information from status reports to determine the need for security re-accreditation

C&A Maintenance • Validate that existing documentation meets requirements for a signed ATO letter (operational systems must have a signed ATO letter that is good for 3 years or is issued when a major change occurs to the system or the system environment)

• Validate that C&A documentation (defined in the Department of Homeland Security Certification and Accreditation Guidance for SBU Systems User’s Manual) is current and is uploaded to the CISO-approved FISMA reporting tool

NIST SP 800-53 Controls Test

• Perform annual tests of NIST SP 800-53 controls using NIST SP 800-53A tests

POA&M Updates • Develop corrective action plans for newly identified weaknesses • Provide monthly status updates for previously identified weaknesses

Contingency Plan • Update the Contingency Plan as changes occur (annually, at a minimum) • Perform annual tests of the Contingency Plan

Monthly Updates for Security Reports

• Perform monthly updates of IT system data included in the CISO-approved FISMA reporting tool to facilitate 1) Information Security Scorecard development, 2) quarterly FISMA report development, 3) annual FISMA report development

Page 80: Pia Reservation SDLC

B-74 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

10.2 Documents The project team records the evidence of its work in the set of documents listed in the project’s tailoring plan. The determination of applicable items for a project is made based on the approved tailoring plan for the project. For a full listing of the documents, including others that get updated during the O&M stage, see Attachment B-5.

Table 10-2: O&M Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Performance Reports ARP (DIR 102-01) UPost Implementation Review (PIR) Results CPIC UDHS Periodic Reporting CPIC U UOperational Analyses CPIC ULessons Learned CPIC UFISMA metrics reports DHS CISO USecurity Incident reports DHS CISO UC&A Updates (every 3 years or when major change is made) DHS CISO UPrivacy Documentation (updated for systems decommissioned) Privacy Office U

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

11. Stage 8: Disposition

Disposition is the act of eliminating all or parts of a system to include IT (e.g., shutting down databases) and non-IT (e.g., decommissioning helicopters). The emphasis in Disposition is to ensure that the system (or parts of the system), data, procedures, and documentation are packaged and archived in an orderly fashion, making it possible to reinstall and bring the system back to an operational status if necessary, and to retain all data records in accordance with DHS and Federal policies regarding retention of electronic records. A Disposition Plan is required to address all facets of archiving, transferring, and disposing of all or a part of a system and its corresponding data. For IT systems, particular emphasis is given to proper preservation of the data processed by the system so that they are effectively migrated to another system or archived in accordance with applicable records management regulations and policies for potential future access. (Refer to Attachment B-2, Section B2.14, for more information on Electronic Records Management.) For example, a common error is to dispose of legacy tape drives without testing to verify that newer technology can read the data from the legacy tapes or planning and maintaining an extract, transform, and load capability to convert legacy data into a format usable by the new system.

11.1 Disposition Activities Refer to Appendix J for a detailed discussion of Disposition Phase Activities.

Page 81: Pia Reservation SDLC

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-75 11-05-08

11.2 Documents The project team records the evidence of its work in the set of documents listed in the project tailoring plan. The determination of applicable documents for a project is made based on the approved tailoring plan for the project. For a full listing of the documents, including other items that get updated during the Disposition stage, see Attachment B-5.

Table 11-1: Disposition Documents

NEEDANALYZE/SELECT

Solu

tion

Engi

neer

ing

Plan

ning

Req

uire

men

ts

Def

initi

on

Des

ign

Dev

elop

men

t

Inte

grat

ion

&

Test

Impl

emen

tatio

n

Ope

ratio

ns &

M

aint

enan

ce

Dis

posi

tion

Disposition Approval Request DHS SELC C/FArchived Data DHS SELC C/FArchived System DHS SELC C/F

OBTAIN

PRODUCT Governing Authority

PRODUCE/ DEPLOY/SUPPORT

ARP Phases

SELC Stages

Page 82: Pia Reservation SDLC

Sample Template and Guidance

B-76 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Attachment B-1 Systems Engineering Life Cycle

Development Methodologies This attachment provides additional information on the DHS SELC system development methodologies discussed in Section 2. The methodologies described are Spiral, Iterative/ Incremental, and Waterfall. Each of the following sections discusses the advantages and disadvantages of each methodology and the applications for which each is most suitable.

B1.1 Spiral Development Methodology

The Spiral methodology is a risk reduction-oriented development methodology that breaks a project into multiple sequential cycles for development. It was conceived through recognition that projects are risky and need an approach to better manage fluctuations in project direction. Each cycle sequentially addresses one or more major risks of the project until all major risks have been addressed. Poorly understood requirements, unfamiliar technology, changes in mission, and potential performance concerns are examples of project risk.

Figure B1-1 depicts the Spiral methodology. The approach is to start the project on a small scale (shown at the center of the spiral), identify a major risk facing the project, explore the risk with a prototype, refine requirements and design based on the success of the prototype, and commit to an approach for the next risk(s) in the sequence. Each cycle of the spiral leads the project team to a better understanding of mission needs, user needs, technology limitations, and performance considerations. All SELC stages are executed and most are executed multiple times. The same is true for the SELC stage reviews – all are held and some may be held more than once.

Implementation O & M

Integration& Test Acceptance Testing

Proof ofConcept

DemonstrationPrototyping

DevelopUnit Test

DevelopUnit Test

DevelopUnit Test

DevelopUnit Test

Prelim.

Design

Design

Design

Prelim.

Design

Integration& Test

Test

User Review

System

Req.

System

Req.

System

Req.

System

Req.

Functional Requirements

Functional Requirements

Functional RequirementsPlanning

Planning

Planning

PlanningPlanning

Business Case

Concept of Operations

Concept of Operations

Concept of Operations

Busines

s

Business Req.

Func

tiona

l Req

.

Requirements

SystemConcept

PPRPPR

SDRSDR

PDRPDR

CDRCDR

TRRTRR

TRRTRR

PRRPRR ORRORR PIRPIR

SpiralDevelopment

Model

Disposition

Figure B1-1. Spiral Methodology

The strengths of the Spiral methodology are that 1) not all requirements need to be well understood at the beginning of the project and 2) there is a heightened focus on risk reduction. Risk is accounted for early in the project life cycle, before large-scale development activity has begun, through the use of prototypes.

Page 83: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-77 11-05-08

The methodology provides frequent checkpoints for risk mitigation and management oversight. If the risk of a given cycle proves insurmountable, an alternative solution can be explored.

The assessment of the Spiral development methodology presented in Table B1-1 is adapted from the Department of the Air Force, Guidelines for Successful Acquisition and Management (GSAM) Condensed Version (February 2003).

Table B1-1: Spiral Methodology: Advantages vs. Disadvantages

Spiral Methodology

Advantages Disadvantages Application • Provides better risk management

than other methodologies • Requirements are better defined • System is more responsive to user

needs

• More complex and harder to manage

• Usually increases development costs and schedule

The Spiral method should be considered for projects where risks are high, requirements must be refined, and user needs are important.

B1.2 Iterative/Incremental Methodology

Iterative and Incremental methodologies are similar enough to be described in a single section of this SELC Guide. Unlike the Spiral methodology, where a single project cycles through sequential iterations of the SELC stages starting with a small group of allocated requirements and building greater understanding/functionality and an increased number of requirements with each spiral, the Iterative and Incremental methodologies work with segmented requirements and multiple sub-projects and process them concurrently through the remaining stages of the SELC.

To initiate Iterative/Incremental development, a project team divides the project into smaller sub-projects that develop a functional component or build of a system. Project requirements are allocated to each functional sub-project. Each sub-project is then independently and concurrently managed to completion. It is similar to the Waterfall methodology, to be described in the next section, in that each stage must be completed before the subsequent stage can begin.

In the Iterative methodology, depicted in the diagram in Figure B1-2, each sub-project cycles through the requirements, design, develop, and integration and test stages several times, each iteration building upon the previous one. After all cycles are completed, the production system is implemented in the production environment.

DHS SLC

ApprovedProject

Implementation

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopIntegration & Test

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopIntegration & Test

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopIntegration & Test

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopIntegration & Test

RequirementsPlanning

Design

DevelopFQT (Dev Test)

PlanningRequirements

Design

DevelopmentIntegration & Test

RequirementsPlanning

Design

DevelopFQT (Dev Test)

RequirementsPlanning

Design

DevelopFQT (Dev Test)

PlanningRequirements

Design

DevelopmentIntegration & Test

Operations & Support

DispositionDisposition

Figure B1-2. Iterative Methodology

The Incremental methodology, by contrast, allocates requirements among multiple sub-projects (increments) but cycles only once through design, development, test, and implementation, as depicted in

Page 84: Pia Reservation SDLC

Sample Template and Guidance

B-78 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

the diagram in Figure B1-3. The key distinction is that each sub-project (increment) is deployed into production as it is completed.

ApprovedProject

Implementation

Planning/Requirements

Design DevelopmentIntegration

& TestPlanning/

RequirementsDesign Development

Integration & Test

Planning/Requirements

Design DevelopmentIntegration

& TestPlanning/

RequirementsDesign Development

Integration & Test

Planning/Requirements

Design DevelopmentIntegration

& TestPlanning/

RequirementsDesign Development

Integration & Test

Implementation

Implementation

DHS SLC

Operations & Support

Operations & Support

Operations & Support

DispositionDisposition

DispositionDisposition

DispositionDisposition

Figure B1-3. Incremental Methodology

The fundamental similarity between the Iterative and Incremental methodologies is that both require a good understanding of the requirements at the beginning of the project. The Iterative and Incremental methodologies are best suited to those projects with a well-defined and stable scope that need the flexibility to divide the project into multiple parallel efforts.

Table B1-2 presents the advantages and disadvantages of Iterative/Incremental methodologies and is adapted from the Department of the Air Force, GSAM Condensed Version (February, 2003).

Page 85: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-79 11-05-08

Table B1-2: Iterative/Incremental Methodology

Iterative/Incremental Methodology

Advantages Disadvantages Application • Provides feedback, allowing later

development cycles to learn from previous

• Allows some requirements modification; may allow addition of requirements

• Usable product is available with first release; each cycle delivers functionality

• Project can be stopped any time after the first cycle and delivers a working product

• Risk is spread over multiple cycles • Project Management is easier for

incremental projects

• Majority of requirements must be known at beginning

• Formal reviews are more complex for incremental releases as opposed to single development effort

• Interfaces must be well-defined at outset due to development over multiple iterations

• Cost and schedule overruns may result in an unfinished system

• More frequent impact to operations with multiple releases

Good for projects: • Needing early delivery of

functionality • With stable requirements at the

beginning • That benefit from feedback of

earlier cycles • With funding uncertainties, as

each cycle produces a working system

• With low to medium levels of risk

B1.3 Waterfall Methodology

The Waterfall methodology is a highly structured development process and the one most commonly used. It is referred to as Waterfall because the stages cascade from one to the next with limited, if any, opportunity to revisit a completed stage. In the Waterfall methodology, stakeholders “sign-off” at each stage, agreeing that the documentation is sufficient to move to the next stage of development.

Figure B1-4 depicts the Waterfall methodology and its traditional cascading of stages.

CDR

TRR

PRR

ORR

ApprovedProject

PLANNING

INTEGRATION& TEST

IMPLEMENTATION

OPERATIONS& SUPPORT

DESIGN

DEVELOPMENT

REQUIREMENTSDEFINITION

PIR

PPR

SDR

PDR

DISPOSITION

Figure B1-4. Waterfall Methodology

Page 86: Pia Reservation SDLC

Sample Template and Guidance

B-80 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

The Waterfall methodology is documentation-intensive because early stages document the capability that must be achieved and subsequent stages provide greater granularity to requirements already defined, then define how the required capability will be achieved. The output from one stage serves as the input to the next stage, with the project flowing from one step to the next in a cascading fashion. Stages are typically sequential, with only localized feedback during the transition between stages. Comprehensive reviews validate the work of one stage and require the resolution of any problems before development is allowed to proceed to the next stage.

An important consideration for selection of the Waterfall methodology is that fixes or modifications are often put off until the maintenance stage (after implementation). This can be costly, as the cost of error correction is magnified in later stages. The advantages and disadvantages presented in Table B1-3 are adapted from the Department of the Air Force, GSAM Condensed Version (February, 2003).

Table B1-3: Waterfall Methodology: Advantages vs. Disadvantages

Waterfall Methodology

Advantages Disadvantages Application • Stages align with CPIC,

ARP, and typical project management phases

• Cost and schedule estimates may be lower and more accurate

• Progress and success are not observable until later stages

• Errors or deficiencies made in earlier stages may not be discovered until implementation

• Risks are dealt with in a single development effort

• Only local feedback is available at stage transition points (due to sequential nature)

• Working product is not available until late in project

• Corrections are addressed in maintenance stage

Useful when requirements are: • Well understood at project

start • Not expected to

change/evolve over life of project

• Project risk is relatively low

Page 87: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-81 11-05-08

Attachment B2: Other SELC Considerations

B2.1 Index of Required Project Elements

As documented in Section 2.5 of this SELC Guide, project managers should evaluate required elements for applicability for all DHS projects. Table B2-1 lists the sections in which these elements are discussed.

Table B2-1: Index of Required Project Elements

Element Section Project Management B.2 Organizational Change Management (OCM) B.3 Enterprise Architecture Alignment B.4 Requirements Definition B.5 Prototyping B.6 Information Security B.7 Privacy Compliance B.8 Critical Infrastructure Protection B.9 Human Factors Engineering B.10 Section 508 EIT Accessibility B.11 Risk Management B.12 Independent Validation and Verification (IV&V) B.13 Electronic Records Management B.14 National Information Exchange Model (NIEM) B.15

B2.2 Project Management

B2.2.1 Definitions

The following paragraphs define project management-related terms found in this DHS SELC Guide for use in DHS acquisitions.

Project Management: Project Management is the application of knowledge, skills, tools, and techniques to a broad range of activities in order to meet the requirements of a particular project. The Project Management Institute (PMI) groups Project Management activities into five process groups: Project Initiation, Project Planning, Project Execution, Project Monitor and Control, and Project Close12 (e.g., known collectively as the project life cycle). PMI also identifies nine supporting knowledge areas: Project Integration Management, Project Scope Management, Project Time Management, Project Cost Management, Project Quality Management, Project Human Resources Management, Project Communications Management, Project Risk Management, and Project Procurement Management.

Program: A program is a group of related projects or a project that is large in scope and dedicated to a specific purpose. In this DHS SELC Guide, for the sake of convenience, the term “project management” will be used to mean either “program” or “project” management.

Project: PMI defines a project as a temporary endeavor undertaken to create a unique product, service, or result.

12 PMI website, Professional Practices section, “About the Profession,” www.pmi.org.

Page 88: Pia Reservation SDLC

Sample Template and Guidance

B-82 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Program Manager: DHS Management Directive (MD) 0784 defines the role of the Program Manager and the associated tasks as follows:13 A Program Manager is the responsible person who, with significant discretionary authority, is uniquely empowered to make final the scope of work, capital investment, and performance acceptability decisions (for an assigned program[s]), and who is responsible for accomplishing program objectives or production requirements through the acquisition of in-house, contract or reimbursable support resources, as appropriate. The Program Manager is responsible for management and oversight of the Integrated Project Team. In general, the Program Manager is the manager of an acquisition program, but may be a manager of a procurement that does not rise to the level of an acquisition program (i.e., janitorial services, HR services, bulk commodity purchases).

Project Management Methodology: A defined process (methodology) with the specific purpose of managing a project or program through all phases of the project life cycle. The methodology is a framework that acts as a mechanism for coordination in combining the responsibilities of managing both an organization (Integrated Project Team – IPT) and a process (project life cycle) for the life of a project.

Integrated Project Team: A group of subject matter experts (SME) who represent the interests of their sponsoring functional organizations in performing project activities as members of a project team under the direction of a Project Manager for the duration of a project.

Project Life Cycle: A standardized collection of distinct phases grouped together to create the process that a Project Manager uses to manage a project. PMI defines the set of common phases and activities in the project life cycle (Table B2-2) as follows:

Table B2-2: Common Phases and Activities in a Project Life Cycle14

Functional Phase High Level Description of Common Activities

Project Initiation Prepares a project proposal, obtains approval and reserves budget for the project, and establishes the project team.

Project Planning Identifies stakeholders, defines project objectives, prepares the project plans for achieving the project objectives, and obtains final allocation of budget.

Project Execution Implements the project plans; coordinates people and other resources to carry out the project plans.

Project Monitor & Control Ensures that project objectives are met by monitoring and measuring progress periodically to identify variances from the baselines (e.g., earned value management – EVM), taking corrective action when necessary, tracking the variances and managing changes.

Project Close Brings the project to an orderly end, formalizes and communicates the acceptance or outcomes of a project, transfers deliverables to the clients or the ongoing responsible area, conducts a post implementation review.

B2.2.2 Benefits

Project management conveys several benefits. It:

• Provides a standard method to ensure that projects are defined, monitored, and implemented in a structured, consistent manner that promotes predictability and quality of outcome so that projects are completed on time and within scope and budget.

13 Office of the Chief Procurement Officer, Department of Homeland Security, Acquisition Oversight Program

Guidebook, July 2005, p77, https://dhsonline.dhs.gov/portal/jhtml/tracking/viewdoc2.jhtml?doid=42070. 14 IEEE Std 1490-2003, IEEE Guide Adoption of PMI Standard – A Guide to the Project Management Body of

Knowledge.

Page 89: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-83 11-05-08

• Supports major functional processes within the life cycle, providing a uniform method to comply with and integrate multiple life cycle requirements and tasks (e.g., strategic planning, enterprise architecture [EA], investment planning, SELC) to ensure that projects meet stated goals.

• Uses a defined project management approach/methodology to coordinate the management of organizational needs and expertise with the management of the project management process in order to achieve results.

B2.2.3 Considerations

DHS Acquisition Managers, including Program Managers and Project Managers are responsible for managing projects as “investment stewards,” ensuring compliance with MD 0784.15 Though this DHS SELC Guide does not define or specify a particular project management methodology to be used, it does make the following recommendations with regard to the management of DHS IT projects:

• Program and/or Project Managers must meet the PM certification requirements as defined in MD 0784.

• Project goals and objectives should directly relate to the DHS mission.

• Key stakeholders (internal and external) should be identified early and involved throughout the life of the project to ensure that business needs are accurately defined and met.

• Project Management Plans should identify and deliver measurable benefits to stakeholders, such as reduced cost, improved quality, and more efficient response time.

• Project Managers are responsible for identifying and communicating risks and issues that may hinder the successful delivery of a project.

• Project Management Plans must be flexible and should use a methodology that efficiently supports DHS’ dynamic development environment, which is characterized by requirements and priorities that can change due to external events.

• Project Management Plans must include program and project reviews defined in this DHS SELC Guide in order to ensure effective reporting and management of project progression.

• Project Managers should define levels of effort and project detail relative to the level of risk to the program, project, and/or task.

• Project Managers must develop and maintain comprehensive elements to support the project management processes of project initiation, project planning, project execution, project monitor and control, and project close, including a work breakdown structure, project schedule, ADEs, resource plans, and identification of critical path, in order to ensure effective and efficient project management and delivery to meet project objectives.

• Project Management Plans must include all appropriate standard elements found in best practice project management methodologies to manage the project (e.g., project communication plan, project training plan).

• Project Managers must ensure that all privacy compliance requirements (including analysis and documentation) are met prior to implementation.

15 Department of Homeland Security, Management Directive (MD) 0784, Acquisition Oversight Program, issued

December 19,/2005, p2, https://dhsonline.dhs.gov/portal/jhtml/tracking/viewdoc2.jhtml?doid=4097002.

Page 90: Pia Reservation SDLC

Sample Template and Guidance

B-84 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

B2.3 Organizational Change Management

B2.3.1 Definition

Organizational Change Management (OCM) helps organizations plan for, adapt to, and integrate change that occurs as a result of changes in enterprise mission, IT systems,16 or process. The implementation, modification, or disposal of any enterprise IT system or component may result in significant change to related business processes, organizational structure or culture, and workforce. OCM is the set of management practices used to proactively assess and address the impact of significant organizational change at appropriate points in the life cycle. OCM activities (e.g., reviews, assessments, tasks, communications, and production of documents and products) should be conducted when necessary to promote project success.

Changes to enterprise mission or IT systems can result from several types of initiatives, such as:

• Business Process Engineering/Business Process Improvement

• Business Architecture Development, Transition, and Sequencing (EA)

• Project Prioritization/Re-prioritization via IT Portfolio Management

B2.3.2 Benefits

The benefits of employing OCM include the following:

• Ownership and commitment to the enterprise change among both the leadership team and other stakeholders

• Facilitation of new roles and responsibilities (e.g., strategy drives structure)

• Enhanced workforce readiness to support and successfully implement the mission

• Earlier identification and resolution of critical issues (e.g., unintended consequences)

B2.3.3 Considerations

• Organizational and business process improvements identified using OCM techniques should be integrated with the Homeland Security (HLS) EA Business Architecture.

• Information Technology (IT) does not drive OCM (e.g., processes can be improved to meet mission needs more effectively than simply by automating).

• While DHS does not mandate specific approaches to manage enterprise change, OCM is recognized as a best practice throughout government and industry.

• OCM should be employed for all major17 DHS development projects, especially those involving significant change to technology, process, or organization structure.

• Project Managers should incorporate OCM concepts and activities in the project plan (e.g., Business Impact Assessments, Outreach and Communication tasks).

16 Dr. Craig J. Petrun, The MITRE Corporation, Organizational Change Management - Practice Area Overview, Slide

1 of 27, August 3, 2006. 17 Refer to Directive 102-01 for the current investment thresholds for major projects.

Page 91: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-85 11-05-08

B2.4 Enterprise Architecture Alignment

B2.4.1 Definition

The domain of EA includes baseline architecture, target architecture, and a sequencing plan.18 EAs are essential for evolving information systems, developing new systems, and inserting emerging technologies which optimize mission value. The existence of an EA is federally mandated by the Clinger-Cohen Act, which makes Chief Information Officers responsible for developing, maintaining, and facilitating the implementation of a sound and integrated IT architecture in their organizations.19 The DHS EA is aligned to the Federal Enterprise Architecture (FEA). The FEA is used to document, describe, and develop the baseline architecture, the target architecture, and the sequencing plan. Component EAs are required to align with the HLS EA. Individual projects are evaluated through the EA Governance process for architectural alignment. The HLS EA target architecture is evolving to a Service Oriented Architecture (SOA).

B2.4.2 Benefits

Among its benefits, an EA:20

• Establishes the enterprise-wide roadmap to achieve the enterprise mission within the IT environment

• Acts as a “blueprint” for systematically and completely defining the current (baseline) and desired (target) IT environment, thereby facilitating change and promoting standardization, alignment, and integration

• Improves communication and decision making through use of a standard vocabulary, the capture of enterprise facts, and support for enterprise analyses and decision making

• Provides a mechanism for sharing services

• Expedites the integration of legacy, migration, and new systems

• Ensures legal and regulatory compliance

• Facilitates transition to new EA concepts

B2.4.3 Considerations

• Official DHS guidance states that the role of the target architecture in the HLS EA is to establish a cohesive and consistent view of the future in terms of the data, software, and technology required to implement it.21

• The HLS EA contains: – DHS architectural principles – A baseline inventory of “as-is” business practices and technology resources – A four-layer target architecture (consisting of business, data, applications, and technology layers) – A transition strategy and plan intended to move the HLS enterprise toward the evolving target

architecture

18 A Practical Guide to Federal Enterprise Architecture, version 1.0, p.1., Chief Information Officers Council, February

2001. 19 Division E – Information Technology Management Reform Act (now the Clinger/Cohen Act), s.1124, as found at

http://www.cio.gov/Documents/it_management_reform_act_Feb_1996.html. 20 Ibid, p8. 21 From HLS Target Architecture, Version 2.0, October 29, 2004.

Page 92: Pia Reservation SDLC

Sample Template and Guidance

B-86 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

• DHS IT systems must align with the HLS EA. EA alignment refers to how well IT investments support the enterprise architecture and the transition to the target architecture. To guide IT project alignment to the HLS EA, appropriate EA-related exit criteria are defined in each DHS SELC stage.

• Project Managers and system architects are responsible for designing DHS systems that take the following into account: – Adherence of all IT system development efforts to the principles, standards, and processes

defined by the DHS Service Component Based Architecture – Leverage of existing services (reuse of registered services) – Identification of opportunities to create additional services (new services or enhancements of

existing services) for reuse throughout DHS – Leverage of existing data assets to increase information sharing opportunities

• The EA Alignment Process is conducted: – As part of the project approval process to ensure EA alignment of proposed projects – Periodically, as part of the ARP, to:

– Validate alignment throughout the life of the investment (e.g., as IT investments are defined, designed, developed, tested, implemented, maintained, and disposed of)

– Ensure that variances are identified, assessed, and either approved or rectified

• The HLS Technical Reference Model (TRM) provides technical standards for all DHS IT systems. The TRM applies to both the development of new systems and the enhancement of existing systems. Additionally, it applies to both pilots and prototypes. Pilots must be compliant with the TRM; prototypes may use emerging technologies if approved by the DHS EAB.

• The EAB Governance Process Guide describes the EA Decision Request (DR) processes. Specific types of EA DRs include the following: – Program Alignment DR (confirm alignment of a program to the HLS EA) – Technology Insertion DR (approve insertion of technology products/standards into the HLS

TRM) – Service Insertion DR – Data Insertion DR – Performance Insertion DR

For more information on EA planning, technical insertion, and assessment processes or the DHS TRM, contact either the EA staff in the DHS Office of the CIO’s Office of Applied Technology or the Technology and Architecture staff of the DHS Component organization of interest.

B2.5 Requirements Definition

B2.5.1 Definition

The activities in the requirements definition phase (commonly referred to as requirements engineering by the Software Engineering Institute [SEI]) of the systems engineering life cycle include the gathering, analysis, specification, verification, validation, and management over time of the stakeholder-desired capabilities and characteristics of a product and/or product component to be produced in support of mission needs.

B2.5.2 Benefits

The benefits of effective and efficient requirements engineering include the following:

Page 93: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-87 11-05-08

• Greater likelihood of successful delivery that meets project cost, scope, and schedule (“requirements development is one of three factors critical to successful acquisitions”)22

• Improved risk management due to increased project stability

• More effective project-related communication

• Enhanced enterprise decision making via defined requirements baselines

B2.5.3 Considerations

Many of the concepts and descriptions below are adapted from the Requirements Engineering Section of an SEI article called “A Framework for Software Product Line Practice.”23

• Effective and efficient configuration management must be used to establish baselines from which to manage change; requirements engineering affects all aspects of systems development and maintenance as changes in requirements in one project may affect other related systems and/or components (e.g., changes in requirements for a development project may affect the requirements for maintenance of existing systems and/or components, as well as the requirements of parallel development efforts).

• Requirements engineering is a communications exercise among the many internal and external stakeholders it involves: citizens, executives, systems engineers, operations personnel, field personnel, taxpayers, other “end-users” (including persons with disabilities), contractors, oversight agencies, other projects under development, among others. Therefore, effective communication plans, activities, and products must be established and managed with appropriate levels of detail available for all stakeholders.

• Requirements must be defined with the appropriate level of both behavioral (e.g., the specificity to support functional need, and the generality at appropriate points to deal with change over the lifetime of the component) and quality characteristics (performance, reliability, security).24

• This DHS SELC Guide recognizes a series of related activities and products that must be addressed in requirements engineering efforts for DHS IT projects. These include the following: – Identification of desired business capabilities from activities conducted as part of the Strategic

Planning, Enterprise Engineering, Operations, and from other appropriate components of the enterprise life cycle

– The creation of a Concept of Operations (CONOPS), a Business Impact Assessment (BIA), and subsequent requirements documents as defined in Section 5 Requirements Definition and including the following: Operational Requirements Document (ORD) Functional Requirements Document (FRD) Requirements Traceability Matrix (RTM) Systems Requirements Document (SRD)

– The appropriate set of activities, products, tools, and techniques (e.g., testing, prototyping, modeling, analyses, documentation and communication of trade-off decisions and results) as recommended by recognized best practice organizations (e.g., SEI, PMI, IEEE) with evidence of successful execution to support the effective and efficient development of systems and/or components required for successful project completion in support of the DHS mission

22 Effective Acquisition Practices = Successful Government Program Management Offices (GPMO), Lisa M. De

Mello, MBA, PMP, Joseph A. Pegnato, DPA;William Sullivan, MS; Scott Webb, MS, PMP; September 2005 23 Software Engineering Institute, “A Framework for Software Product Line Practice, Version 4.2,” c.2007. 24 Ibid.

Page 94: Pia Reservation SDLC

Sample Template and Guidance

B-88 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

B2.6 Prototyping

B2.6.1 Definition

Prototyping is defined as the development and implementation of a model (e.g., physical, electronic, digital, analytical) of a product built to do the following:

• Assess the feasibility of a new or unfamiliar technology

• Assess or mitigate technical risk

• Validate requirements or a product

• Demonstrate critical features

• Verify a product

• Determine enabling product readiness

• Characterize performance or product features

• Discover physical principles25

• Demonstrate accessibility (Section 508)

The terms “prototype” and “pilot” are often used interchangeably; from a DHS SELC perspective, however, a prototype is not the same as a pilot. Prototypes or “proof of concepts” are initiated as a means of demonstrating functionality, technology integration, program management, and/or business process changes. As such, they are typically low-cost, require short development time, are not robust, and do not follow a rigorous development methodology. They are not intended to provide operational support. Prototypes are used to determine if a particular approach is feasible and acceptable. Based on the prototype results, the approach may be expanded into a formal program to deliver an operational system.26

Unlike a prototype, a pilot is intended for operational use and, as such, follows a formal development methodology (SELC) with full life-cycle documentation, including requirements specification, design documents, and test plans. The pilot is required to pass security certification and, therefore, must contain sufficient features for secure operational use. The pilot will be implemented in a much more limited deployment than the full production system. This could mean fewer geographic sites, fewer users, and/or more limited features.27

B2.6.2 Benefits

In many traditional system development methodologies, prototypes were not used, were not available during development, and/or were maintained only long enough to establish technical feasibility. It is now recognized that prototyping can provide a variety of benefits throughout the systems development life cycle, rather than at a single time for a single purpose:

• Prototyping is an effective technique used in product design and evaluation to accomplish the following: – Discover physical principles of a product – Assess and/or confirm product feasibility, requirements, performance, and/or features – Mitigate project and program risks – Evaluate technical integration feasibility or alternative solutions – Elicit user feedback and refine requirements

25 EIA 632: Processes for Engineering a System, January 1999. 26 EAB Governance Process Guide, ver. 3.0, September, 2006. 27 Ibid.

Page 95: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-89 11-05-08

• Prototyping is considered a best practice in commercial software development, especially as it applies to the design of user interfaces and complex systems.28

B2.6.3 Considerations

• Prototyping provides increased value in the use of evolutionary/Spiral development methodologies, such as Rapid Application Development (RAD), Agile Development, and Xtreme Programming.

• All major29 DHS IT development projects should use prototyping to facilitate requirements analysis and technical feasibility.

• All major30 DHS IT development projects should continue to leverage the prototype as part of the initial build or baseline capability, where feasible.

• Privacy compliance requirements do not distinguish between pilot and prototype and instead look purely at the data and data usage; consequently, privacy requirements must be met for both.

B2.7 Information Security

B2.7.1 Definition

The term “information security” means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide:

• Integrity, which means guarding against improper information modification or destruction and includes ensuring information non-repudiation and authenticity

• Confidentiality, which means preserving restrictions on unauthorized access and disclosure, including ways to protect personal privacy and proprietary information

• Availability, which means ensuring timely and reliable access to and use of information31

B2.7.2 Benefits

It is DHS policy that all information systems that generate, store, process, transfer, or communicate sensitive information shall be protected at a level commensurate with the threat. The level of protection will be determined by the criticality and sensitivity of the information and of the mission supported by the information system, and in compliance with national policy and standards. Information security policies, processes, and procedures provide a variety of benefits throughout the systems development life cycle.32

• Establish policies and procedures for the overall DHS Information Security Program and information security in order to ensure incorporation of standards into information security technology efforts.

• Ensure that information assurance requirements are addressed by the mission areas and that security is adequately incorporated into information system functions throughout the systems engineering life cycle.

• Ensure that all DHS operational components can exploit the maximum advantage from information in order to accomplish their missions while keeping residual risk to a minimum.

• Provide security policies and architecture that provide a common, interoperable, cost-effective means of protecting DHS information resources.

28 IEEE, Guide to the Software Engineering Body of Knowledge. 29 Refer to Directive 102-01 for the current investment thresholds for major projects. 30 Ibid. 31 Federal Information Security Management Act (FISMA) of 2002, Public Law 107-347, December 17, 2007. 32 DHS Information Security Program Strategic Plan, April 4, 2004, version 1.

Page 96: Pia Reservation SDLC

Sample Template and Guidance

B-90 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

• Ensure that DHS information, systems, major applications, and general support systems are sufficiently secure to accomplish the DHS priority missions.

• Ensure compliance with Federal Information Security Management Act of 2002 (FISMA), National Institute of Standards and Technology, Office of Management and Budget (OMB), National Security Agency, and Central Intelligence Agency guidance as well as all applicable laws, directives, policies, and directed actions on a continuing basis.

B2.7.3 Considerations

Government agencies are required to protect information assets against loss, theft, damage, and unauthorized destruction, modification, and access. To fulfill this responsibility, various risk management activities must be performed throughout the systems development life cycle. These activities include the following:

• Identification of sensitive systems

• Performance of security risk analyses and assessments

• Development of system security plans, contingency plans, and secure operating procedures

• Testing of systems security

• Certification and accreditation of systems for operation in the assigned security mode

B2.7.3.1 Information Security Controls

FISMA requires that information security controls must be selected based on risk as determined by the categorization of a system resulting from a Federal Information Processing Standards (FIPS) 199 review. The FIPS 199 System Categorization availability must be synchronized with Business Continuity planning activities by ensuring that FIPS 199 availability is consistent with the BIA Maximum Allowable Outage.

A review of the effectiveness of the controls and control testing must be performed at least annually. The determination of the level of rigor to be used in the review is based upon the characteristics of the system, such as the:

• Acceptable level of risk to the system and its information

• Extent to which system configurations and settings are documented and continuously monitored

• Extent to which patch management occurs

• Comprehensive nature of the past review

• Age of the most recent in-depth testing and evaluation

B2.7.3.2 Certification and Accreditation

DHS requires that all solutions using IT in any form are Certified and Accredited (C&A) in accordance with DHS security policies before they are deployed to the operational environment. The Chief Information Security Officer (CISO) has established a security C&A process to ensure that all DHS IT solution development initiatives comply with DHS security policies and Federal regulations.

Key requirements of the DHS C&A process include the following:

• Identification of an Information Systems Security Officer (ISSO) to serve as the principal point of contact for all IT security aspects of the system.

• Use of the CISO-approved tool for conducting certification and accreditation. The Office of Information Security maintains both a classified and unclassified version of the CISO-approved C&A tool. DHS 4300A applies to all unclassified systems and is used throughout this document as reference. For classified systems, all references should be made to DHS 4300B. Access to the

Page 97: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-91 11-05-08

classified or unclassified version of the CISO-approved C&A tool should be made through the Component’s Information System Security Manager (ISSM).

• Use of the CISO-approved tool for conducting annual self assessments and for FISMA reporting.

• Development of a Security Risk Assessment, System Security Plan (SSP), and Plan of Action and Milestones (POA&M).

• Completion of an independent Security Test and Evaluation (ST&E) with the effort documented in the Security Assessment Report (SAR).

• A validated Privacy Threshold Analysis reviewed and approved by the DHS Privacy Office.

• IT system C&A completed every three years except in cases where a DHS IT system or an operational environment has experienced a “major” change. If an IT system or operational environment is significantly changed prior to the system reaching its three year milestone, the affected system(s) must be re-certified and re-accredited prior to being allowed to again operate in the DHS environment.

All DHS IT projects must be in compliance with DHS Systems Security requirements; compliance is verified throughout the systems development life cycle via exit criteria used in SELC stage reviews. Additional information regarding the DHS IT security program is presented in DHS 4300A Sensitive Systems Policy and Handbook, 4300B National Security Systems Policy and Handbook, and Security Architecture Guides Volumes 1-3.

B2.7.3.3 Continuous Monitoring

The purpose of the Continuous Monitoring Phase is to provide oversight and monitoring of the security controls in the IT system on an ongoing basis and to inform the authorizing official or designated representative when changes occur that may impact the security of the system. The Continuous Monitoring Phase monitors the status of the IT system to ensure that residual risk is kept within an acceptable level. During this phase, it is necessary to identify any significant changes to the system configuration or to the operational/threat environment that might affect system security.

B2.8 Privacy Compliance

B2.8.1 Definition

Privacy Compliance is a structured review process to ensure that all privacy compliance requirements are met prior to final project implementation. Section 208 of the E-Government Act of 2002 requires all Federal government agencies to conduct Privacy Impact Assessments (PIA) for all new or substantially changed technology that collects, maintains, or disseminates personally identifiable information. The DHS Chief Privacy Officer (CPO) is required by Section 222 of the Homeland Security Act to ensure that the technology used by the Department sustains and does not erode privacy protections. The PIA is one mechanism through which the CPO fulfills this statutory mandate. The CPO is also required by Section 222 to conduct PIAs for proposed rulemakings of the Department. The CPO approves PIAs conducted by the Department’s offices and programs.

B2.8.2 Benefits

Privacy Compliance conveys the following benefits:

• Compliance with laws and regulations pertaining to the identification and protection from unlawful disclosure and use of personally identifiable information collected, maintained, or disseminated by DHS personnel and/or technology

• Auditable requirements, processes, and training that govern a standard approach to privacy protection of personally identifiable information within DHS

Page 98: Pia Reservation SDLC

Sample Template and Guidance

B-92 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

B2.8.3 Considerations

B2.8.3.1 Privacy Compliance Documentation

Privacy Threshold Analysis

A Privacy Threshold Analysis (PTA) is used to determine whether a full PIA is needed. The PTA template is available on the Privacy Office website on DHS Online and www.dhs.gov/privacy. This form must be completed and reviewed by the Privacy Office. If a PIA is required, the Privacy Office will send the project manager (the submitter) a copy of the PIA Guidance and accompanying template to complete and return.

Privacy Impact Assessment

A Privacy Impact Assessment (PIA) is an analysis of how personally identifiable information is collected, stored, protected, shared, and managed. The term “personally identifiable information” or “PII” refers to any information that permits the identity of an individual to be directly or indirectly inferred, including any other information that is linked or is linkable to that individual regardless of whether the individual is a U.S. citizen, legal permanent resident, or a visitor to the U.S. The purpose of a PIA is to demonstrate that project sponsors and developers have consciously incorporated privacy protections throughout the entire life cycle of a system. This involves making certain that privacy protections are built into the system from the beginning, when such protections are less costly and more effective. Addressing privacy issues publicly through a PIA builds citizen trust in the operations of the Department.

System of Records Notice

A System of Records Notice (SORN) is a publicly published document that describes a certain type of data collection and how that data is used. A “system of records” is a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifier assigned to the individual. The Privacy Act requires each agency to publish notice of its systems of records in the Federal Register.

B2.8.3.2 Sources for Privacy Protection

The Privacy Act of 1974

The Privacy Act of 1974 regulates the Federal government’s use of personal information. It places restrictions on the collection, use, maintenance, and release of information about individuals and gives individuals the right, under certain circumstances, to see records about themselves, to obtain copies of their records, to have records corrected or amended, and to have a statement of disagreement filed in their records if the correction or amendment is not approved.

The E-Government Act of 2002

Section 208 mandates that all Federal Executive Branch departments and agencies and their contractors conduct PIAs whenever they develop or procure new IT involving the collection, maintenance, or dissemination of personally identifiable information or make substantial changes to existing technology for managing such information.

The Homeland Security Act of 2002

Section 222 mandates the appointment of a senior-level privacy officer responsible for privacy policy. The duties of that privacy officer include: ensuring that the use of technologies sustains, and does not erode, privacy protections relating to the use, collection, and disclosure of personal information; ensuring that systems of records comply with the Privacy Act of 1974; evaluating legislative and regulatory proposals involving the collection, use, and disclosure of personal information; conducting PIAs on rules proposed by DHS; and preparing an annual report to Congress on privacy at DHS.

For more information on DHS privacy compliance requirements, contact the Privacy Office, U.S. Department of Homeland Security, Washington, DC 20528, available via email [email protected]. Examples

Page 99: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-93 11-05-08

of PIAs and SORNs as well as additional educational materials related to the compliance are available on the DHS Privacy Office website: www.dhs.gov/privacy.

B2.9 Critical Infrastructure Protection

B2.9.1 Definition

The USA Patriot Act of 2001 defines critical infrastructure as:33

“Systems and assets, whether physical or virtual, so vital to the United States that the incapacity or destruction of such systems and assets would have a debilitating impact on security, national economic security, national public health or safety, or any combination of those matters.”

Homeland Security Presidential Directive (HSPD)-7, passed on December 17, 2003, establishes a national policy for Federal departments and agencies to follow in identifying, prioritizing, and protecting critical infrastructure and key resources throughout the United States. Consistent with these acts and directives, the DHS Critical Infrastructure Protection (CIP) Program is one of the eight programs described in the DHS Information Security Program Strategy. It is a DHS program that reports to OMB annually about functions and services identified and prioritized as critical infrastructure, the assets used, and programmatic actions and efforts taken by the Component to protect the asset and ensure continuity during or recovery after disruption. The CIP program follows the guidance of the Federal Executive Agent (DHS Infrastructure Protection) and is responsible for identifying mission critical functions and services, determining those which are candidates, and then making the final determination as to whether the function and/or service is or is not national critical infrastructure.

B2.9.2 Benefits

CIP conveys the following benefits:

• Systems and assets vital to the security, economy, public health, and/or safety of the United States are identified and prioritized, allowing the implementation of plans to protect them from incapacity, destruction, and/or to ensure continuity or recovery after disruption.

• A national policy exists to ensure a unified approach to the identification, prioritization, and protection of vital systems and assets.

B2.9.3 Considerations

Additional information on the CIP program can be found in the DHS Sensitive Systems Handbook Security Program Policy. Additional information on the DHS CIP program is available from the organization supporting Continuity Planning for Critical DHS Assets within the DHS Office of the Chief Information Officer (DHS OCIO). For information at the Component organizational level, contact the CIP Officer for the specific Component organization of interest.

B2.10 Human Factors Engineering

B2.10.1 Definition

Human factors engineering (HFE) is concerned with optimizing the interaction of people (often referred to as “end users” or “users”) with the devices, applications, systems, processes, and equipment they must

33 USA PATRIOT Act of 2001 (42 U.S.C. 5195c(e))., SEC. 1016. CRITICAL INFRASTRUCTURES PROTECTION,

October 18th, 2001, from http://fl1.findlaw.com/news.findlaw.com/cnn/docs/terrorism/hr3162.pdf

Page 100: Pia Reservation SDLC

Sample Template and Guidance

B-94 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

use in order to minimize errors, time spent performing a task, lost productivity, staff turn-over, and training costs, while maximizing efficiency, clarity, ease of learning, effectiveness, and productivity.34

B2.10.2 Benefits

Applying the principles of HFE conveys the following benefits:

• The user-centered design process creates a design that is user-evaluated and altered at several points early in the project life cycle, resulting in an end-product or service more likely to meet user needs.

• Cost efficiencies in service/product development can be gained because corrections are made earlier in the development life cycle when they are less costly than those made later in the life cycle.

B2.10.3 Considerations

• User-centered design requires that project developers: – Know the user(s), including users with disabilities,35 and their requirements – Involve the users early and often in the design process – Use an iterative approach so that each iteration is targeted to improve a specific facet of the

design, with the total number of iterations determined by usability acceptance criteria and user priority36

• A variety of factors in the operational environment affect the importance of “usability” (or “ease of use”) in the design of a product or service. For example, in products or services affecting human safety or security, usability is probably a greater priority than in other applications.

• While DHS has no formal policies that specifically mandate the use of HFE by name, HFE concepts are present in many activities related to the SELC. The user-centered design characteristics of understanding user needs, involving the user early in design, and using an iterative design process are analogous to activities in Strategic Planning (e.g., identify desired business capabilities), EA (e.g., Business Reference Models, Service Component Based Architecture), and in many facets of Requirements Engineering (e.g., develop a CONOPS, BIA, ORD, use cases, prototyping).

• HFE should be considered in all design activities. Several standards and other resources are available to guide designers in this area: – ANSI/HFS 100-1988, published by the Human Factors and Ergonomics Society (HFES):

American National Standard for Human Factors Engineering of Visual Display Terminal Workstations

– ISO 9241-Parts 1-17:1997 Ergonomic requirements for office work with visual display terminals – ISO 9241-110:2006 Ergonomics of human-system interaction – ISO 13407:1999 Human-centered design processes for interactive systems – ISO/TR 16982:2002 Ergonomics of human-system interaction – Usability methods supporting

human-centered design – ISO/TR 18529:2000 Ergonomics – Ergonomics of human-system interaction – Human-centered

lifecycle process descriptions

34 US-VISIT Human Factors Engineering Plan, May 19th, 2006, USVISIT-APMO-CONTHSSCHQ04D0096T004-

PLN060150-D 35 Note that Fitness of Duty requirements do not preclude Section 508 compliance. 36 IEC, International Engineering Consortium; http://www.iec.org/online/tutorials/hmi/topic05.html

Page 101: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-95 11-05-08

B2.11 Section 508 Electronic and Information Technology Accessibility

B2.11.1 Definition

In 1998, Congress amended the Rehabilitation Act to require Federal agencies to make their Electronic and Information Technology (EIT) accessible to people with disabilities. Inaccessible technology interferes with an individual's ability to obtain and use information quickly and easily. Section 508 was enacted to eliminate barriers in IT, to make available opportunities for people with disabilities, and to encourage development of technologies that will help achieve these goals. The law applies to all Federal agencies when they develop, procure, maintain, or use electronic and information technology. Under Section 508 of the Rehabilitation Act, as amended by the Workforce Investment Act of 1998 (P.L. 105-220), agencies must give disabled employees and members of the public access to information that is comparable to the access available to others.37

B2.11.2 Benefits

The following are benefits of complying with Section 508:

• Compliance with Federal law.

• Experience with the new technologies and with the HFE used in applications to support Section 508 compliance may be reusable in other DHS design applications.

• Disabled employees and members of the public will have access to DHS information equal to that of employees and members of the public who are not disabled.

B2.11.3 Considerations

• Each project must engage the DHS Office on Accessible Systems and Technology (OAST) or the Component Section 508 Coordinator to ensure that it is compliant with all DHS accessibility requirements. Accessibility reviews may require the project to prepare a plan of remediation and additional project requirements.

• All DHS IT system development efforts must address Section 508 requirements throughout the life cycle. Each project must have a Section 508 EIT Accessibility Plan to define its approach to meeting Section 508 requirements.

• DHS considers accessibility to EIT a priority for all employees and external customers, including those with disabilities. MD 4010 established OAST within the OCIO and indirectly within the Office of Civil Rights and Civil Liberties as well as policy regarding EIT accessibility for DHS employees and customers with disabilities.38

• Section 508 requirements apply to all DHS Components and to all EIT developed, procured, maintained, or used by DHS Components. Additional guidance for accessibility is provided by numerous Public Laws, regulations, and OMB circulars, and directives: – The Homeland Security Act of 2002, P.L. 107-296 (November 25, 2002), Section 102 – Section 508 of the Rehabilitation Act of 1973, as amended in the Workforce (65 FR 80500,

December 21, 2000; 66 FR 20894, April 25, 2001) – Electronic and Information Technology (EIT) Accessibility Standards (36 CFR part 1194) – Federal Acquisition Regulations (FAR) – Subpart 39.2 – Electronic and Information Technology – OMB Circular A-130, Management of Federal Information Resources (61 FR 6428, February 20,

1996)

37 Federal Section 508 website,http://www.Section508.gov. 38 Department of Homeland Security MD 4010, Section 508 Program Management Office & Electronic and

Information Technology Accessibility

Page 102: Pia Reservation SDLC

Sample Template and Guidance

B-96 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

– Clinger-Cohen Act of 1996 (40 U.S.C. 1401(3)) – Section 202(d) of the E-Government Act of 2002, Accessibility to Persons with Disabilities – DHS MD 4010.2, Electronic Information Technology Accessibility

For further information regarding Section 508, email [email protected], call OAST at 202-447-0476 or TTY 202-447-5857, or visit OAST site at: https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?index=130&community=MGMT&id=2025980011.

B2.12 Risk Management39

B2.12.1 Definition

Risk management is a process, applied on a continuous basis throughout the SELC, with the objective of providing a repeatable process for balancing cost, schedule, and performance goals within program funding. The process is made up of:

• Risk Identification

• Risk Analysis

• Risk Mitigation Planning

• Risk Mitigation Plan Implementation

• Risk Tracking

B2.12.2 Benefits

A properly structured and administered risk management practice will generally help to ensure a successful acquisition program with the following benefits:

• The ability to anticipate risks before they become issues because of a planned risk management process integral to the acquisition process, especially to the technical planning processes

• Continuous, event-driven technical reviews to help define risk areas to meet users’ needs within acceptable risk

• An independent risk perspective because the risk management practice is independent of the Project Manager

• A defined set of success criteria for performance, schedule, and cost elements

B2.12.3 Considerations

Risks have three components:

• A potential future event that, if it occurs, will create a programmatic issue

• A probability assessed in the present of that future event occurring

• The programmatic impact of that future event

Risk can be quantified, since it is the product of the probability of the event occurring times the programmatic impact. An issue is an event that has occurred, and has had an impact on the program that requires management intervention. A primary objective of risk management is to develop risk mitigation strategies that will either prevent an event from occurring or reduce its impact. The practice of risk management draws from many management disciplines, including but not limited to program management, systems engineering, earned value management, production planning, quality assurance,

39 Price, Gordon, Software Testing Terminology, from http://www.stsc.hill.af.mil/crosstalk/1994/07/xt94d07l.asp.

Original citation credited to IEEE 729/610.12, Glossary of Software Engineering Terminology, New York, 1990.

Page 103: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-97 11-05-08

logistics, system safety and mishap prevention, and requirements definition. It establishes a process that ensures achieving program objectives for cost, schedule, and performance. That is, risk management is a primary practice used to ensure that the program is not a failure.

B2.13 Independent Verification & Validation

B2.13.1 Definition

Independent Verification and Validation (IV&V) is a specific certification practice. Commonly, IV&V is performed by an individual or organization that is technically, managerially, and financially independent of the development organization.40

B2.13.2 Benefits

• The independent nature of IV&V provides management with insight into the project without conflict of interest.

• IV&V provides the opportunity for more effective decision making affecting project direction.

B2.13.3 Considerations

The objectives of IV&V are to:41

• Facilitate early detection and correction of software anomalies

• Enhance management insight into process and product risk

• Support the life cycle processes to ensure conformance to program performance, schedule, and budget

• Provide an early assessment of software and system performance

• Provide objective evidence of software and system conformance to support a formal certification process

• Improve the software development and maintenance processes

• Support process improvement for an integrated systems analysis model

IV&V teams analyze and review software development outputs to help to ensure that software requirements are not in conflict with any standards or requirements applicable to other system components. Typically, IV&V teams do not conduct tests against the system (e.g., unit, integration, user acceptance). IV&V teams often will witness these tests to gauge the effectiveness of the testing process in accurately and completely testing requirements and in identifying defects. While DHS does not mandate IV&V at this time, IV&V is considered a best practice throughout government and industry. As such, it should be employed for all major development projects.

Verification and Validation entail:

• Verifying requirements: Proving that each requirement has been satisfied. Verification can be done by logical argument, inspection, modeling, simulation, analysis, expert review, test, or demonstration.

• Validating requirements: Ensuring that (1) the set of requirements is correct, complete, and consistent; (2) a model can be created that satisfies the requirements; and (3) a real-world solution can be built and tested to prove that it satisfies the requirements.

40 Price, Gordon, Software Testing Terminology, from http://www.stsc.hill.af.mil/crosstalk/1994/07/xt94d07l.asp.

Original citation credited to IEEE 729/610.12, Glossary of Software Engineering Terminology, New York, 1990 41 IEEE, Draft Standard for Software Verification and Validation, IEEE P1012/D12, September 12, 2004, p3.

Page 104: Pia Reservation SDLC

Sample Template and Guidance

B-98 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

• Verifying a system: Building the system right: ensuring that the system complies with the system requirements and conforms to its design.

• Validating a system: Building the right system: making sure that the system does what it is supposed to do in its intended environment. Validation determines the correctness and completeness of the end product, and ensures that the system will satisfy the actual needs of the stakeholders.

The requirement for IV&V may be placed on a high risk program by Congress as a condition of release of funds. When this occurs, the CIO is typically required to certify that the program has an IV&V contract in place. The DHS CIO provides this certification upon review of supporting documentation from the program. Certification is based on the following criteria:

• An IV&V contract is in place that includes the subject program within the scope.

• The contractor’s IV&V approach meets IEEE 1012 standards.

• The contractor’s IV&V approach ensures that items are completed and are of sufficient quality and that all outcomes will meet the needs of the user.

• The contractor’s IV&V technical approach is developed for all the necessary IV&V activities.

• The contractor’s IV&V technical approach identifies a strategy or method for determining which activities will need to be conducted, how those activities will be performed, and when those activities will be conducted.

B2.14 Electronic Records Management

B2.14.1 Definition

National Archives and Record Administration (NARA) regulations affecting Federal agencies and their records management programs are found in Subchapter B of 36 Code of Federal Regulations Chapter XII. Part 1234

establishes the basic requirements related to the creation, maintenance, use, and disposition of electronic records. Electronic records include numeric, graphic, and text information, which may be recorded on any medium capable of being read by a computer and which satisfies the definition of a record. This includes, but is not limited to, magnetic media, such as tapes and disks, and optical disks. Unless otherwise noted, these requirements apply to all electronic information systems, whether on microcomputers, minicomputers, or main-frame computers, regardless of storage media, in network or stand-alone configurations. This part also covers creation, maintenance and use, and disposition of Federal records created by individuals using electronic mail applications.42

NARA regulations applicable to electronic records management must be considered by Project Managers throughout all phases of the SELC. Additional resources and guidance are available at http://www.archives.gov/records-mgmt/.

B2.14.2 Benefits

According to information from the NARA Records Management website, electronic records management:

• Contributes to the smooth operation of an agency’s acquisitions by making the information needed for decision making and operations readily available

• Helps deliver services in a consistent and equitable manner

• Facilitates effective performance of activities throughout an agency

42 http://www.archives.gov/records-mgmt/policy/guidance-regulations.html

Page 105: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-99 11-05-08

• Protects the rights of the agency, its employees, and its customers

• Provides continuity in the event of a disaster

• Protects records from inappropriate and unauthorized access

• Meets statutory and regulatory requirements, including archival, audit, and oversight activities

• Provides protection and support in litigation

B2.14.3 Considerations

The Records Management (RM) Profile43 recommends that agencies embed records management requirements in the earliest stages of the SELC. For more information, contact the DHS records management point of contact at NARA at http://www.archives.gov/recordsmgmt/appraisal/work-group-all.html. Table B2-3 provides a checklist that Project Managers can use to integrate records management into the SELC stages.

Table B2-3: Integration Checklist

Records Management Integration into the SELC

DHS SELC Stage

Checklist

1. Is the agency Records Officer included from the beginning in the system design process?

2. Are records that support the business process identified?

3. a. Do current record schedules apply to the new system? b. Is a new record schedule required because of changes in the records?

Planning

4. Is the agency Records Officer’s signature on the agency Investment Summary Proposal?

5. Are all records-related requirements identified and incorporated into the final CONOPS and Operational Requirements Document?

6. Are new records schedules being drafted, if needed?

Requirements Definition

7. Is the agency Records Officer’s signature on the requirements document?

8. Are all records management requirements incorporated into the system design document?

9. Is the agency Records Officer’s signature on the system design document?

Design

10. Is the agency records management staff included in project status meetings as needed?

11. Is the agency records management staff included in project status meetings as needed? Development

12. Are proposed records schedules submitted to NARA?

13. Are records management requirements incorporated into the system? Integration and Test 14. Is the agency Records Officer’s signature on the Systems Test Report?

15. Is the agency records management staff included in project status meetings as needed? Implementation

16. Is the agency Records Officer’s signature on the document approving deployment of the system?

Operations and Maintenance

17. Is the Mid-Cycle Review complete? (The review to occur three years after going to production to validate records management requirements and records schedules.)

43 Refer to Federal Enterprise Records Management Profile, Sections 4.1.1 through 4.1.6;

http://www.archives.gov/records-mgmt/policy/rm-profile.html.

Page 106: Pia Reservation SDLC

Sample Template and Guidance

B-100 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Records Management Integration into the SELC

DHS SELC Stage

Checklist

18. Are disposition authorities being implemented in accordance with appropriate dispositions?

19. Is the Mid-Cycle Review report sent to the agency records management staff for review?

20. Is the agency Records Officer’s signature on the Mid-Cycle Review certification document?

21. At the time of retirement or rollover of the system, are records preserved, retained, and made fully accessible for the full retention periods in accordance with appropriate dispositions?

22. At the time of retirement or rollover of the system, are temporary records destroyed in accordance with appropriate dispositions?

System Disposition

23. At the time of retirement or rollover of the system, are permanent records transferred to NARA in accordance with the appropriate dispositions?

B2.15 National Information Exchange Model (NIEM)

B2.15.1 Stage 1 - NIEM Scenario Planning NIEM provides the development and life cycle processes to create and document information exchanges. An IEPD provides users with the documentation, including business rules and use cases, for the exchange of information, in addition to the schemas that describe the elements. The process used to create an IEPD and maintain it through its life cycle ensures a consistent approach to sharing information among systems across DHS and it allows for increased efficiency in deploying new services. Once completed, the IEPDs products are stored in the DHS Information Exchange Repository to facilitate reuse by other component agencies. They are linked to services, when applicable, that are available for providing access to parties external to DHS. Detailed information on the IEPD development process and products can be found in the NIEM Concept of Operations (http://www.niem.gov//files/NIEM_Concept_of_Operations.pdf). Detailed technical documentation and assistance is available by contacting the DHS Enterprise Data Management Office. When Should a Program or Project Use NIEM? If inter-component or inter-agency information exchanges (transfer of information between two parties) will be developed, the information exchanges must be developed in accordance with the NIEM IEPD. SELC Planning activities should address NIEM scenario planning activities. Scenario planning is a business analysis approach to identifying information exchanges and is the first phase in the IEPD development methodology.

B2.15.2 Stage 2 - NIEM Requirements Analysis

In the case where inter-component or inter-agency information exchanges are being developed, two NIEM products are required:

• NIEM Exchange Model – To be documented in the NIEM IEPD main document.

• NIEM Exchange Business Requirements - To be documented in the NIEM IEPD main document.

B2.15.3 Stage 3 - NIEM Map and Model

If NIEM IEPDs are being developed as a result of inter-component or inter-agency information exchanges the following products should be completed as part of the NIEM Map and Model Phase:

• Domain Model

Page 107: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-101 11-05-08

• Component Mapping Template

B2.15.4 Stage 4 - NIEM IEPD Build and Validate / Assemble and Document

If a NIEM IEPD is being developed in support of a inter-component or inter-agency information exchange, NIEM IEPD lifecycle activities and products prescribed in the Build and Validate and Assemble and Document phases should be performed in this SELC phase. Those products include:

• Want list

• XML Schemas – exchange, subset, and exchange schemas

• IEPD Metadata

• Main IEPD document including packaged IEPD products as specified in previous phases

Please contact the DHS EDMO for the current list of DHS required NIEM IEPD products and an IEPD main document outline.

B2.15.5 Stage 5 - NIEM IEPD Publish and Implement

Upon completion of the Development SELC phase and the Assemble and Document NIEM IEPD lifecycle phase, the NIEM IEPD should be packaged and uploaded to the DHS Information Exchange Repository. Details on how to package the IEPD and how to submit the IEPD to the DHS Information Exchange Repository can be obtained from the DHS Enterprise Data Management Office (EDMO).

B2.16 SELC and the DHS Service Oriented Architecture (IT Only)

PMs and developers also need to consider the shift in the DHS architectural strategy from one based on traditional development of IT systems to one based on an SOA, also referred to as Service Component Based Architecture (SCBA). The impact of this change in strategy is that as DHS transitions to an SOA, greater emphasis will be placed on service creation, reuse, and modification for reuse of established service, with IT acquisitions increasingly being analyzed for compliance with SOA principles, requirements, and technology during project authorization activities.

B2.16.1 SOA Definition

The HLS-EA Target Architecture, Version 2.0, describes the DHS approach to SOA and defines the new architectural style for developing, maintaining, and enhancing IT systems:

The Homeland Security Application and Component Architecture is based on commercial best practices and a new paradigm promoted by the Office of Management and Budget (OMB) and the U.S. Chief Information Officer (CIO) Council: Service/Component-Based Architecture. The benefit of Service/Component-Based Architecture is that it defines the functionality of the applications by the services or capabilities provided to the users. These services are made available through reusable software components generating significant cost savings and facilitating information sharing and commonality. Constructing applications is a matter of assembling components into groupings that provide the capability required to satisfy a business need.

The Service/Component Based paradigm has many advantages for the Department. Among these advantages are reduced user training, easily modified applications (plug and play), reuse of components in multiple applications, and better interoperability of applications. The capabilities needed to detect and respond to new mechanisms must be highly adaptable. The Services/Component Based paradigm provides DHS with the needed adaptability.44

44 HLS-EA Target Architecture, Version 2.0, October 29, 2004.

Page 108: Pia Reservation SDLC

Sample Template and Guidance

B-102 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

In the new architectural environment, DHS systems will consist of a set of loosely coupled services that will be used internally and exposed externally to other DHS applications through a service registry mediated by the DHS Enterprise Service Bus (ESB). The requirement for communication through the ESB does not necessarily include communication between internal components of an application (e.g., direct API or SQL calls). DHS SELC Guide requirements reflect DHS architectural standards as well as those required by other enterprise life cycle processes. In its role of supporting the evolution to SOA, the DHS SELC Guide will focus more specifically on service creation and reuse. An artifact called the Service Reuse Plan is required as an entry criterion to the DHS SELC. This plan documents what services a project will create, modify, or reuse. For more information on DHS’s SOA vision and plan, the DHS Service Oriented Architecture Framework defines how DHS plans to transition to a service oriented architecture.

B2.16.2 Multiple Track Development for Services

The SELC Guide recommends the use of a multiple track approach for acquisitions that will build or enhance SOA services. The first track is focused on the main system development effort, which includes the following tasks:

• Designing and developing the core system capabilities

• Analyzing and selecting existing services to reuse on the project

The other tracks are focused on:

• Defining, developing, and deploying new or enhanced services into the DHS SOA with the primary purpose of supporting the needs of the sponsoring project

• Defining, developing, and deploying the new or enhanced services with a secondary purpose of supporting the needs of other/future systems

Acquisitions are encouraged to reuse existing services. However, if the existing service must be enhanced, it is the requesting project’s responsibility to provide the funding for the enhancement. It is important to note that acquisitions may consist of one, multiple, or no service development efforts. The number of tracks will depend on the number of services the project needs to create or enhance. The intent is that the primary project will break out the smaller service development efforts into separate (sub-)projects that can be independently managed. It is expected that all IT acquisitions will leverage and build toward the SOA, which means placing more emphasis on service reuse. The overall structure of the Program into a set of projects as described in the Directive 102-01 may consider SOA (sub-)projects as projects to be managed separately for purposes of the APB and the ARP depending on the size, cost, schedule, distinctness and interdependencies of the services to be reused/enhanced or newly built.

Page 109: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-103 11-05-08

Attachment B3: Summary of Exit Criteria The following table lists all the exit criteria for all SELC stages.

-- Under Development --

Page 110: Pia Reservation SDLC

Sample Template and Guidance

B-104 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Attachment B4: SELC Document Matrix

Table B4-1 lists the Governing Authority that represents the DHS organization or process that mandates the development of the artifact, the associated DHS organization to contact for more information, and a DHS website where more information can be found.

Table B4-1: Governing Authorities, DHS Organizations, and Websites

Governing Authority

DHS Organization to Contact Website

ARP (Directive 102-01)

Chief Procurement Office’s Acquisition Program Management Division

DHS EA Process

CIO’s Office of Applied Technology (OAT)

https://interactive.dhs.gov/suite/personalization/grouppage.do?groupid=847

DHS OAST CIO’s Office on Accessible Systems and Technology (OAST)

https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?community= MGMT&index=130&id=2025980011

DHS CISO CIO’s Chief Information Security Officer (CISO)

https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?index=10&community= MGMT&id=2002980003

Privacy Office DHS Privacy Office http://www.dhs.gov/privacy

DHS CPIC CIO’s Enterprise Business Management Office (EBMO)

https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?community=MGMT&index=0&id=1

DHS SELC CIO’s EBMO https://dhsonline.dhs.gov/portal/jhtml/community.jhtml?community=MGMT&index= 141&id=2036980003

Table B4-2 depicts the SELC products and their update schedule. Each artifact is marked in the appropriate SELC stage as Create (C), Update (U), or Finalize (F). This indicates the status of each artifact at each SELC stage and how it should mature throughout the life cycle.

Page 111: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-105 11-05-08

Table B4-2: Full Document Matrix Governing

Authority ARP Phases

NEE

D

AN

ALY

ZE/

SELE

CT

OB

TAIN

PROD

UCE

DEPL

OY/

SUPP

ORT

SELC Stages

Line Number

Documents

Solu

tion

Engi

neer

ing

Plan

ning

Requ

irem

ents

De

finiti

on

Desig

n

Deve

lopm

ent

Inte

grat

ion

& Te

st

Impl

emen

tatio

n

Oper

atio

ns &

Ma

inte

nanc

e

Disp

ositi

on

1 Mission Need Statement ARP (Dir 102-01)

C F

2 Capability Development Plan (CDP)

ARP (Dir 102-01)

C/F

3 Acquisition Plan ARP (Dir 102-01)

C U U U U

4 CONOPS ARP (Dir 102-01) C/F 5 Analysis of Alternatives ARP (Dir 102-01) C/F 6 Lifecycle Cost Estimate (LCCE) ARP (Dir 102-01) C U U U 7 Operational Requirements Document (ORD) C/F 8 Integrated Logistics Support Plan (ILSP) C U 9 Acquisition Program Baseline ARP (Dir 102-01) C U U 10

Project Tailoring Plan DHS SELC C/F

11

Service Reuse Plan DHS SELC C U U U U U U F F

12

HLS EA Program Alignment Decision Request

DHS EA Process C U U U F F

13

Program Alignment Documentation Matrix C U U U F F

14

Map to Business Architecture DHS EA Process C U U U F F

15

Map to OCIO Portfolios DHS EA Process C U U U F F

16

Section 508 National Security Exception Determination

DHS OAST C/F

17

Critical Infrastructure Protection Report

DHS CISO C F

18

FIPS 199 Security Categorization DHS CISO C U

19

Preliminary Security Risk Assessment

DHS CISO C/F

20

Solution Engineering Review Approval Letter

DHS SELC C/F

21

Project Management Plan (Includes Integrated Master Schedule) (PMP)

ARP (Dir 102-01) C U U U U U F F

2 Configuration Management Plan DHS SELC C F

Page 112: Pia Reservation SDLC

Sample Template and Guidance

B-106 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Governing Authority

ARP Phases

NEE

D

AN

ALY

ZE/

SELE

CT

OB

TAIN

PROD

UCE

DEPL

OY/

SUPP

ORT

SELC Stages

Line Number

Documents

Solu

tion

Engi

neer

ing

Plan

ning

Requ

irem

ents

De

finiti

on

Desig

n

Deve

lopm

ent

Inte

grat

ion

& Te

st

Impl

emen

tatio

n

Oper

atio

ns &

Ma

inte

nanc

e

Disp

ositi

on

2 23

Privacy Threshold Analysis (PTA) Privacy Office C/F

24

Section 508 EIT Accessibility Plan

DHS OAST C U U U U U F F

25

Risk Management Plan ARP (Dir 102-01) C/F

26

Quality Assurance Plan DHS SELC C/F

27

Data Management Plan DHS SELC C F

28

Training Plan DHS SELC C/F

29

Project Planning Review Approval Letter

DHS SELC C/F

30

Functional Requirements Document (FRD)

DHS SELC C U U U U F F

31

Requirements Traceability Matrix (RTM)

DHS SELC C U U U U F F

32

Test and Evaluation Master Plan (TEMP)

ARP (Dir 102-01) C U F

Developmental Test Plan (DTP) ARP (Dir 102-01 C U F 33

Security Requirements Traceability Matrix (SRTM)

DHS CISO C U F

34

Plan of Action & Milestone (POA&M)

DHS CISO C U U F

35

System Security Plan (SSP) DHS CISO C U U U F

36

Contingency Plan DHS CISO U F

37

Disaster Recovery Plan DHS SELC C U F

38

Security Risk Assessment (SRA) DHS CISO C U F

39

Security Test & Evaluation (ST&E) Plan

DHS CISO C F

40

Map to the Data Architecture DHS EA Process C U U F F

41

Map to the Business Architecture DHS EA Process C U U F F

Page 113: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-107 11-05-08

Governing Authority

ARP Phases

NEE

D

AN

ALY

ZE/

SELE

CT

OB

TAIN

PROD

UCE

DEPL

OY/

SUPP

ORT

SELC Stages

Line Number

Documents

Solu

tion

Engi

neer

ing

Plan

ning

Requ

irem

ents

De

finiti

on

Desig

n

Deve

lopm

ent

Inte

grat

ion

& Te

st

Impl

emen

tatio

n

Oper

atio

ns &

Ma

inte

nanc

e

Disp

ositi

on

42

Map to Technology Standards & Products C U U F F

43

System Definition Review Approval Letter

DHS SELC C/F

44

Service Level Agreements ARP (Dir 102-01) C U F F

45

System Requirements Document DHS SELC C U U U F F

46

Interconnection Security Agreement (ISA)

DHS CISO C/F

47

Logical Design Document DHS SELC C/F

48

Data Architecture Document DHS SELC C/F

49

System Design Document DHS SELC C U U F

50

Technology Insertion Decision Request C/F

51

Site Prep Plan DHS SELC C/F

52

Deployment Plan DHS SELC C/F

53

Critical Design Review Approval Letter

DHS SELC C/F

54

Training Materials DHS SELC C/F

55

Test Case Specification DHS SELC C/F

56

System Acceptance Test Procedures

DHS SELC C/F

57

Test Readiness Review Approval Letter

DHS SELC C/F

58

Operators Manuals DHS SELC C F

59

Maintenance Manuals DHS SELC C F

60

User Manuals DHS SELC C F

6 System Test Report DHS SELC C/F

Page 114: Pia Reservation SDLC

Sample Template and Guidance

B-108 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Governing Authority

ARP Phases

NEE

D

AN

ALY

ZE/

SELE

CT

OB

TAIN

PROD

UCE

DEPL

OY/

SUPP

ORT

SELC Stages

Line Number

Documents

Solu

tion

Engi

neer

ing

Plan

ning

Requ

irem

ents

De

finiti

on

Desig

n

Deve

lopm

ent

Inte

grat

ion

& Te

st

Impl

emen

tatio

n

Oper

atio

ns &

Ma

inte

nanc

e

Disp

ositi

on

1 62

Acceptance Test Report DHS SELC C/F

63

Section 508 Assistive Technology Interoperability Test Report

C/F

64

Service Insertion Package (SIP) DHS EA Process C U F F

65

Security Assessment Report (SAR)

DHS CISO C/F

66

Security Accreditation package DHS CISO C/F

67

Privacy Impact Assessment (PIA) Privacy Office C/F

68

DHS Periodic Reporting CPIC C U U U

69

Production Readiness Review Approval Letter

DHS SELC C/F

70

Pilot Results Report IRP (MD 1400) C/F

71

Follow on Test Results IRP (MD 1400) C/F

72

Version Description Document DHS SELC C/F

73

Transition To Support Document DHS SELC C/F

74

Critical Infrastructure Protection Report

CIP C/F

75

System of Record Notice (SORN) DHS Privacy C/F

76

Authority To Operate (ATO) Letter

DHS CISO C/F

77

Operational Readiness Review Approval Letter

DHS SELC C/F

78

Performance Reports IRP (MD 1400) U U

79

Post Implementation Review (PIR) Results U U

80

Disposal Plan DHS SELC C/F C/F

Page 115: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-109 11-05-08

Governing Authority

ARP Phases

NEE

D

AN

ALY

ZE/

SELE

CT

OB

TAIN

PROD

UCE

DEPL

OY/

SUPP

ORT

SELC Stages

Line Number

Documents

Solu

tion

Engi

neer

ing

Plan

ning

Requ

irem

ents

De

finiti

on

Desig

n

Deve

lopm

ent

Inte

grat

ion

& Te

st

Impl

emen

tatio

n

Oper

atio

ns &

Ma

inte

nanc

e

Disp

ositi

on

81

DHS Periodic Reporting CPIC U U

82

Operational Analyses CPIC U U

83

Lessons Learned CPIC U U

84

FISMA metrics reports DHS CISO U U

85

Security Incident reports DHS CISO U U

86

C&A Updates (every 3 years or when major change is made)

U U

87

Privacy Documentation (updated for systems decommissioned)

U U

88

Disposition Approval Request DHS SELC C/F

89

Disposition Plan DHS SELC C/F

90

Archived Data DHS SELC C/F

91

Archived System DHS SELC C/F

Page 116: Pia Reservation SDLC

Sample Template and Guidance

B-110 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Attachment B5- Acronyms

ANSI American National Standards Institute APB Acquisition Program Baseline API Application Programming Interface ATO Authority to Operate BCP Business Continuity Planning BIA Business Impact Assessment BRM Business Reference Model C Create C&A Certification and Accreditation CCB Change Control Board CDD Capability Development and Demonstration CDM Conceptual Data Model CDR Critical Design Review CFO Chief Financial Officer CFR Code of Federal Regulations CI Configurable Item CIO Chief Information Officer CIP Critical Infrastructure Protection CISO Chief Information Security Officer CM Configuration Management CMMI Capability Maturity Model Integrated CONOPS Concept of Operations COOP Continuity of Operations Plan COTS Commercial Off-the-Shelf CPIC Capital Planning and Investment Control CPO Chief Privacy Officer CTD Concept and Technology Development DAA Designated Accrediting Authority DHS Department of Homeland Security DR Decision Request DRM Data Reference Model DT Developmental Test DTP Developmental Test Plan EA Enterprise Architecture EAB Enterprise Architecture Board EBMO Enterprise Business Management Office EIA Electronic Industries Alliance EIT Electronic Information Technology EOA Early Operational Assessment ESB Enterprise Service Bus EVM Earned Value Management F Finalize FAR Federal Acquisition Regulation FBRM Federal Business Reference Model FEA Federal Enterprise Architecture FIPS Federal Information Processing Standards FRD Functional Requirements Document FYHSP Future Years Homeland Security Program GSAM Guidelines for Successful Acquisition and Management HFE Human Factors Engineering HFES Human Factors and Ergonomics Society HFS Human Factors Society

Page 117: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-111 11-05-08

HLS Homeland Security HR Human Resources HSPD Homeland Security Presidential Directive IDE Integrated Development Environment IEC International Engineering Consortium IEEE Institute of Electrical and Electronics Engineers IEPD Information Exchange Package Description IPG Integrated Programming Guidance IPT Integrated Project Team IRB Investment Review Board ARP Investment Review Process ISA Interconnection Security Agreement ISO International Organization for Standardization ISO/TR International Organization for Standardization/Technical Report ISSM Information Systems Security Manager ISSO Information System Security Officer IT Information Technology IV&V Independent Verification and Validation JRC Joint Requirements Council LCCE Life Cycle Cost Estimate LDM Logical Data Model MBA Masters of Business Administration MD Management Directive MDP Milestone Decision Point MNS Mission Needs Statement MOA Memorandum of Agreement MOU Memorandum of Understanding N/A Not Applicable NARA National Archives and Record Administration NCSD National Cyber Security Division NIEM National Information Exchange Model NIST National Institute of Standards and Technology NPG NASA Procedures and Guidelines O&M Operations and Maintenance O&S Operations and Support OAST Office on Accessible Systems and Technology OAT Office of Applied Technology OCIO Office of the Chief Information Officer OCM Organizational Change Management OGC Office of General Counsel OMB Office of Management and Budget OT Operational Test ORD Operational Requirement Document ORR Operational Readiness Review P&D Production and Development P.L. Public Law PA&E Program Analysis and Evaluation PADRT Program Assessment and Design Review Team PAR Project Authorization Review PART Program Assessment Rating Tool PDM Physical Data Model PDR Preliminary Design Review PfM IT Portfolio Manager PIA Privacy Impact Assessment PII Personally Identifiable Information PIR Post Implementation Review

Page 118: Pia Reservation SDLC

Sample Template and Guidance

B-112 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

PM Project Manager PMBOK Project Management Body of Knowledge PMI Project Management Institute PMO Project Management Office PMP Project Management Plan, Project Management Professional POA&M Plan of Action and Milestones PPBE Planning, Programming, Budgeting and Execution PPR Project Planning Review PRR Production Readiness Review PTA Privacy Threshold Assessment QA Quality Assurance RA Risk Assessment RAD Rapid Application Development RAP Resource Allocation Plan RTM Requirements Traceability Matrix SAR Security Assessment Report SBU Sensitive But Unclassified SCBA Services and Components Based Architectures SDLC System Development Life Cycle SDR System Definition Review SEI Software Engineering Institute SIP Service Insertion Package SLA Service Level Agreement SELC Systems Engineering Life Cycle SME Subject Matter Expert SO System Owner SOA Service Oriented Architecture SORN System of Records Notice SP Standards Profile SQL Structured Query Language SRA Security Risk Assessment SRD System Requirements Document SRM Service Reference Model SRTM Security Requirements Traceability Matrix SSP System Security Plan ST&E Security Test and Evaluation TEMP Test and Evaluation Master Plan TI Technical Insertion TRR Test Readiness Review U Update U.S.C. U.S. Code URL Uniform Resource Locator V&V Validation and Verification VDD Version Description Document WBS Work Breakdown Structure

Page 119: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-113 11-05-08

Attachment B6 - Glossary

Acceptance Testing Testing conducted in a simulated operational environment to determine whether a system satisfies its acceptance criteria (i.e., initial requirements and current needs of its user) and to enable the customer to determine whether to accept the system. (Per IEEE Std 1012-1998)

Acquisition

The conceptualization, initiation, design, development, testing, contracting, production, deployment, support, modification, and disposal of systems, supplies, or services (including construction) to satisfy agency needs. “Acquisition” means the acquiring by contract with appropriated funds of supplies or services (including construction) by and for the use of the Federal Government through purchase or lease, whether the supplies or services are already in existence or must be created, developed, demonstrated, and evaluated. Acquisition begins at the point when agency needs are established and includes the description of requirements to satisfy agency needs, solicitation and selection of sources, award of contracts, contract financing, contract performance, contract administration, and those technical and management functions directly related to the process of fulfilling agency needs by contract. [Federal Acquisition Regulation]

Acquisition Planning

Preparing, developing, or acquiring the information to be used to design a project; assess the benefits, risks, and risk-adjusted life-cycle cost of alternative solutions; and establish realistic cost, schedule, and performance goals for the selected alternative, before proceeding to full acquisition of the capital project or useful segment or terminating the project. Planning must progress to the point of commitment to achieving specific goals for the completion of the acquisition. Information-gathering activities may include market research of available solutions, architectural drawings, geological studies, environmental planning, environmental and safety studies, engineering and design studies, and prototypes. Planning is a useful segment of capital investment. Depending on the nature of the project, one or more planning segments may be necessary [OMB Circular No. A-11]. “Acquisition planning” means the process by which the efforts of all personnel responsible for an acquisition are coordinated and integrated through a comprehensive plan for fulfilling the agency need in a timely manner and at a reasonable cost. It includes developing the overall strategy for managing the acquisition. [Federal Acquisition Regulation]

Acquisition Program Baseline (APB)

The APB establishes the project’s performance requirements, schedule requirements, and estimate of total acquisition cost. APB parameter values shall represent the project as it is expected to be produced or deployed. In the case of an evolutionary acquisition strategy, the APB shall include parameters for the next phase and, if known, for follow-on phases. The APB will contain parameters that, if not met, will require the IRB to reevaluate the project and consider alternative project concepts or design approaches. APB parameters are typically established for each useful segment, as well as for the program as a whole, and mirror the objectives in the Future Years Homeland Security Program (FYHSP) and the Program Assessment Rating Tool (PART).

Application Programmable Interface (API)

A set of routines, protocols, and tools for building software applications. A good API makes it easier to develop a program by providing all the building blocks. A programmer puts the blocks together. (Per Services and Components Based Architectures [SCBA], version 3.5)

Baseline A formally approved version of a configuration item, regardless of media, formally designated and fixed at a specific time during the configuration item’s life cycle. (Per ISO/IEC 12207)

Page 120: Pia Reservation SDLC

Sample Template and Guidance

B-114 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Baseline Goals Baseline cost, schedule, and performance goals are the standard against which actual work is measured. They are the basis for the annual report to the Congress required by the Federal Acquisition Streamlining Act Title V on variances of 10 percent or more from cost and schedule goals and any deviation from performance goals. OMB must approve the goals and any changes to the goals. The baseline cost and schedule goals should be realistic projections of total cost, total time to complete the project, and interim cost and schedule goals. The interim cost and schedule goals should be based on the value of work performed or a comparable concept. The performance goals should be realistic assessments of what the acquisition is intended to accomplish, expressed in quantitative terms if possible.

Business Continuity Planning (BCP)

Per National Institute of Standards and Technology (NIST) SP 800-34: “The documentation of a predetermined set of instructions or procedures that describe how an organization’s business functions will be sustained during and after a significant disruption.”

Business Reference Model

Per OMB Circular A-11, Business Reference Model (BRM) is a function-driven framework used to describe the lines of business and sub-functions performed by the Federal Government independent of the agencies performing them. IT investments are mapped to the BRM to identify collaboration opportunities.

Business Requirement

A requirement that outlines the procedures and information flows, the proposed changes to those procedures, the user’s assessment of information needs, a preliminary description of the desired system, and an outline of the user’s acceptance requirements.

Capital Planning and Investment Control

A decision-making process for ensuring that investments integrate strategic planning, architecture, security, budgeting, procurement, and the management of the investment in support of agency missions and business needs. The term comes from the Clinger-Cohen Act of 1996; while originally focused on IT, it now applies also to non-IT investments (OMB Circular No. A-11).

Certification and Accreditation

Per OMB Circular A-11, Certification and Accreditation (C&A) is a comprehensive assessment of the management, operational, and technical security controls in an information system, made in support of security accreditation, to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements of the system.

Component (organizational)

All the entities that directly report to the Office of the Secretary, which includes the Secretary, Deputy Secretary and his or her staff, Chief of Staff and his or her staff, and Counselors and their staff. See Management Directive 0010.2.

Component (as related to services)

An independently deployable unit of software that exposes its functionality through a set of services accessed via well-defined interfaces. A component is based on a component standard, is described by a specification, and has an implementation. Components can be assembled to create applications or larger-grained components. (Per SCBA)

Component Based Architecture

An architecture process that enables the design of enterprise solutions using pre-manufactured components. The focus of the architecture may be a specific project or the entire enterprise. This architecture provides a plan of what needs to be built and an overview of what has been built already. (Per SCBA. v3.5)

Page 121: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-115 11-05-08

Conceptual Data Model (CDM)

An abstract representation to illustrate overall structure of organization data by identifying entity types and the relationships between them independent of any database management system or other implementation considerations. The Conceptual Data Model serves as the common foundation for normalizing data access across the enterprise to support improved data sharing. The Conceptual Data Model satisfies two main objectives: (1) to provide common vocabulary of across the enterprise and (2) to provide understanding of the fundamental (data) structure of the enterprise.

Configuration Item An entity within a configuration that satisfies an end use function and that can be uniquely identified at a given reference point. (Per ISO/IEC 12207)

Configuration Management

A management process for establishing and maintaining consistency of a product’s performance, functional, and physical attributes with its requirements, design, and operational management information throughout its life. (Per ANSI/EIA-649)

Continuity of Operations Plan (COOP)

Per NIST SP 800-34: “A predetermined set of instructions or procedures that describe how an organization’s essential functions will be sustained for up to 30 days as a result of a disaster event before returning to normal operations.”

Control Phase Capital planning phase that requires ongoing monitoring of information technology investments against schedules, budgets, and performance measures.

Customer An individual, organization, or enterprise that (1) commissions the engineering of a system; (2) is a prospective purchaser of the end products of a system, or portions thereof; or (3) is an acquirer of a product. (Per ANSI/EIA-632-1998)

Data A value or set of values, representing a specific concept or concepts in a formalized manner, suitable for communication, interpretation, or processing by humans or by automatic means. Data becomes “information” when analyzed and possibly combined with other data in order to extract meaning, and provide context. The meaning of data can vary according to its context.

Data Architecture Architectural framework for how data is stored, managed, and used in a system. It describes how data is persistently stored, how components and processes reference and manipulate this data, how external/legacy systems access the data, interfaces to data managed by external/legacy systems, implementation of common data operations. Data architecture establishes common guidelines for data operations that make it possible to predict, model, gauge, and control the flow of data in the system. (Carnegie Mellon Software Engineering Institute)

Data Asset A managed container for data; examples include relational database, Web site, document repository, directory or data service. (FEA DRM 2.0)

Data Exchange Categorization of information being exchanged between one or more parties; such as the regular exchange of environment testing data among federal, state, local, and tribal entities. (FEA DRM 2.0)

Data Model A graphic and/or lexical representation of the data and information required to support the operation of any set of business processes and/or the systems used to automate them (based on FEA DRM 2.0). A description of the organization of data in a manner that reflects an information structure; (ISO 11179-1). A representation of data, specifying their properties, structure and inter-relationships; (ISO 11179-3). A model that describes in an abstract way how data is represented in a business organization, an information system, or a database management system.

Page 122: Pia Reservation SDLC

Sample Template and Guidance

B-116 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Data Reference Model (DRM)

A representational framework whose primary purpose is to enable information sharing and reuse across all levels of the government via the standard description and discovery of common data and the promotion of uniform data management practices. A Data Reference Model (DRM) is one of the five reference models of the Federal Enterprise Architecture (FEA).

Acquisition Decision Event

A predetermined point within the acquisition lifecycle phases at which the investment will undergo a review prior to commencement of the next phase.

Deliverable An item agreed to be delivered to an acquirer as specified in an agreement. This item can be a document, a hardware item, a software item, a service, or any type of work product. (Per ANSI/EIA-632-1998)

Development The action by which a set of requirements is translated into a solution definition for a set of products that satisfy stakeholders. (Per ANSI/EIA-632-1998)

Developmental Test Any testing used to assist in the development and maturation of products, product elements, or manufacturing or support processes. Also any engineering-type test used to verify that design risks are minimized, substantiate achievement of contract technical performance, and certify readiness for Operational Testing.

Earned Value Management

A management methodology for integrating scope of work with schedule and cost elements for optimum investment planning and control. A project (investment) management tool effectively integrating the investment scope of work with schedule and cost elements for optimum investment planning and control. The qualities and operating characteristics of earned value management (EVM) systems are described in American National Standards Institute/Electronic Industries Alliance (ANSI/EIA) Standard -748-A-1998, Earned Value Management Systems, approved May 19, 1998. It was reaffirmed on August 28, 2002.

E-Government Per the E-Government Act of 2002, the term “electronic Government” means the use by the Government of web-based Internet applications and other information technologies, combined with processes that implement these technologies, to: (a) enhance the access to and delivery of Government information and services to the public, other agencies, and other Government entities; or (b) bring about improvements in Government operations that may include effectiveness, efficiency, service quality, or transformation.

Electronic Records Electronic, or machine-readable records, are records on electronic storage media (A Glossary for Archivists, Manuscript Curators, and Records Managers, Society of American Archivists: Chicago, 1992 p. 12). Electronic record, as defined in NARA regulations (36 CFR 1234.2), means any information that is recorded in a form that only a computer can process and that satisfies the definition of a Federal record per the Federal Records Act definition supplied above. Federal electronic records are not necessarily kept in a “recordkeeping system” but may reside in a generic electronic information system or are produced by an application such as word processing or electronic mail.

Electronic Records Management

Electronic records management is using automated techniques to manage records regardless of format. Electronic records management is the broadest term that refers to electronically managing records on varied formats, be they electronic, paper, microform, etc.

Enterprise Service Bus

An enterprise integration architecture that allows incremental integration driven by business requirements, not technology limitations. (Per SCBA, v3.5)

Page 123: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-117 11-05-08

Evaluate Phase Capital planning phase that requires information technology investments to be reviewed once they are operational to determine whether the investments meet expectations.

Evaluation A systematic determination of the extent to which an entity meets its specified criteria. (Per ISO/IEC 12207)

Environment (1) The natural conditions (weather, climate, ocean conditions, terrain, vegetation, dust, etc.) and induced conditions (electromagnetic interference, heat, vibration, etc.) that constrain the design definitions for end products and their enabling products. (2) External factors affecting an enterprise or project. (3) External factors affecting development tools, methods, or processes. (Per ANSI/EIA-632-1998)

Exhibit 300 Business Case

Exhibit 300 business cases are also referred to as capital asset plans. They are required by OMB Circular A-11 and provide budget justification and reporting requirements for investments. They provide agencies with the format to report on the budgeting, acquisition, and management of federal capital assets.

Exhibit 53 Exhibit 53s are also referred to as agency IT investment portfolios. They are required by OMB Circular A-11 and provide summary budget information for all agency major and non-major IT investments.

Exit Criteria (per Directive 102-01)

Project-specific accomplishments that must be demonstrated satisfactorily before a project can either progress further in the current acquisition phase or transition to the next acquisition phase. Exit criteria are normally selected to track progress in important technical, schedule, or management risk areas. Exit criteria shall serve as gates that, when successfully passed or exited, demonstrate that the project is on track to achieve its final goals and should be allowed to continue with additional activities within an acquisition phase or be considered for continuation into the next acquisition phase. Exit criteria include some level of demonstrated performance outcome (e.g., level of engine thrust), the accomplishment of some process at some level of efficiency (e.g., manufacturing yield), the successful accomplishment of some event (e.g., first flight), or some other criterion (e.g., establishment of a training program or inclusion of a particular clause in the follow on contract) that indicates that the particular aspect of the project is progressing satisfactorily.

Federal Enterprise Architecture

Federal Enterprise Architecture (FEA) is a business-based framework for government-wide improvement. It describes the relationship between business functions and the technologies and information supporting them. The FEA is being constructed through a collection of interrelated “reference models” designed to facilitate cross-agency analysis and the identification of duplicative investments, gaps, and opportunities for collaboration within and across federal agencies. More information about the FEA reference models is available at http://www.egov.gov.

Financial System Per OMB Circular A-127, the term “financial system” means an information system, comprised of one or more applications, that is used for any of the following:

• collecting, processing, maintaining, transmitting, and reporting data about financial events;

• supporting financial planning or budgeting activities;

• accumulating and reporting cost information; or

• supporting the preparation of financial statements.

Page 124: Pia Reservation SDLC

Sample Template and Guidance

B-118 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Functional Requirement

A requirement that defines what system products must do and their desired behavior in terms of an effect produced, or an action or service to be performed. (Per ANSI/EIA-632-1998)

Independent Verification and Validation (IV&V)

Verification and Validation processes performed by an organization with a specified degree of technical, managerial, and financial independence from the development organization. (Per IEEE Std 1012-1988)

Information Security’ Per FISMA, the term “information security” means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide: (A) Integrity, which means guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity; (B) Confidentiality, which means preserving authorized restrictions on access and disclosure, including means for protecting personal privacy and proprietary information; and (C) Availability, which means ensuring timely and reliable access to and use of information.

Information System Per OMB Circular A-130: The term “information system” means a discrete set of information resources organized for the collection, processing, maintenance, transmission, and dissemination of information, in accordance with defined procedures, whether automated or manual.

Information Technology

Any equipment or interconnected system(s) or subsystem(s) of equipment/software, or any national security system, that is used in the automatic acquisition, storage, manipulation, management, movement, control, display (including geospatial technologies), switching, interchange, transmission (wired or wireless telecommunications), or reception of data, voice, video, or information by an executive agency. For purposes of this definition, equipment is used by DHS if the equipment is used by DHS directly or is used by DHS organizational partners (including other federal agencies, state and local governments and private contractors) under a contract with DHS which (a) requires the use, to a significant extent, of such equipment in the performance of a service or the furnishing of a product. The term IT includes computers; ancillary equipment (including imaging peripherals, input, output, and storage devices necessary for security and surveillance); peripheral equipment designed to be controlled by the central processing unit of a computer, software; firmware and similar procedures; services (including support services); and related resources. The term IT does not include any equipment that is acquired by a contractor incidental to a contract or any equipment that contains imbedded IT that is used as an integral part of the product, but the principal function of which is not the acquisition, storage, analysis, evaluation, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information. For example, heating, ventilation, and air conditioning equipment, such as thermostats or temperature control devices, and medical equipment for which IT is integral to operation, are not IT [Federal Acquisition Regulation 2.101]. The EAB will review all IT investments, including any investments categorized as non-IT on the E300 but that contain IT components.

Integrated Project Team

A multi-disciplinary team led by a PM responsible and accountable for planning, budgeting, procurement and life-cycle management of the investment to achieve its cost, schedule and performance goals. Team skills include budgetary, financial, capital planning, procurement, user, program, architecture, earned value management, security, and other staff as appropriate. An IPT may include members from both government (including a contracting officer) and industry, after award.

Page 125: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-119 11-05-08

Integration Testing An orderly progression of testing of incremental pieces of the software program in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated to show compliance with the program design, and capabilities and requirements of the system. (Per IEEE Std 1012-1988)

Integrity Per FIPS Publication 199, the term “integrity” refers to guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity. [44 U.S.C., SEC. 3542]

Interoperability Ability of systems, personnel, and equipment to provide and receive functionality, data, information, and/or services to and from other systems, personnel, and equipment, between both public and private agencies, departments, and other organizations, in a manner enabling them to operate effectively together.

Investment DHS cost, outlays or expenditure to achieve goals and objectives that results in the acquisition and/or sustainment of a needed capability (including processes) for furthering the DHS mission. Examples of investments are expenditures for personnel, research and development (R&D), capital assets, information technology (IT), service, operational and maintenance, and decommissioning and disposal of replaced systems. Investment decisions are a balance of requirements, risks, and funding. Investment decisions spur and guide acquisition and contracting decisions and baselines. DHS has categorized major investments as Levels 1 and 2 and Level 3 IT.

Interoperability Per the E-Government Act of 2002, the term “interoperability” means the ability of different operating and software systems, applications, and services to communicate and exchange data in an accurate, effective, and consistent manner.

Life Cycle Cost The total cost to the Federal government of acquiring, operating, supporting, and, if applicable, disposing of the items being acquired [FAR 7.101]; the sum of all costs over the useful life of a building, system, or product; the sum total of the direct, indirect, recurring, nonrecurring, and other related costs incurred or estimated to be incurred in the design, development, production, operation, maintenance, support, and final disposition of a major system over its anticipated useful life span and salvage (resale) value, if any [FAR 52.248-2(b)]. Where system or project planning anticipates the use of existing sites or facilities, restoration and refurbishment costs should be included [OMB Circular A-94, Appendix A].

Life Cycle Model A framework containing the processes, activities, and activities involved in the development, operation, and maintenance of a software product, spanning the life of the system from the definition of its requirements to the termination of its use. (Per ISO/IEC 12207)

Logical Data Model (LDM)

A graphical representation of the information requirements of a business area at a more granular level than a Conceptual Data Model and includes data objects and their interrelationships. The Logical Data Model contains objects and elements expressed in business terms and is the basis for developing physical data models.

Major Information System

Per OMB Circular A-130, the term “major information system” means an information system that requires special management attention because of its importance to an agency mission; its high development, operating, or maintenance costs; or its significant role in the administration of agency programs, finances, property, or other resources.

Major Investment At DHS, major investments include all Level 1 and 2 investments, as well as Level 3 IT investments in accordance with the investment thresholds defined in Directive 102-01.

Page 126: Pia Reservation SDLC

Sample Template and Guidance

B-120 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Metadata The information that is stored as the description of a unique piece of data and all the properties associated with it.

Monitoring An examination of the status of the activities of a supplier and of their results by the acquirer or a third party. (Per ISO/IEC 12207)

National Information Exchange Model (NIEM)

A Federal, State, Local and Tribal interagency initiative providing a foundation for seamless information exchange. It leverages the data exchange standards efforts successfully implemented by the Global Justice Information Sharing Initiative (Global) and extends the Global Justice XML Data Model (GJXDM) to facilitate timely, secure information sharing across the whole of the justice, public safety, emergency and disaster management, intelligence, and homeland security enterprise. NIEM was launched on February 28, 2005, through a partnership agreement between the U.S. Department of Justice (DOJ) and the U.S. Department of Homeland Security (DHS) and signed by Chief Information Officers.

National Security System

Per OMB Circular A-130, the term “national security system” means any telecommunications or information system operated by the United States Government, the function, operation, or use of which (1) involves intelligence activities; (2) involves cryptologic activities related to national security; (3) involves command and control of military forces; (4) involves equipment that is an integral part of a weapon or weapons system; or (5) is critical to the direct fulfillment of military or intelligence missions, but excluding any system that is to be administrative and business applications (including payroll, finance, logistics, and personnel management applications).

Non-Functional Requirement

Non-functional requirements are requirements which specify criteria that can be used to judge the operation of a system, rather than specific behaviors. This should be contrasted with functional requirements that specify specific behavior or functions. Typical non-functional requirements are reliability, scalability, availability, and cost.

Operational Per OMB Circular, A-11, operational (steady state) means an asset or a part of an asset with a delivered component performing the mission.

Operational Analysis

Operational analysis is a method of examining the ongoing performance of an operating asset investment and measuring that performance against an established set of cost, schedule, and performance goals. An operational analysis is, by nature, less structured than performance reporting methods applied to developmental projects and should trigger considerations of how the investment's objectives could be better met, how costs could be reduced, and whether the organization should continue performing a particular function. [OMB Circular A-11] Basically, operational analysis is used to examine whether an investment in Operations and Maintenance still meets its intended objectives and yields expected benefits. See the DHS Operational Analysis Guidance for more information.

Operational Scenario A sequence of events expected during operation of system products. Includes the environmental conditions and usage rates as well as expected stimuli (inputs) and responses (outputs). (Per ANSI/EIA-632-1998)

Operational Test The field test performed under realistic conditions, overseen and evaluated by an activity independent from the agency developer and user organizations, of any system or key component of a system for the purposes of determining the effectiveness and suitability of that system or component when used by typical users in the expected operating environment.

Page 127: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-121 11-05-08

Periodic Reporting

A DHS reporting process for major investments that establishes communication among investment Program Managers, DHS Component senior leadership, and DHS oversight entities regarding the health and status of major DHS investments. The information provided via Periodic Reporting enables DHS to provide oversight and to ensure compliance with Department and OMB requirements, along with preparing required reports related to the OMB Information Technology High Risk Template and the President’s Management Agenda e-Government initiative.

Performance Requirement

A requirement that defines how well the system products are required to perform a function, along with the conditions under which the function is performed. (Per ANSI/EIA-632-1998)

Personally Identifiable Information

The term “personally identifiable information” or “PII” means any information that permits the identity of an individual to be directly or indirectly inferred, including any other information which is linked or linkable to that individual regardless of whether the individual is a U.S. Citizen, Legal Permanent Resident, or a visitor to the U.S.

Physical Data Model (PDM)

A representation of a data design which takes into account the facilities and constraints of a given database management system. It is typically derived from the Logical Data Model and may include all the database products required to create relationships between tables or achieve performance goals, such as indexes, constraint definitions, linking tables, partitioned tables or clusters. At this level, the data modeler specifies how the logical data model will be realized in the database schema (Conceptual, Logical, and Physical Data Models).

Planning Preparing, developing or acquiring the information used to: design the investment; assess the benefits, risks, and risk-adjusted life-cycle costs of alternative solutions; and establish realistic cost, schedule, and performance goals, for the selected alternative, before either proceeding to full acquisition of the capital project (investment) or useful segment or terminating the investment. Planning must progress to the point where the project is ready to commit to achieving specific goals for the completion of the acquisition before preceding to the acquisition phase. Information gathering activities may include market research of available solutions, architectural drawings, geological studies, engineering and design studies, and prototypes. Planning is a useful segment of a capital project (investment). Depending on the nature of the investment, one or more planning segments may be necessary.

Plan of Action and Milestones

As defined in OMB Memorandum 02-01, a plan of action and milestones (POA&M), also referred to as a corrective action plan, is a tool that identifies activities that need to be accomplished. It details resources required to accomplish the elements of the plan, any milestones in meeting the task, and scheduled completion dates for the milestones. The purpose of the POA&M is to assist agencies in identifying, assessing, prioritizing, and monitoring the progress of corrective efforts for security weaknesses found in programs and systems.

Portfolio Management The management of broad categories of investments linked by their relationship to the mission to ensure effective performance, correspondence to the DHS EA, minimization of overlapping functions, and proper funding.

Post-Implementation Review

Evaluation of the investment after it has been fully implemented or terminated to determine whether the targeted outcome (e.g., performance measures) of the investment has been achieved.

Pre-Select Phase Capital planning phase that provides a process to assess whether information technology investments support strategic and mission needs.

Page 128: Pia Reservation SDLC

Sample Template and Guidance

B-122 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Privacy Impact Assessment

Per OMB Circular A-11, Privacy Impact Assessment (PIA) is a process for examining the risks and ramifications of using information technology to collect, maintain and disseminate information in identifiable form from or about members of the public, and for identifying and evaluating protections and alternative processes to mitigate the impact to privacy of collecting such information.

Program

The totality of activities directed to accomplish specific goals and objectives, which may provide new or improved capabilities in response to approved requirements and/or sustain existing capabilities, and which may have multiple projects to obtain specific capability requirements or capital assets. A Program is funded by one or more Investments.

Program Manager The responsible agency customer, who, with significant discretional authority, is uniquely empowered to make final scope-of-work, capital-investment, and performance acceptability decisions and who is responsible for accomplishing program objectives or production requirements through the acquisition of any mix of in-house, contract, or reimbursable support resources. The Program Manager is responsible for management and oversight of the IPT.

Project A planned undertaking with a definite beginning, objective and ending. A project involves the definition, acquisition, and fielding of a unique product, service or result in accordance with specified resources and requirements. A project may be the whole or a part of a Program, and is funded by those Investments related to the Program. A project is a basic building block related to a program that is individually planned, approved, and managed. All investment elements with a start and end date and producing a defined capability are considered projects.

Project Manager A project manager (PM) is the official assigned responsibility for accomplishing a specifically designated unit of work effort established to achieve stated or designated objectives, defined tasks, or other units of related effort on a schedule and in support of the program mission. The project manager is responsible for the planning, controlling, and reporting of the project, and for the management of a specific function or functions, performance of the schedule, formulation of the budget, and execution of the approved budget. The project manager works closely with the Program Manager to ensure project objectives meet program objectives.

Prototype A model (physical, electronic, digital, analytical, etc.) of a product built for the purpose of a) assessing the feasibility of a new or unfamiliar technology; b) assessing or mitigating technical risk; c) validating requirements; d) demonstrating critical features; e) verifying a product; f) validating a product; g) determining enabling product readiness; h) characterizing performance or product features; or i) discovering physical principles. (Per ANSI/EIA-632-1998)

Qualification The process of demonstrating whether an entity is capable of fulfilling specified requirements. (Per ISO/IEC 12207)

Quality Assurance All the planned and systematic activities implemented within the quality system, and demonstrated as needed, to provide adequate confidence that an entity will fulfill requirements for quality. (Per ISO/IEC 12207)

Registry A database providing information describing and categorizing objects, but which does not contain the objects themselves. Registries usually provide information as to how to access the object they describe. For example, a “Service Registry” provides information on services. (Per SCBA, v3.5)

Release A particular version of a configuration item that is made available for a specific purpose (for example, test release). (Per ISO/IEC 12207)

Page 129: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-123 11-05-08

Requirement (1) Something that governs what, how well, and under what conditions a product will achieve a given purpose. (2) Normative elements that govern implementation of this Standard, including certain documents such as agreements, plans, or specifications. (Per ANSI/EIA-632-1998)

Resource Allocation Decision

The Secretary’s formal approval of Components’ RAPs. Resource Allocation Decisions set resource allocation targets for Components for the FYHSP and become the basis for the budget.

Resource Allocation Plan (RAP)

In the programming phase of the PPBE, the Components annually develop proposed programs consistent with the IPG. These programs, expressed in the RAP, reflect systematic allocation of resources required to achieve missions, objectives, and priorities, and potential alternative methods of accomplishing them. Resource requirements reflected in RAPs are translated into time-phased funding requirements. RAPs must account for long-term requirements and resources including human capital, construction and investments, operating and maintenance, and potential disposal or termination costs, and program performance goals.

Reuse Any use of a preexisting software artifact (component, specification, etc). in a context different from that in which it was created. (Per SCBA, v3.5)

Risk Management An organized process for identifying and assessing risks and for implementing means to avoid them or mitigate their effect if they occur. (Per ANSI/EIA-632-1998)

Security Category Per FIPS Publication 199, the characterization of information or an information system based on an assessment of the potential impact that a loss of confidentiality, integrity, or availability of such information or information system would have on organizational operations, organizational assets, or individuals.

Security Controls Per FIPS Publication 199, the term “security controls” refers to the management, operational, and technical controls (i.e., safeguards or countermeasures) prescribed for an information system to protect the confidentiality, integrity, and availability of the system and its information.

Select Phase Capital planning phase used to identify all new, ongoing, and operational investments for inclusion into the agency’s investment portfolio(s).

Service Discrete unit of functionality that can be requested (provided a set of preconditions is met), performs one or more operations (typically applying business rules and accessing a database), and returns a set of results to the requester. Completion of a service always leaves business and data integrity intact. (Per SCBA, v3.5)

Service Component A self-contained business process or service with predetermined and well-defined functionality that may be exposed through a well defined and documented business or technology interface. Well-designed Service Components are “loosely coupled” and collaborate primarily by exchanging messages. (Per SCBA, v3.5)

Page 130: Pia Reservation SDLC

Sample Template and Guidance

B-124 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Services and Components Based Architecture

Services and Components Based Architecture (SCBA) leverages the Federal Enterprise Architecture (FEA) and builds upon the concepts, principles, and benefits of Service Oriented Architecture (SOA). SCBA represents a practical, results-oriented, approach to modernizing enterprises. It is intended to help organizations reduce long-term costs, improve quality of service, improve information sharing, and help achieve a vision of flexible business processes supported by customer-focused applications, which can be altered in a matter of days instead of months. SCBA builds upon SOA principles in three ways: (1) it is tightly integrated with the Federal Enterprise Architecture, (2) it provides a description of what the architecture is (clarifying the varying descriptions that exist), and (3) it identifies the organizational, cultural, and process elements, as well as technological elements, that need to exist for these architectures to be successful. The most important aspect of SCBA is its focus on reuse of services and components – better referred to as Service Components. (Per SCBA, v3.5)

Service Component Reference Model

Per OMB Circular A-11, Service Component Reference Model (SRM) is a common framework and vocabulary used for characterizing the IT and business components collectively comprising an IT investment. The SRM helps agencies rapidly assemble IT solutions through the sharing and re-use of business and IT components. A component is a self-contained process, service, or IT capability with pre-determined functionality that may be exposed through a business or technology interface.

Service Level Agreement (SLA)

A contract or memorandum of agreement between a service provider and a customer that specifies, usually in measurable terms, what services the service provider will furnish. Information technology departments in major enterprises have adopted the idea of writing a service level agreement so that services for their customers (users in other departments within the enterprise) can be measured, justified, and perhaps compared with those of external (sourcing) service providers. (Per SCBA, v3.5)

Service Oriented Architecture

(1) Architecture that describes an entity (e.g., application or enterprise) as a set of interdependent services. SOA provides for reuse of existing services and the rapid deployment of new business capabilities based on exploiting existing assets. (2) Representation of a system where the functionality is provided as a set of services called by other parts of the system. (3) Policies, practices and frameworks that enable application functionality to be provided and requested as sets of services published at a granularity relevant to the service Requestor, which are abstracted away from the implementation using a single, standards based form of interface. (Per SCBA, v3.5)

Specification A document that contains specified requirements for a product and the means to be used to determine that the product satisfies these requirements. (Per ANSI/EIA-632-1998)

Stage A period within the life cycle of an entity that relates to the state of its description or realization. (Per IEEE P15288/CD1)

Stakeholder An individual or organization having a right, share, claim, or interest in a system or in its possession of characteristics that meet their needs and expectations. (Per IEEE P15288/CD1)

Standard A document that establishes engineering and technical requirements for products, processes, procedures, practices, and methods that have been decreed by authority or adopted by consensus. (Per ANSI/EIA-632-1998)

Page 131: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-125 11-05-08

Steady State An asset or part of an asset that has been delivered and is performing the mission. The steady state phase may also be termed “operational.”

Subsystem A grouping of items that perform a set of functions within a particular end product. (Per ANSI/EIA-632-1998)

System An aggregation of end products enabling products to achieve a given purpose. (Per ANSI/EIA-632-1998)

System Of Records Notice

Per OMB Circular A-11, System Of Records Notice (SORN) means a statement providing to the public notice of the existence and character of a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifying particular assigned to the individual. The Privacy Act of 1974 requires this notice to be published in the Federal Register upon establishment or substantive revision of the system, and establishes what information about the system must be included.

System Requirement A requirement derived from one or more functional requirements and stated in technical terms. (Per ANSI/EIA-632-1998)

System Testing The activities of testing an integrated hardware and software system to verify and validate whether the system meets its original objectives. (Per IEEE Std 1012-1988)

Test Case Documentation that specifies inputs, predicted results, and a set of execution conditions for a test item. (Per IEEE Std 1012-1988)

Test Plan Documentation that specifies the scope, approach, resources, and schedule of intended testing activities. (Per IEEE Std 1012-1988)

Total Acquisition Cost All costs for acquiring, by contract, interagency agreement (IA), and/or other funding instruments, supplies and/or services for a designated investment through purchase or lease, whether the supplies are already in existence or must be created, developed, demonstrated, and evaluated, and without regard to the type(s) of funds used, whether appropriated or non-appropriated. Service contracts that are part of the investment must be considered part of the total acquisition cost.

Traceability The ability to identify the relationship between various products of the development process, i.e., the lineage of requirements, the relationship between a design decision and the affected requirements and design features, the assignment of requirements to design features, the relationship of test results to the original source of requirements. (Per ANSI/EIA-632-1998)

Trade-Off Decision-making actions that select from various requirements and alternative solutions on the basis of net benefit to the stakeholders. (Per IEEE P15288/CD1)

Use Case A use case is a technique for capturing functional requirements of business systems and, potentially, of an IT system to support the business system. The use case model uses “Actors” and “use cases.” An Actor is the representation of a person or system which exists outside the system under study and who (or which) performs a sequence of activities in a dialogue with the system. A Use Case represents a single interaction between a primary actor (who initiates the interaction) and other (secondary) actors, and the system itself. The interaction is presented as a sequence of simple steps.

User An individual or organization that uses the operational system to perform a specific function. (Per ISO/IEC 12207)

Page 132: Pia Reservation SDLC

Sample Template and Guidance

B-126 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

Unit Testing

Testing performed by the Development Team during the Development Stage (Stage 4) to verify the code or changes to the code within a particular module or subroutine. This is the lowest level of testing that can be done on a code module or unit.

Validation Confirmation by examination and provision of objective evidence that specified requirements have been fulfilled. (Per ISO/IEC 12207)

Verification Confirmation, through the provision of objective evidence, that specified requirements are well defined. (Per IEEE P15288/CD1)

Page 133: Pia Reservation SDLC

Sample Template and Guidance

DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B B-127 11-05-08

Attachment B7 - References

1. DHS Privacy Impact Assessments: The Privacy Office Official Guidance, March 2006.

2. DHS ISSM Guide to the DHS Information Security Program, Version 2.0, July 19, 2004.

3. DHS ISSO Guide to the DHS Information Security Program, Version 0.6, July 26, 2004.

4. DHS EAB Governance Process Guide, Version 3.0, September 28, 2006.

5. HLS-EA Target Architecture, Version 2.0, October 29, 2004.

6. DHS NCSD, Security in the Software Lifecycle: Making Software Development Processes—and Software Produced by Them—More Secure, DRAFT Version 1.1, July 2006.

7. Customs and Border Protection, Systems Engineering Life Cycle Handbook, CIS HB 5500-07B, Version 1.0, November 9, 2006.

8. ICE System Lifecycle Management, Version 1.0, February 2005.

9. TSA, Systems Development Life Cycle Guidance Document, Version 2.0.4, July 2005.

10. U.S. Coast Guard, Office of Command, Control, Communications, Computers and Information Technology (CG-6) System Development Life Cycle (SELC), 11 February 2005.

11. CIO Council, Services and Components Based Architectures: A Strategic Guide for Implementing Distributed and Reusable Components and Services in the Federal Government-Chapter 1: Executive Strategy, Version 3.5, January 31, 2006.

12. IEEE, Guide to the Software Engineering Body of Knowledge, Version 1.0, May 2001.

13. IEEE. ISO/IEC IEEE P12207/CD1: Systems and Software Engineering — Software Life Cycle Processes, June 2006.

14. IEEE, ISO/IEC IEEE P15288/CD1: Software and Systems Engineering — Systems Engineering Life Cycle Processes, June 2006.

15. IEEE, ISO/IEC WD 27478: Systems and software engineering — Life cycle management — Guide for Life Cycle Management, June 2006.

16. ANSI, ANSI/EIA-632-1998: Processes for Engineering a System, January 7, 1999.

17. Information Technology Infrastructure Library (ITIL) series, 2004.

18. Department of Labor, System Development Life Cycle Management Manual, Exposure Draft, Version 2.1, December 2002.

19. Department of Defense, DoD 5000 series policy documents, 2003.

20. National Aeronautics and Space Administration, NASA Procedures and Guidelines, NPG: 7120, November 2002.

21. Internal Revenue Service, Enterprise Life Cycle Guide, Version 2.0, May 2006.

22. Standish Group, International, The CHAOS Report, 1995.

23. Standish Group, International, 2004 Third Quarter Research Report, 2004.

24. U.S. Patent and Trademark Office, Managed Evolutionary Development Guidebook, Second Edition, June 1993.

25. Department of the Air Force, Software Technology Support Center, Guidelines for Successful Acquisition and Management of Software-Intensive Systems: Weapon Systems, Command and Control Systems, Management Information Systems, Condensed Version, February 2003.

Page 134: Pia Reservation SDLC

B-128 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix B Interim Version 1.9 November 7 2008

26. Royce, Winston W., “Managing the Development of Large Software Systems,” Proceedings, IEEE WESCON, 1970.

27. CIO Council, Services and Components Based Architectures: A Strategic Guide for Implementing Distributed and Reusable Components and Services in the Federal Government, Version 3.5, Chapter 1: Executive Strategy, January 2006.

28. Sensitive Systems Policy Directive DHS 4300A, June 1, 2006.

29. National Security Systems Policy Directive DHS 4300B, June 1, 2006.

30. DHS 4300A Sensitive Systems Handbook, Version 4.0, June 1, 2006.

31. DHS 4300B National Security Systems Handbook, June 1, 2006.

32. DHS Security Certification and Accreditation Guidance for SBU Systems User’s Manual, Version 2.0, May 5, 2006.

33. DHS Security Information Security Categorization Guide, Version 3.0, April 14, 2006.

34. Standards for Security Categorization of Federal Information and Information Systems; Federal Information Processing Standards (FIPS) Publication 199, NIST, February 2004.

35. Security Considerations in the Information System Development Life Cycle, NIST Special Publication 800-64, October 2003.

36. Contingency Planning Guide for Information Technology Systems, NIST Special Publication, 800-34, June 2002.

37. Guide for the Security Certification and Accreditation of Federal Information Systems, NIST Special Publication, 800-37, May 2004.

38. Guide for Developing Security Plans for Information Technology Systems, NIST Special Publication 800-18, December 1998.

39. Integrating IT Security into the Capital Planning and Investment Control Process, NIST Special Publication 800-65, January 2005.

40. Recommended Security Controls for Federal Information Systems, NIST Special Publication 800-53, Revision 1, December 2006.

41. Guide for Assessing the Security Controls in Federal Information Systems, Draft NIST Special Publication 800-53A, April 2006.

42. Managing Risk from Information Systems, An Organizational Publication, Draft NIST Special Publication 800-39, October, 2007.

43. DHS Service Oriented Architecture Framework, September October, 2006, Version 0.1.

44. NIEM Concept of Operations, NIEM Program Management Office, Revision 0.5, January 2007.