VILNIUS UNIVERSITY
FACULTY OF MATHEMATICS AND INFORMATICS
DEPARTMENT OF SOFTWARE ENGINEERING
Test and configuration management plans
2nd assignment
Made by: Rita Birgelytė
Daina Dirmaitė
Andriy Voitenko
Tadej Rola
Kristina Naudžiūnaitė
Vilnius – 2018
2
CONTENT
1. TEST PLAN OBJECTIVE ........................................................................................................................................ 4
2. SCOPE ................................................................................................................................................................ 4
3. TEST STRATEGY .................................................................................................................................................. 4
TEST TYPES ............................................................................................................................................................ 5
3.1.1. Functional testing ...................................................................................................................................... 5 3.1.1.1. Unit testing ........................................................................................................................................................... 5 3.1.1.2. Integration testing ................................................................................................................................................ 5 3.1.1.3. System testing ...................................................................................................................................................... 6 3.1.1.4. Regression testing ................................................................................................................................................ 6 3.1.1.5. Acceptance testing ............................................................................................................................................... 6
3.1.2. Non-functional testing ............................................................................................................................... 6 3.1.2.1. Performance testing ............................................................................................................................................. 6 3.1.2.2. Load testing .......................................................................................................................................................... 6 3.1.2.3. Stress testing ........................................................................................................................................................ 7 3.1.2.4. Security testing ..................................................................................................................................................... 7 3.1.2.5. Configuration testing ............................................................................................................................................ 7 3.1.2.6. Recovery testing ................................................................................................................................................... 7
3.1.3. Static testing .............................................................................................................................................. 8
3.1.4. Data migration testing ............................................................................................................................... 8 3.1.4.1. Database integrity testing .................................................................................................................................... 8 3.1.4.2. Data flow correctness testing ............................................................................................................................... 9 3.1.4.3. Test data generation ............................................................................................................................................ 9
GENERAL RULES .................................................................................................................................................... 10
4. CRITERIA .......................................................................................................................................................... 10
5. TOOLS .............................................................................................................................................................. 10
6. HUMAN RESOURCES ........................................................................................................................................ 12
7. MILESTONES .................................................................................................................................................... 15
8. DELIVERABLES .................................................................................................................................................. 17
9. RISKS ................................................................................................................................................................ 18
10. INTRODUCTION TO CONFIGURATION MANAGEMENT ................................................................................... 20
11. REFERENCE DOCUMENTS ............................................................................................................................... 20
12. MANAGEMENT .............................................................................................................................................. 20
ORGANIZATION .................................................................................................................................................. 20
RESPONSIBILITIES ................................................................................................................................................ 21
3
13. ACTIVITIES ...................................................................................................................................................... 22
CONFIGURATION IDENTIFICATION ........................................................................................................................... 22
13.1.1. Documents ............................................................................................................................................. 22
13.1.2. Software executables ............................................................................................................................. 23
13.1.3. Source code items .................................................................................................................................. 23
CONFIGURATION CHANGE CONTROL........................................................................................................................ 23
CONFIGURATION STORAGE .................................................................................................................................... 24
13.3.1. Archive items .......................................................................................................................................... 24
13.3.2. Evolving items ........................................................................................................................................ 24
13.3.3. Source code items .................................................................................................................................. 25
CONFIGURATION AUDITS AND REVIEWS ................................................................................................................... 25
14. RESOURCES .................................................................................................................................................... 26
4
1. TEST PLAN OBJECTIVE
This document describes testing plan for the new payment hub configuration and integration
with internal VIRBank Systems. Document supports following objectives:
1. Prepare data migration testing strategy;
2. Prepare testing requirements and define testing acceptance criteria;
3. List the deliverable elements of the test activities;
4. Identify the required resources;
5. Identify testing process risks.
2. SCOPE
The overall testing process is divided into 2 different streams: functional/non-functional testing
and data migration testing.
During testing phase we are going to run these tests:
1. Functional: each configured functionality and process orchestration, internal and
external integration testing, regression, acceptance;
2. Non-functional: security, performance, load, stress, configuration, recovery.
Data migration testing scope:
1. Database integrity testing;
2. Data flow correctness testing.
During testing phase we are not going to test:
1. UI of the new payment hub (not going to be used widely);
2. COTS product functionality (it is assumed that COTS product functionality was
already tested by software provider).
3. TEST STRATEGY
This section describes how the system will be tested. The main considerations for the test
strategy are the techniques going to be used and the criterion for knowing when the testing is
completed.
In the project 4 different environments are used. For testing purposes (integration / system /
regression / non – functional / database testing) testing environment is used, while unit testing is
5
done in development environment and acceptance, performance and recovery testing in staging
area.
Due to importance of some non-functional testing types, staging environment should be created
as close to the production environment as it can.
Test types
This section lists out the types of testing that will be performed during testing.
3.1.1. Functional testing
3.1.1.1. Unit testing
Unit tests are written by developers when it is considered that story / task is done from
development point of view. Unit tests have to be automated. Unit tests are ran by developer and
every time when building application.
3.1.1.2. Integration testing
Integration tests should not be run together with unit tests.
Integration testing can be performed when unit test have passed by 90 % of success rate.
For integration testing bottom-up approach is used, meaning that bottom level units are tested
first and upper-level units step by step afterwards.
Integration tests should be automated whenever it is possible.
6
As part of integration testing also should be done API testing to check if requests and responses
of payment hub’s and external systems’ webservices match specification and if those two parts
work correctly together. Testing is done using mocks and stubs where necessary.
3.1.1.3. System testing
When independent payment hub component is developed and integrated with related external
systems, system testing could be started.
3.1.1.4. Regression testing
Regression testing should be automatized.
Regression tests are executed in the following cases:
1. When new functionality is added to the application;
2. When there is a change in requirements;
3. When there is a defect fix;
4. When there is a performance issue fix;
5. When there is a change in environment used.
3.1.1.5. Acceptance testing
Acceptance tests always have to pass.
Acceptance testing is done each time new component is successfully completed and integrated
with one of the external systems.
Acceptance testing is done based on prepared testing scenarios. Those scenarios are prepared
prior to acceptance testing.
3.1.2. Non-functional testing
3.1.2.1. Performance testing
Performance testing measures response time, transaction rates, and other time sensitive
requirements.
Performance testing should be completed in staging environment before acceptance testing.
Performance test is done using different “background loads”: normal and peak.
3.1.2.2. Load testing
Load testing should be used for API endpoints – either as a mix or perhaps even individually.
7
3.1.2.3. Stress testing
Stress testing is done each time new component is successfully completed and integrated with
one of the external systems.
Testing is done in these scenarios:
1. Maximum number of users (defined during load testing) performing the critical
operations at the same time;
2. There are some hardware issues such as database server down.
3.1.2.4. Security testing
Security testing is done based on prepared taxonomy which includes most common security
issues. Most common security issues:
1. Access to Application (authentication, authorisation);
2. Data protection;
3. Brute-Force Attack (using tool);
4. SQL Injection and Cross-Site Scripting (XSS);
5. Session Management.
Security testing should be done after major changes to the system.
3.1.2.5. Configuration testing
Configuration testing steps:
1. Creation of matrix which consists of various combinations of software and hardware
configurations;
2. Prioritizing the configurations as its difficult to test all the configurations;
3. Testing every configuration based on prioritization.
Software configuration testing begins when:
1. Configurability requirements to be tested are specified;
2. Test environment is ready;
3. Unit and integration test passed.
3.1.2.6. Recovery testing
Tests should be done on the hardware that is restoring to whenever possible and in staging
environment.
8
Interfaces, hardware, and code should be a replica to the live system.
Recovery tests should be conducted on a regularly scheduled basis – at the end of the sprint.
Recovery tests should also be scheduled whenever major technology changes are implemented,
including physical upgrades, new installations, and related re-configurations.
Back up should not be present at one location but multiple locations.
3.1.3. Static testing
Reviews are performed to find and eliminate errors in test cases.
Tools and code reviews are used for static analysis to find:
1. A variable with an undefined value;
2. Inconsistent interface between modules and components;
3. Variables that are declared but never used;
4. Unreachable code (or) Dead Code;
5. Programming standards violations;
6. Security vulnerabilities;
7. Syntax violations.
3.1.4. Data migration testing
3.1.4.1. Database integrity testing
Testing is done in order to:
1. Ensure the correct data mapping between the front end and database, such that all the
operations performed at the front end are correctly and consistently reflected in the
database and vice-versa;
2. Ensure the data integrity such that changes in the data after the CRUD operation, are
reflected correctly and consistently everywhere, wherever the data is stored or present.
The integrity of a database has to be evaluated by testing two different types of integrity:
1. Domain Integrity: It ensures that all the fields or columns used in the database are
declared and defined under a specific and valid domain. It has to be tested using null,
default and invalid values;
2. Entity Integrity: It mainly concerns with the non-duplication of the records and
ensures that each row of a table has a non-null primary key, where each of these keys
9
should be unique and different to each other. It has to be tested by providing similar /
duplicate or null values.
3.1.4.2. Data flow correctness testing
Testing is done in order to:
1. Ensure if the legacy database is not updated during tests after database migration is
completed;
2. Ensure whether the mapping at field and table levels do not change the output logic;
3. Ensuring if data is migrated accurately and completely.
Data flow correctness testing has to be evaluated by:
1. Verifying if the queries executed in the new database yield same results as in the older
one;
2. Verifying if the number of records in the old database and new database is the same;
3. Verifying that there are no redundancies and new database provides exactly the same
result as the older one;
4. Verifying the query performance (time-taken to execute complex queries) of the new
database;
5. Verifying completeness of data: row count, min / max, avg, sum checks, row by row
comparison, job run times and etc.
3.1.4.3. Test data generation
Jobs that automatically copy production data to testing environment or third party automated test
data generation tools are used for the test data generation. General process to produce test data
from production environment should be followed by these steps:
1. Set up production jobs to copy the data to a common test environment (data should not
be generated manually);
2. All PII (Personally Identifiable Information) is modified along with other sensitive
data. The PII is replaced with logically correct, but non-personal data;
3. Remove data that is irrelevant;
4. After the data is copied, testers or developers can copy this data to their individual test
environment. They can modify it as per their requirement.
Generated test data should be used just for the testing purposes and should be constantly
updated.
10
General rules
Code contribution:
1. Contributed code should be reviewed and accepted by at least 2 senior developers;
2. Before contributing, unit tests should be executed.
Test coverage:
1. Unit test coverage should be at least 80%;
2. Integration test coverage should be at least 90%;
3. Test cases should cover 100% of the requirements.
4. CRITERIA
ENTRY criteria
Testing can be started when all entry criteria are met:
1. All the necessary documentation should be available;
2. All the standard software tools including the testing tools must have been successfully;
installed and functioning properly;
3. The test environment should be ready (in some cases also staging);
4. Proper test data is available;
5. QA resources have completely understood the requirements;
6. Test scenarios, test cases are reviewed.
EXIT criteria
Testing can be finished when all exit criteria are met:
1. 100% requirements coverage has been achieved;
2. Less than 6 minor, 3 medium and 0 major faults are left outstanding;
3. All high-risk areas have been fully tested, with only minor residual risks left outstanding.
5. TOOLS
This section will describe the testing tools necessary to conduct the tests.
For what? Tool Comment
Testing deliverables creation TESTFLO for JIRA Used for test cases creation
and management, test reports.
11
Data migration testing Tricentis Used for database integrity
and data flow correctness
testing
Data generation DTM Data Generator Used to create data values
and optional schema objects
such as views, procedures,
tables and triggers.
Static analysis SonarQube SonarQube is used to measure
and analyse the source code
quality.
Crucible code review Crucible code review is used
by programmer to review
code.
System testing Selenium webdriver Current UI automation tests.
Will be run during regression
testing.
Integration testing SoapUI
Postman
Used for testing SOAP
services / REST API
manually.
Karate
REST Assured
Framework used for SOAP /
REST API integration
automated testing.
Unit testing JUnit
NUnit
Used to write unit tests for
components written in Java /
C#.
tSQLt Used to run unit tests for SQL
Server databases.
Performance / load / stress testing Apache JMeter Used to load test functional
12
The Grinder behaviour and measure
performance.
HammerDB Used for database load
testing.
Security testing Fiddler Used to monitor, manipulate,
and reuse HTTP requests
Wireshark Used for protocol analysis.
Metasploit Framework
Wapiti
Used for penetration testing.
IBM Security AppScan Used to test Web applications
for security vulnerabilities
during the development
process.
Incident reporting JIRA Used to report all kind of
incidents and anomalies.
6. HUMAN RESOURCES
Role Resources Specific Responsibilities / Comments
Developer Junior 4 Develops and tests units.
Responsibilities:
1. Develop units;
2. Design and create unit tests;
3. Run unit tests.
Regular 6
Senior 5 Develops and tests units.
Responsibilities:
1. Develop units;
2. Design and create unit tests;
13
3. Run unit tests;
4. Integrate units;
5. Consults testers about implementation details.
Project Manager 2 Provides oversight of testing on the project.
Responsibilities:
1. Interact with business people;
2. Set testing strategy;
3. Provide business requirements for testing to test
managers (high level testing plan);
4. Define the processes used to ensure the quality of
the deliverable;
5. Coordinate test managers, system analysts,
security specialists;
6. Ensure that deadlines are met;
7. Decide the test budget and schedule;
8. Management reporting;
9. Approve testing results.
Test Manager 3 Provides management oversight in a team.
Responsibilities:
1. Help to set testing strategy;
2. Generate and take care of functional test plan;
3. Identifies, prioritizes, and implements test cases;
4. Prepare test metrics;
5. Provide technical direction;
6. Acquire appropriate resources;
7. Management reporting (test summary report and
etc.);
8. Share updates on testing with project manager.
14
Test Engineers 6 Executes on the test plan.
Responsibilities:
1. Some participate in test case creation;
2. Execute tests;
3. Automate test (integration, regression).;
4. Log results (prepare reports);
5. Document defects;
6. Prepares tasks for developers.
4 (higher
level test
engineers)
Ensures test data (database) environment, related
processes and assets are managed and maintained.
Responsibilities:
1. Prepare test data;
2. Administer test data;
3. Prepare data integrity testing plan.
4. Prepare data flow correctness testing plan;
5. Execute prepared plans;
6. Provide other teams with test data;
7. Document defects.
Software Architects 3 Plans and designs software solutions and advice non-
functional test scenarios.
Responsibilities:
1. Envision, model and provide initial models and
designs that can be built for development;
2. Lead the technical implementation in a team;
3. Advice project members of any software related
questions;
4. Participate in requirements analysis and design of
non-functional test scenarios.
System Analyst 3 Plans system non-functional testing and leads its
execution.
15
Ensures test environment and assets are managed and
maintained.
Working directly with security specialists.
Responsibilities:
1. Analyse requirements and design of non-
functional test scenarios;
2. Administer test environment;
3. Install and manage worker access to test systems;
4. Develop and conduct performance, load, stress
and configuration test designs, cases and
procedures;
5. Ensure access and usability of tools used for
testing.
Security Specialist 4 Plans security and recovery testing and leads its
execution.
Working directly with system analysts.
Responsibilities:
1. Analyse requirements and design of non-
functional test scenarios;
2. Develop and conduct recovery test designs, cases
and procedures;
3. Test taxonomy for security testing which contains
most common security issues.
7. MILESTONES
Taking into account that 40 employees from IT department will participate in the project
implementation (including 3 architects, 15 developers, 3 test managers, 10 test engineers) and
that teams are going to work applying scrum framework, it was decided to have 3 different
scrum component teams over feature teams made of developers and testers because of:
1. Diverse technology stack;
2. Will of developers to specialize;
16
3. Balanced workload;
4. Component ownership.
In total we are going to have 5 different scrum teams:
1. Team 1 / Team 2 / Team 3: 1 architect, 5 developers, 1 test manager, 2 test engineers;
2. Team 4: 4 test engineers;
3. Team 5: 4 security specialists and 3 system analyst.
Each of scrum team is going to be made of 1 architect, 5 developers, 1 test manager, 2 test
engineers. Each scrum team is going to be responsible for developing one component at a time
and successfully integrating it with one of the external systems.
Another scrum team made of 4 test engineers will be formed for database testing. Security
specialists and system analyst are going to work together in another scrum team taking care of
non-functional testing.
Each sprint is planned to take 3 weeks. Before each sprint so called “Firefighters” team is going
to be formed of 3 employees (1 – full time, 2 – on demand): 2 developers and 1 tester, which will
be responsible for fixing production environment bugs.
TECHNICAL BACKLOG
BA/SECURITY SPECIALISTS
Input
In review In progress Done Blocked Testing
In review In progress Done Blocked Testing
In review In progress Done Blocked Testing
3 te
ams m
ade
of 1
arc
hite
ct, 5
dev
elop
ers,
1 te
st m
anag
er, 2
test
eng
inee
rs
FIREFIGHTERS
Before each sprint 3 architects gather
together to discuss next steps
BACKLOG
Taking into account that teams will be working using sprints, most of the activities listed below
will be repeated in each sprint or after an independent payment hub component is developed and
integrated with related external systems.
17
No. Milestone task When it is done? Effort
1. Test plan Each sprint 2 days per sprint
2. Test scenarios & test cases Each sprint 1 days per sprint
3. Testing taxonomies After integration with
external system
3 days per sprint
4. Test development -
5. Test data Each sprint 1-2 days per sprint
6. Unit / integration /
regression / system
testing
Each sprint 4 days per sprint
7. Acceptance testing After integration with
external system
10 days per sprint
8. Non – functional testing Each sprint 5 days per sprint
9. Data migration testing After integration with
external system
15 days per sprint
10. Test metrics After integration with
external system
1 days per sprint
11. Incident report Each sprint 1 days per sprint
12. Test summary report After integration with
external system
2 days per sprint
8. DELIVERABLES
Deliverable Responsible Comments
Test plan Project manager
Test manager
Document which contains the
plan for all the testing
activities to be done to
deliver a quality product.
Test scenarios & test cases Test manager and test
engineers
Document which determines
test conditions that contains
high-level and low-level test
18
cases.
Testing taxonomies Security specialists Testing taxonomy for
security testing which
contains most common
security issues.
Test data Test engineers Prepare and test data which
will be used by the testers to
run the test cases.
Test metrics Test manager Prepared to estimate the
progress, quality and health
of a software testing effort.
Incident report Test engineers Defects found during test
execution are reported in
JIRA. Incident report is
generated using the same
tool.
Test summary report Test manager It contains the final test
results and the summary of
test activities.
9. RISKS
1. For testing purposes data uploaded into third party products / tools will cause data leakage;
2. During testing phase will be compromised customers’ data;
3. Data migration testing will take a lot of time and therefore will require repetitive work and
too many resources to produce fake / testing data;
4. Production environment data may be not fully usable for testing based on the developed
business scenarios;
5. The testers spend more time than required for communicating with architects, database
administrators and BAs for gathering data;
19
6. Testers will not be able to properly test process orchestration because of the payment hub
product in-build functionality;
7. Testers will not be able to set up an integration between testing tools and payment hub;
8. Testers will not understand how payment hub works and will not have proper subject
training;
9. At the beginning of each sprint testers will not have what to do.
20
10. INTRODUCTION TO CONFIGURATION MANAGEMENT
This part of document contains the software configuration management plan of implementation
of a new Payments Hub to core banking systems replacing legacy payments processing
components.
This part of document represents the Software Configuration Management Plan. Configuration
Management (CM) is essentially the process of identifying and assuring retention and control of
all the various artefacts (documents, source code, executables, e-mail, etc.) generated during the
software development life cycle.
The primary objective of the Configuration Management process is to establish and manage
baselines of product releases.
Configuration Management for software engineering is defined as: “A set of requirements,
design, source code files and the associated executable code, build files, and user documentation
(associated entities) that have been assigned a unique identifier can be considered to be a
baseline”.
11. REFERENCE DOCUMENTS
In our software configuration management plan, we reference on various standards and policies.
They are provided by The Institute of Electrical and Electronics Engineers (IEEE).
1. IEEE 828 - Standard for Software Configuration Management Plans
1.1. minimum required content of SCMP are established;
1.2. which activities are required;
1.3. requirements for any part of product’s life cycle.
2. IEEE 1042 - Guide to Software Configuration Management
2.1. provides basic guidelines in planning SCMP;
2.2. guidelines are compatible with IEEE 828.
12. MANAGEMENT
Organization
Since we did not provide special workplaces for configuration management roles, we will
allocate time required for it to few people from developers, test managers and test engineers
groups.
For this purpose, we will introduce some new informal roles in our team. We augment with:
21
1. configuration manager - from test managers section;
2. development lead (team lead) - from developers section;
3. release manager - from developers section;
4. QA manager - from test engineers section.
Responsibilities
Role Responsibilities
Configuration
Manager
1. Educates project team members in SCM “best practices”;
2. Establishes, promotes, and releases baselines;
3. Validates final builds;
4. Prepares release package and version description documents.
Project Manager 1. Ensures correct we are following IEEE standard;
2. Check on the SCM process;
3. Evaluates all other change requests;
4. Identify dependent projects.
Development Lead
1. Develops and maintains artefacts following proper version
control procedures using the SCM plan;
2. Submits build / release requests;
3. Coordinates development activities and assigns task,
4. Ensures all developers are following the SCM plan;
5. Ensures all SCM procedures are implemented and followed.
Release Manager 1. Coordinates the release and deployment of software;
2. Assures products meet all exit criteria prior to release;
3. Assures change control and SCM processes have been followed
as defined.
Developers 1. Develops and maintains artefacts following proper version
control procedures using the SCM plan;
2. Maintain accurate, detailed information for all assigned change
22
request, in the change request database;
3. Documentation of build, release, and installation instructions.
Quality Assurance
manager
1. Responsible for testing installed releases, as SCM provides
releases from development;
2. Update change requests assigned to them according to test
activity results;
3. Determines Pass / Fail for each change request scheduled for a
release (and opens new change requests).
13. ACTIVITIES
Configuration identification
We have identified two main components that are going to be altering during the project
execution.
They are documents and software executables.
13.1.1. Documents
Document items are assigned with unique identifiers that represent only one project component,
along with current revision level.
The identifier consists of at least one, to maximum three parts:
1. Acronym;
2. Acronym-class;
3. Acronym-component-class.
• Documents, such as policies, standards, process descriptions or guidance are defined with
“acronym”. An example is SCMP.
• Components that are project specific but not associated with any other component in the
software are defined with “acronym-class”. An Example of COTSIntegration-SPMP that
represents project management plan for project COTS integration
• Further, documents, regarding for specific component inside project are defined as
“acronym-component-class”. So, the required definition for payment component, would be
COTS-PAYM-REQD
We also provide version control for each document. It consists of two numbers.
23
For example, 2.1, where:
1. Version number changes only when the core architecture of the item changes;
2. Revision number changes when existing content is changed, but overall structure and
flow remains the same.
13.1.2. Software executables
Software executables are identified by name and version number. The name convention for each
software item is defined by the development team.
The version consists of 3 components, for example 2.1 a, where:
1. Version number changes when core architecture of the software item or user interface
changes;
2. Revision number is changed when a new feature is added, e.g. additional functionality or
any other kind of content to executable file;
3. Update character is added when software item is fixed from any defects, without adding
new functionalities / modules.
13.1.3. Source code items
Identification of these items is in control of configuration management tool we will be using for
this project. Tools are described in the last section (Resources). Because of this, numbering /
identification of source code items is in domain of the tool.
Configuration change control
Throughout change control process we are dealing with verification, approval and execution of
identified issues, during development or production operations.
This process is comprised of 4 stages:
1. Discovery (when new issues are identified during production or development lifecycle
and are entered in the workflow and validated);
2. Analysis (when development team create or modifies the documentation or code to fix
the issue);
3. Authorization (when the configuration manager assigns the verified issue to specific
iteration in software development cycle);
4. Iteration (when actually the fix is solved during the software development iteration.
Includes documentation modifying).
24
Configuration storage
For each configuration item that we mention there are different storage requirements.
Management of archive items is usually an easy deal. Differently, evolving items and source
code items.
13.3.1. Archive items
Items that are suitable for archiving are stored in appropriate directories for the project.
Directories are available on a central file server. These items are later on divided into more
topical subdirectories.
13.3.2. Evolving items
A copy of each item is usually stored on the project web server with link from project home page
to the item. If a new item is a revision or an update of existing item, this item is replaced on web
server under the same URL.
25
With such control, we can keep being confident that each project member has an access to the
latest item version.
All versions of each item are also stored in the configuration management tool. It is described in
section 14.
13.3.3. Source code items
This kind of items is usually created by the development team and directly connected with
configuration management tool. The tool handles:
1. Storage;
2. Identification;
3. Versioning;
4. Check in / check out operations.
Each member of our team can add new file to our source code management tool.
Configuration audits and reviews
The revision management process manages creation or changing any configuration item, as well
as build of one or many configuration items. Builds that passes all the reviews (tests) become
new baseline.
The process is separated in 3 stages:
1. Informal iteration process (when new configuration items are created and checked in to
our configuration management tool);
2. Formal iteration process (where configuration items are finalized for review and
prepared for release build for the stage);
3. In-stage assessment process (where review is done, and the build is conducted -> that is
rejected (for re-work) or accepted and becomes a new baseline).
26
14. RESOURCES
Items Management tool
Documents and software code GitLab
Builds Jenkins
Deploying the product to the next environment after build Jenkins
Tasks and errors Jira
27
When a change is made, it is placed separately from the published code.
When a release is made, the changes will be added to the general database structure code.
The configuration of the environment will be done through the database scripts, using different
data for each environment.
The system will use different configuration files at start-up (for example, app.config).
GitLab, a single application for the whole DevOps lifecycle that includes not only configuration
management, but also capabilities for project management, source code management, CI/CD,
and monitoring.