Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
IPA/SEC White Paper 2007 on
Software Development Projects in Japan
1,770 projects of IT companies.
Development trends revealed by quantitative analysis
IPA/SEC Software Engineering Center
Information-Technology Promotion Agency, Japan
2 IPA//SEC White Paper 2007 on Software Development projects in Japan
IPA/SEC White Paper 2007 on Software Development Projects in Japan
1,770 projects of IT companies. Development trends revealed by quantitative analysis
Edited and translated by: Software Engineering Center Information-Technology Promotion Agency, Japan (IPA/SEC) © 2010, IPA/SEC. All rights reserved. For more information on this White Paper, including corrections and amendments, visit the Web site of IPA SEC via the following URL:
http://sec.ipa.go.jp/ Trademarks * Microsoft Office Excel is a trademark of Microsoft Corporation. For more information on the product, consult the corporation.
* SPSS is a trademark of SPSS Inc. For more information on the SPSS product, consult the company.
* Product names written in this White Paper are used for identification purposes only. This White Paper uses product names other than the ones listed above.
Preface
IPA/SEC White Paper 2007 on Software Development Projects in Japan 3
Preface About the IPA/SEC White Paper 2007 on Software Development projects in Japan
Since fiscal 2004, the Software Engineering Center (SEC) has been making definitions of data about
software development projects while collecting, analyzing and applying the data. SEC publishes the “IPA/SEC White Paper 20xx on Software Development projects in Japan” every fiscal year to present the results of data analysis conducted in the fiscal year before the publication. The White Paper 2007 is the third issue of the White Paper.
With cooperation from twenty companies that offered their project data, SEC obtained data about nearly
400 projects additionally to boost the total number of target projects to 1,774. SEC defined major project data items and has been collecting the items under the policy that its data analysis can provide reference data having a distribution useful for many companies. Although making minor modifications to the data item definition year by year, SEC maintains the fundamental structure of the basic data items. The White Paper 2007 is published after intensive inspection of project data.
The three issues of White Paper have the following topics. • White Paper 2005: Presents the framework construction for cross-company data collection and analysis
and the results of collected initial analysis. • White Paper 2006: Presents the results of analysis of relationship among major development-project
elements (such as size, effort, development schedule, and productivity) and the results of analysis of the relationship between the planned and actual data.
• White Paper 2007: Presents the results of analysis of additional items. New project types, Enhancement (in this document, it describes Maintenance/Support and Enhancement), and a new business types, Public Service. This issue also adds the results of analysis of the reliability by the number of defects identified after system cutover.
Structure of this White Paper
The White Paper 2007 consists of the following chapters.
Chapter 1: Scope and objectives. Chapter 2: Data collection policy and the yearly count of target projects. Chapter 3: Basic data sampling policy and basic analysis procedure. Chapter 4: Profiles of every analyzed data item. Chapter 5: Basic distribution of project size, development schedule, and number of staff per month. Chapter 6: Stratified analysis of relationship among the project size, effort, development schedule,
productivity, number of staff and others. Chapter 7: Reliability data analyzed with various strata Chapter 8: Distribution of project data with respect to the phase from basic design to acceptance test. Chapter 9: Analysis of the relationship between planned data and actual data about the size, effort, and
development schedule, and thematic analysis of productivity characteristics.
In chapters 5 through 8, projects are categorized in three types: Development, Maintenance/Support, and Enhancement. Analysis of Enhancement projects presented in chapters 5 through 8 and thematic analysis of productivity characteristics in chapter 9 are completely new topics.
We are anticipating that this White Paper will promote the application of quantitative data about software development and help many people who are engaged in the management of quality and efficient software development.
4 IPA/SEC White Paper 2007 on Software Development Projects in Japan
IPA/SEC White Paper 2007 on Software Development
Project in Japan
1,770 projects of IT companies. Development trends revealed by quantitative analysis
Contents
Preface.........................................................................3
Contents .......................................................................4
1 Background and Objectives...................................6
2 Data Collection ......................................................8 2.1 Data Collection Policy..........................................................8 2.2 Data Offering Status ..........................................................10
3 Data Analysis.......................................................13 3.1 Analysis Policy...................................................................13 3.2 Analysis Conventions ........................................................15 3.3 Analysis Guidelines ...........................................................17
4 Profiles of Collected Data ....................................21 4.1 Data Adoption Criteria and Format ....................................21 4.2 General Characteristics of Development Projects .............22 4.3 Project Applications ...........................................................24 4.4 System Characteristics......................................................26 4.5 Development Techniques..................................................32 4.6 User Requirement Management........................................34 4.7 Development Staff Skills and Experiences ........................36 4.8 Size ...................................................................................38 4.9 Development Schedule .....................................................41 4.10 Effort..................................................................................43 4.11 Personnel Assignment.......................................................50 4.12 Reliability ...........................................................................53 4.13 Development Process Phase Combinations...................56 4.14 Project Evaluation..............................................................57
5 Statistics of Major Project Elements....................59 5.1 Adoption Conventions for Chapter 5..................................59 5.2 FP Size..............................................................................62 5.3 SLOC Size.........................................................................71 5.4 Development Schedule .....................................................82 5.5 Effort..................................................................................91 5.6 Number of Staff per Month .............................................. 104
Contents
IPA/SEC White Paper 2007 on Software Development Projects in Japan 5
6 Analysis of the Relationship among Effort, Development Schedule, and Size...................111
6.1 Scope of This Chapter .................................................... 111 6.2 Distribution of Major Factors ........................................... 113 6.3 Effort and Development Schedule................................... 114 6.4 FP Size and Effort ........................................................... 121 6.5 FP_Productivity ............................................................... 132 6.6 SLOC Size and Effort...................................................... 154 6.7 SLOC_Productivity.......................................................... 173 6.8 Relationship Between the FP Size and SLOC Size ........ 193
7 Analysis of Reliability ........................................194 7.1 Scope of This Chapter .................................................... 194 7.2 FP Size and Number_of_Identified_Defects ................... 196 7.3 FP Size and Identified Defect Density............................. 200 7.4 SLOC Size and Number_of_Identified_Defects .............. 208 7.5 SLOC Size and SLOC_Identified_Defect_Density .......... 212
8 Development-Phase-Based Analysis................224 8.1 Development-Phase-Based Analysis of Effort and
Development Schedule ................................................ 224 8.2 Number of Issues Pointed Out in Reviews...................... 228 8.3 Test-Phase-Based Test Cases and Identified
Software Failures ......................................................... 229
9 Estimates-Results Analysis and Productivity Cross-Analysis ................................................241
9.1 Estimates-Results Analysis............................................. 241 9.2 Productivity Analysis ....................................................... 246
10 Postscript ..........................................................251
Appendix ..................................................................253
A: Data Item Definitions.........................................254 A.1 Mapping Between Phase Names and SLCP................... 254 A.2 Data Item Definitions Version 2.3.................................... 255 A.3 Industry Classification ..................................................... 271 A.4 Derived indicators Names and Definitions ...................... 272
B: Data Entry Form Version 2.3.............................275
C: Per-Data-Item Reply Status ..............................278
D: Glossary ............................................................288
E: References........................................................290 E.1 Reference Materials ........................................................ 290 E.2 Reference Information..................................................... 291
List of Figures and Tables........................................292
6 IPA/SEC White Paper 2007 on Software Development Projects in Japan
1 Background and Objectives
Background of IT Industry
Information technology (IT) has been playing a critical role in the remarkable progress of science, technologies, and industries. Proliferating in almost every area, IT is now indispensable for our economic activities and daily life. With harder competition in the market, IT systems are required to provide highly multi-functional capabilities and higher performance. This trend indicates that IT will be playing a wider role in the foreseeable future.
IT systems, on the other hand, have been suffering from never-ending system malfunctioning caused by software bugs, which can cause fatal consequences to human life. One of the major causes of software defects is increasing effort on programmers and system engineers engaged in development, maintenance, and operation of core software of IT systems. These programmers and engineers have to develop and implement system requirements in much shorter time, which is another source of system malfunctioning.
To tackle these problems, knowledge and expertise have to be concentrated industry wide so that they provide a knowledge base available to everyone. In essence, what is urgently needed now are the technologies, processes, and experts that produce safe, reliable software.
Objectives
With the background, the IPA/SEC is standardizing software development processes, making rules for process quantization, and promoting its process standardization and quantization rules. Process standardization is popular in other areas including the manufacturing industry. Since its establishment in October 2004, SEC has been collecting, examining, and analyzing process data as well as making quantitative data definitions, systematizing analyzed data step by step to make the data beneficial for any people engaged in software development.
One of the objectives of this White Paper is “to publish the results of yearly statistic analysis, which serves as a kind of meter standard providing higher precision year by year”. Another objective is “to publish new standards and new viewpoints found in extended analysis of particular subjects”. Making these efforts continuously, SEC expects that the results of the efforts will help many people who are trying to alleviate software defects and who have to make problem solving decisions.
With many background factors taken into account, it is not easy to yield a standard from quantitative data quickly. It is appreciated that the reader would acknowledge the difficulty in standardization.
Remarks
Successful project completion needs faithful partnership between the user and the vendor. Successful project management requires that development staff can know the state of software development processes at any necessary time to make accurate predictions and take effective actions. The state of the processes has to be measured and presented quantitatively to let the staff make decisions objectively.
This White Paper provides basic quantitative standards (hereafter metrics) for software development processes, such as the relationship among the size, effort, and development schedule, and that between the effort and development schedule. Readers are expected to collect your own data and to make the data more accurate. Once necessary data are collected, the reader can use this White Paper as a reference when making decisions or predictions.
1. Background and Objectives
IPA/SEC White Paper 2007 on Software Development Projects in Japan 7
Basic information presented in this White Paper
Statistics of software development primary data (profile) The profile depicts the distribution of primary actual data (including the degree of variance and the median) about software development. The primary data include the size, development schedule, effort, reliability, and personnel assignment. In addition to actual data, the White Paper published in 2006 or later presents planned data values about the size, effort, and development schedule. The profile is the result of analysis done on the collected project data.
Software development analysis The results of software development analysis show the relationship among major elements including the size, effort, development schedule, reliability, and personnel assignment. The results provide standard information on software development planning. The information is applicable to the early stages of the software development process as well as to later ones. Many companies offered data about their software development projects presented in this White Paper, and in most cases the companies mapped the data in accordance with the data definitions made by SEC. In some cases, however, the analysis shows data prepared in accordance with measurement methods proprietary to the companies. Note that the software development analysis in this White Paper can only show a trend of data.
Collected data items Collected data items are listed in Data Item Definition and Data Entry Form in Appendix A and Appendix B, respectively. The definitions and the form let companies apply the same standard to data collection and analysis when they improve their quantitative management of software development projects.
Expected readers
Executives of user and vendor companies This White Paper provides basic information on software development including the size, development schedule, effort, and reliability, thereby making the information serves as a foundation upon which users and vendors can share the same viewpoint for many aspects of software development. User companies can use the information as fundamental data on management resources needed for their software development projects. It is recommended that vendor companies use the basic information to achieve successful completion of their projects.
Managers of business divisions and IT system divisions It is recommended that managers of business divisions and IT system divisions use the basic information presented in this White Paper to encourage software development staff to adopt the quantitative analysis approach including data collection, quantitative management, and accuracy improvement, and to achieve successful completion of their projects.
Project managers (PMs) and project leaders (PLs) To complete a software development project successfully, it is necessary to promote project management based on quantitative data. To make project management successful, it is desirable that project managers and project leaders measure project characteristics quantitatively in each development process to estimate or predict the size, effort, development schedule and quality and to take necessary actions. Project managers and project leaders can evaluate their project data by referring this White Paper.
Project management office (PMO) staff and quality assurance (QA) staff It is highly recommended that project management office staff and quality assurance staff use this White Paper, especially Data Item Definition and Data Entry Form in Appendix A and Appendix B, as a reference when building up databases of quantitative data about their projects or when benchmarking the projects
8 IPA/SEC White Paper 2007 on Software Development Projects in Japan
2 Data Collection
This chapter describes the policy applied to collection of the data published in this White Paper, and presents associated information including the amount of data and when the data was collected.
This White Paper presents data 1,774 software development projects which twenty companies in Japan offered. Most data items were actually measured, while the primary data items (size, development schedule, and effort) actually measured accompanying estimates proposed at the planning phase.
The analyzed projects developed applications or some kinds of systems for general-purpose computers, which refer to computers that do not run embedded software. Based on the collected data, Chapter 4 presents the distribution of project characteristics (including type of development, type of industry, type of business, architecture, programming language), the size of system, development schedule, effort, and others.
2.1 Data Collection Policy 2.1.1 Basic Policy
Adding two project types, data collection in 2006 expanded the major data items defined for the White Paper 2006. The new project types are “Maintenance//Support” and “Enhancement”. A new business type “Public Service” was also added. To obtain effective reliability data, data collection in 2006 employed a policy that promotes companies informing SEC of the number of defects identified after system cutover. Data items and definitions used for the data collection in 2006 are almost the same as those in the White Paper 2006.
Major data items collected in 2006 are listed below.
• Type of software development project: Development, Maintenance/Support, and Enhancement. • Architecture: intranet/Internet, 2-layer client/server, 3-layer client/server. • Type of business: finance/insurance, information, manufacturing, public service, etc. • Programming language: Java, VB, C, COBOL, C++, etc. • Platform: Windows, Unix. • Size index: Function Point (FP) or SLOC (Source Line of Code). (Mandatory item) • Number of defects identified after system cutover. (Mandatory item) • Effort, size, and development schedule. (Actual data mandatory) • Major-development phases from basic design to system test. (Mandatory) The five phases are necessary
for equal-condition comparison between different projects. • Priority was given to projects completed in the period between April 2003 and June 2006 in favor of
recent projects.
2. Data Collection
IPA/SEC White Paper 2007 on Software Development Projects in Japan 9
2.1.2 Collection Method
Using the Data Definition in Appendix A and Data Entry Form in Appendix B, SEC collected actual project data in the period from August to November in 2006. As described in Appendix C, SEC gave the collection priority to about 80 data items with two priority levels: “Mandatory” and “Critical”. Appendix C presents the reply ratio of each data item.
Data Item Definition Version 2.3
Appendix A presents Data Item Definition Version 2.3, which made improvements to the definition descriptions written in Data Item Definition Version 2.0 in Appendix A of the White Paper 2006 while keeping basic definitions unchanged.
Data Entry Form Version 2.3
Appendix B presents Data Entry Form Version 2.3, which was used for data collection. SEC created the entry form as its own special tool using Microsoft Office Excel. 2.1.3 Data Inspection
SEC made the following efforts to keep accuracy of data as high as possible. • SEC accepted project data that passed inspection or reviews by the quality assurance division or
production control division of data-offering companies. Since accurate FP size data was needed, SEC mainly used FP size data that was measured by company staff that was given measurement method lectures by the measurement support group of data-offering companies, or data that passed reviews by the group.
• SEC inspected offered data to find abnormal data values or mistyped data. When finding these errors, SEC asked the data-offering company to examine the errors. This process repeated until offered data was found to be error-free and ready for analysis.
10 IPA/SEC White Paper 2007 on Software Development Projects in Japan
2.2 Data Offering Status 2.2.1 Companies Offered Project Data
Companies that offered their project data to SEC are listed below.
Argo21 Corporation Toshiba Information Systems Corporation NEC Soft, Ltd. Nihon Unisys Ltd. NTT Software Corporation Nomura Research Institute Ltd. NTT Data Corporation Hitachi Systems & Services, Ltd. Oki Software Co., Ltd. Hitachi Ltd. Oki Electric Industry Co., Ltd. Hitachi Software Engineering Co., Ltd. KOZO KEIKAKU ENGINEERING Inc. Fujitsu Ltd. CSK SYSTEMS CORP. Matsushita Electric Industrial Co., Ltd. NS Solutions Corporation Mitsubishi Electric Information Systems Corporation TIS Inc. Ricoh Software Inc.
2.2.2 Data Volume
The following graph illustrates the relationship between the number of projects whose data was offered and the number of companies that offered the data. Figure 2-2-1 Data Volume and Company Count
2.2.3 Data-Updated Project Count per Fiscal Year
Figure 2-2-2 illustrates how much project data was updated in each fiscal year. Project data offered in fiscal 2005 and fiscal 2006 follow the data collection policy of SEC. The data originated from projects in recent years more or less and clearly states the beginning date and completion date of every project.
Data on 356 projects were updated or newly added in fiscal 2006. The 356 projects compose 27% of the 1,314 projects whose beginning and completion dates are clearly known.
Number of Projects
Num
ber o
f Com
pani
es
Ove
r 275
2. Data Collection
IPA/SEC White Paper 2007 on Software Development Projects in Japan 11
Figure 2-2-2 Data-Updated Project Count per Fiscal Year
N = 1,774 * Figure description
2006: Projects whose data was newly added in fiscal 2006.
2005: Projects whose data was newly added in fiscal 2005 or was updated and offered again after the first offer made in March 2005 or before.
2004: Projects whose data was offered in March 2005 or before and was not updated in fiscal 2005.
2.2.4 Per-Beginning-Year and Per-Completion-Year Cross-Total
Figure 2-2-3 illustrates the number of projects on a per-beginning-year and on a per-completion-year for each fiscal year in which data of the project was updated, that is, offered. Tables 2-2-4 and Figure 2-2-5 show the cross-totals between the project beginning year and the data update fiscal year and those between the project completion year and the data update fiscal year.
As much as 65% of the 1,314 projects whose beginning and completion dates are clearly known began in 2003 or later, and the projects completed in 2003 or later compose 74%. This shows that data about recent projects compose a major portion of the whole. The projects that began in 2002 or later compose 78%, and those began in 2001 or later compose 87%. The projects completed in 2002 or later compose 85%, and those completed in 2001 compose 94%. The projects completed in 2004 compose the maximum portion of 28%, and those completed in 2005 follow to compose about 22%. Figure 2-2-3 Per-Beginning-Year and Per-Completion-Year Project Count
Table 2-2-4 Per-Beginning-Year and Per-Completion-Year Cross-Total
Project count on a per-beginning-year and per-completion-year Data update fiscal year
Beginning/ Completion
1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 Total
Beginning year 1 3 0 16 31 62 76 106 157 86 2 0 5402004 Completion year 0 3 1 9 21 40 67 102 104 191 2 0 540Beginning year 0 1 0 3 7 43 44 43 90 161 27 0 4192005
Completion year 0 0 0 0 2 9 44 50 38 148 127 0 418Beginning year 0 0 0 0 0 2 4 23 21 100 199 7 3562006
Completion year 0 0 0 0 0 0 0 0 26 26 157 147 356
Beginning year
Num
ber o
f pro
ject
s
Num
ber o
f pro
ject
s
Completion year
Data update fiscal year Data update fiscal year
* Sample count: Beginning Year 5149_Beginning Date (actual) the whole projects (N = 1,315, No reply : 459)
* Sample count: Completion Year 5158_Completion Date (actual) the whole projects (N = 1,314, No reply : 460)
12 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Table 2-2-5 Per-Beginning-Year and Per-Completion-Year Project Count
Project count on a per-beginning-year
Project count on a per-completion-year
Beginning year
Completion year
Dat
a up
date
(offe
ring)
fisc
al y
ear
Dat
a up
date
(offe
ring)
fisc
al y
ear
3. Data Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 13
3 Data Analysis
This chapter describes the data analysis policy, the data filtering criteria, and the resulting presentation conventions. Chapters 4 and later mainly present the result of analysis in accordance with the data filtering criteria. Useful knowledge obtained in analysis is presented at relevant sections as notes.
3.1 Analysis Policy
The analysis was conducted for this White Paper under an approach focusing on “typical elements that serve as a foundation upon which the users and vendors can share the same viewpoint of many aspects of software development”. The approach also aims to clarify the relationship among the elements. 3.1.1 Analysis Scheme for Fiscal 2006
Figure 3.1.1 illustrates major characteristic elements (shown as ovals) of software development projects and the relationship among the elements (shown as arrows). Elements that affect the projects are categorized in two groups (shown as two rectangles), that is, “constraints on systematization” and “requirements for systematization”. The two element groups are the key to successful project completion. To know how the key works, we have to make elaborate efforts to collect various kinds of project data and analyze the relationship among the elements. Figure 3-1-1 Characteristic Elements and Their Mutual Relationship
Personnel assignment
Effort
Cost
Number of persons
Outsourcing
Environment
Training Education
Productivity
Reliability
Staff
Development standard
Measurement
Analysis
Improvement
Expertise
Experience
Price
Development schedule
Size
Required quality
Technology
Actual quality
Preciseness of requirements
Staff
Personnel assignment
Policy
Value
Constraints on systematization
Requirements for systematization
Development method
Architecture
Platform
Review
Test
End user Vendor
Degree of satisfaction
14 IPA/SEC White Paper 2007 on Software Development Projects in Japan
3.1.2 Analysis Procedure
This section describes the analysis procedure. Each of the following chapters describes details of the procedure in its opening paragraph. (1) Inspect each set of collected data to eliminate erroneous data that has one or more missing data items or
has inconsistent data values. Note that erroneous data used here does not mean that some of the value of the “data is out of acceptable range”. Data whose value is out of acceptable range are labeled as outlier as described later. If, for example, a data set lacks some project profile data or has inconsistent sums of data values, the data set is considered to be erroneous. On finding an erroneous data set offered from a company, SEC makes every possible attempt to ask the company to examine and correct the data set and send back an error-free data set.
(2) Use scattergrams to know the distribution of values of every data item and the relationship among variables. What is important at this stage is to extract “inherent trends of data values” from the scattergrams. Do not draw regression lines on the scattergrams or jump to an impetuous conclusion.
(3) Reveal the distribution of the size, effort, productivity, development schedule, reliability (the state of quality represented by the number of defects identified after system cutover). Analyze finer elements such as the architecture and platform, as necessary.
(4) Analyze the relationship among the characteristic elements illustrated in Figure 3-1-1. (5) “Stratify” the data with other project characteristics taken into account to make detailed analysis. For
example, analyze vendor-side factors such as organization, project staff, personnel assignment, and environment.
3. Data Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 15
3.2 Analysis Conventions
This section describes the data sampling conventions and the data item conventions.
3.2.1 Data Sampling Conventions
The following conventions are applied to data sampling. (1) Comparison under equal conditions requires that every compared data set contain data about all the
Major-development phases. For equal-condition comparison, sample projects that went through the Major-development phases from basic design to system test. (Refer to 5-phase type 1 and 5-phase type 2 listed in Table 4-13-1 Development Phase Combinations.)
(2) Analysis of effort data requires that effort is measured for each of the Major-development phases. (3) Analysis of size data requires that the measurement method is precisely known.
FP-based size analysis requires that both “701_actual FP size measurement method” and “10124_purity of actual FP size measurement method” are known. SLOC-based size analysis requires that the name of “primary programming language 1” is clearly known.
3.2.2 Data Item Conventions
This section describes conventions applied to data items. Data analysis assumes that the data item conventions are fully observed. For detailed definitions of data items and derived items, refer to Appendix A.
Project type • Make a group of projects whose type is “Maintenance/Support” or “Enhancement” as a group of
projects of the “Enhancement” type. • Handle projects of the “Development” type and those of the “Redevelopment” type as distinct types.
With little amount of “Redevelopment” project data available, the White Paper 2007 does not present data about “Redevelopment” projects.
• The “all project type” is provided to refer to any of the following types: “Development”, “Enhancement” (“Maintenance/Support” and “Enhancement”), and “Redevelopment”.
FP size • The functional size is represented in function points (FPs) in many cases. Several FP measurement
methods are known. The functional size measured by any of the methods is referred to as the “FP size”.
• Use the unadjusted functional size only. The adjusted functional size is yielded through an adjustment method that varies from method to method. The definition of “unadjusted functional size” follows JIS X 0135-1:1999.
• For a system that provides improved functionality implemented by a Maintenance/Support or Enhancement project, the functional size of the system refers to the functional size that excludes that of the base system to which the improved functionality was added. To follow this convention, eliminate every project whose functional size includes an unknown functional size of the base system.
• The following FP measurement methods are categorized as the “IFPUG group”: IFPUG method, NESMA estimated (FP count) method, and SPR method. Depending on the purpose of analysis, the “IFPUG group” is used if one or more of other kinds of FP measurement methods coexist in the same project data set. The following conditions must be satisfied to make a group of FP measurement methods.
(a) The definition of the measurement method is open to the public. <Examples of open method> IFPUG method ISO/IEC 20926:2003 Software engineering -- IFPUG 4.1 Unadjusted functional size measurement method -- Counting practices manual NESMA estimated (FP count) method ISO/IEC 24570:2005 Software engineering -- NESMA functional size measurement method version 2.1 -- Definitions and counting guidelines for the application of Function Point Analysis SPR method Capers Jones, pp.82-88 and appendix B of “Applied Software Measurement Assuring Productivity and Quality Second Edition”
(b) The same software model is used. The NESMA estimated (FP count) method and the SPR method are simplified versions of the IFPUG method. This means that these three methods share the same software model. Refer to the “SEC Journal No. 5” pp.36-43 for more information on the software model.
16 IPA/SEC White Paper 2007 on Software Development Projects in Japan
• FP size values accompany the smallest unit FP. In some cases, “KFP” is used to represent 1,000 FPs.
SLOC size • SLOC stands for Source Lines of Code, the number of source lines of program code. • The “SLOC size of an Enhancement project” is indicated by “SLOC size” enhancement. • The SLOC size excludes comment lines and blank lines. • If the SLOC size in the offered data includes comment lines or blank lines, use the ratio of comment
lines or that of blank lines included in the data, calculate the number of comment lines or blank lines by using the ratio, and subtracted from that SLOC size to obtain the net SLOC size.
• The unit “KSLOC” is used to represent the number of source code lines in multiples of one thousand SLOCs.
• If types of programming languages have to be stratified in terms of SLOC size, use the “primary programming language 1”.
• Primary programming languages commonly used are referred to as the “primary programming language group”.
• Analysis based on the SLOC size requires that the types of programming languages are presented as the conditions under which the analysis was made.
Effort • The effort used for the project data analysis is the sum of “in-house effort” and that of “outsourced
effort”. The in-house effort consists of the effort of “development”, “management”, and “every other kind” of in-house task, including “tasks that do not belong to any defined type”. If conversion of effort in person-months is necessary, use the conversion coefficient included in the offered project data. Conversion of effort in person-hours uses the coefficient of 160 hours per month, which is obtained by multiplying the legal working hours per day of 8 hours provided by the Labor Standard Act and the average net working days per month of 20 days together. Note that the White Paper 2006 uses the 165 hours per month for the same purpose.
• If a project went through all Major-development phases from basic design to system test, effort used for analysis of the project is the sum of effort consumed in the “five phases” and effort “that does not belong to any defined type”.
• The total amount of project effort may be used depending on the purpose of analysis.
Development schedule • The development schedule spans from the beginning of the basic design phase to the end of the system
test phase, covering all Major-development phases. • The overall project schedule may be used depending on the purpose of the analysis.
Number of staff per month • Number of staff per month = Actual effort (Major-development phases) ÷ Actual months
(Major-development phases) ÷ Conversion ratio between person-months and person-hours The “number of staff” is a derived indicator listed in Appendix A.4.
• Conversion of effort in person-month uses the same method as that for effort conversion. 3.2.3 Other Conventions
Outliers Project data analysis does not exclude data values that deviate excessively from the average or from the
typical distribution. Data subject to the analysis is made open to the public via a process “that clarifies the reason why outliers are eliminated”. Figure 3-2-1 Example of Outliers
Data Value Intervals
Freq
uenc
y
3. Data Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 17
3.3 Analysis Guidelines
This section describes the analysis guidelines applied to the analysis results presented in this White Paper, including the publishing standard, evaluation criteria, and the basic format of analysis results. The significance of the analysis results is described in each chapter with reference to profile data such as the purpose of analysis, analyzed data, and their characteristics.
Note: Please read the results and profile data as a complete set or take an alternative approach to avoid an “oversimplified results evaluation that is based on only numbers”.
3.3.1 General
Adoption criteria • Results are adopted if the number of analyzed samples yielding the results is ten or more. • Results based on multiple strata are adopted if any of the strata has ten or more samples. • Similarly, basic statistics or box-and-whisker plots are published if any of them has ten or more samples. • Results are published if the sampled data yielding the results is not biased in favor of a particular
company. • Samples must be provided by three or more companies and any one company must not provide more
than 70% of the samples. • Results that do not satisfy the above criteria may be adopted for special reasons with relevant remarks.
Unit notation The following table defines the unit notation used in graphs and diagrams in this White Paper.
Table 3-3-1 Unit Notation
Data Type Typical Unit Notation FP size Omitted (Assume FP when omitted.) SLOC size Omitted (Assume SLOC when omitted.) Multiples of 1,000 SLOCs [KSLOC] Multiples of 1,000 FPs [1,000FP]
[KFP] (The notation K is used if enough space is not available.) Effort in person-hours [person-hours] Effort in person-months [person-months] Development schedule duration in months [months] Number of identified defects Omitted Number of staff [persons]
Adoption conventions This section describes the adoption conventions applied to Chapter 6 through Chapter 10.
• Notation of used data
Sampling conditions for the population and analyzed data are noted as shown in the following example. Example: The population is sampled in accordance with conditions 1, 2, and 3 to analyze the relationship
between data 1 and data 2.
Strata definition Analyzed data • Condition 1 ( 1st sampling condition) • X-axis: Data 1 ( 1st analyzed data) • Condition 2 ( 2nd sampling condition) • Y-axis: Data 2 ( 2nd analyzed data) • Condition 3 ( 3rd sampling condition)
If analyzed data, for example data 1, serve as a derived indicator, the data accompanies an appropriate
note such as “Data 1 (derived indicators)” in the part “ Analyzed data.” Refer to Appendix A2 and Appendix A4 for data definitions.
18 IPA/SEC White Paper 2007 on Software Development Projects in Japan
• Derived indicators examples • FP productivity • SLOC productivity • Number of identified defects • FP defects density • SLOC defects density • Number of staff per month • Outsourcing ratio • Target platform group (Windows or Unix) • FP measurement method mixture
• Representation of analysis results
• “Scattergram”: Displays the dissipation of and the fluctuation trend of data values. • “Basic statistics”: Displays the fluctuation trend of statistical data. • “Box-and-whisker plot”: Displays the median, 25 percentiles, and 75 percentiles to indicate the
distribution trends. 3.3.2 Basic Statistics
Publishing standard for basic statistics Basic statistics are published if the statistics have ten or more samples. Note that basic statistics having less than ten samples are published in either of the following cases.
• The basic statistics accompany data arranged in multiple strata any of which has ten or more samples. • The basic statistics are productivity data, in which case the data are only presented by number of
instances (N) and the median.
Basic statistics format The “basic statistics” are presented in either of the two formats defined in Table 3-3-2. The “N”, “Min”, “P25”, “Med”, “P75”, “Max”, “Mean”, “S.D.” mean the number of instances, minimum,
25 percentiles, median, 75 percentiles, maximum, mean, and standard deviation, respectively. Both “P25” and “P75” are omitted if “N” is equal to or less than 3. Refer to Appendix D for more information on terminology. Table 3-3-2 Basic Statistics Formats
N Min P25 Med P75 Max Mean S.D.
N Med Mean S.D.
Basic statistics evaluation criteria Table 3-3-3 defines the evaluation criteria for the basic statistics.
Table 3-3-3 Evaluation Criteria When Using Basic Statistics
Item Criteria 1 Number of cases per stratum: n Minimum requirement: n ≥ 10. Desirable: n ≥ 30. 2 Choice of statistics average The absolute value of skewness |s| larger than 2 generally indicates a
large amount of asymmetry. Hence, choose the median instead of the mean if |s| > 2.
3. Data Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 19
3.3.3 Regression Analysis
Adoption criteria for regression analysis results The results of regression analysis are adopted if all three adoption criteria listed in Table 3-3-4 are
satisfied. A regression equation is published based on the adoption criteria if it yields a reasonable relationship with
respect to both the number of data values and correlation. This also applies to adoption of regression lines or curves. Note that this does not apply when the trend of data is illustrated just for visualization purposes or when coefficients are necessary for description. Table 3-3-4 Evaluation Criteria When Using Regression Analysis
Item Criteria 1 Number of cases per stratum: n n ≥ 30 2 Evaluation of correlation Coefficient of determination R2 ≥ 0.75: Large correlation
Coefficient of determination R2 ≥ 0.64: Medium-to-large correlation 3 Significance of correlation Significant if p-value < 0.05. (At the risk ratio of 5%)
3.3.4 Scattergram with Confidence Width
Format of scattergram with confidence width The confidence width indicates the confidence interval (such as 50% or 95%) obtained from measured
values by calculation. The “confidence width of 50%” for an estimated value, for example, indicates that the value comes in the width with the probability of 50%.
The example analysis results illustrated in Figure 3-3-5 shows the confidence width by using two curves labeled y (50%) and y (-50%). The curve y (50%) indicates the upper limit of the confidence width and the other does the lower limit. Thus, the confidence width is represented by its upper and lower limits. The confidence width of 95%, for example, is represented by y (95%) and y (-95%). Note: The “Multivariate Statistical Analysis” (Gendai-Sugakusha) was used as a reference for calculation of confidence width. Figure 3-3-5 Example of Scattergram with Confidence Width
Sample Observed value
20 IPA/SEC White Paper 2007 on Software Development Projects in Japan
3.3.5 Box-And-Whisker Plot
Box-and-whisker plot format Box-and-whisker plots graphically show the degree of dispersion as well as the median for each population,
thereby making it easier to grasp differences among populations. As illustrated in Figure 3-3-6, a box-and-whisker plot consists of a “box” and “whiskers”. The top edge and bottom edge of the box is referred to as the upper hinge and the lower hinge, respectively. The upper hinge represents the upper quartile, above which the highest 25% portion of the population resides. Similarly, the lower hinge represents the lower quartile, under which the lowest 25% portion of the population resides. The box is divided by a line that indicates the median, which divides the population into equal halves of 50%. The lower whisker extends from the lower hinge to the value of the minimum observation except for lower outliers. The upper whisker extends from the upper hinge to the value of the maximum observation except for higher outliers. Figure 3-3-6 Box-And-Whisker Plot Example
Extreme outlier Outer boundary
Outliers
Inner boundary (Max. except for outliers)
Upper hinge Median Lower hinge
(Box height × 3)
Inner boundary (Min. except for outliers)
(Box height × 3)
Outer boundary
Outliers
(Box height × 1.5)
(Box height × 1.5)
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 21
4 Profiles of Collected Data
This chapter presents profiles of data of “1,774 projects in total” which SEC had collected the data by December 2006.
4.1 Data Adoption Criteria and Format
This section describes criteria applied to adoption of the profile data. (1) Adoption criteria
Data items labeled “Mandatory”, “Critical”, or “Recommended” are presented in this chapter if projects accompanying the data items add up to approximately 10% or more of the total number of projects. In the case of this White Paper, the 10% threshold equals approximately 180 projects. Refer to Appendix C for more information. (2) Formats
Profiles of the project data are presented in the following three formats. • “Pie chart”
A pie chart is used to show the percentage and frequency of each alternative (such as a, b, or c) associated with a data item.
• “Bar chart and table” For a data item providing multiple-choice alternatives (such as a, b, and c), a bar chart and a table are used together to show the percentage and frequency of each alternative.
• “Histogram and table of basic statistics” For a data item having variable numerical values, a histogram and a table of basic statistics are used together to show the distribution of the values.
(3) Numeric Information presented together with charts
Pie charts and tables of frequencies show the names of alternatives, frequencies, and frequency percentages.
• The name of an alternative is shown only if the alternative has a non-zero frequency. Refer to Data Item Definition in Appendix A for available alternatives of each data item.
• The frequency of a data item alternative refers to the number of projects that chose the alternative. If the data item provides multiple-choice alternatives, the frequency of each of the alternatives is presented. Note that the frequency of the second or third alternative is left blank if the alternative has a zero frequency.
• The ratio of a data item alternative refers to the percentage of the number of projects that chose the alternative with respect to the total number of companies that chose any alternative of the data item. If a data item provides multiple-choice alternatives, only the frequency ratio of the first choice are listed in frequency tables. Data Item Definition in Appendix A defines the first choice as the alternative that is most typical or is chosen most frequently. This is the reason why only the first alternative is listed in the table.
(4) Treatment of Missing Data Items
Frequencies of data item alternatives exclude no-reply counts. This means that frequencies of alternatives of a data item may add up to a value smaller than the total number of target projects, that is, 1,774. In any case, a no-reply count n is noted in the form “No-reply count: n.” If a non-zero no-reply count is found in the second or third choice of multiple-choice alternatives as well as in the first choice, only the non-zero no-reply count of the first choice is presented. (5) Data item definition
Data items used to yield the project data profile are presented in the form “Analyzed data item(s): Data ID_Data name.” Some types of profile such as FP productivity are obtained via derived indicators calculated from the source data values. Refer to Appendix A.2 and A.4 for data item definitions and derived indicators, respectively.
22 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.2 General Characteristics of Development Projects
This section presents basic characteristics of development projects.
(1) Type of project (2) Category of project (3) Purpose of project (4) New customer or not (5) New business or not (6) Using new technology or not Figure 4-2-1 Types of Projects Figure 4-2-2 Category of Project
N = 1,774 (No-reply count: 0) * Analyzed data item(s): 103_Project type
N = 1,774 (No-reply count: 0) * Analyzed data item(s): 105_Project category
Projects of the “Development” type and those of the “Maintenance/Support” type take up 60% and nearly 30%, respectively. These two types take up a major potion.
Projects in the “Entrusted Development” category take up more than 90%.
Figure 4-2-3 Purpose of Project
N = 1,774 (No-reply count: 2) * Analyzed data item(s): 107_Project purpose_1, ... , and 107_Project purpose_12
Most of the projects share the purpose of “software development”. Projects aimed at “project management” or “system migration” take up 10 to 20%, and those aimed at “infrastructure build-up” take up 10%.
Number of Projects
Software development
Infrastructure-building
Operational environment preparation
System migration
Maintenance
Operation support
Consulting
Project management
Quality assurance
On-site environment preparation/adjustment for a running system
Customer training
Others
a: Commercial package
development, 101, 5.7%
c: Redevelopment, 86, 4.8%
b: Maintenance /support, 472,
26.6% a: Development, 1,035, 58.3%
c: For in-house use, 11,
0.6%
b: Entrusted development, 1,634, 92.1%
d: Enhancement, 181, 10.2%
e: Other, 12, 0.7%
d: Prototyping, 16, 0.9%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 23
Figure 4-2-4 New Customer or Not Figure 4-2-5 New Business or Not
N = 603 (No-reply count: 1,171) * Analyzed data item(s): 108_New customer or not
N = 554 (No-reply count: 1,220) * Analyzed data item(s): 109_New business or not
Projects for “current customers” take up more than 80%, and those for “new customers” take up nearly 20%.
Projects for “current businesses” take up more than 80%, and those for “new businesses” take up nearly 20%.
Figure 4-2-6 Using New Technology or Not
N = 489 (No-reply count: 1,285) * Analyzed data item(s): 111_Using new technology or not
Projects that “used new technology” take up less than one fourth of the whole.
a: Used new technology,
114, 23.3%
b: Did not use new technology,
375, 76.7%
a: New customer, 113, 18.7%
a: New industry or business, 92,
16.6%
b: Old customer, 490, 81.3%
b: Old business or industry, 462,
83.4%
24 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.3 Project Applications
This section presents the following application characteristics of the projects. (1) Type of industry (Major type) (2) Type of business (3) User accessibility Figure 4-3-1 Type of Industry (Major Type)
Type of Industry (Major Type) 1st alternative
Ratio 2nd alternative
3rd alternative
A : Agriculture 5 0.3% C : Fisheries 1 0.1% E : Construction 23 1.5% F : Manufacturing 232 15.2% 1 G : Electricity, gas, heat supply and water 33 2.2% H : Information and communications 216 14.2% 14 I : Transport 74 4.9% 2 1J : Wholesale and retail trade 123 8.1% 4 1K : Finance and insurance 501 32.9% 5 3L : Real estate 24 1.6% M : Eating and drinking places, accommodations 7 0.5% N : Medical, health care and welfare 34 2.2% O : Education, learning support 8 0.5% P : Compound services 6 0.4% Q : Services, N.E.C. 67 4.4% 2 R : Government, N.E.C. 121 7.9% 2 S : Industries unable to classify 50 3.3%
Total 1,525 100.0% 30 5 N = 1,525 (No-reply count: 249) * Analyzed data item(s): 201_Industry type_1 (Major type), 201_Industry type_2 (Major type), and 201_Industry type_3
(Major type)
“Finance/Insurance” is the largest, taking up more than 30%. “Manufacturing”, “Information and Telecommunications”, “Wholesale/Retail Trade”, and “Public Service” follow in descending order.
Number of Projects
1st alternative
2nd alternative
3rd alternative
A: AgricultureC: Fisheries
E: ConstructionF: Manufacturing
G: Electricity, gas, heat supply and waterH: Information and communications
I: TransportJ: Wholesale and retail trade
K: Finance and insuranceL: Real estate
M: Eating and drinking places, accommodationsN: Medical, health care and welfare
O: Education, learning supportP: Compound services
Q: Services, N.E.C.R: Government, N.E.C.
S: Industries unable to classify
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 25
Figure 4-3-2 Type of Business
Type of Business 1st alternative
Ratio 2nd alternative
3rd alternative
a : Management/planning 14 1.0% 2 b : Accounting 95 7.0% 8 c : Sales 160 11.8% 10 d : Production/distribution 62 4.6% 2 2 e : Personnel/welfare. 43 3.2% 1 f : General management 162 12.0% 8 3 g : General affairs 24 1.8% 7 h : Research/development 27 2.0% i : Technology/control 55 4.1% 2 j : Master management 23 1.7% 3 3 k : Ordering/inventory 79 5.8% 10 1 l : Distribution management 15 1.1% 3 m : Subcontractor management 2 0.1% 1 n : Contract/transfer 52 3.8% 5 o : Customer management 65 4.8% 3 3 p : Product planning (per-product) 13 1.0% 1 q : Product management (per-product) 47 3.5% 7 1 r : Facility (stores) 23 1.7% s : Information analysis 73 5.4% 8 4 t : Other 321 23.7% 12 1
Total 1,355 100.0% 92 19 N = 1,355 (No-reply count: 419) * Analyzed data item(s): 202_Business type_1, 202_Business type_2, and 202_Business type_3
“Sales” is the largest, and “Management”, “Accounting”, “Ordering/Inventory” follow in descending order (excluding Others). Figure 4-3-3 User Accessibility
N = 1,641 (No-reply count: 133) * Analyzed data item(s): 204_User accessibility
More than 80% of the projects developed systems that are “accessible to limited users”, and the rest (nearly 20%) is “open to public”.
Number of Projects
a: Management/planningb: Accounting
c: Salesd: Production/distribution
e: Personnel/welfare.f: General management
g: General affairsh: Research/development
i: Technology/controlj: Master management
k: Ordering/inventoryl: Distribution management
m: Subcontractor managementn: Contract/transfer
p: Product planning (per-product)q: Product management (per-product)
r: Facility (stores)
o: Customer management
s: Information analysist: Other
1st alternative
2nd alternative
3rd alternative
a: Accessible to limited users, 1,382, 84.2%
b: Open to the public, 259, 15.8%
26 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.4 System Characteristics
This section presents the following profiles, which represent characteristics of developed systems. The profiles were used for stratification and classification. (1) Type of developed system (2) Use of business application package (3) Mode of processing (4) Architecture (5) Target platform (6) Use of Web technology (7) Programming language (8) Use of DBMS Figure 4-4-1 Type of Developed System
N = 1,758 (No-reply count: 16) * Analyzed data item(s): 301_Type of developed
system
More than 90% of the projects developed systems of “application software”. Thus, most of the projects developed business applications.
Figure 4-4-2 Use of Business Application Package
N = 1,304 (No-reply count: 470) * Analyzed data item(s): 302_Use of business
application package * Reference data item: 303_Using business
application package for the first time (Reply count: 100)
Approximately 20% of the projects “used business application packages” for system development. One hundred replies were made for the item whether the company used one or more application packages for the first time or more than once. Among the 100 replies, 21 replies confirm that application packages were used for the first time and 77 replies do that application package was used more than once. The remaining two replies tell that it is uncertain whether or not an application package was used.
c: Tool software, 14, 0.8%
e: Other, 8, 0.5%
b: System software (middleware,
operating system), 67, 3.8%
a: Application software,
1,660, 94.4%
d: Development environment software, 9, 0.5%
a: Yes, 249, 19.1%
b: No, 1,055, 80.9%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 27
Figure 4-4-3 Mode of Processing
Mode of Processing 1st alternative
Ratio 2nd alternative
a : Batch processing 57 10.7% 25 b : Interactive processing 366 68.8% 27 c : Online transaction processing 99 18.6% 8 d : Other 10 1.9%
Total 532 100.0% 60 N = 531 (No-reply count: 1,243) * Analyzed data item(s): 307_Processing Mode_1 and 307_Processing Mode_2
Nearly 70% of the projects developed systems for “interactive processing”, which processes data by exchanging information between computers and human beings via human-machine interfaces such as the keyboard, mouse, and display. The percentage of interactive processing is far greater than other modes of processing such as “online transaction processing” and “batch processing”. Figure 4-4-4 Architecture
Architecture 1st alternative
Ratio 2nd alternative
3rd alternative
a : Stand-alone 221 13.2% 2 b : Mainframe 143 8.5% 2 c : 2-layer client/server 448 26.7% 11 1 d : 3-layer client/server 317 18.9% 8 1 e : Intranet/Internet 475 28.4% 24 f : Other 71 4.2% 5 2
Total 1,675 100.0% 52 4 N = 1,675 (No-reply count: 99) * Analyzed data item(s): 308_Architecture_1, 308_Architecture_2, and 308_Architecture_3
The largest portion is the “intranet/Internet” architecture, taking up nearly 30%. The “2-layer client-server” architecture and the “3-layer client-server” architecture follow in descending order.
Number of Projects
a: Stand-alone
b: Mainframe
c: 2-layer client/server
d: 3-layer client/server
e: Intranet/Internet
f: Other
1st alternative 2nd alternative 3rd alternative
Number of Projects
a: Batch processing
b: Interactive processing
c: Online transaction processing
d: Other
1st alternative 2nd alternative
28 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-4-5 Target Platform
Target Platform 1st alternative
Ratio 2nd alternative
3rd alternative
a : Windows 95, 98, or Me 32 2.1% 67 7b : Windows NT, 2000, or XP 706 47.1% 133 16c : Windows Server 2003 82 5.5% 31 2d : HP-UX 126 8.4% 43 2e : HI-UX 22 1.5% 7 2f : AIX 39 2.6% 16 4g : Solaris 178 11.9% 60 6h : Redhat Linux 23 1.5% 8 1k : Turbo Linux 3 0.2% 1 l : Other type of Linux 5 0.3% 1 m : Linux 37 2.5% 5 1n : Other type of UNIX 45 3.0% 14 1o : MVS 60 4.0% 1 1p : IMS 7 0.5% 2 q : TRON 1 0.1% r : Office computer system 11 0.7% 2 s : Other 121 8.1% 27 12
Total 1,498 100.0% 418 55 N = 1,498 (No-reply count: 276) * Analyzed data item(s): 309_Target platform_1, 309_Target platform_2, and 309_Target platform_3
Major platform types found in the first choice are “Windows platforms (alternatives a, b, and c)” taking up more than 50% and “Unix platforms including Solaris, HP-UX, AIX, and Linux (alternatives d through n)” taking up approximately 30%. On the other hand, the second and third choices add up to more than 400 in number. This implies that the major architectures (intranet/Internet and client/server) favor multi-platform configurations.
Number of Projects
a: Windows 95, 98, or Me b: Windows NT, 2000, or XP
c: Windows Server 2003 d: HP-UX e: HI-UX
f: AIX g: Solaris
h: Redhat Linux k: Turbo Linux
l: Other type of Linux m: Linux
n: Other type of UNIX o: MVS p: IMS
q: TRON r: Office computer system
s: Other
1st alternative
2nd alternative
3rd alternative
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 29
Figure 4-4-6 Use of Web technology
Use of Web technology 1st alternative
Ratio 2nd alternative
3rd alternative
a : HTML 77 7.2% 25 4 b : XML 17 1.6% 17 1 c : Java Script 95 8.8% 41 8 d : ASP 53 4.9% 6 2 e : JSP 38 3.5% 15 14 f : J2EE 22 2.0% 18 9 g : Apache 27 2.5% 9 8 h : IIS 29 2.7% 8 3 i : Tomcat 5 0.5% 12 9 j : JBOSS 1 0.1% 1 k : OracleAS 4 0.4% 1 l : WebLogic 40 3.7% 4 4 m : WebSphere 38 3.5% 5 6 n : Coldfusion 4 0.4% o : WebService 2 0.2% 1 p : Other 61 5.7% 16 6 q : None 563 52.3%
Total 1,076 100.0% 178 75 N = 1,076 (No-reply count: 698) * Analyzed data item(s): 310_Use of Web technology_1, 310_Use of Web technology_2, and 310_Use of Web technology_3
“Projects that did not used Web technologies” take up more than half of the whole. “JavaScript” takes up the largest portion, and “HTML”, “JSP”, “ASP”, and “J2EE” follow in descending order.
Number of Projects
a: HTML b: XML
c: Java Script d: ASP e: JSP
f: J2EE g: Apache
h: IIS i: Tomcat j: JBOSS
k: OracleAS l: WebLogic
m: WebSphere n: Coldfusion
o: WebService p: Other q: None
1st alternative
2nd alternative
3rd alternative
30 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-4-7 Programming Language
Programming Language 1st alternative
Ratio 2nd alternative
3rd alternative
a : Assembly language 1 0.1% 2 b : COBOL 318 19.8% 47 4c : PL/I 8 0.5% 3 2d : Pro*C 17 1.1% 14 6e : C++ 87 5.4% 21 9f : Visual C++ 72 4.5% 33 8g : C 220 13.7% 101 21h : VB 276 17.1% 120 28i : Excel (VBA) 11 0.7% 9 4j : PowerBuilder 7 0.4% 8 7k : Developer2000 17 1.1% 1 l : InputMan 3 m : PL/SQL 37 2.3% 59 25n : ABAP 13 0.8% o : C# 25 1.6% 4 2p : Visual Basic NET 32 2.0% 6 1q : Java 310 19.3% 77 35r : Perl 7 0.4% 7 8s : Shell script 2 0.1% 16 11t : Delphi 5 0.3% 10 3u : HTML 12 0.7% 47 8v : XML 3 0.2% 7 6w : Others 130 8.1% 112 65
Total 1,610 100.0% 707 253 N = 1,611 (No-reply count: 163) * Analyzed data item(s): 312_Primary programming language_1, 312_Primary programming language_2, 312_Primary
programming language_3, and 312_Primary programming language_4
For the first choice, “COBOL” and “Java” together take up the largest portion of 20%. “VB”, “C”, “C++”, and “Visual C++” follow in descending order.
Number of Projects
a: Assembly language b: COBOL
c: PL/I d: Pro*C
e: C++ f: Visual C++
g: C h: VB
i: Excel (VBA) j: PowerBuilder
k: Developer2000 l: InputMan m: PL/SQL
n: ABAP o: C#
p: Visual q: Java
w: Others
s: Shell script r: Perl
t: Delphi
v: XML u: HTML
1st alternative
2nd alternative
3rd alternative
4th alternative
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 31
Figure 4-4-8 Use of DBMS
Use of DBMS 1st alternative
Ratio 2nd alternative
3rd alternative
A : Oracle 634 48.7% 5 2 B : SQL Server 102 7.8% 9 1 C : PostgreSQL 15 1.2% 1 D : MySQL 6 0.5% E : Sybase 10 0.8% F : Informix 1 0.1% 1 G : ISAM 6 0.5% 1 H : DB2 53 4.1% 15 I : Access 24 1.8% 7 J : HiRDB 50 3.8% 3 K : IMS 45 3.5% L : Other 157 12.0% 9 2 M : None 200 15.3% 1 1
Total 1,303 100.0% 52 6 N = 1,303 (No-reply count: 471) * Analyzed data item(s): 313_Use of DBMS_1, 313_Use of DBMS_2, and 313_Use of DBMS_3
Nearly 90% of the projects used DBMS products. Among the products, “Oracle” takes up the largest portion of approximately 50%.
Number of Projects
a: Oracle b: SQL Server
c: PostgreSQL
d: MySQL
e: Sybase
f: Informix
g: ISAM h: DB2
i: Access
j: HiRDB k: IMS
l: Other
m: None
1st alternative
2nd alternative
3rd alternative
32 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.5 Development Techniques
This section presents the following profiles, which represent development techniques. (1) Development life cycle model (2) Examined similar projects or not (3) Application of development methods (4) Use of development frameworks (5) Using tool software or not Figure 4-5-1 Development Life Cycle
Model Figure 4-5-2 Examined Similar In-House
Projects or Not
N = 1,678 (No-reply count: 96) * Analyzed data item(s): 401_Development life cycle model
N = 244 (No-reply count: 1,530) * Analyzed data item(s): 403_Examined similar projects or
not
Projects that used the “waterfall model” take up more than 90%. Projects that used the iterative and incremental model (SLCP) take up a very little portion.
Projects that “examined similar in-house projects”take up more than 60%.
Figure 4-5-3 Application of Development
Methods Figure 4-5-4 Use of Development
Frameworks
N = 407 (No-reply count: 1,367) * Analyzed data item(s): 412_ Application of Development
Methods
“Projects that applied development methods” take up 80%. Among the applied methods, the “structured analysis and design” method takes up the largest portion, which is nearly 40%. The “object-oriented analysis and design” method and the “data oriented approach (DOA)” follow in descending order.
N = 255 (No-reply count: 1,519) * Analyzed data item(s): 422_Use of development
frameworks
Projects that “used development frameworks”take up only one fourth of the whole. The majority of the projects used no development frameworks.
e: None, 78, 19.2%
d: Other, 105, 25.8%
b: Object-oriented analysis and
design, 52, 12.8%
b: No, 196, 76.9%
a: Yes, 59, 23.1%
c: Data-oriented approach (DOA),
21, 5.2%
a: Structured analysis and design, 151,
37.1%
c: Other, 20, 1.2%
b: Iterative, 41, 2.4%
a: Waterfall, 1,617, 96.6%
b: No, 84, 34.4%
a: Yes, 160, 65.6%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 33
Figure 4-5-5 Using Tool Software or Not
Analyzed data a: Yes b: No N No answer 404_Use of project management tool 214 314 528 1,246 405_Use of configuration management tool 225 294 519 1,255 406_Use of design support tool 87 417 504 1,270 407_Use of documentation tool 168 330 498 1,276 408_Use of debug/testing support tool 215 350 565 1,209 409_Use of CASE tool 20 247 267 1,507 411_Use of code generator 41 232 273 1,501
Projects that used “configuration management tool” software take up approximately 40%. This percentage
also applies to “document authoring tool” software, “testing support tool” software, and project management tool software.
Projects that used “design support tool” software or “code generator” are nearly 20%. Projects that used “CASE tool” take up nearly 10%.
a: Yes b: No
Project management tool
Design support tool
Documentation tool
Debug/testing support tool
CASE tool
Code generator
Configuration management tool
34 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.6 User Requirement Management
This section presents the following profiles, which represent user requirements, difficulties that lie in the requirements, and user participation in projects. (1) User requirements and participation (2) Level of requirements Figure 4-6-1 User Requirements and Participation
Desirable Undesirable
Analyzed data a b c d N No answer501_ Clearness of user requirements specifications 74 348 203 47 672 1,102502_ User participation in user requirement specifications 174 210 190 26 600 1,174503_ User expertise in computing 81 175 64 26 346 1,428505_ Clearness of user role and responsibility 43 112 23 7 185 1,589507_ User comprehension of system design 50 101 22 1 174 1,600509_ User participation in acceptance test 117 201 42 64 424 1,350 * Alternatives a, b, c, and d
[501_Clearness of user requirements] a: Very clear, b: Clear, c: Ambiguous, d: Very ambiguous [502_User participation in user requirement specifications] a: Full, b: Adequate, c: Inadequate, d: None [503_User expertise in computing] a: Comprehensive, b: Adequate, c: Inadequate, d: None [505_Clearness of user role and responsibility] a: Very clear, b: Clear, c: Ambiguous, d: Very ambiguous [507_User comprehension of system design] a: Full, b: Adequate, c: Inadequate, d: None [509_User participation in acceptance test] a: Full, b: Adequate, c: Inadequate, d: None
Projects that confirmed their user requirements were “Very clear” take up more than 50%, while projects that confirmed their requirements were “Ambiguous” or “Very ambiguous” take up nearly 40%. Project that confirmed their users' participation in requirement specifications was “Inadequate” or “None” take up nearly 40%. The percentage of projects that confirmed their users' participation in requirement specifications was “Adequate” has grown higher since the issue of the White Paper 2006.
Projects that confirmed their users' participation in the acceptance test was “Adequate” take up nearly 50%. Those that confirmed their users' participation in acceptance test was “Full” or “Adequate” take up more than 70%. Note that projects that confirmed their users' participation in acceptance test was “None” take up nearly 20%, a portion that is not negligible.
User expertise in computing, user comprehension of system design, and clearness of user role and responsibility gained good evaluation.
Clearness of user requirements specifications
User expertise in computing
Clearness of user role and responsibility
User comprehension of system design
User participation in user requirement specifications
User participation in acceptance test
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 35
Figure 4-6-2 Level of Requirements
Analyzed data a: Very high b: High c: Medium d: Low N No
answer512_Level of requirements (Reliability) 76 186 294 22 578 1,196513_Level of requirements (Usability) 19 79 77 7 182 1,592514_Level of requirements (Performance and efficiency) 52 205 374 28 659 1,115515_Level of requirements (Maintainability) 17 47 104 14 182 1,592516_Level of requirements (Portability) 13 26 87 62 188 1,586517_Level of requirements (Running cost) 6 22 102 41 171 1,603518_Level of requirements (Security) 27 109 263 30 429 1,345
“Usability”, “reliability”, “maintainability”, “performance and efficiency” are the aspects of quality that are
required be at a high or very high level. The percentage of projects that faced a very high or high level of user requirements vary with the aspect of quality between nearly 40% and approximately 50%. More than 10% of the projects were required to implement a very high level of “reliability” and “usability”. The portability and running cost were required to be at a relatively low level.
Reliability
Usability
Performance and efficiency
Maintainability
Portability
Running cost
Security
a: Very high b: High c: Medium d: Low
36 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.7 Development Staff Skills and Experiences
This section presents the following profiles, which represent skills and experience of development staff. (1) Skills and experience of project managers (PMs) (2) Experience of development staff (3) Personnel assignment and skills for testing Figure 4-7-1 Experiences and Skills of PMs
N = 498 (No-reply count: 1,276) * Analyzed data item(s): 601_PM skill
Projects that confirmed their project managers had skills of “level 4” take up 50%. Projects that confirmed their project managers had skills of “level 5” and “level 6 or 7” (comprehensive expertise) take up approximately 20%.
Note: About the definition of 601_PM skill
IT Skill Standard Version 2 is used to classify project managers' skills by referring the “Project Management” occupation defined in the standard. For the skill level index and the degree of skill, refer to “IT Skill Standard Version 2 Project Management” at http://www.ipa.go.jp/jinzai/itss.
The following table shows the mapping between the skill level defined in the skill standard and the skill level used in this White Paper. Grade guideline for “system development, application development, or system integration”
Skill levels defined by IT Skill Standard Version 2
The person experienced management of 500 staff or more at the maximum, or experienced projects each of which had per-year contract money of ¥1 billion or more. The person experienced management of N staff (50 ≤ N < 500) at the maximum, or experienced projects each of which had per-year contract money of ¥0.5 billion or more.
a: Level 6 or 7
The person experienced management of N staff (10 ≤ N < 50) at the maximum, or experienced projects each of which had per-year contract money of ¥0.1 billion or more.
b: Level 5
The person experienced management of less than 10 staff at the maximum. c: Level 4 The number of staff is irrelevant at this level. d: Level 3
d: Level 3, 52, 10.4%
c: Level 4, 249, 50.0%
a: Level 6 or 7, 89, 17.9%
b: Level 5, 108, 21.7%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 37
Figure 4-7-2 Experiences of Development Staff
Analyzed data a b c d N No answer
602_Staff skill_application domain experience 151 377 132 27 687 1,087603_Staff skill_analysis and design experience 108 249 91 3 451 1,323604_Staff skill_programming language and software tool experience 151 374 110 7 642 1,132605_Staff skill_development platform experience 171 260 82 12 525 1,249
* Alternatives a, b, and c
a: All the staff members have enough skills, b: Half of the staff members have enough skills and the rest have adequate skills, c: Half of the staff members have adequate skills and the rest have no skills, d: All staff members have no skills
For all the items (the “application domain”, “analysis/design”, “programming language/tool skills”, “development platform”), approximately 50% of the projects each had development staff, “half of which had enough skills and the rest having adequate skills”. These projects and the projects having development “staff of only full-skill members” add up to approximately 80%. Figure 4-7-3 Personnel Assignment and Skills for Testing
N = 314 (No-reply count: 1,460) * Analyzed data item(s): 1010_Personnel
assignment for testing
The projects that had “enough testing skills and personnel” take up 30%. These projects and the projects having “enough testing skills” add up to more than 50% (alternatives a and b).
application domain experience
analysis and design experience
programming language and software tool experience
development platform experience
d: The members assigned to testing tasks were not
enough in number and had insufficient skills, 46,
14.6%
c: The members assigned to testing
tasks were enough in number, but they had insufficient skills, 95,
30.3%
a: The members assigned to testing tasks were
enough in number and had enough skills,
98, 31.2%
b: The members assigned to testing tasks had enough
skills, but they were not enough in
number, 75, 23.9%
38 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.8 Size
This section presents the following profiles, which represents the size of developed software. Each of the profiles in this section accompanies the number of projects concerned and the number of companies that offered the data. (1) Types of software sizing scale (FP or number of source code lines) (2) FP measurement method (3) Purity of FP measurement method (original or customized) (4) Actual FP size (5) Actual SLOC (Source Lines of Code) size Figure 4-8-1 Types of Software Sizing Scale (by Number of Projects)
N=1,774 (No-reply count : 0) *1 Analyzed data item(s): 5001_Actual FP size
(unadjusted), whether or not the actual net SLOC size (derived indicator) exists
*2 Other indices including design documents, number of database tables, and number of GUI window types (including no-reply count)
“Projects that used the FP size” for software sizing take up more than 40%. This percentage also applies to the “SLOC size”. “Project that used both sizes” take up 6.1%.
Figure 4-8-2 Types of Software Sizing Scale (by Number of Companies)
N = 20 (Number of companies) *1 Analyzed data item(s): 5001_Actual FP size
(unadjusted), whether or not the actual net SLOC size (derived indicator) exists
*2 Other indices including design documents, number of database tables, and number of GUI window types (including no-reply count)
Thirteen “companies used the FP size” for software sizing, while sixteen “companies used the SLOC size”. Nine “companies used both”.
Figure 4-8-3 FP Measurement Method (by Number of Projects)
N = 860 (No-reply count: 908) *1 Analyzed data item(s): 701_Primary FP
measurement method *2 This analysis is made for the 860 projects that
used the “FP size only” or “both the FP size and SLOC size” as shown in Figure 4-8-1.
The projects that used the “IFPUG method”, “NESMA estimated (FP count) method”, or “SPR method” take up nearly 60%.
c: Both FP and SLOC, 109, 6.1%
b: SLOC, 729, 41.1%
a: FP, 751, 42.3%
d: Other indices, 185, 10.4%
c: Both FP and SLOC, 9,
45.0%
a: FP, 4, 20.0%
b: SLOC, 7, 35.0%
f: Others, 308, 35.8%
c: NESMA indicative method, 2, 0.2%
a: IFPUG, 214, 24.9%
b: SPR, 240, 27.9%
d: NESMA estimated method,
52, 6.0%
Measurement method unknown, 44, 5.1%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 39
Figure 4-8-4 FP Measurement Method (by Number of Companies) *1 Analyzed data item(s): 701_Primary FP
measurement method *2 This analysis is made for the 816 projects
selected among the 860 projects used for analysis of “FP measurement method by number of projects”. The selection is made in a way that excludes “projects that could not identify the measurement method”. This analysis is also made for the thirteen companies that used only the “FP size” or “both the FP size and SLOC size” as shown in Figure 4-8-2. Some companies used more than one measurement method.
Among the thirteen companies, the “IFPUG” and the “NESMA methods” are most frequently used. Some companies used more than one measurement method.
Figure 4-8-5 Purity of FP Measurement Method
N = 816 (No-reply count: 952) *1 Analyzed data item(s): 10124_Purity of
measurement method for actual FP size *2 This analysis is made for the 816 projects
selected among the 860 projects used for analysis of “FP measurement method by number of projects”. The selection is made in a way that excludes “projects that could not identify the measurement method”. This analysis is also made for the thirteen companies that used only the “FP size” or “both the FP size and SLOC size” as shown in Figure 4-8-2. Some companies used more than one measurement method.
Most of the projects used FP measurement methods in their original forms. Most of the FP measurement methods categorized as others are customized versions of IFPUG.
Figure 4-8-6 Actual FP Size
N Min P25 Med P75 Max Mean S.D.
860 5 177 400 882 14,545 857 1,446 N = 860 (No-reply count: 908) *1 Analyzed data item(s): 5001_Actual FP size (unadjusted) *2 This analysis is made for the 860 projects that used the “FP size only” or “both the FP size and SLOC size” as shown in
Figure 4-8-1.
Approximately 60% of the projects have the size of 500 FPs or less, while projects of 2,000 FPs or larger size take up approximately 10%.
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 2,4
00
a: IFPUG
b: SPR
c: NESMA indicative method
d: NESMA estimated method
f: Other
FP Measurement Method
FP Measurement Method Purity
Number of Projects
a: Original method 188b: Customized method 8a: IFPUG Purity unknown 18a: Original method 229b: Customized method 0b: SPR Purity unknown 11a: Original method 1b: Customized method 1c: NESMA indicative
method Purity unknown 0a: Original method 52b: Customized method 0d: NESMA estimated
method Purity unknown 0a: Original method 1b: Customized method 307f: Other Purity unknown 0
Total 816
40 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-8-7 Actual SLOC Size
The following bar graph magnifies the lowest part of the actual SLOC size values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 838 0.4 23.9 58.1 200.9 12,100.0 228.9 649.2
N = 838 (No-reply count: 932) *1 Analyzed data item(s): Actual net SLOC size (derived indicators) *2 Actual net SLOC size: Actual SLOC size excluding comment lines and blank lines *3 This analysis is made for the 838 projects that used “only the SLOC size” or “both the SLOC size and FP size” as shown
in Figure 4-8-1.
Approximately 60% of the projects have the size of 100 KSLOC or less.
Actual SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
Actual Net SLOC Size (200-KSLOCs or less, 10-KSLOC intervals) N = 626
Actual SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 1,0
00
Actual Net SLOC Size (Full range, 50-KSLOC intervals)
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 41
4.9 Development Schedule
This section presents the following profiles, which represents development schedules of the projects. (1) Planned project duration in months (2) Actual project duration in months (3) Planned Major-development phase duration in months (4) Actual Major-development phase duration in months Figure 4-9-1 Planned Project Duration in Months
N = 826 (No-reply count: 948) *1 Analyzed data item(s): Planned
duration in months (whole project)
*2 Planned duration in months (whole project): 5140_Planned work period of whole project. If data item 5140 is unavailable, 10126_Duration in months (planned)_whole project (offered data) is used instead.
N Min P25 Med P75 Max Mean S.D. 826 0.6 5.1 8.0 11.7 57.4 9.3 6.2
The median of planned project duration is 8 months. Most of the projects have a duration of one year or
less. Note: The number of projects that offered their planned project duration values is approximately half the number of projects that offered their actual project duration values shown in Figure 4-9-2. This means that simple comparison between Figure 4-9-1 and Figure 4-9-2 is meaningless. Figure 4-9-2 Actual Project Duration in Months
N = 1,669 (No-reply count: 105) *1 Analyzed data item(s): Actual
duration in months (whole project)
*2 Actual duration in months (whole project): 5167_Actual work period of whole project. If data item 5167 is unavailable, 10128_Duration in months (actual) _whole project (offered data) is used instead.
N Min P25 Med P75 Max Mean S.D. 1,669 0.5 4.1 6.5 10.4 57.4 8.3 6.1
The median of actual project duration is 6.5 months. Projects whose actual duration is one year or less take
up 80%.
Planned Months (Whole project) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
Actual Months (Whole project) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
42 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-9-3 Planned Major-development Phase Duration in Months N = 327 (No-reply count: 917) *1 Analyzed data item(s): Planned
duration in months (Major-development phases) (derived indicators)
*2 Planned duration in months (Major-development phases): The duration in months that is obtained by subtracting the planned beginning date of the basic design phase from the planned completion date of the system test phase.
*3 This analysis is made for the 327 projects selected among the 1,244 projects that went through all the Major-development phases.
*4 The 1,244 projects mentioned above are those that measured relevant data in each of the Major-development phases from basic design to system test. These projects have the symbol
or ⇒ written in the relevant entry field.
N Min P25 Med P75 Max Mean S.D.
327 0.1 3.8 5.5 8.5 57.4 6.9 5.6
The median of planned Major-development phase duration is 5.5 months. Most of the projects have a planned Major-development phase duration of one year or less. Note: The number of projects that offered the planned Major-development phase duration values is approximately 60% of the number of projects that offered their actual Major-development phase duration values shown in Figure 4-9-4. This means that simple comparison between Figure 4-9-3 and Figure 4-9-4 is meaningless. Figure 4-9-4 Actual Major-development Phase Duration in Months
N = 528 (No-reply count: 716) *1 Analyzed data item(s): Actual
duration in months (Major-development phases) (derived indicators)
*2 Actual duration in months (Major-development phases): The duration in months that is obtained by subtracting the actual beginning date of the basic design phase from the actual completion date of the system test phase.
*3 This analysis is made for the 528 projects selected among the 1,244 projects that went through all Major-development phases.
*4 The 1,244 projects mentioned above are those that measured relevant data in each of the Major-development phases from basic design to system test. These projects have the symbol
or ⇒ written in the relevant entry field.
N Min P25 Med P75 Max Mean S.D.
528 0.2 3.9 6.1 10.2 57.4 7.6 5.5
The median of actual Major-development phase duration is 6.1 months. Projects that have the planned Major-development phase duration of one year or less take up 80%.
Planned Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 43
4.10 Effort
This section presents the following profiles, which represent development effort used in the projects. Profiles (1) to (3) represent effort values in person hours. Profiles (4) to (6) represent effort values
converted to person months. The data item “902_Conversion ratio among person hours/person months” is used for conversion between person-hour values and person-month values. (1) Planned effort of the whole project
(person hours) (5) Actual effort of the whole project
(person months) (2) Actual effort of the whole project (person hours) (6) Actual effort of the Major-development phases
(person months) (3) Actual effort of the Major-development phases
(person hours) (7) Effort unit (person hours or person months)
(4) Planned effort of the whole project (person months)
(8) Conversion ratio among person-month and person-hour
Figure 4-10-1 Project Effort Planned at the Beginning of Basic Design (Person Hours)
The following bar graph magnifies the lowest part of the planned effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D.
444 62 3,000 8,102 21,664 900,000 27,856 73,393 N = 444 (No-reply count: 1,330) * Analyzed data item(s): 11015_Project effort in person hours (planned at the beginning of basic design)
The median of planned project effort is 8,102 person hours. Projects that began with the planned effort of 21,000 person hours or less take up three fourths of the whole. Note: The number of projects that offered their planned project effort values is approximately 25% of the number of projects that offered their actual project effort values shown in Figure 4-10-2. This means that simple comparison between Figure 4-10-1 and Figure 4-10-2 is meaningless.
Planned Effort [person-hours]
Num
ber o
f Pro
ject
s
Planned Effort (Whole project) (At the beginning of basic design) (Effort in person-hours. 20,000-person-hours or less, 1,000-person-hour intervals) N = 328
Planned Effort [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 100
,000
Planned Effort (Whole project) (At the beginning of basic design) (Effort in person-hours. Full range, 5,000-person-hour intervals)
44 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-10-2 Actual Project Effort (Person Hours)
The following bar graph magnifies the lowest part of the actual effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 1,758 18 1,910 5,251 16,262 956,505 21,398 56,719
N = 1,758 (No-reply count: 16) * Analyzed data item(s): Actual effort (whole project) (derived indicators)
The median of actual project effort is 5,251 person hours. Half of the projects used the project effort of 5,000 person hours or less.
Actual Effort [person-hours]
Num
ber o
f Pro
ject
s
Actual Effort (Whole project) (Effort in person-hours. 20,000-person-hours or less, 1,000-person-hour intervals) N = 1,382
Actual Effort [person-hours]
Num
ber o
f Pro
ject
s
Actual Effort (Whole project) (Effort in person-hours. Full range, 5,000-person-hour intervals)
Ove
r 100
,000
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 45
Figure 4-10-3 Actual Major-development Phase Effort (Person Hours)
The following bar graph magnifies the lowest part of actual effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 1,204 62 2,621 7,436 21,482 956,505 27,176 65,542
N = 1,204 (No-reply count: 40) *1 Analyzed data item(s): Actual Major-development phase effort (person hours) (derived indicators) *2 Actual Major-development phase effort (person hours): The sum of the effort of the Major-development phases and the
in-house and external effort that does not fall into any defined categories. *3 This analysis is made for the 1204 projects selected among the 1,244 projects that went through all Major-development
phases. *4 The 1,244 projects mentioned above are those that measured relevant data in each of the Major-development phases
from basic design to system test. These projects have the symbol or ⇒ written in the relevant entry field.
The median of the actual Major-development phase effort is 7,436 person hours. Projects that used the Major-development phase effort of 20,000 person hours or less take up approximately 70%.
Actual Effort [person-hours]
Num
ber o
f Pro
ject
s
Actual Effort (Major-development phases) (Effort in person-hours. 20,000-person-hours or less, 1,000-person-hour intervals)
N = 875
Actual Effort [person-hours]
Num
ber o
f Pro
ject
s
Actual Effort (Major-development phases) (Effort in person-hours. Full range,5,000-person-hour intervals)
Ove
r 100
,000
46 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-10-4 Project Effort Planned at the Beginning of Basic Design (Person Months)
The following bar graph magnifies the lowest part of the planned effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 444 0.40 18.56 50.69 129.05 6,000.00 170.20 464.82
N = 444 (No-reply count: 1,330)
Analyzed data item(s): Project effort converted into person months from 11015_Project effort in person hours (planned at the beginning of basic design). Offered project effort data in person months were used as is. Person-hour values were converted to person-month values by using the coefficient of 1 month /160 hours. * SEC began to collect planned effort data in fiscal 2005.
The median of the planned project effort is approximately 50.7 person months. Approximately half of the projects used project effort of 50 person months or less.
Planned Effort [person-months]
Num
ber o
f Pro
ject
s
Planned Effort (Whole project) (At the beginning of basic design) (Effort in person-hours. 150-person-months or less, 10-person-month intervals) N = 350
Planned Effort [person-months]
Num
ber o
f Pro
ject
s Planned Effort (Whole project) (At the beginning of basic design) (Effort in person-hours. Full range, 50-person-month intervals)
Ove
r 700
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 47
Figure 4-10-5 Actual Project Effort (Person Months)
The following bar graph magnifies the lowest part of the actual effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 1,758 0.11 11.94 32.66 101.10 5,626.50 131.63 345.34
N = 1,758 (No-reply count: 16) * Analyzed data item(s): Project effort converted to person months from Actual effort (whole project) (derived indicators).
Offered project effort data in person months were used as is. Person-hour values were converted to person-month values by using the coefficient of 1 month /160 hours.
The median of the actual project effort is approximately 33 person months.
Actual Effort [person-months]
Num
ber o
f Pro
ject
s
Actual Effort (Whole project) (Effort in person-hours. 150-person-months or less, 10-person-month intervals)
N = 1,444
Actual Effort [person-months]
Num
ber o
f Pro
ject
s Actual Effort (Whole project)
(Effort in person-hours. Full range, 50-person-month intervals)
Ove
r 700
48 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-10-6 Actual Major-development Phase Effort (Person Months)
The following bar graph magnifies the lowest part of the actual effort values shown in the above bar graph.
N Min P25 Med P75 Max Mean S.D. 1,204 0.39 16.01 45.09 132.27 5,626.50 166.23 397.39
N = 1,204 (No-reply count: 40) *1 Analyzed data item(s): Actual Major-development phase effort (person months) (derived indicators). *2 Actual Major-development phase effort (person months): The sum of the person-month effort of the Major-development
phases and the in-house and external person-month effort that does not fall into any defined categories. Offered project effort data in person months were used as is. Person-hour values were converted to person-month values by using the coefficient of 1 month /160 hours.
*3 This analysis is made for the 1,204 projects selected among the 1,244 projects that went through all Major-development phases.
*4 The 1,244 projects mentioned above are those that measured relevant data in each of the Major-development phases from basic design to system test. These projects have the symbol ○ or ⇒ written in the relevant entry field.
The median of the actual Major-development phase effort is approximately 45 person months.
Actual Effort [person-months]
Num
ber o
f Pro
ject
s
Actual Effort (Major-development phases) (Effort in person-hours. 150-person-months or less, 10-person-month intervals)
N = 930
Actual Effort [person-months]
Num
ber o
f Pro
ject
s
Actual Effort (Major-development phases) (Effort in person-hours. Full range, )
Ove
r 700
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 49
Figure 4-10-7 Effort Unit
N = 1,774 (No-reply count: 0) * Analyzed data item(s): 901_Unit of effort
Projects that used “person-month” effort values take up more than 60%.
Figure 4-10-8 Conversion Ratio Among Person-Month and Person-Hour
N Med Mean S.D. 1,095 162.0 161.4 15.2
N = 1,095 (No-reply count: 0) * Analyzed data item(s): 902_Conversion ratio among person hours/person months
This analysis is made for 1,095 projects that chose “b: person months” for 901_Unit of effort.
The above bar graph and table show statistics of the effort data that companies offered in person-hour values for conversion into person months. Note: SEC did not collect any coefficient for conversion from person hours to person months. Strictly speaking, the above graphs and table do not present any complete picture of effort data. The graphs and table are used to provide reference information.
Conversion Ratio [person-hours]
Num
ber o
f Pro
ject
s
b: Person-months, 1,095, 61.7%
a: Person-hours, 679, 38.3%
50 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.11 Personnel Assignment
This section presents the following profiles, which represent project personnel assignment. (1) Number of staff members per month (2) The ratio of outsourced effort amount (3) The ratio of expenditure on outsourced work (4) The source of outsourced workforce (5) New subcontractors or not Figure 4-11-1 Number of Staff Members per Month
N Min P25 Med P75 Max Mean S.D. 523 0.3 3.6 7.3 15.4 385.9 15.1 26.4
N = 523 (No-reply count: 721) *1 Analyzed data item(s): Number of staff members per month (derived indicators) *2 Number of staff members per month (derived indicators) = Actual Major-development phase effort ÷ Actual duration in
months (Major-development phases) ÷ Conversion ratio among person-month and person-hour The conversion ratio among person-month and person-hour takes the value assigned to 902_Conversion ratio among person hours/person months if “b: person months” is chosen for 901_Unit of effort. If “a: person hours” is chosen instead, the conversion ratio takes the value of 160.
*3 This analysis is made for the 524 projects selected among the 1,244 projects that went through all Major-development phases.
The number of in-house and external staff members per month involved in the Major-development phases from basic design to system test yields the median of approximately 7 persons per month.
Number of Staff Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 26
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 51
Figure 4-11-2 The Ratio of Outsourced Effort Amount
N Min P25 Med P75 Max Mean S.D. 732 0.0% 45.2% 71.1% 86.9% 100.0% 62.1% 30.7%
N = 732 (No-reply count: 1,042) *1 Analyzed data item(s): The ratio of outsourced effort amount (derived indicators) *2 The ratio of outsourced effort amount (derived indicators): Total outsourced effort used from basic design to system test ÷
Actual effort total *3 Amount of outsourced effort values explicitly written as “0” are included in the above distribution as the effort ratio of 0%.
More than 60% of the projects had an outsourced effort ratio of 70% or higher. Projects that relied on only in-house effort take up more than 10%. Figure 4-11-3 The Ratio of Expenditure on Outsourced Work
N Min P25 Med P75 Max Mean S.D. 272 0.0% 41.0% 71.9% 84.5% 100.0% 60.9% 29.6%
N = 272 (No-reply count: 1,502) *1 Analyzed data item(s): 5204_Actual outsourcing data (expenditure ratio) *2 The data item 5204 explicitly written as “0” is included in the above distribution as the expenditure ratio of 0%.
The ratio of expenditure on outsourced work has distribution similar to distribution of the ratio of outsourced effort.
The Ratio of Expenditure on Outsourced Work [%]
Num
ber o
f Pro
ject
s
The Ratio of Outsourced Effort [%]
Num
ber o
f Pro
ject
s
52 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-11-4 Source of Outsourced Workforce N = 407 (No-reply count: 1,367) * Analyzed data item(s): 118_Source of outsourced
workforce_1
Over 40% of the projects outsourced workforce from “Japanese companies are not affiliated or grouped by the main companies”. Another over 40% of the projects outsourced workforce from “affiliated or grouped Japanese companies”. Projects that outsourced workforce from Japanese companies take up nearly 90%.
Figure 4-11-5 New Subcontractors or Not
N = 396 (No-reply count: 1,378) * Analyzed data item(s): 110_New subcontractors or
not
Projects that “outsourced from the same subcontractors more than once” take up more than 90%.
c: Foreign company (intra-group/
affiliate), 7, 1.7%
d: Foreign company (out-of-group/ non-affiliate),
12, 2.9%
b: Japanese company (out-of-group/ non-affiliate), 177, 43.5%
a: Japanese company
(intra-group /affiliate),
173, 42.5%
e: No outsourcing, 38, 9.3%
b: The company of the project used the subcontractors more
than once, 363, 91.7%
a: The subcontractors were new to the
company of the project, 33, 8.3%
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 53
4.12 Reliability
This section presents the following profiles, which represent reliability of developed software. (1) Number of defects identified after system cutover (2) Number of defects identified after system cutover (number of failure) (3) Number of defects identified after system cutover (number of fault) (4) Personnel assignment for quality assurance (5) Practice of quality assurance standard and review
The number of defects identified after system cutover refers to the maximum value among the following values: the number of defects identified in 1 month after system cutover, that of defects identified in 3 months after system cutover, and that identified in 6 months after system cutover. This analysis includes the number of defects that equals zero.
Figure 4-12-1 integrates Figure 4-12-2 and Figure 4-12-3 in a way that uses the number of fault before the number of failure if both numbers are available. The number of identified defects presented in Chapter 7 is based on this way of integration. Figure 4-12-1 Number of Defects Identified After System Cutover
N Min P25 Med P75 Max Mean S.D. 717 0 0.0 2.0 10.0 1,262 19.1 76.8
N = 717 (No-reply count: 1,057) * Analyzed data item(s): The maximum value among the following values under the condition that the number of fault is
used before the number of failure. 5267_Number of failure (total)_1 month, 5268_Number of failure (total)_3 months, 5269_Number of failure (total)_6 months, 10112_Number of fault (total)_1 month, 10113_Number of fault (total)_3 months, and 10114_Number of fault (total)_6 months
Most of the projects identified 5 defects or less after system cutover. Projects that identified 10 defects or less after system cutover take up 75%.
Number of Defects Identified After System Cutover
Num
ber o
f Pro
ject
s
Ove
r 70
54 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-12-2 Number of Defects Identified After System Cutover (Number of failure)
N Min P25 Med P75 Max Mean S.D. 623 0 0.0 2.0 10.0 1,262 18.5 79.1
N = 623 (No-reply count: 1,151) * Analyzed data item(s): The maximum value among the following values.
5267_Number of failure (total)_1 month, 5268_Number of failure (total)_3 months, and 5269_Number of failure (total)_6 months
Most of the projects identified 5 failures or less after system cutover. Projects that identified 10 failures or less after system cutover take up 75%. Figure 4-12-3 Number of Defects Identified After System Cutover (Number of fault)
N Min P25 Med P75 Max Mean S.D. 226 0 0.0 1.0 4.0 285 12.1 40.5
N = 226 (No-reply count: 1,548) * Analyzed data item(s): The maximum value among the following values.
10112_Number of fault (total)_1 month, 10113_Number of fault (total)_3 months, and 10114_Number of fault (total)_6 months
Most of the projects identified 5 faults or less after system cutover. Projects that identified 10 faults or less after system cutover take up approximately 80%. The distribution of number of identified faults is narrower than that of identified failures.
Number of Defects Identified After System Cutover
Num
ber o
f Pro
ject
s
Ove
r 70
Number of Defects Identified After System Cutover
Num
ber o
f Pro
ject
s
Ove
r 70
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 55
Figure 4-12-4 Personnel Assignment for Quality Assurance N = 380 (No-reply count: 1,394) * Analyzed data item(s): 5241_Personnel
assignment for quality assurance
Projects that assigned project members to quality assurance tasks take up nearly 70%. Projects that assigned special staff dedicated to quality assurance tasks take up more than 30%, showing an increase of 67 persons (8.8%) since last year.
Figure 4-12-5 Practice of Quality Assurance Standard and Review
Analyzed data a: Yes b: No N No answer 1011_Existence of quantitative delivery quality standards 245 32 277 1,497 1013_Existence of third-party reviews 206 21 227 1,547
More than 90% of the projects had quality assurance standards for product shipment or carried out quality
assurance reviews.
Quantitative delivery quality standards
Third-party reviews
a: Yes b: No
c: No member was assignment to quality
assurance tasks, 4, 1.1%
b: Special members were dedicated to quality assurance tasks, 124, 32.6%
a: Project members were assigned to quality assurance tasks, 252, 66.3%
56 IPA/SEC White Paper 2007 on Software Development Projects in Japan
4.13 Development Process Phase Combinations
This section presents what combinations of development phases the projects went through. Projects that went through the same phase combination belong to the same group.
Figure 4-13-1 Development Phase Combinations
Phases
Development Phase Combinations
Dev
elop
men
t pl
anni
ng
Req
uire
men
ts
Ana
lysi
s
Bas
ic d
esig
n
Det
aile
d de
sign
Impl
emen
tati
on
Inte
grat
ion
test
Sys
tem
test
Acc
epta
nce
test
Dev
elop
men
t
Enh
ance
men
t
Mix
ed P
roje
ct
Type
s
5-phase type 1 135 109 2485-phase type 2 579 342 996
58 58 118 9 2 11 4 2 7 20 30 51
Basic design involved
7 8 15Detailed design to integration test
16 17 34
Construction to system test 3 2 5Integration test, system test 0 3 3Requirements analysis involved
10 9 19
No answer (Unknown) Blank Blank Blank Blank Blank Blank Blank Blank 1 7 8Others 193 64 259Total 1,035 653 1,774 Legend: The above table uses the following symbols to indicate how certain each phase was or was
not involved in projects based on interpretation of answerers' selection of the phase-existence symbol ( , ⇒, ).
: This phase was involved. (" " or "⇒" was selected.) : This phase may have been involved. (" " or "⇒" was selected or
no answer was made.) : This phase wan not involved. (" " was selected.) : Whether or not this phase was involved is irrelevant to this combination. (" ", "⇒", or " " was selected or
no answer was made.) N = 1,774 (No-reply count: 0) * Analyzed data item(s):
5106_Phase existence_Systematization planning, 5107_Phase existence_Requirements definition 5108_Phase existence_Basic design, 5109_Phase existence_Detailed design 5110_Phase existence_Construction, 5111_Phase existence_Integration test 5112_Phase existence_System test, 5113_Phase existence_Acceptance test
Projects that went through the five phases from basic design to system test are classified in two types: 5-phase type 1 and type 2 shown in Figure 4-13-1. Projects of 5-phase type 1 and those of 5-phase type 2 add up to 70%.
These projects are referred to as “5-phase projects” in Chapter 5 and later.
4. Profiles of Collected Data
IPA/SEC White Paper 2007 on Software Development Projects in Japan 57
4.14 Project Evaluation
This section presents the following profiles, which represent self-evaluation of project success. Evaluation of project planning and actual results is analyzed from three viewpoints of QCD and the degree
of success is classified into discrete levels. (1) Evaluation of planning (QCD) (2) Evaluation of actual results (QCD) (3) Self-evaluation of project success (4) Subjective evaluation of customer satisfaction by the vendor Figure 4-14-1 Evaluation of Planning (QCD)
Analyzed data a b c N No answer 120_Evaluation of planning (Cost) 537 66 0 603 1,171 121_Evaluation of planning (Quality) 486 59 35 580 1,194 122_Evaluation of planning (Development schedule) 544 54 5 603 1,171
* Alternatives a, b, and c:
[120_Evaluation of planning (Cost)] a: The basis of cost estimation is clear and feasibility was confirmed. b: The basis of cost estimation is unclear or feasibility was not confirmed. c: No planning. [121_Evaluation of planning (Quality)] a: Quality objectives are clear and feasibility was confirmed. b: Quality objectives are unclear or feasibility was not confirmed. c: No planning. [122_Evaluation of planning (Schedule)] a: The basis of schedule planning is clear and feasibility was confirmed. b: The basis of development schedule is unclear or feasibility was not confirmed. c: No planning.
Among projects that offered evaluation data, 80 to 90% of them evaluated that they had clear basis of cost estimation, quality objectives, and schedule planning and that they confirmed feasibility of these planning elements.
Evaluation of planning (Quality)
Evaluation of planning (Cost)
Evaluation of planning (Development schedule)
58 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 4-14-2 Evaluation of Actual Results (QCD)
Analyzed data a b c d e N No answer123_Eveluation of results (Cost) 96 549 57 14 27 743 1,031124_Evaluation of results (Quality) 51 303 67 20 17 458 1,316125_Evaluation of results (Development schedule) 24 581 29 37 70 741 1,033
* Alternatives a, b, and c:
[123_Evaluation of results (Cost)] a: Actual cost is less than planned cost by 10% or more. b: Actual cost is close to planned cost with an error smaller than ±10%. c: Actual cost is larger than planned cost with an error of 30% or less. d: Actual cost is larger than planned cost with an error of 50% or less. e: Actual cost is larger than planned cost with an error larger than 50% [124_Evaluation of results (Quality)] a: The number of defects identified after system cutover (hereafter Nd) is less than planned value by 20% or more. b: Nd is less than planned value. c: Nd is larger than planned value by 50% or less. d: Nd is larger than planned value by 100% or less. e: Nd is larger than planned value by more than 100%. [125_Evaluation of results (Schedule)] a: The project finished before planned delivery date. b: The project finished on schedule. c: The project finished with delay of less than 10 days. d: The project finished with delay of less than 30 days. e: The project finished with delay of 30 days or more.
More than 80% of the projects finished nearly as planned in terms of cost and schedule (alternatives a and b). Projects that accomplished planned quality more or less take up nearly 80% (alternatives a and b).
Figure 4-14-3 Self-Evaluation o Project Success
Figure 4-14-4 Subjective Evaluation of Customer Satisfaction by the Vendor
N = 279 (No-reply count: 1,495) * Analyzed data item: 116_Project success_Self-evaluation
N = 298 (No-reply count: 1,476) * Analyzed data item: 117_Subjective evaluation of
customer satisfaction
Projects that evaluated themselves as “successful in terms of quality, cost, and delivery date (QCD)” take up 60%. Projects that evaluated themselves as “successful in terms of two elements of QCD” take up nearly 30%.
More than 60% of the projects subjectively evaluated that their customers were “satisfied on the whole”. More than 20% of the projects subjectively evaluated that their customers were “totally satisfied”.
d: None of the QCD elements is successful, 0,
8, 2.9% a: The customer is fully satisfied,
74, 24.8%
b: Two of the QCD elements are successful,
79, 28.3% b: The customer is almost
satisfied, 183, 61.4%
a: All the QCD elements are
successful, 166, 59.5%
c: One of the QCD elements is
successful, 26, 9.3%
c: The customer is dissatisfied at
some points, 30, 10.1%
d: The customer is not at all satisfied,
11, 3.7%
Evaluation of results (Quality)
Eveluation of results (Cost)
Evaluation of results (Development schedule)
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 59
5 Statistics of Major Project Elements
5.1 Adoption Conventions for Chapter 5
This chapter presents the project count and the project frequency distribution with respect to the size, development schedule, effort, and the number of staff per month using stratified sampling by type of project, type of industry, architecture, and type of business, thereby showing the whole picture of the collected project data.
The project data used to present statistics of major project elements in Chapter 5 is sampled using the same criteria as the project data used for Chapter 6 and later. When you see graphs or tables in Chapter 6 or later, you may want to see the distribution of the data the graphs or tables are based on. In such a case, Chapter 5 is the place to look into. 5.1.1 Analyzed Data
Analyzed projects Chapter 5 analyzes the projects that went through the five phases from basic design to system test. In other words, these projects assigned the label “existed” to all the five data items from 5108_Phase
existence_Basic design to 5112_Phase existence_System test. These projects belong to 5-phase type 1 or type 2 as shown in Table 4-13-1 in Chapter 4.
Scope of per-project-type project size This White Paper uses the following conventions when analyzing the size of the project in terms of FP size
or SLOC size. The same conventions are applied irrespective of the project type. • The size of a project of the development type equals the FP or SLOC size of the whole system
developed by the project. • The size of a project of the enhancement type equals the FP or SLOC size of the enhancemented part of
the system enhancemented by the project. The enhancemented part consists of additional functionality, modified functionality, and/or removed functionality. Thus, the size of an enhancement project excludes the size of the existing system to which additions, modifications, and/or removal was made.
• The above enhancement size definition applies whether or not the enhancement project size is presented with the size of all development project types.
The size of a project of the enhancement type is calculated from the size of the enhancemented (additional and/or modified) part of the system developed by the project. That is, the size of the enhancement project does not equal the size of the whole enhancemented system. (In this White Paper, no calculation is made to determine the size of the whole enhancemented system of any kind.) Note that small size values of enhancement projects do not always indicate that their systems are small size.
FP (Function Point) size This White Paper uses the unadjusted FP size (5001_FP size_unadjusted) and only the data of which FP
measurement methods are explicitly stated. The data sets used for this White Paper include FP values that were measured via different FP measurement methods, unless otherwise noted.
The “FP size” refers to the size of system measured in function points. In some cases, the unit KFP is used to represent function points in multiples of one thousand FPs.
SLOC (Source Lines of Code) size The “SLOC size” stands for Source Lines of Code. The unit KSLOC is used to represent source lines of
code in multiples of one thousand SLOCs. The total number of source code lines is calculated in SLOCs whether or not the lines are written in different programming languages. This White Paper uses SLOC values of source code only if the name of the programming language of the source code is explicitly stated. Source code is written in different programming languages, unless otherwise noted. Refer to derived indicators in Appendix A.4 for the definition of size calculation for project type of development or enhancement.
Effort The effort of a project is the total amount of actual effort in person-hours the project used through the five
phases from basic design to system test, including effort for in-house tasks (development, management, and tasks that do not fall into any defined categories) and that for outsourced tasks. The derived indicators “Actual effort (Major-development phases)” in Appendix A.4 describe more details. This White Paper presents effort values in person-hours.
60 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Development schedule The duration of a project is calculated from the actual completion date of the system test phase and the
actual beginning date of the basic design phase. The derived indicators “Actual months (Major-development phases)” in Appendix A.4 describes more details. This White Paper presents the schedule duration in months.
Number of staff per month The number of staff per month is calculated from the actual effort (Major-development phases), actual
months (Major-development phases), and the conversion ratio among person-month and person-hour. The derived indicators “Number of staff per month” in Appendix A.4 describe more details. This White Paper presents the number of staff per month in persons per month. 5.1.2 Stratification Scheme
The following sections of Chapter 5 present the FP size, SLOC size, development schedule, effort, and number of staff per month by using project count tables, frequency distribution histograms, and basic statistics tables based on the stratification scheme illustrated in Figure 5-1-1. The scheme stratifies project data in a hierarchical manner as described below.
(1) The analysis that is presented first is the project count, distribution, and basic statistics of the whole projects and then those of each project type (all project types, development, and enhancement). The “enhancement” type integrates the “maintenance/support” type and the “enhancement” type into one.
(2) The analysis that is presented secondly is the project count, distribution, and basic statistics of each project type (development and enhancement) formatted on a per industry type basis (F: manufacturing, H: information and telecommunications, J: wholesale/retail trade, K: finance/insurance, and R: public service). These industry types are encoded as 201_Industry type (Major type) and they are the top five types most frequently found in the collected project data as shown in Figure 4-3-1.
(3) The analysis that is presented next is the project count, distribution, and basic statistics of each project type (development and enhancement) formatted on a per architecture type basis (a: stand-alone, b: mainframe, c: 2-layer client/server. d: 3-layer client/server, and e: intranet/Internet).
(4) The analysis that is presented next is the project count, distribution, and basic statistics of each project type (development and enhancement) formatted on a per business type basis (b: accounting, c: sales, f: management, i: technology/control, k: ordering/inventory, o: customer management, and s: information analysis). These business types are encoded as “202_Business type” and they are the top seven types frequently found in the collected project data as shown in Figure 4-3-2.
Some kinds of business characteristics data came from a relatively small number of projects. When
presenting such characteristics data, some information is omitted. A frequency table is presented for each business type in any case. If the number of projects is 10 or more, various data about each of the projects shall be presented, however, if the number of project is 9 or less, only a table of basic statistics is presented with the distribution graph omitted. If most of the project counts that are to be listed in the table of basic statistics are 9 or less, only project counts are presented.
Note that Chapter 6 and later do not include the type of business in the sampling criteria because only a small amount of business type data is available.
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 61
5.1.3 Distribution
Histograms are used to present the frequency distribution. The table format shown in Figure 5-1-2 is used to present basic statistics in terms of the frequency (column
N), minimum, 25 percentile (P25), median, 75 percentile (P75), maximum, mean and standard deviation (SD). Figure 5-1-1 Stratification Scheme and Analyzed Elements
Table 5-1-2 Basic Statistics Presentation Format
N Min P25 Med P75 Max Mean S.D.
Size
Effort
Development schedule
Number of staff per month
FP size
Project type
SLOC size
Project type
Industry type
Architecture type
Business type
Industry type
Architecture type
Business type
Industry type
Architecture type
Business type
FP size
SLOC size
Industry type
Architecture type
Business type
All project typesDevelopment Enhancement
Development Enhancement
All project types Development Enhancement
All project types Development Enhancement
Development Enhancement
Development Enhancement
Development Enhancement
All project types
62 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.2 FP Size 5.2.1 Per-Project-Type FP Size
This section presents the per-project-type FP size distribution of the projects that measured their size in FPs.
Among the whole projects concerned (451 projects), projects of 100 to 200 FPs take up 16% (72 projects). Projects of 1,000 FPs or less take up a major portion of 69% (309 projects).
Projects of the “development” type take up 75% (340 projects), and those of the “maintenance/support” type take up 17% (75 projects). These two types add up to 92%, while those of the “redevelopment” take up 4% and so do those of the “enhancement” type.
Among projects of the “development” type (340 projects), projects of 200 to 400 FPs take up the largest portion of 18% (60 projects). Development projects of 3,000 FPs or more take up 9% (32 projects). Thus, the FP size spreads widely from low values to high values.
Among projects of the “enhancement” type (93 projects), projects of 200 FPs or less take up the largest portion of 43% (40 projects). “Enhancement“ projects have the median value of 242 FPs, smaller than the median value of 677 FPs of “development” projects. Note that small FP values of “enhancement” projects do not indicate that their systems are small size. As described in section 5.1, the size of enhancement projects does not include the size of their base system to which enhancement was made. Table 5-2-1 Per-Project-Type FP Size Project Count
103_Project type Number of Projects 105_Project category Number of
Projects a : Commercial package development 14b : Entrusted development 318c : For in-house use 3d : Prototyping 4
a : Development 340
e : Other 1a : Commercial package development 6b : Entrusted development 67c : For in-house use 2d : Prototyping 0
b : Maintenance/support 75
e : Other 0a : Commercial package development 3b : Entrusted development 15c : For in-house use 0d : Prototyping 0
c : Redevelopment 18
e : Other 0a : Commercial package development 1b : Entrusted development 12c : For in-house use 0d : Prototyping 0
d : Enhancement 18
e : Other 5Total 451 451
Table 5-2-2 Per-FP-Measurement-Method FP Size Project Count
701_Primary FP measurement method (actual) Number of Projects
a : IFPUG 171b : SPR 89c : NESMA estimated 41f : Other 150Total 451
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 63
Figure 5-2-3 FP Size Distribution
Table 5-2-4 Per-Project-Type FP Size Distribution
Figure 5-2-5 Per-Project-Type FP Size Basic Statistics
(unit: FP)Project Type N Min P25 Med P75 Max Mean S.D.
All project types 451 13 235 539 1,309 14,545 1,171 1,818a : Development 340 21 300 677 1,503 14,545 1,326 2,001b : Maintenance/support 75 17 146 183 397 2,901 404 569c : Redevelopment 18 163 595 1,108 1,797 4,914 1,435 1,228d : Enhancement 18 13 359 606 1,482 4,500 1,164 1,281
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 2,0
00
a: Development
b: Maintenance/support
c: Redevelopment
d: Enhancement
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 3,0
00
64 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-2-6 FP Size Distribution (Development, Mixed FP Measurement Methods)
Figure 5-2-7 FP Size Distribution (Development, IFPUG Group)
Table 5-2-8 Per-FP-Measurement-Method FP Size Basic Statistics (Development)
(unit: FP)FP Size Measurement Method N Min P25 Med P75 Max Mean S.D.
Mixed FP size measurement methods 340 21 300 677 1,503 14,545 1,326 2,001IFPUG group 218 21 334 776 1,767 14,545 1,615 2,266
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 5,0
00
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 5,0
00
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 65
Figure 5-2-9 FP Size Distribution (Enhancement, Mixed FP Measurement Methods)
Figure 5-2-10 FP Size Distribution (Enhancement, IFPUG Group)
Table 5-2-11 Per-FP-Measurement-Method FP Size Basic Statistics (Enhancement)
(unit: FP)FP Size Measurement Method N Min P25 Med P75 Max Mean S.D.
Mixed FP size measurement methods 93 13 154 242 508 4,500 551 809IFPUG group 65 13 171 319 641 4,500 665 883
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 2,0
00
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
Ove
r 2,0
00
66 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.2.2 Per-Industry-Type FP Size
This section presents the per-industry-type FP size distribution of the projects that measured their size in FPs. The distribution is presented for different project types separately. This analysis is made for the top five “major industry types” (manufacturing, information and telecommunications, wholesale/retail trade, finance/insurance, and public service), which are most frequently found in the collected project data.
Development projects Table 5-2-12 lists per-industry-type FP-size “development” project count. Figure 5-2-13 illustrates the FP size
distribution for each of the five major industry types, and Table 5-2-14 lists the FP size basic statistics for each of the five major industry types. The “finance/insurance” type takes up the largest portion (86 projects), and the “manufacturing” type takes up the next largest (66 projects) and then follow “service” (33 projects), “information and telecommunications” (31 projects), “wholesale/retail trade” (30 projects), and “public service” (14 projects). Among the whole development type (340 projects), the five industry types (except for the “service” industry) take up 67% (227 projects). Table 5-2-12 Per-Industry-Type FP-Size Project Count
(Development, Mixed FP measurement methods)
201_Industry Type (Major Type) Number of Projects
E : Construction 5F : Manufacturing 66G : Electricity, gas, heat supply and water 5H : Information and communications 31I : Transport 11J : Wholesale and retail trade 30K : Finance and insurance 86L : Real estate 8M : Eating and drinking places, accommodations 6N : Medical, health care and welfare 7Q : Services, N.E.C. 33R : Government, N.E.C. 14S : Industries unable to classify 4No answer 34Total 340
Figure 5-2-13 Per-Industry-Type FP Size Distribution
(Development, Mixed FP measurement methods)
Table 5-2-14 Per-Industry-Type FP Size Basic Statistics
(Development, Mixed FP measurement methods) (unit: FP)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 66 21 224 483 1,014 13,080 1,123 2,190H : Information and communications 31 57 187 397 709 2,248 583 545J : Wholesale and retail trade 30 236 668 1,096 2,154 11,670 1,829 2,262K : Finance and insurance 86 61 347 769 1,942 14,545 1,597 2,264R : Government, N.E.C. 14 45 267 343 661 7,770 970 1,996
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
F: ManufacturingH: Information and communications
J: Wholesale and retail trade
K: Finance and insurance
R: Government
Ove
r 2,0
00
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 67
Enhancement projects Table 5-2-15 lists per-industry-type FP-size “enhancement” project count. Figure 5-12-6 illustrates the FP size
distribution for each of the five major industry types, and Table 5-2-17 lists the FP size basic statistics for each of the five major industry types. The “finance/insurance” type takes up the largest portion (37 projects), and then follow “wholesale/retail trade” (7 projects), “manufacturing” (6 projects), “public service” (6 projects), and “information and telecommunications” (2 projects). Among the whole enhancement type (93 projects), the five industry types take up 62% (58 projects). The “enhancement” projects generally have lower median values than the “development” projects. Table 5-2-15 Per-Industry-Type FP-Size Project Count
(Enhancement, Mixed FP measurement methods)
201_Industry Type (Major Type) Number of Projects
F : Manufacturing 6H : Information and communications 2J : Wholesale and retail trade 7K : Finance and insurance 37L : Real estate 1N : Medical, health care and welfare 2Q : Services, N.E.C. 1R : Government, N.E.C. 6S : Industries unable to classify 2No answer 29Total 93
Figure 5-2-16 Per-Industry-Type FP Size Distribution
(Enhancement, Mixed FP measurement methods)
Table 5-2-17 Per-Industry-Type FP Size Basic Statistics
(Enhancement, Mixed FP measurement methods) (unit: FP)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 6 146 297 545 1,653 2,901 1,062 1,125H : Information and communications 2 219 - 410 - 600 410 269J : Wholesale and retail trade 7 182 252 307 1,317 4,143 1,110 1,540K : Finance and insurance 37 17 109 183 388 4,500 447 829R : Government, N.E.C. 6 130 175 301 448 1,742 517 615
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
F: ManufacturingH: Information and communications
J: Wholesale and retail tradeK: Finance and insurance
R: Government
Ove
r 2,0
00
68 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.2.3 Per-Architecture FP Size
This section presents the per-architecture FP size distribution of “development” projects and the per-architecture FP size basic statistics of these projects. This section also presents those of “enhancement” projects separately.
Development projects Among the whole “development” projects concerned, the “2- or 3-layer client server” type and the
“intranet/Internet” type collectively take up a major portion of 88% (298 projects). The “intranet/Internet” type spreads into the large-FP-size area. Table 5-2-18 Per-Architecture-Type FP-Size Project Count
(Development, Mixed FP measurement methods)
308_Architecture Number of Projects
a : Stand-alone 23b : Mainframe 13c : 2-layer client/server 94d : 3-layer client/server 90e : Intranet/Internet 114f : Other 4No answer 2Total 340
Figure 5-2-19 Per-Architecture-Type FP Size Distribution
(Development, Mixed FP measurement methods)
Table 5-2-20 Per-Architecture-Type FP Size Basic Statistics
(Development, Mixed FP measurement methods) (unit: FP)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 23 21 161 282 580 2,626 467 559b : Mainframe 13 57 125 203 890 2,938 656 853c : 2-layer client/server 94 61 290 543 1,494 11,724 1,013 1,443d : 3-layer client/server 90 69 313 630 1,208 6,428 1,086 1,328e : Intranet/Internet 114 62 478 941 1,960 14,545 2,031 2,805
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 2,0
00
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 69
Enhancement projects ”Enhancement” projects of the “mainframe” type take up a large portion of 31% (29 projects) with the median
smaller than that of any other architecture types. Table 5-2-21 Per-Architecture-Type FP-Size Project Count
(Enhancement, Mixed FP measurement methods)
308_Architecture Number of Projects
a : Stand-alone 19b : Mainframe 29c : 2-layer client/server 16d : 3-layer client/server 5e : Intranet/Internet 24No answer 0Total 93
Figure 5-2-22 Per-Architecture-Type FP Size Distribution
(Enhancement, Mixed FP measurement methods)
Table 5-2-23 Per-Architecture-Type FP Size Basic Statistics
(Enhancement, Mixed FP measurement methods) (unit: FP)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 19 98 148 179 412 2,000 357 434b : Mainframe 29 17 88 164 264 2,563 273 458c : 2-layer client/server 16 130 164 359 1,009 2,310 646 644d : 3-layer client/server 5 352 436 612 1,513 1,742 931 648e : Intranet/Internet 24 13 180 360 805 4,500 897 1,259
Actual FP Size (unadjusted)
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 2,0
00
70 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.2.4 Per-Business-Type FP Size
This section presents the per-business-type FP size distribution of “development” projects and the per-business-type FP size basic statistics of these projects. This section also presents those of “enhancement” projects separately. If projects that provide their business characteristics data for an analysis case are insufficient in number, this section presents only the project count for the case while omitting distribution graphs and basic statistics tables.
Development projects Among the whole “development” type (340 projects), projects belonging to any of the top four major business
types (“sales”, “accounting”, “management”, and “ordering/inventory”) take up 32% (110 projects). These business types are most frequently found in the collected data. Projects of the type “others” and those with “no reply” to the business type inquiry add up to as much as 43% (146 projects).
Enhancement projects The “enhancement” projects amount to 93 in number, far less than the number of development projects. Among
the 93 projects, those of the type “others” and those with “no reply” to the business type inquiry add up to 68% (63 projects), a larger portion than that of development projects. With less than ten samples of business characteristics data available, this section omits the basic statistics of per-business-type FP size of enhancement projects. Table 5-2-24 Per-Business-Type FP-Size
Project Count (Development, Mixed FP measurement methods)
Table 5-2-25 Per-Business-Type FP Size Count (Enhancement, Mixed FP measurement methods)
202_Business type Number of
Projects 202_Business type Number of
Projects a : Management/planning 3 b : Accounting 1b : Accounting 23 c : Sales 8c : Sales 46 d : Production/distribution 2d : Production/distribution 11 e : Personnel/welfare 1e : Personnel/welfare 8 f : General management 5f : General management 19 h : Research/development 1g : General affairs 1 i : Technology/control 1h : Research/development 2 j : Master management 1j : Master management 3 k : Ordering/inventory 2k : Ordering/inventory 22 l : Distribution management 1l : Distribution management 1 n : Contract/transfer 5n : Contract/transfer 12 o : Customer management 1o : Customer management 12 q : Product management (per-product) 1p : Product planning (per-product) 6 t : Other 6q : Product management (per-product) 5 No answer 57r : Facility (stores) 2 Total 93s : Information analysis 18 t : Other 61 No answer 85 Total 340
Table 5-2-26 Per-Business-Type FP Size Basic Statistics
(Development, Mixed FP measurement methods) (unit: FP)
Business Type N Min P25 Med P75 Max Mean S.D. b : Accounting 23 62 684 1,028 1,584 6,428 1,471 1,516c : Sales 46 82 353 858 2,014 6,318 1,417 1,525f : General management 19 117 210 464 1,802 2,938 1,003 1,027k : Ordering/inventory 22 98 421 771 1,851 11,670 1,728 2,535o : Customer management 12 136 471 769 1,096 13,080 2,135 3,773s : Information analysis 18 106 334 665 1,229 7,945 1,383 1,984
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 71
5.3 SLOC Size 5.3.1 Per-Project-Type SLOC Size
This section presents the per-project-type SLOC size distribution of the projects that measured their size in SLOCs, and presents the per-project-type SLOC size basic statistics of these projects. The analysis presented in this section does not distinguish different programming languages from each other. Even if a project used different programming languages, its SLOC size is presented here as the total SLOC of the languages.
Among the whole projects concerned (668 projects), projects of 50 KSLOCs or less take up 49% (326 projects). Projects of 10 KSLOCs take up the largest portion of 14% (96 projects). Projects of larger SLOC sizes have smaller portions.
Among the whole “development” type (354 projects), projects of 50 KSLOCs or less take up 40% (140 projects). Projects of 20 to 30 KSLOCs take up the largest portion of about 10% (34 projects). Projects of 10 KSLOCs or less take up 6 to 8%, and so do those of 10 to 20 KSLOCs and those of 30 to 40 KSLOCs.
Among the whole “enhancement” type (279 projects), projects of 50 KSLOCs or less take up 62% (171 projects). Projects of 10 KSLOCs take up the largest portion of 23% (63 projects) and the project count decreases as the SLOC value increases.
Projects of the development type have the largest median value of 90.8 KSLOCs, followed by those of “redevelopment” (57.8 KSLOCs), “maintenance/support” (43.7 KSLOCs), and “enhancement” (24.0 KSLOCs). The “enhancement” projects have the median value of 35.0 KSLOCs, a smaller value than that of “development” projects. Table 5-3-1 Per-Project-Type SLOC Size Project Count Figure 5-3-2 Per-Primary-
Programming-Language SLOC Size Project Count
103_Project type Number
of Projects
105_Project category Number
of Projects
312_Primary programming
language
Number of
Projectsa : Commercial package development 19 b : COBOL 152b : Entrusted development 329 c : PL/I 1c : For in-house use 3 d : Pro*C 4d : Prototyping 0 e : C++ 27
a : Development 354
e : Other 3 f : Visual C++ 32a : Commercial package development 4 g : C 93b : Entrusted development 165 h : VB 84c : For in-house use 0 i : Excel (VBA) 1d : Prototyping 0 k : Developer2000 5
b : Maintenance/ support
170
e : Other 1 m : PL/SOL 3a : Commercial package development 3 n : ABAP 3b : Entrusted development 32 o : C# 20c : For in-house use 0 p : Visual Basic.NET 18d : Prototyping 0 q : Java 147
c : Redevelopment 35
e : Other 0 r : Perl 1a : Commercial package development 11 t : Delphi 4b : Entrusted development 96 u : HTML 4c : For in-house use 2 w : Other 69d : Prototyping 0 Total 668
d : Enhancement 109
e : Other 0 Total 668 668
72 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-3-3 SLOC Size Distribution (Mixed Primary Programming Languages)
The following bar graph magnifies the lowest part of the SLOC values shown in the above bar graph.
Figure 5-3-4 Per-Project-Type SLOC Size Distribution
(Mixed Primary Programming Languages)
Table 5-3-5 Per-Project-Type SLOC Size Basic Statistics
(Mixed Primary Programming Languages) (unit: KSLOC)
Project Type N Min P25 Med P75 Max Mean S.D. All project types 668 0.02 20.9 53.7 192.2 12,100.0 211.6 636.2a : Development 354 0.51 30.6 90.8 251.2 12,100.0 291.3 837.6b : Maintenance/support 170 0.02 11.6 43.7 189.6 1,940.0 155.4 279.0c : Redevelopment 35 1.96 31.4 57.8 118.5 1,201.0 157.4 279.2d : Enhancement 109 0.50 11.0 24.0 54.1 700.0 58.0 97.3
Actual Net SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 500
a: Development
b: Maintenance/supportc: Redevelopment
d: Enhancement
Actual Net SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
(200-KSLOCs or less, 10-KSLOC intervals) N = 505
Actual Net SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 1,5
00
(Full range, 50-KSLOC intervals)
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 73
Figure 5-3-6 SLOC Size Distribution (Development, Mixed Primary Programming Languages)
The following bar graph magnifies the lowest part of the SLOC values shown in the above bar graph.
Table 5-3-7 SLOC Size Basic Statistics (Development)
(unit: KSLOC)Primary Programming Language N Min P25 Med P75 Max Mean S.D.
Mixed primary programming languages 354 0.5 30.6 90.8 251.2 12,100.0 291.3 837.6b : COBOL 76 5.8 57.2 171.5 432.2 12,100.0 588.2 1,590.6g : C 48 0.5 34.7 70.5 243.5 2,653.6 277.4 546.1h : VB 47 3.2 20.6 101.3 255.2 1,710.0 221.5 351.2q : Java 84 1.9 27.4 59.6 252.7 3,866.0 254.0 551.7
Actual Net SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
(200-KSLOCs or less, 10-KSLOC intervals) N = 245
Actual Net SLOC Size [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 1,0
00
(Full range, 50-KSLOC intervals)
74 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-3-8 SLOC Size Distribution (Enhancement, Mixed Primary Programming Languages)
The following bar graph magnifies the lowest part of the SLOC values shown in the above bar graph.
Table 5-3-9 SLOC Size Basic Statistics (Enhancement)
(unit: KSLOC)Primary Programming Language N Min P25 Med P75 Max Mean S.D.
Mixed primary programming languages 279 0.0 11.6 35.0 106.0 1,940.0 117.3 230.8b : COBOL 61 3.4 18.2 40.9 265.7 1,940.0 194.1 341.1g : C 43 1.0 12.2 27.5 58.9 1,272.0 86.8 202.1h : VB 32 0.1 2.3 25.9 88.0 327.8 67.0 95.7q : Java 58 0.5 10.5 34.5 83.0 650.0 91.9 154.0
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
(200-KSLOCs or less, 10-KSLOC intervals) N = 230
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s (Full range, 50-KSLOC intervals)
Ove
r 1,0
00
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 75
5.3.2 Per-Industry-Type SLOC Size
This section presents the per-industry-type SLOC size distribution of development projects that measured their size in SLOCs. This section also presents those of enhancement projects separately. This analysis is made for the top five “major industry types” (manufacturing, information and telecommunications, wholesale/retail trade, finance/insurance, and public service), which are most frequently found in the collected project data.
Development projects Table 5-3-10 lists the per-industry-type SLOC-size project count of the “development” type. Figure 5-3-11
illustrates the per-industry-type SLOC size distribution of the development projects, and Table 5-3-12 lists the per-industry-type SLOC size basic statistics of the projects.
Among the “manufacturing” type (49 projects), projects of 50 KSLOCs or less take up 43% (21 projects). Among the “information and telecommunications” type (43 projects), projects of 50 KSLOCs or less take up 67%
(29 projects). Among the “wholesale/retail trade” type (39 projects), projects of 100 KSLOCs or less take up 49% (19 projects)
and the number of projects of 50 KSLOCs or less is almost the same as that of 50 to 100 KSLOCs. Among the “finance/insurance” type (110 projects), projects of 150 KSLOCs or less take up 58% (64 projects). The
number of projects of 50 to 100 KSLOCs is less than half of that of 50 KSLOCs or less, and so is that of 100 to 150 KSLOCs.
Among the “public service” type (36 projects), projects of 50 KSLOCs or less take up 53% (19 projects). Projects of the “wholesale/retail trade” or “finance/insurance” type have larger median values than those of
“manufacturing”, “information and telecommunications”, or “public service”. The median values of these types are as follows: 123.0 KSLOCs (“wholesale/retail trade”), 120.6 KSLOCs (“finance/insurance”), 69.3 KSLOCs (“manufacturing”), 34.3 KSLOCs (“information and telecommunications”), and 44.8 KSLOCs (“public service”). Including SLOC values of different programming languages, the SLOC data presented here makes it fruitless to draw a simple comparison between these industry types. Table 5-3-10 Per-Industry-Type SLOC Size Project Count
(Development, Mixed Primary Programming Languages)
201_Industry Type (Major Type) Number of Projects
A : Agriculture 1C : Fisheries 1E : Construction 5F : Manufacturing 49G : Electricity, gas, heat supply and water 11H : Information and communications 43I : Transport 18J : Wholesale and retail trade 39K : Finance and insurance 110L : Real estate 1N : Medical, health care and welfare 6O : Education, learning support 3P : Compound services 2Q : Services, N.E.C. 14R : Government, N.E.C. 36S : Industries unable to classify 15Total 354
76 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-3-11 Per-Industry-Type SLOC Size Distribution (Development, Mixed Primary Programming Languages)
Table 5-3-12 Per-Industry-Type SLOC Size Basic Statistics
(Development, Mixed Primary Programming Languages) (unit: KSLOC)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 49 2.0 33.1 69.3 171.0 596.3 128.3 142.7H : Information and communications 43 0.5 11.6 34.3 77.2 563.3 76.6 119.0J : Wholesale and retail trade 39 3.2 50.1 123.0 226.0 2,653.6 281.0 533.2K : Finance and insurance 110 2.8 40.5 120.6 365.8 12,100.0 416.2 1,215.9R : Government, N.E.C. 36 1.9 24.7 44.8 192.0 2,200.2 229.5 463.3
Enhancement projects Table 5-3-13 lists the per-industry-type SLOC size project count of the “enhancement” type. Figure 5-3-14
illustrates the per-industry-type SLOC size distribution of the enhancement projects, and Table 5-3-15 lists the per-industry-type SLOC size basic statistics of the projects.
Among the “manufacturing” type (18 projects), projects of 50 KSLOCs or less take up 44% (8 projects). Among the “information and telecommunications” type (45 projects), projects of 50 KSLOCs or less take up 67%
(30 projects). Among the “wholesale/retail trade” type (18 projects), projects of 50 KSLOCs or less take up 56% (10 projects). Among the “finance/insurance” type (94 projects), projects of 50 KSLOCs or less take up 63% (59 projects). Among the “public service” type (39 projects), projects of 50 KSLOCs or less take up 51% (20 projects). Projects of the manufacturing type have the largest median value of 79.9 KSLOCs, followed by those of “public
service” (48.5 KSLOCs), “wholesale/retail trade” (39.7 KSLOCs), “information and telecommunications” (31.6 KSLOCs), and “finance/insurance” (29.9 KSLOCs). This trend differs from that of “development” projects. Including SLOC values of different programming languages, the SLOC data presented here makes it fruitless to draw a simple comparison between these industry types
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 500
F: Manufacturing
H: Information and communications
J: Wholesale and retail trade
K: Finance and insurance
R: Government
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 77
Table 5-3-13 Per-Industry-Type SLOC Size Project Count (Enhancement, Mixed Primary Programming Languages)
201_Industry Type (Major Type) Number of Projects
A : Agriculture 2E : Construction 2F : Manufacturing 18G : Electricity, gas, heat supply and water 5H : Information and communications 45I : Transport 16J : Wholesale and retail trade 18K : Finance and insurance 94L : Real estate 3N : Medical, health care and welfare 8O : Education, learning support 3P : Compound services 4Q : Services, N.E.C. 7R : Government, N.E.C. 39S : Industries unable to classify 15Total 279
Figure 5-3-14 Per-Industry-Type SLOC Size Distribution
(Enhancement, Mixed Primary Programming Languages)
Table 5-3-15 Per-Industry-Type SLOC Size Basic Statistics
(Enhancement, Mixed Primary Programming Languages) (unit: KSLOC)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 18 2.3 20.8 79.9 207.9 620.0 137.7 164.1H : Information and communications 45 0.4 11.0 31.6 93.7 1,023.5 91.7 186.7J : Wholesale and retail trade 18 1.7 13.6 39.7 151.3 700.0 113.6 172.6K : Finance and insurance 94 0.0 8.0 29.9 75.0 1,940.0 114.1 272.2R : Government, N.E.C. 39 7.8 28.5 48.5 191.5 1,144.1 164.9 253.3
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
Ove
r 500
F: Manufacturing H: Information and communications
J: Wholesale and retail trade
K: Finance and insurance R: Government
78 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.3.3 Per-Architecture-Type SLOC Size
This section presents the per-architecture-type SLOC size distribution of “development” projects and the per-architecture-type SLOC size basic statistics of these projects. This section also presents those of “enhancement” projects separately.
Development projects “Development” projects of the “intranet/Internet” type and those of the “2- or 3-layer client/server” type add up to
77% (273 projects). Among the whole “development” type (354 projects), projects of 50 KSLOCs or less take up 32% (115 projects) and those of 150 KSLOCs or less take up 53% (188 projects). Table 5-3-16 Per-Architecture-Type SLOC Size Project Count
(Development, Mixed Primary Programming Languages)
308_Architecture Number of Projects
A : Stand-alone 13B : Mainframe 19C : 2-layer client/server 79D : 3-layer client/server 56E : Intranet/Internet 138F : Other 13No answer 36Total 354
Figure 5-3-17 Per-Architecture-Type SLOC Size Distribution
(Development, Mixed Primary Programming Languages)
Table 5-3-18 Per-Architecture-Type SLOC Size Basic Statistics
(Development, Mixed Primary Programming Languages) (unit: KSLOC)
Architecture N Min P25 Med P75 Max Mean S.D. A : Stand-alone 13 0.5 4.7 8.3 114.2 818.8 108.7 227.9B : Mainframe 19 5.8 116.8 272.0 508.0 12,100.0 1,249.8 2,982.2C : 2-layer client/server 79 3.2 26.0 70.0 180.5 2,212.0 214.5 419.5D : 3-layer client/server 56 1.9 42.3 105.9 253.2 2,773.0 310.5 565.5E : Intranet/Internet 138 2.8 33.3 113.1 270.0 3,866.0 270.5 502.0
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 500
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 79
Enhancement projects “Enhancement” projects of the “2- or 3-layer client/server” type take up the largest portion of 25% (70 projects)
and those of the “intranet/Internet” type take up the next largest of 23% (63 projects). Projects of the “intranet/Internet” type have the smallest median value.
Among the whole projects concerned (279 projects), projects of 50 KSLOCs or less take up 52% (146 projects). Table 5-3-19 ( Per-Architecture-Type SLOC Size Project Count
(Enhancement, Mixed Primary Programming Languages)
308_Architecture Number
of Projects
a : Stand-alone 19b : Mainframe 19c : 2-layer client/server 70d : 3-layer client/server 54e : Intranet/Internet 63f : Other 15No answer 39Total 279
Figure 5-3-20 Per-Architecture-Type SLOC Size Distribution
(Enhancement, Mixed Primary Programming Languages)
Table 5-3-21 Per-Architecture-Type SLOC Size Basic Statistics
(Enhancement, Mixed Primary Programming Languages) (unit: KSLOC)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 19 0.5 4.0 24.2 46.4 252.5 39.3 57.5b : Mainframe 19 2.0 12.4 40.2 106.5 543.2 93.1 138.5c : 2-layer client/server 70 0.1 17.7 38.1 120.5 1,144.1 132.2 225.0d : 3-layer client/server 54 0.5 9.1 28.4 84.3 1,940.0 117.8 283.2e : Intranet/Internet 63 1.0 8.0 23.8 55.1 700.0 63.6 125.7
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 500
80 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.3.4 Per-Business-Type SLOC Size
This section presents the per-business-type SLOC size distribution of “development” projects and the per-business-type SLOC size basic statistics of these projects. This section also presents those of “enhancement” projects separately. If projects that provide their business characteristics data for an analysis case are insufficient in number, this section presents only the project count for the case while omitting distribution graphs and basic statistics tables.
Development projects Among the whole “development” type (354 projects), projects that fall into any of the seven business types
highlighted in Table 5-3-22 take up 60% (212 projects). Projects of the type “others” and those with “no reply” to the business type inquiry add up to as much as 21% (73 projects). Table 5-3-22 Per-Business-Type
SLOC Size Project Count (Development, Mixed Primary Programming Languages)
Figure 5-3-23 Per-Business-Type SLOC Size Distribution (Development, Mixed Primary Programming Languages)
202_Business type Number
of Projects
a : Management/planning 4b : Accounting 28c : Sales 45d : Production/distribution 14e : Personnel/welfare 4f : General management 54g : General affairs 5h : Research/development 3i : Technology/control 21j : Master management 1k : Ordering/inventory 22l : Distribution management 8m : Subcontractor management 1n : Contract/transfer 13o : Customer management 19p : Product planning (per-product) 3q : Product management (per-product) 10r : Facility (stores) 3s : Information analysis 23t : Other 69No answer 4Total 354
Table 5-3-24 Per-Business-Type SLOC Size Basic Statistics
(Development, Mixed Primary Programming Languages) (unit: KSLOC)
Business Type N Min P25 Med P75 Max Mean S.D. b : Accounting 28 12.8 82.0 170.2 265.1 12,100.0 769.5 2,341.2c : Sales 45 2.8 27.0 50.1 265.2 1,757.0 258.6 451.7f : General management 54 1.9 36.3 122.5 270.0 1,000.0 197.0 211.9i : Technology/control 21 0.5 13.1 30.4 42.0 384.2 47.4 80.7k : Ordering/inventory 22 7.8 28.5 54.0 167.5 2,653.6 321.8 715.0o : Customer management 19 2.0 19.6 35.1 136.7 614.8 98.9 147.7s : Information analysis 23 7.7 22.1 28.3 115.8 639.0 119.5 182.6
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s b: Accounting c: Sales f: General managementi: Technology/control k: Ordering/inventory o: Customer s: Information analysis
Ove
r 500
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 81
Enhancement projects Among the whole “enhancement” type (279 projects), projects that fall into any of the highlighted seven business
types take up 51% (143 projects). Projects of the type “others” and those with “no reply” to the business type inquiry add up to as much as 24% (68 projects). Table 5-3-25 Per-Business-Type SLOC
Size Project Count (Enhancement, Mixed Primary Programming Languages)
Figure 5-3-26 Per-Business-Type SLOC Size Distribution (Enhancement, Mixed Primary Programming Languages)
202_Business type Number
of Projects
a : Management/planning 2b : Accounting 13c : Sales 37d : Production/distribution 10e : Personnel/welfare 13f : General management 45g : General affairs 11h : Research/development 4i : Technology/control 10j : Master management 8k : Ordering/inventory 14l : Distribution management 2m : Subcontractor management 1n : Contract/transfer 2o : Customer management 14q : Product management (per-product) 13r : Facility (stores) 2s : Information analysis 10t : Other 61No answer 7Total 279
Table 5-3-27 Per-Business-Type SLOC Size Basic Statistics
(Enhancement, Mixed Primary Programming Languages) (unit: KSLOC)
Business Type N Min P25 Med P75 Max Mean S.D. b : Accounting 13 0.4 11.0 31.6 54.1 305.2 69.3 102.8c : Sales 37 0.0 4.4 20.7 62.0 700.0 70.9 126.4f : General management 45 0.1 14.2 57.5 246.0 1,940.0 228.7 415.9i : Technology/control 10 1.0 2.8 11.2 24.4 359.0 46.9 110.2k : Ordering/inventory 14 1.7 20.2 31.4 60.5 310.4 54.3 77.6o : Customer management 14 3.2 5.9 16.2 44.5 252.5 50.6 81.1s : Information analysis 10 1.0 15.3 60.9 106.5 1,023.5 167.8 311.5
Actual Net SLOC Size_Enhancement [KSLOC]
Num
ber o
f Pro
ject
s
b: Accounting c: Sales f: General managementi: Technology/control k: Ordering/inventory o: Customer s: Information analysis
Ove
r 500
82 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.4 Development Schedule 5.4.1 Per-Project-Type Development Schedule
This section presents the distribution of development schedule duration of the development projects that provide development schedule data, and presents the basic statistics of development schedule duration of these projects. This section also separately presents those of the enhancement projects that provide development schedule data.
Among the whole projects concerned (523 projects), projects that have the development schedule of 14 months or less take up a major portion of 89% (468 projects). Projects that have the development schedule of 1 year (12 months) or less take up 82% (428 projects). Schedule duration values of the whole projects have the median between 5 and 7 months regardless of the type of project. The median of schedule duration values of the whole projects equals 6.1 months. That of “development” projects equals 7.1 months, while that of “maintenance/support” projects equals 5.0 months.
Among the whole “development” type (294 projects), projects having the development schedule of 14 months or less take up 89% (262 projects) and those having the development schedule of 1 year or less take up 79% (233 projects).
Among the whole “enhancement” type (209 projects), projects having the development schedule of 14 months or less take up 90% (188 projects) and those having a development schedule of 1 year or less take up 85% (177 projects).
Projects having a development schedule of 2 to 8 months take up 50% (147 projects) of the whole “development” projects and 65% (135 projects) of the whole “enhancement” projects. Table 5-4-1 Per-Project-Type Schedule Duration Project Count
103_Project type Number of Projects 105_Project category Number of
Projects a : Commercial package development 8 b : Entrusted development 280 c : For in-house use 3 d : Prototyping 0
a : Development 294
e : Other 3 a : Commercial package development 4 b : Entrusted development 113 c : For in-house use 0 d : Prototyping 0
b : Maintenance/support 120
e : Other 3 a : Commercial package development 0 b : Entrusted development 20 c : For in-house use 0 d : Prototyping 0
c: Redevelopment 20
e : Other 0 a : Commercial package development 3 b : Entrusted development 79 c : For in-house use 2 d : Prototyping 0
d: Enhancement 89
e : Other 5 Total 523 523
Figure 5-4-2 Schedule Duration Distribution
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 31
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 83
Figure 5-4-3 Per-Project-Type Schedule Duration Distribution
Table 5-4-4 Per-Project-Type Schedule Duration Basic Statistics
(unit: months)Project Type N Min P25 Med P75 Max Mean S.D.
All project types 523 0.2 3.9 6.1 10.2 57.4 7.6 5.5a : Development 294 1.0 4.3 7.1 11.1 30.4 8.2 4.9b : Maintenance/support 120 0.2 3.2 5.0 8.2 57.4 6.9 6.9c : Redevelopment 20 2.0 4.1 5.9 7.5 20.3 7.1 4.9d : Enhancement 89 0.9 3.8 5.7 7.8 30.4 6.8 5.2
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 20
a: Development
b: Maintenance/support
c: Redevelopment
d: Enhancement
84 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-4-5 Schedule Duration Distribution (Development)
Table 5-4-6 Schedule Duration Basic Statistics (Development)
(unit: months) N Min P25 Med P75 Max Mean S.D.
294 1.0 4.3 7.1 11.1 30.4 8.2 4.9 Figure 5-4-7 Schedule Duration Distribution (Enhancement)
Table 5-4-8 Schedule Duration Basic Statistics (Enhancement)
(unit: months) N Min P25 Med P75 Max Mean S.D.
209 0.2 3.3 5.3 8.1 57.4 6.9 6.2
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 30
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 85
5.4.2 Per-Industry-Type Development Schedule
This section presents the per-industry-type schedule duration distribution of “development” projects and the per-industry-type schedule duration basic statistics of these projects. This section also presents those of “enhancement” projects separately. This analysis is made for the top five “major industry types” (manufacturing, information and telecommunications, wholesale/retail trade, finance/insurance, and public service), which are most frequently found in the collected project data.
Development projects Many “development” projects have a schedule duration of 10 months or less. The percentage of 10-month-or-less
projects varies with the type of industry as follows: 80% (39 projects) — “manufacturing” (49 projects), 88% (45 projects) — “information and telecommunications” (51 projects), 73% (19 projects) — “wholesale/retail trade” (26 projects), 58% (42 projects) — “finance/insurance” (72 projects), and 67% (16 projects) — “public service” (24 projects). More than half of the “information and telecommunications” type is made up of projects having a schedule duration of 6 months or less, and this also applies to the “public service” type. Among projects of the “finance/insurance” type, projects having a schedule duration of 1 year (12 months) or more take up 26% (19 projects).
The median of schedule duration values varies with the type of industry as follows: 7.1 months (“wholesale/retail trade”), 8.2 months (“finance/insurance”), 6.0 months (“public service”), 6.1 months (“manufacturing”), and 4.1 months (“information and telecommunications”). Table 5-4-9 Per-Industry-Type
Schedule Duration Project Count (Development)
Figure 5-4-10 Per-Industry-Type Schedule Duration Distribution (Development)
201_Industry Type (Major Type) Number
of Projects
E : Construction 3F : Manufacturing 49G : Electricity, gas, heat supply and water 8H : Information and communications 51I : Transport 17J : Wholesale and retail trade 26K : Finance and insurance 72L : Real estate 6M : Eating and drinking places,
accommodations 6
N : Medical, health care and welfare 4O : Education, learning support 1P : Compound services 1Q : Services, N.E.C. 18R : Government, N.E.C. 24S : Industries unable to classify 8No answer 0Total 294
Table 5-4-11 Per-Industry-Type Schedule Duration Basic Statistics (Development)
(unit: months)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 49 1.3 4.3 6.1 9.1 15.2 6.7 3.4H : Information and communications 51 1.0 3.1 4.1 7.8 17.2 5.8 4.0J : Wholesale and retail trade 26 1.8 5.0 7.1 10.0 18.3 7.7 3.6K : Finance and insurance 72 1.3 5.5 8.2 12.4 24.3 9.4 5.1R : Government, N.E.C. 24 1.5 3.5 6.0 11.3 30.4 8.3 6.8
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 20
F: Manufacturing
H: Information and communications
K: Finance and insurance R: Government
J: Wholesale and retail trade
86 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Enhancement projects Many “enhancement” projects have a schedule duration of 6 months or less. The percentage of 6-month-or-less
projects varies with the type of industry as follows: 6% (5 projects) — “manufacturing” (14 projects), 63% (29 projects) — “information and telecommunications” (46 projects), 83% (10 projects) — “wholesale/retail trade” (12 projects), 54% (32 projects) — “finance/insurance” (59 projects), and 58% (18 projects) — “public service” (31 projects). Among projects of the “finance/insurance” type, projects having a “schedule duration of 1 year (12 months) or more” take up 17% (10 projects).
The median of schedule duration value varies with the type of industry as follows: 6.8 months (“manufacturing”), 5.1 months (“finance/insurance”), 5.3 months (“public service”), 4.5 months (“information and telecommunications”), and 3.9 months (“wholesale/retail trade”). Table 5-4-12 Per-Industry-Type
Schedule Duration Project Count (Enhancement)
Figure 5-4-13 Per-Industry-Type Schedule Duration Distribution (Enhancement)
201_Industry Type (Major Type) Number
of Projects
E : Construction 2F : Manufacturing 14G : Electricity, gas, heat supply and water 5H : Information and communications 46I : Transport 14J : Wholesale and retail trade 12K : Finance and insurance 59L : Real estate 2N : Medical, health care and welfare 2O : Education, learning support 3P : Compound services 2Q : Services, N.E.C. 4R : Government, N.E.C. 31S : Industries unable to classify 13Total 209
Table 5-4-14 Per-Industry-Type Schedule Duration Basic Statistics (Enhancement)
(unit: months)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 14 1.9 5.6 6.8 7.4 24.3 8.0 5.7H : Information and communications 46 0.8 2.3 4.5 8.4 14.9 5.8 4.1J : Wholesale and retail trade 12 2.0 3.0 3.9 5.6 12.2 4.8 2.8K : Finance and insurance 59 0.6 3.3 5.1 8.1 30.4 6.9 5.4R : Government, N.E.C. 31 2.0 3.7 5.3 10.1 57.4 9.5 11.5
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
F: Manufacturing H: Information and communications
J: Wholesale and retail trade
K: Finance and insurance R: Government
Ove
r 20
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 87
5.4.3 Per-Architecture-Type Development Schedule
This section presents the per-architecture-type schedule duration distribution of “development” projects and the per-architecture-type schedule duration basic statistics of these projects. This section also presents those of “enhancement” projects separately.
Development projects “Development” projects that belong to the “2- or 3-layer client/server” type or the “intranet/Internet” type take up a
major portion of 83% (243 projects). Development projects having a schedule duration of 1 year or more take up 17% (49 projects). Table 5-4-15 Per-Architecture-Type Schedule Duration Project Count (Development)
308_Architecture Number of Projects
a : Stand-alone 12b : Mainframe 10c : 2-layer client/server 87d : 3-layer client/server 69e : Intranet/Internet 87f : Other 15No answer 14Total 294
Figure 5-4-16 Per-Architecture-Type Schedule Duration Distribution (Development)
Table 5-4-17 Per-Architecture-Type Schedule Duration Basic Statistics (Development)
(unit: months)Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 12 1.5 3.3 4.6 7.8 15.5 6.2 4.5b : Mainframe 10 3.0 8.7 10.1 12.6 14.2 9.6 3.9c : 2-layer client/server 87 1.5 4.1 7.1 11.2 24.3 7.8 4.5d : 3-layer client/server 69 1.3 4.1 7.6 11.1 24.3 8.1 4.8e : Intranet/Internet 87 1.0 4.5 6.5 11.2 30.4 8.5 5.3
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 20
88 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Enhancement projects “Enhancement” projects that belong to the “2- or 3-layer client/server” type or the “intranet/Internet” type take up a
major portion of 65% (136 projects). Enhancement projects having a schedule duration of 1 year or less take up 56% (118 projects). Table 5-4-18 ( Per-Architecture-Type Schedule Duration Project Count (Enhancement)
308_Architecture Number
of Projects
A : Stand-alone 20B : Mainframe 19C : 2-layer client/server 60D : 3-layer client/server 27E : Intranet/Internet 49f : Other 18No answer 16Total 209
Figure 5-4-19 Per-Architecture-Type Schedule Duration Distribution (Enhancement)
Table 5-4-20 Per-Architecture-Type Schedule Duration Basic Statistics (Enhancement)
(unit: months)Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 20 1.4 3.2 5.9 7.9 24.3 6.7 5.2b : Mainframe 19 1.5 3.0 3.0 12.1 18.3 6.8 5.2c : 2-layer client/server 60 0.8 4.3 5.7 8.1 57.4 8.1 9.0d : 3-layer client/server 27 2.0 3.5 4.4 6.4 18.2 5.7 3.9e : Intranet/Internet 49 1.6 3.1 4.0 7.0 19.3 5.8 4.4
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 20
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 89
5.4.4 Per-Business-Type Development Schedule
This section presents the per-business-type schedule duration distribution of “development” projects and the per-business-type schedule duration basic statistics of these projects. This section also presents those of “enhancement” projects separately.
Development projects Among the whole “development” type (294 projects), projects belonging to any of the top four major business
types (“sales”, “accounting”, “management”, and “ordering/inventory”) take up 41% (121 projects). These business types are most frequently found in the collected data. Projects of the type “others” and those with “no reply” to the business type inquiry add up to as much as 25% (74 projects).
The median of schedule duration values varies with the type of business as follows: 6.4 months (“management”), 7.1 months (“ordering/inventory”), 7.1 months (“sales”), and 10.1 months (“accounting”). Table 5-4-21 Per-Business-Type
Schedule Duration Project Count (Development)
Figure 5-4-22 Per-Business-Type Schedule Duration Distribution (Development)
202_Business type Number
of Projects
a : Management/planning 4b : Accounting 25c : Sales 45d : Production/distribution 8e : Personnel/welfare 3f : General management 30g : General affairs 5h : Research/development 5i : Technology/control 19j : Master management 1k : Ordering/inventory 21l : Distribution management 7n : Contract/transfer 8o : Customer management 14p : Product planning (per-product) 1q : Product management (per-product) 6s : Information analysis 18t : Other 53No answer 21Total 294
Table 5-4-23 Per-Business-Type Schedule Duration Basic Statistics (Development)
(unit: months)Business Type N Min P25 Med P75 Max Mean S.D.
b : Accounting 25 1.5 8.0 10.1 11.2 24.3 10.1 5.1c : Sales 45 1.0 5.0 7.1 11.1 24.3 8.0 4.9f : General management 30 2.3 4.2 6.4 12.1 19.2 8.2 4.6i : Technology/control 19 1.5 2.9 3.6 5.3 17.2 4.9 3.9k : Ordering/inventory 21 2.0 4.5 7.1 10.4 18.3 7.6 4.3o : Customer management 14 3.2 4.5 6.2 10.6 16.6 8.0 4.7s : Information analysis 18 1.8 4.1 5.4 9.5 18.3 7.0 4.4
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 20
b: Accounting c: Sales f: General managementi: Technology/control k: Ordering/inventory o: Customer s: Information analysis
90 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Enhancement projects Among the whole “enhancement” type (209 projects), projects belonging to any of the top four major business
types (“sales”, “accounting”, “management”, and “ordering/inventory”) take up 35% (73 projects). These business types are most frequently found in the collected data. Projects of the type “others” and those with “no reply” to the business type inquiry add up to as much as 29% (61 projects).
The median of schedule duration values varies with the type of business as follows: 3.9 months (“ordering/inventory”), 4.5 months (“management”), 5.1 months (“accounting”), and 5.6 months (“sales”). “Enhancement” projects of the “accounting” type have a smaller median value of 5.1 months than “development” projects of the same type. However, this does not indicate any inherent differences between the enhancement and development types because of an insufficient number of enhancement projects of the accounting type available for the above analysis. Table 5-4-24 Per-Business-Type
Schedule Duration Project Count (Enhancement)
Figure 5-4-25 Per-Business-Type Schedule Duration Distribution (Enhancement)
202_Business type Number
of Projects
a : Management/planning 1b : Accounting 8c : Sales 30d : Production/distribution 4e : Personnel/welfare 7f : General management 21g : General affairs 7h : Research/development 5i : Technology/control 19j : Master management 3k : Ordering/inventory 14m : Subcontractor management 1n : Contract/transfer 6o : Customer management 11q : Product management (per-product) 5s : Information analysis 6t : Other 49No answer 12Total 209
Table 5-4-26 Per-Business-Type Schedule Duration Basic Statistics (Enhancement)
(unit: months)Business Type N Min P25 Med P75 Max Mean S.D.
b : Accounting 8 0.8 3.5 5.1 5.7 16.2 5.6 4.7c : Sales 30 0.6 3.5 5.6 7.8 18.3 6.3 4.2f : General management 21 1.7 3.2 4.5 6.1 13.8 5.6 3.5i : Technology/control 19 1.6 2.1 4.3 6.9 14.9 5.3 4.0k : Ordering/inventory 14 2.1 3.1 3.9 6.0 12.2 4.8 2.7o : Customer management 11 2.7 3.2 3.8 4.6 9.8 4.3 2.0s : Information analysis 6 1.6 4.6 7.3 9.7 12.2 7.1 3.9
Actual Months (Major-development phases) [months]
Num
ber o
f Pro
ject
s
Ove
r 20
b: Accounting c: Sales f: General managementi: Technology/control k: Ordering/inventory o: Customer s: Information analysis
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 91
5.5 Effort 5.5.1 Per-Project-Type Effort
This section presents the per-project-type effort distribution of the projects that measured their development effort. This section also presents the effort distribution of the projects that measured their size in FPs and that of the projects that measured their size in SLOCs.
Among the whole projects concerned (1,204 projects), projects that used the effort of 5,000 person-hours or less take up 40% (487 projects). Projects that used the effort of 2,000 person-hours or less take up 20% (243 projects).
The median of effort values varies with the type of project as follows: 9,000 person-hours (“development”), 4,202 person-hours (“maintenance/support”), 5,524 person-hours (“enhancement”), and 13,992 person-hours (“redevelopment”). Table 5-5-1 Per-Project-Type Effort Project Count
103_Project type Number of Projects 105_Project category Number of
Projects a : Commercial package development 29b : Entrusted development 649c : For in-house use 6d : Prototyping 4
a : Development 691
e : Other 3a : Commercial package development 12b : Entrusted development 283c : For in-house use 2d : Prototyping 0
b : Maintenance/support 300
e : Other 3a : Commercial package development 8b : Entrusted development 63c : For in-house use 0d : Prototyping 0
c : Redevelopment 71
e : Other 0a : Commercial package development 12b : Entrusted development 123c : For in-house use 2d : Prototyping 0
d : Enhancement 142
e : Other 5Total 1,204 1,204
92 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-5-2 Effort Distribution
The following bar graph magnifies the lowest part of the effort values shown in the above bar graph.
Figure 5-5-3 Per-Project-Type Effort Distribution
Table 5-5-4 Per-Project-Type Effort Basic Statistics
(unit: person-hours)Project Type N Min P25 Med P75 Max Mean S.D.
All project types 1,204 62 2,621 7,436 21,482 956,505 27,176 65,542a : Development 691 62 3,074 9,000 24,101 956,505 31,301 77,513b : Maintenance/support 300 165 1,592 4,202 16,522 353,685 23,776 52,426c : Redevelopment 71 481 6,856 13,992 31,061 237,610 27,496 36,143d : Enhancement 142 101 2,767 5,524 11,910 162,880 14,128 22,211
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 50,
000
a: Development
b: Maintenance/support
c: Redevelopment
d: Enhancement
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(Effort in person-hours. 20,000-person-hours or less, 1,000-person-hour intervals) N = 875
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 100
,000
(Effort in person-hours. Full range, 5,000-person-hour intervals)
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 93
The rest of this section presents analysis of a project data set that provide FP values and analysis of another project data set that provide SLOC values. For each of the data sets, the effort distribution of and the effort basic statistics of the “development” projects are presented and those of “enhancement” projects are also presented separately.
Chapter 6 analyzes the relationship between the size and effort and the productivity with respect to FP-size projects and to SLOC-size projects based on these data sets. The rest of this section is included in this White Paper to present these data sets' effort distribution, which serve as the fundamental data for Chapter 6.
Development projects (FP size) The frequency distribution of development projects that measured their size in FPs shows that the project count
reaches the maximum in the effort range between 1,000 to 2,000 person-hours, and the project count remains high in the effort range up to 3,000 person-hours. Among development projects that used mixed FP measurement methods (340 projects in total), projects that used 3,000 person-hours or less take up 36% (122 projects). Among development projects that used FP measurement methods of the IFPUG group (218 projects in total), projects that used 3,000 person-hours or less take up 28% (60 projects). The 218 projects of IFPUG group have a larger median value of 10,123 person-hours than the 340 projects. Figure 5-5-5 Effort Distribution of FP-Size Projects
(Development, Mixed FP Measurement Methods)
The following bar graph magnifies the lowest part of the actual Major-development phase effort values shown in the above bar graph.
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(20,000-person-hours or less, 1,000-person-hour intervals) N = 270
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 100
,000
(Full range, 5,000-person-hour intervals)
94 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-5-6 Effort Distribution of FP-Size Projects (Development, IFPUG Group)
The following bar graph magnifies the lowest part of the actual Major-development phase effort values shown in the above bar graph.
Table 5-5-7 Per-FP-Measurement-Method Effort Basic Statistics (Development)
(unit: person-hours)FP Size Measurement Method N Min P25 Med P75 Max Mean S.D.
Mixed FP size measurement methods 340 62 1,917 5,432 16,509 285,417 19,688 37,622IFPUG group 218 62 2,699 10,123 29,255 285,417 27,849 44,766
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(20,000-person-hours or less, 1,000-person-hour intervals) N = 150
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s (Full range, 5,000-person-hour intervals)
Ove
r 100
,000
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 95
Enhancement projects (FP size) The frequency distribution of enhancement projects that measured their size in FPs shows that the project count
reaches the maximum in the effort range under 1,000 person-hours, regardless of FP measurement method. Among enhancement projects that used “mixed FP measurement methods” (93 projects in total), projects that used 1,000 person-hours or less take up 27% (25 projects). Among enhancement projects that used FP measurement methods of the “IFPUG group” (65 projects in total), projects that used 1,000 person-hours or less take up 28% (18 projects). These 65 projects have a larger median value of 2,685 person-hours. Figure 5-5-8 Effort Distribution of FP-Size Projects
(Enhancement, Mixed FP Measurement Methods)
Figure 5-5-9 Effort Distribution of FP-Size Projects (Enhancement, IFPUG Group)
Table 5-5-10 Per-FP-Measurement-Method Effort Basic Statistics (Enhancement)
(unit: person-hours)FP Size Measurement Method N Min P25 Med P75 Max Mean S.D.
Mixed FP size measurement methods 93 165 900 2,250 6,240 103,360 8,775 17,874IFPUG group 65 180 900 2,685 8,809 103,360 10,967 20,673
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 20,
000
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 20,
000
96 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Development projects (SLOC size) The magnified distribution of development projects that measured their size in SLOCs shows that the project
count remains high in the effort range between 3,000 and 6,000 person-hours. In the higher effort range up to 25,000 person-hours, the project count fluctuates in the medium-to-low range. Figure 5-5-11 Effort Distribution of SLOC-Size Projects
(Development, Mixed Primary Programming Languages)
The following bar graph magnifies the lowest part of the actual Major-development phase effort values shown in the above bar graph.
Table 5-5-12 Per-Primary-Programming-Language SLOC-Size Projects Effort Basic
Statistics (Development) (unit: person-hours)
Primary Programming Language N Min P25 Med P75 Max Mean S.D. Mixed primary programming languages 354 90 5,286 15,478 42,054 956,505 47,980 102,408b : COBOL 76 2,090 9,891 25,216 91,114 956,505 84,932 161,592g : C 48 243 5,460 16,153 32,073 589,050 56,635 116,332h : VB 47 403 4,399 14,484 33,152 283,290 36,479 60,134q : Java 84 250 3,963 10,384 37,593 609,620 40,773 87,792
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(20,000-person-hours or less, 1,000-person-hour intervals) N = 197
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(Full range, 5,000-person-hour intervals)
Ove
r 100
,000
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 97
Enhancement projects (SLOC size) The magnified distribution of enhancement projects that measured their size in SLOCs shows that the project
count remains high in the effort range between 1,000 and 3,000 person-hours. In the higher effort range up to 10,000 person-hours, the project count fluctuates in the medium-to-low range. Figure 5-5-13 Effort Distribution of SLOC-Size Projects
(Enhancement, Mixed Primary Programming Languages)
The following bar graph magnifies the lowest part of the actual Major-development phase effort values shown in the above bar graph.
Table 5-5-14 Per-Primary-Programming-Language SLOC-Size Projects Effort Basic
Statistics (Enhancement) (unit: person-hours)
Primary Programming Language N Min P25 Med P75 Max Mean S.D. Mixed primary programming languages 279 101 2,592 6,408 19,029 353,685 24,774 49,695b : COBOL 61 1,656 6,679 11,638 46,496 353,685 45,572 77,041g : C 43 1,112 2,659 4,420 8,939 57,221 10,265 13,846h : VB 32 334 1,733 3,026 8,940 59,456 11,255 17,413q : Java 58 545 2,566 6,640 16,048 199,325 17,930 34,960
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(20,000-person-hours or less, 1,000-person-hour intervals) N = 215
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
(Full range, 5,000-person-hour intervals)
Ove
r 100
,000
98 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.5.2 Per-Industry-Type Effort
This section presents the per-industry-type effort distribution of the development projects and that of enhancement projects separately. This analysis is made for the top five “major industry types” (manufacturing, information and telecommunications, wholesale/retail trade, finance/insurance, and public service), which are most frequently found in the collected project data.
Development projects “Development” projects that used 5,000 person-hours or less take up a large portion. The percentage of such
projects varies with the type of industry as follows: 50% (57 projects) — “manufacturing” (115 projects), 55% (42 projects) — “information and telecommunications” (77 projects), 17% (12 projects) — “wholesale/retail trade” (71 projects), 41% (21 projects) — “public service” (51 projects), and 23% (43 projects) — “finance/insurance” (188 projects).
Projects of “finance/insurance” have the largest median value of 16,496 person-hours, and those of “wholesale/retail trade” have the next largest of 11,880 person-hours. The rest of industry types have median values less than 10,000 person-hours as follows: 9,282 person-hours (“public service”), 5,120 person-hours (“manufacturing”), and 4,344 person-hours (“information and telecommunications”). The median value of 4,344 person-hours is about one quarter of that of “finance/insurance” projects. Table 5-5-15 Per-Industry-Type Effort
Project Count (Development)
Figure 5-5-16 Per-Industry-Type Effort Distribution (Development)
201_Industry Type (Major Type) Number
of Projects
A : Agriculture 1C : Fisheries 1E : Construction 10F : Manufacturing 115G : Electricity, gas, heat supply and water 14H : Information and communications 77I : Transport 29J : Wholesale and retail trade 71K : Finance and insurance 188L : Real estate 10M : Eating and drinking places,
accommodations 6
N : Medical, health care and welfare 12O : Education, learning support 3P : Compound services 2Q : Services, N.E.C. 43R : Government, N.E.C. 51S : Industries unable to classify 17No answer 41Total 691
Table 5-5-17 Per-Industry-Type Effort Basic Statistics (Development)
(unit: person-hours)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 115 249 1,902 5,120 14,603 170,363 14,152 24,626H : Information and communications 77 243 1,800 4,344 10,500 76,177 9,207 13,820J : Wholesale and retail trade 71 1,400 7,714 11,880 33,058 334,390 35,132 61,582K : Finance and insurance 188 220 5,247 16,496 57,791 956,505 57,191 114,189R : Government, N.E.C. 51 90 1,945 9,282 23,729 196,482 20,479 35,793
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 50,
000
F: Manufacturing H: Information and communicationsJ: Wholesale and retail trade
K: Finance and insurance R: Government
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 99
Enhancement projects “Enhancement” projects that used 5,000 person-hours or less take up a large portion. The percentage of such
projects varies with the type of industry as follows: 38% (11 projects) — “manufacturing” (29 projects), 40% (25 projects) — “information and telecommunications” (63 projects), 38% (8 projects) — “wholesale/retail trade” (21 projects), 44% (19 projects) — “public service” (43 projects), and 51% (75 projects) — “finance/insurance” (148 projects).
Enhancement projects of the “manufacturing” type and those of the “information and telecommunications” type have the median values of 8,245 person-hours and 6,390 person-hours, respectively. Both of the values are larger than the effort median values of “development” projects of the same industry types. Table 5-5-18 Per-Industry-Type Effort
Project Count (Enhancement)
Figure 5-5-19 Per-Industry-Type Effort Distribution (Enhancement)
201_Industry Type (Major Type) Number
of Projects
A : Agriculture 2E : Construction 4F : Manufacturing 29G : Electricity, gas, heat supply and water 10H : Information and communications 63I : Transport 24J : Wholesale and retail trade 21K : Finance and insurance 148L : Real estate 5N : Medical, health care and welfare 10O : Education, learning support 4P : Compound services 4Q : Services, N.E.C. 9R : Government, N.E.C. 43S : Industries unable to classify 22No answer 44Total 442
Table 5-5-20 Per-Industry-Type Effort Basic Statistics (Enhancement)
(unit: person-hours)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 29 445 3,422 8,245 19,440 119,350 15,953 23,447H : Information and communications 63 101 2,734 6,390 16,951 353,685 31,638 69,400J : Wholesale and retail trade 21 1,260 3,799 5,655 10,540 105,000 16,389 27,192K : Finance and insurance 148 165 2,159 4,798 20,025 242,420 22,520 42,105R : Government, N.E.C. 43 563 2,529 6,408 22,291 172,294 18,824 32,215
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 50,
000
F: Manufacturing H: Information and communicationsJ: Wholesale and retail trade
K: Finance and insurance R: Government
100 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.5.3 Per-Architecture-Type Effort
This section presents the per-architecture-type effort distribution of “development” projects and the per-architecture-type effort basic statistics of these projects. This section also presents those of “enhancement” projects separately.
Development projects Among the whole “development” projects, projects that used 10,000 person-hours or less take up 49% (340
projects). The project count decreases as the effort value increases. Among the “intranet/Internet” type (244 projects), projects of 10,000 person-hours or less take up 41% (100 projects) while those of 30,000 person-hours or less take up 72% (176 projects). Table 5-5-21 Per-Architecture-Type Effort Project Count (Development)
308_Architecture Number of Projects
a : Stand-alone 42b : Mainframe 30c : 2-layer client/server 176d : 3-layer client/server 136e : Intranet/Internet 244f : Other 21No answer 42Total 691
Figure 5-5-22 Per-Architecture-Type Effort Distribution (Development)
Table 5-5-23 Per-Architecture-Type Effort Basic Statistics (Development)
(unit: person-hours)Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 42 62 761 1,417 2,969 144,838 7,092 23,856b : Mainframe 30 220 3,813 16,530 84,448 956,505 112,367 231,948c : 2-layer client/server 176 349 2,310 6,389 16,094 589,050 20,421 52,490d : 3-layer client/server 136 249 2,767 6,802 19,164 609,620 31,782 86,879e : Intranet/Internet 244 90 4,623 14,501 36,199 341,250 34,612 52,835
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
Ove
r 50,
000
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 101
Enhancement projects Among the whole “enhancement” projects, projects that used 5,000 person hours or less take up 47% (207
projects). The project count decreases as the effort value increases. Enhancement projects used less effort than “development” projects regardless of the type of architecture. Table 5-5-24 Per-Architecture-Type Effort Project Count (Enhancement)
308_Architecture Number of Projects
a : Stand-alone 53b : Mainframe 55c : 2-layer client/server 108d : 3-layer client/server 66e : Intranet/Internet 92f : Other 27No answer 41Total 442
Figure 5-5-25 Per-Architecture-Type Effort Distribution (Enhancement)
Table 5-5-26 Per-Architecture-Type Effort Basic Statistics (Enhancement)
(unit: person-hours)Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 53 180 716 1,443 3,400 20,736 2,997 3,905b : Mainframe 55 165 1,668 3,780 7,272 232,271 18,466 44,452c : 2-layer client/server 108 324 1,978 5,707 14,894 353,685 20,212 47,973d : 3-layer client/server 66 1,112 2,939 7,659 28,980 309,068 35,946 68,721e : Intranet/Internet 92 101 1,995 4,109 8,547 204,575 13,560 27,591
Actual Effort (Major-development phases) [person-hours]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 50,
000
102 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.5.4 Per-Business-Type Effort
This section presents the per-business-type effort distribution of “development” projects and the per-business-type effort basic statistics of these projects. This section also presents those of “enhancement” projects separately. If projects that provide their business characteristics data for an analysis case are insufficient in number, this section presents only the project count for the case while omitting distribution graphs and basic statistics tables.
Development projects “Development” projects of the “accounting” type have the largest median value of 16,740 person-hours, and those
of the “management” type have the next largest of 16,375 person-hours. Development projects of “customer management” also have a relatively large median value of 11,520 person-hours. Development projects of the “technology/control” type have the smallest median value of 3,984 person-hours, less than one quarter of that of “accounting” projects.
Enhancement projects “Enhancement” projects of the “accounting” type have the largest median value of 13,650 person-hours, and those
of the “sales” type have the next largest of 7,905 person-hours. Enhancement projects of the “management” type also have a relatively large median value of 7,683 person-hours. Enhancement projects of customer “management” have the smallest median value of 2,264 person-hours. Table 5-5-27 Per-Business-Type
Effort Project Count (Development)
Table 5-5-28 Per-Business-Type Effort Project Count (Enhancement)
202_Business type Number of
Projects 202_Business type Number of
Projects A : Management/planning 8 a : Management/planning 2B : Accounting 49 b : Accounting 17C : Sales 76 c : Sales 43D : Production/distribution 25 d : Production/distribution 11E : Personnel/welfare 12 e : Personnel/welfare 15f : General management 76 f : General management 55G : General affairs 6 g : General affairs 13H : Research/development 6 h : Research/development 7i : Technology/control 24 i : Technology/control 20j : Master management 4 j : Master management 8K : Ordering/inventory 43 k : Ordering/inventory 16l : Distribution management 9 l : Distribution management 4M : Subcontractor management 1 m : Subcontractor management 1N : Contract/transfer 22 n : Contract/transfer 8O : Customer management 33 o : Customer management 20P : Product planning (per-product) 10 q : Product management (per-product) 21Q : Product management (per-product) 14 r : Facility (stores) 2r : Facility (stores) 5 s : Information analysis 11s : Information analysis 38 t : Other 89t : Other 133 No answer 79No answer 97 Total 442Total 691
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 103
Table 5-5-29 Per-Business-Type Effort Basic Statistics (Development) Table 5-5-30 Per-Business-Type Effort Basic Statistics (Enhancement)
(unit: person-hours)Business Type N Min P25 Med P75 Max Mean S.D.
B : Accounting 49 373 7,260 16,740 44,778 956,505 53,054 139,900C : Sales 76 543 3,880 8,256 24,097 609,620 42,949 104,358F : General management 76 127 5,709 16,375 38,186 260,575 35,165 49,844I : Technology/control 24 243 2,887 3,984 5,951 162,850 13,239 32,898K : Ordering/inventory 43 90 2,670 7,031 23,025 489,090 36,924 89,071O : Customer management 33 290 4,483 11,520 21,038 149,325 21,559 32,655S : Information analysis 38 447 3,641 6,038 15,955 106,689 19,194 29,463
(unit: person-hours)Business Type N Min P25 Med P75 Max Mean S.D.
b : Accounting 17 334 3,057 13,650 42,030 309,068 55,397 92,266c : Sales 43 207 3,793 7,905 22,175 140,000 19,524 27,910f : General management 55 651 4,033 7,683 36,109 242,420 29,115 46,972i : Technology/control 20 714 3,049 6,498 10,697 20,736 8,178 6,434k : Ordering/inventory 16 839 1,942 4,000 5,756 29,835 5,709 6,851o : Customer management 20 1,130 1,699 2,264 3,454 6,872 2,986 1,888s : Information analysis 11 101 2,071 4,218 36,948 353,685 48,200 104,867
104 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.6 Number of Staff per Month 5.6.1 Per-Project-Type Number of Staff Per Month
The head-count per month is calculated from effort and the number of months. The derived indicators “Number of staff per month” in Appendix A.4 describe more details.
This section presents the distribution and the basic statistics of the head-count per month with respect to the development projects that provide valid data about the head-count per month. This section also presents separately the same analysis of the enhancement projects that provide valid per-month head-count data.
Among the whole projects concerned (523 projects), projects that used 2 to 4 staff per month take up the largest portion of 20% (104 projects). Projects that used 10 staff per month or less take up a major portion of 63% (331 projects).
The “development” projects have the median value between 7 and 8 staff per month, and so do the “enhancement” projects. The “maintenance/support” projects have the median value of 6 staff per month.
Among the whole “development” type (294 projects), projects that used 2 to 4 staff per month take up the largest portion of 20% (58 projects). Development projects that used 10 staff per month or less take up a major portion of 61% (180 projects).
Among the whole “enhancement” type (209 projects), projects that used 2 to 4 staff per month take up the largest portion of 20% (42 projects). Enhancement projects that used 10 staff per month or less take up a major portion of 67% (139 projects). Table 5-6-1 Per-Project-Type Head-Count Per Month Project Count
103_Project type Number of Projects 105_Project category Number of
Projects a : Commercial package development 8b : Entrusted development 280c : For in-house use 3d : Prototyping 0
a : Development 294
e : Other 3a : Commercial package development 4b : Entrusted development 113c : For in-house use 0d : Prototyping 0
b : Maintenance/support 120
e : Other 3a : Commercial package development 0b : Entrusted development 20c : For in-house use 0d : Prototyping 0
c : Redevelopment 20
e : Other 0a : Commercial package development 3b : Entrusted development 79c : For in-house use 2d : Prototyping 0
d : Enhancement 89
e : Other 5Total 523 523
Figure 5-6-2 Distribution of Head-Count Per Month
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 60
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 105
Figure 5-6-3 Distribution of Per-Project-Type Head-Count Per Month
Table 5-6-4 Basic Statistics of Per-Project-Type Head-Count Per Month
(unit: persons)Project Type N Min P25 Med P75 Max Mean S.D.
All project types 523 0.3 3.6 7.3 15.4 385.9 15.1 26.4a : Development 294 0.4 3.8 7.3 16.8 129.5 15.4 21.4b : Maintenance/support 120 0.3 3.0 6.0 13.7 385.9 16.0 40.2c : Redevelopment 20 1.0 4.7 7.9 29.0 152.6 21.4 33.6d : Enhancement 89 0.3 3.9 7.8 13.2 68.3 11.8 13.4
Figure 5-6-5 Distribution of Head-Count Per Month (Development)
(unit: persons) N Min P25 Med P75 Max Mean S.D.
294 0.4 3.8 7.3 16.8 129.5 15.4 21.4 Figure 5-6-6 Distribution of Head-Count Per Month (Enhancement)
(unit: persons) N Min P25 Med P75 Max Mean S.D.
209 0.3 3.4 7.2 13.5 385.9 14.2 31.7
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 60
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 60
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 20
a: Development b: Maintenance/supportc: Redevelopment d: Enhancement
106 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.6.2 Per-Industry-Type Number of Staff Per Month
This section presents the distribution and the basic statistics of head-count per month on a per industry type basis with respect to development projects that provide valid data about head-count per month. This section also presents separately the same analysis of enhancement projects that provide valid per-month head-count data. These analyses are made for the top five “major industry types” (manufacturing, information and telecommunications, wholesale/retail trade, finance/insurance, and public service), which are most frequently found in the collected project data.
Development projects The percentage of “development” projects that used 10 staff per month or less varies with the type of industry as
follows: 80% (39 projects) — “manufacturing” (49 projects), 76% (39 projects) — “information and telecommunications” (51 projects), 54% (14 projects) — “wholesale/retail trade” (26 projects), 50% (36 projects) — “finance/insurance” (72 projects), and 50% (12 projects) — “public service” (24 projects). Development projects of the “finance/insurance” type that used 60 staff per month or more take up as much as 14% (10 projects), which indicates that the finance/insurance type includes more large size project than other industry types.
Development projects of the “manufacturing” type have the median value of 4.3 staff per month and those of the “information and telecommunications” type have the median value of 5.5 staff per month. The rest of the industry types have larger median values as follows: 9.2 staff per month (“wholesale/retail trade”), 9.8 staff per month (“public service”), and 10.0 staff per month (“finance/insurance”). Table 5-5-17 Per-Industry-Type Effort Basic Statistics (Development) shows that these three industry types have large effort median values. That is, development projects of these three types include large size projects. Figure 5-6-7 Per-Industry-Type
Head-Count Per Month Project Count (Development)
Table 5-6-8 Distribution of Per-Industry-Type Head-Count Per Month (Development)
201_Industry Type (Major Type) Number
of Projects
E : Construction 3F : Manufacturing 49G : Electricity, gas, heat supply and water 8H : Information and communications 51I : Transport 17J : Wholesale and retail trade 26K : Finance and insurance 72L : Real estate 6M : Eating and drinking places,
accommodations 6
N : Medical, health care and welfare 4O : Education, learning support 1P : Compound services 1Q : Services, N.E.C. 18R : Government, N.E.C. 24S : Industries unable to classify 8Total 294
Table 5-6-9 Basic Statistics of Per-Industry-Type Head-Count Per Month (Development)
(unit: persons)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 49 0.5 2.5 4.3 8.4 80.9 8.7 12.9H : Information and communications 51 0.8 3.5 5.5 9.4 32.8 8.2 7.1J: Wholesale and retail trade 26 2.0 4.0 9.2 17.9 62.3 15.8 17.2K : Finance and insurance 72 1.7 5.0 10.0 33.9 114.6 24.0 28.1R : Government, N.E.C. 24 0.6 3.2 9.8 15.4 111.6 14.7 22.1
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 20
F: Manufacturing H: Information and communicationsJ: Wholesale and retail trade K: Finance and insurance R: Government
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 107
Enhancement projects The percentage of “enhancement” projects that used 10 staff per month or less varies with the type of industry as
follows: 71% (10 projects) — “manufacturing” (14 projects), 52% (24 projects) — “information and telecommunications” (46 projects), 75% (9 projects) — “wholesale/retail trade” (12 projects), 63% (37 projects) — “finance/insurance” (59 projects), and 74% (23 projects) — “public service” (31 projects).
As opposed to the “development” projects, enhancement projects of the “manufacturing” type and those of the “information and telecommunications” type have larger median values (8.4 staff per month and 9.7 staff per month, respectively). Enhancement projects of the “finance/insurance” type have a smaller median value of 6.8 staff per month with little peaks in the range between 14 and 18 staff per month. Table 5-6-10 Per-Industry-Type Head-Count Per Month Project Count (Enhancement)
201_Industry Type (Major Type) Number of Projects
E : Construction 2F : Manufacturing 14G : Electricity, gas, heat supply and water 5H : Information and communications 46I : Transport 14J : Wholesale and retail trade 12K : Finance and insurance 59L : Real estate 2N : Medical, health care and welfare 2O : Education, learning support 3P : Compound services 2Q : Services, N.E.C. 4R : Government, N.E.C. 31S : Industries unable to classify 13Total 209
Figure 5-6-11 Distribution of Per-Industry-Type Head-Count Per Month (Enhancement)
Table 5-6-12 Basic Statistics of Per-Industry-Type Head-Count Per Month (Enhancement)
(unit: persons)Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 14 1.2 6.5 8.4 12.0 27.9 9.6 6.9H : Information and communications 46 0.4 4.5 9.7 15.4 170.5 18.9 30.1J: Wholesale and retail trade 12 1.6 3.5 6.9 8.7 44.5 9.6 11.6K : Finance and insurance 59 0.3 3.7 6.8 14.5 68.1 11.5 13.1R : Government, N.E.C. 31 0.3 3.1 7.6 10.4 68.3 11.9 15.9
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
Ove
r 20
F: Manufacturing
H: Information and communications
J: Wholesale and retail trade
K: Finance and insurance
R: Government
108 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.6.3 Per-Architecture-Type Number of Staff Per Month
This section presents the distribution and the basic statistics of the head-count per month on a per architecture type basis. This section also presents those of “enhancement” projects separately.
Development projects The distribution of staff per month has a broad peak in the range as follows: 28% (24 projects) between 2 and 4
staff per month — “2-layer client/server” (87 projects), 22% (15 projects) between 2 and 4 staff per month — “3-layer client/server” (69 projects), and 16% (14 projects) between 6 and 8 staff per month — “intranet/Internet” (87 projects).
The median value of the development projects varies with the type of architecture as follows: 5.2 staff per month — “2-layer client/server”, 7.1 staff per month — “3-layer client/server”, 9.4 staff per month — “intranet/Internet”, and 10.6 staff per month — “mainframe”. Table 5-6-13 Per-Architecture-Type Head-Count Per Month Project Count (Development)
308_Architecture Number of Projects
a : Stand-alone 12b : Mainframe 10c : 2-layer client/server 87d : 3-layer client/server 69e : Intranet/Internet 87f : Other 15No answer 14Total 294
Figure 5-6-14 Distribution of Per-Architecture-Type Head-Count Per Month (Development)
Table 5-6-15 Basic Statistics of Per-Architecture-Type Head-Count Per Month
(Development) (unit: persons)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 12 1.0 2.2 2.8 5.5 58.4 10.3 17.5b : Mainframe 10 3.3 7.5 10.6 20.1 33.6 13.9 9.8c : 2-layer client/server 87 0.8 2.8 5.2 10.9 63.5 8.8 10.1d : 3-layer client/server 69 0.5 4.0 7.1 16.2 129.5 17.8 26.4e : Intranet/Internet 87 0.8 4.5 9.4 25.1 123.6 21.5 26.4
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 20
5. Statistics of Major Project Elements
IPA/SEC White Paper 2007 on Software Development Projects in Japan 109
Enhancement projects The distribution has a broad peak in the range between 2 to 12 staff per month as follows: 28% (17 projects)
between 2 and 4 staff per month — “2-layer client/server” (60 projects), 44% (12 projects) between 2 and 6 staff per month — “3-layer client/server” (27 projects), and 78% (38 projects) between 1 and 12 staff per month — “intranet/Internet” (49 projects).
The enhancement projects come in two groups with respect to the median value. One group consists of only the “stand-alone” type, which has the median value of 4.9 staff per month. The other consists of the rest of architecture types, which have the following median values: 5.1 staff per month — “2-layer client/server”, 7.6 staff per month — “3-layer client/server”, 7.7 staff per month — “intranet/Internet”, and 7.2 staff per month — “mainframe”. Table 5-6-16 Per-Architecture-Type Head-Count Per Month Project Count (Enhancement)
308_Architecture Number of Projects
a : Stand-alone 20b : Mainframe 19c : 2-layer client/server 60d : 3-layer client/server 27e : Intranet/Internet 49f : Other 18No answer 16Total 209
Figure 5-6-17 Distribution of Per-Architecture-Type Head-Count Per Month (Enhancement)
Table 5-6-18 Basic Statistics of Per-Architecture-Type Head-Count Per Month (Enhancement)
(unit: persons)Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 20 0.8 3.8 4.9 7.2 22.9 6.3 5.2b : Mainframe 19 0.3 3.7 7.2 11.7 96.9 13.3 21.9c : 2-layer client/server 60 0.3 2.8 5.1 11.9 170.5 11.6 23.7d : 3-layer client/server 27 2.3 4.4 7.6 12.7 39.6 10.3 9.0e : Intranet/Internet 49 0.3 3.5 7.7 11.4 68.3 11.4 14.4
Head-Count Per Month [persons]
Num
ber o
f Pro
ject
s
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
Ove
r 20
110 IPA/SEC White Paper 2007 on Software Development Projects in Japan
5.6.4 Per-Business-Type Number of Staff Per Month
This section presents the distribution and the basic statistics of the head-count per month on a per business type basis with respect to “development” projects. This section also presents those of “enhancement” projects separately. If projects that provide their business characteristics data for an analysis case are insufficient in number, this section presents only the project count for the case while omitting distribution graphs and basic statistics tables.
Development projects Among the “development” type, projects of the “ordering/inventory” type have the smallest median value of 4.3
staff per month while projects of the “accounting”, “sales”, or “management” type have larger median values (7.3, 7.4, and 7.4 staff per month, respectively).
Enhancement projects As opposed to the “development” projects, the median value of “enhancement” projects of the “ordering/inventory”
type (6.8 staff per month) is not the smallest. Table 5-6-19 Per-Business-Type
Head-Count Per Month Project Count (Development)
Table 5-6-20 Per-Business-Type Head-Count Per Month Project Count (Enhancement)
202_Business type Number of
Projects 202_Business type Number of
Projects A : Management/planning 4 A : Management/planning 1B : Accounting 25 B : Accounting 8C : Sales 45 C : Sales 30D : Production/distribution 8 D : Production/distribution 4E : Personnel/welfare 3 e : Personnel/welfare 7F : General management 30 f : General management 21G : General affairs 5 g : General affairs 7H : Research/development 5 h : Research/development 5I : Technology/control 19 i : Technology/control 19J : Master management 1 j : Master management 3K : Ordering/inventory 21 k : Ordering/inventory 14L : Distribution management 7 m : Subcontractor management 1N : Contract/transfer 8 n : Contract/transfer 6O : Customer management 14 o : Customer management 11P : Product planning (per-product) 1 q : Product management (per-product) 5Q : Product management (per-product) 6 s : Information analysis 6S : Information analysis 18 t : Other 49T : Other 53 No answer 12No answer 21 Total 209Total 294
Table 5-6-21 Basic Statistics of Per-Business-Type Head-Count Per Month (Development)
(unit: persons)Business Type N Min P25 Med P75 Max Mean S.D.
b : Accounting 25 0.5 3.4 7.3 29.2 87.4 18.5 21.6c : Sales 45 1.2 4.7 7.4 17.9 114.6 17.5 23.4f : General management 30 0.6 2.9 7.4 12.8 39.1 10.7 11.0i : Technology/control 19 1.0 3.9 6.7 9.1 32.8 8.5 8.0k : Ordering/inventory 21 1.5 2.7 4.3 11.1 62.3 11.3 16.5o : Customer management 14 0.5 4.0 4.8 9.9 80.9 15.0 23.3s : Information analysis 18 0.4 3.7 5.8 19.2 50.5 11.4 12.7
Table 5-6-22 Basic Statistics of Per-Business-Type Head-Count Per Month (Enhancement)
(unit: persons)Business Type N Min P25 Med P75 Max Mean S.D.
b : Accounting 8 2.1 2.6 3.6 9.7 68.3 14.0 23.2c : Sales 30 1.2 3.0 7.3 13.9 57.2 12.0 13.8f : General management 21 0.3 3.3 7.8 9.9 28.1 8.1 6.5i : Technology/control 19 2.2 8.3 9.1 14.4 23.2 11.0 6.0k : Ordering/inventory 14 3.1 4.3 6.8 7.6 14.4 6.6 3.0o : Customer management 11 1.6 2.8 3.9 6.5 11.0 5.0 3.3s : Information analysis 6 0.4 2.6 4.8 24.4 170.5 35.6 67.1
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 111
6 Analysis of the Relationship among Effort, Development Schedule, and Size
6.1 Scope of This Chapter
This chapter analyzes the relationship among effort, development schedule, size, and other factors. 6.1.1 Introduction
Major factors analyzed in Chapter 6 include effort, development schedule, size (FP and SLOC), FP_productivity (FP size per effort), and SLOC_productivity (SLOC size per effort). Table 6-1-1 shows the combinations of major factors analyzed and presented in this chapter. These factors were analyzed with stratification by different characteristics such as “all project types”, “development projects”, “industry types”, or “system architectures”.
Table 6-1-2 shows which factor is analyzed in this chapter against what characteristic with what stratification. This table lists factors in the topmost row and characteristics in the second row to present the combinations of factors and characteristics. (For example, effort against development schedule, and FP size against effort.) The third and succeeding rows show in what strata analyses were made (for example, project types or industry types). The numbers (x.x.x) in Table 6-1-2 are the section numbers of this chapter. You can pick a section number in the table and look up the row and column intersecting at the number to know what kind of analysis is presented in that section.
Section 6.3 analyzes the relationship between effort and development schedule, with no consideration of FP or SLOC size.
Sections 6.4 and 6.5 analyze the FP size, and sections 6.6 and 6.7 analyze the SLOC size. Section 6.8 analyzes the relationship between the FP size and SLOC size.
This chapter uses the note “Mixed_FP_measurement_methods” to refer to an aggregate of projects each of which has a clearly written FP “measurement method name”, including IFPUG, SPR, NESMA estimated method, and other methods proprietary to companies. This chapter also uses the note “Mixed primary programming languages” to refer to an aggregate of projects each of which has a primary programming language name written down in its “Primary programming language_1, _2, or _3”. See Chapter 3 for more information on the “IFPUG_group” and “major_programming_language_groups”.
Based on the FP measurement method, the FP size was analyzed under the following sampling criteria: the 2 mixed_FP_measurement_methods” and the “IFPUG_group”. The SLOC size was analyzed with stratification by the “mixed primary programming languages” or the “major_programming_language_group” based on the programming language actually used by the projects. Table 6-1-1 Combinations of Major Factors
FP size SLOC size Effort Development
schedule FP_ productivity
SLOC_ productivity
Reliability (Number of defects, defects density)
Personnel assignment (Number_of_staff_per_month)
Outsourcing ratio
FP size SLOC size Effort Development schedule
FP_productivity SLOC_ productivity
Reliability (Number of defects, defects density)
Chapter 7
Chapter 7
Personnel assignment (Number_of_staff_per_month)
Outsourcing ratio
112 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Table 6-1-2 Combinations of Factors, Characteristics, and Stratification
Mix
ed F
P m
easu
rem
ent
met
hods
Mix
ed P
rimar
yP
rogr
amm
ing
Lang
uage
s
Prim
ary
prog
ram
min
gla
ngua
ge
Prim
ary_
prog
ram
min
g_l
angu
age_
grou
p
CharacteristicsD
evel
opm
ent
sche
dule
SLOCsize FP size
Num
ber_
of_s
taff_
per_
mon
th
Out
sour
cing
_rat
io
FP size
Num
ber_
of_s
taff_
per_
mon
th
Out
sour
cing
_rat
io
SLOCsize
Num
ber_
of_s
taff_
per_
mon
th
Out
sour
cing
_rat
io
All projecttype 6.4.1 6.4.2 6.6.1 6.6.2
6.3.16.3.2 6.4.3 6.4.4 6.5.1 6.5.7 6.5.9 6.5.2 6.5.8 6.5.10
Type of Industry 6.3.3 6.4.5 6.5.3 6.6.8 6.7.3Architecture 6.3.4 6.4.6 6.5.4 6.6.9 6.7.4
Primary programminglanguage 6.3.5 6.8.1 6.5.5
6.6.36.6.4 ~7
6.7.16.7.2 6.7.6 6.7.7
Platform 6.5.6 6.7.56.3.66.3.7 6.4.7 6.4.8 6.5.11 6.5.17 6.5.18 6.5.12
Type of Industry 6.3.8 6.4.9 6.5.13 6.6.15 6.7.10Architecture 6.3.9 6.4.10 6.5.14 6.6.16 6.7.11
Primary programminglanguage 6.3.10 6.5.15
6.6.106.6.11 ~14
6.7.86.7.9 6.7.13 6.7.14
Platform 6.5.16 6.7.12
Factors
Types ofProjects
Focus of Analysis
SLOC_productivityFP_productivity SLOC size
IFPU
G_g
roup
Prim
ary_
prog
ram
min
g_l
angu
age_
grou
p
Mix
ed F
P m
easu
rem
ent
met
hods
Effort
FP size
Major development phases
Whole project
IFPU
G_g
roup
Effort
Effort
Whole project
Major development phases
Enh
ance
men
tD
evel
opm
ent Major development phases
6.1.2 Analyzed Data
The data analyzed in this chapter is the same data set defined in Section 5.1.1 “Analyzed Data”, unless otherwise noted. Where a different data set is used, the difference is written at a description of relevant stratification. If an analysis is made for the whole development schedule of projects, for example, this condition is written where the analysis is presented. 6.1.3 Analysis Procedure
The analyses presented in this chapter were made in accordance with the procedure described in Section 3.1.2. The data was analyzed and examined with the “stratification” shown in Table 6-1-2. 6.1.4 Analysis Approach
The relationship among effort, development schedule, FP size, and SLOC size were analyzed in terms of two kinds of correlation, linear and exponential, and then R2 was calculated for each of them to choose and present in Chapter 6 whichever kind that exhibits a better trend. See Appendix F for more information.
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 113
6.2 Distribution of Major Factors
Major factors presented in Chapter 6 include effort, development schedule, and size (FP size and SLOC size). Chapter 5 presents distribution of these factors in histograms and basic statistics. Use these histograms and statistics as a reference when you see the relationship among the factors analyzed in Section 6.3 or later. Table 6-2-1 Major Factors and Associated Sections
Factors Associated sections FP size 5.2 SLOC size 5.3 Development schedule 5.4 Effort 5.5 Number_of_staff_per_month 5.6
114 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.3 Effort and Development Schedule
This section presents the relationship between effort and development schedule. Some of data item names presented in this section accompany the note “Derived indicator”. See Appendix A.4 for the definitions and deriving methods of those data items. 6.3.1 Effort and Development Schedule:
Development, Whole Project
This section presents the relationship between the actual whole-project effort (including effort in the major development phases) and the actual whole-project development schedule (months) of the development projects that went through the major development phases (from basic design to system test).
Note that the whole-project data may include data about system planning and/or system test and data about the major development phases as well as data about the major development phases.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Actual_effort_ (whole_project) > 0 • Actual_months_ (whole_project) > 0
Analyzed data • X-axis: Actual_effort_ (whole_project)
(Derived indicator) • Y-axis: Actual_months_ (whole_project)
(Derived indicator)
The following equation gives the approximate correlation found here between effort and development schedule.
Months = A × (effort)B, where B = 0.32 and R2 = 0.55 The trend found here is that the whole-project development schedule (from system planning to acceptance
test *1) is proportional to the cube root of effort. Very few projects are found under the lower boundary of the 95% confidence interval as shown in the
scattergram. The above trend therefore can only serve as a reference that evaluates the feasibility of the whole-project development schedule based on the whole-project effort, though correlation between effort and development schedule varies more or less with project statistics.
*1) Including projects that went through the major development phases only. Figure 6-3-1 Whole-Project Effort and Development Schedule (Development) with
Confidence Intervals of 50% and 95%
0
5
10
15
20
25
30
35
40
45
50
0 100,000 200,000 300,000 400,000 500,000 600,000 700,000 800,000 900,000 1,000,000
Actual_months_ (Whole_project) [person-hour]
y(50%)y(-50%)y(95%)y(-95%)
Copyright IPA SEC
N=645
Actu
al_m
onth
s_ (W
hole
_pro
ject
) [mon
ths]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 115
6.3.2 Effort and Development Schedule: Development
This section presents the relationship between the actual effort in the major development phases and development schedule (months) of the development projects that went through the major development phases (from basic design to system test).
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development”. • Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between effort and development
schedule. Months = A × (effort)B, where B = 0.31 and R2 = 0.51 The trend found here is that the development schedule is proportional to the cube root of effort, though
correlation between effort and development schedule varies more or less with project statistics. Figure 6-3-2 Major-Development-Phase Effort and Development Schedule (Development)
with Confidence Intervals of 50% and 95%
0
5
10
15
20
25
30
35
0 50,000 100,000 150,000 200,000 250,000 300,000
Actual_effort_ (Major_development_phases) [person-hour]
y(50%)y(-50%)y(95%)y(-95%)
Copyright IPA SEC
N=283
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
6.3.3 Industry-Type-Based Effort and Development Schedule:
Development
This section presents the relationship between the actual effort in the major development phases and development schedule (months) of the development projects that went through the major development phases (from basic design to system test) with stratification by the five major industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development”. • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Many projects of the “finance and insurance” type have larger effort and longer development schedule than
those of other types. Projects of the “government” type tend to use effort of 80,000 person hours or less in relatively longer development schedule, with some exceptions.
116 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-3-3 Industry-Type-Based Effort and Development Schedule (Development)
0
5
10
15
20
25
30
35
0 50,000 100,000 150,000 200,000 250,000 300,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=221Ac
tual
_mon
ths_
(Maj
or_d
evel
opm
ent_
phas
es)
[mon
ths]
6.3.4 Architecture-Based Effort and Development Schedule:
Development
This section presents on a per-system-architecture basis the relationship between the actual effort in the major development phases and development schedule (months) of the development projects that went through the major development phases (from basic design to system test). Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value. • Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Many projects of the “intranet/Internet” architecture or the “3-layer client/server” architecture have large
effort and longer development schedule than those of other architectures. Figure 6-3-4 Architecture-Based Effort and Development Schedule (Development)
0
5
10
15
20
25
30
35
0 50,000 100,000 150,000 200,000 250,000 300,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=264
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 117
6.3.5 Primary-Programming-Language-Based Effort and Development Schedule: Development
This section presents on a per-primary-programming-language basis the relationship between the actual
effort in the major development phases and development schedule (months) of the development projects that went through the major development phases (from basic design to system test). Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Most projects have a development schedule of 20 months or less regardless of their actual effort, except for
some special ones. Figure 6-3-5 Primary-Programming-Language-Based Effort and Development Schedule
(Development)
0
5
10
15
20
25
30
35
0 50,000 100,000 150,000 200,000 250,000 300,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=230
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
6.3.6 Effort and Development Schedule: Enhancement, Whole Project
This section presents the relationship between the actual whole-project effort (including effort in the major development phases) and development schedule (months) of the enhancement projects that went through the major development phases (from basic design to system test). Note that the whole-project data may include data about system planning and/or system test as well as data about the major development phases.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Actual_effort_ (whole_project) > 0 • Actual_months_ (whole_project) > 0
Analyzed data • X-axis: Actual_effort_ (whole_project)
(Derived indicator) • Y-axis: Actual_months_ (whole_project)
(Derived indicator)
The following equation gives the approximate correlation found here between effort and development
schedule. Months = A × (effort)B, where B = 0.29 and R2 = 0.47
b: COBOL g: C h: VB q: Java
118 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-3-6 Whole-Project Effort and Development Schedule (Enhancement) with Confidence Intervals of 50% and 95%
0
10
20
30
40
50
60
0 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000
Actual_months_ (Whole_project) [person-hour]
y(50%)y(-50%)y(95%)y(-95%)
Actu
al_m
onth
s_ (W
hole
_pro
ject
) [m
onth
s]
Copyright IPA SEC
N=405
6.3.7 Effort and Development Schedule: Enhancement
This section presents the relationship between the actual effort in the major development phases and development schedule (months) of the enhancement projects that went through the major development phases (from basic design to system test).
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between effort and development
schedule. Months = A × (effort)B, where B = 0.31 and R2 = 0.37
Figure 6-3-7 Major-Development-Phase Effort and Development Schedule (Enhancement)
with Confidence Intervals of 50% and 95%
0
10
20
30
40
50
60
0 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000
Actual_effort_ (Major_development_phases) [person-hour]
y(50%)y(-50%)y(95%)y(-95%)
Copyright IPA SEC
N=197
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 119
6.3.8 Industry-Type-Based Effort and Development Schedule: Enhancement
This section presents the relationship between the actual effort in the major development phases and
development schedule (months) of the enhancement projects that went through the major development phases (from basic design to system test) with stratification by the five major industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 03_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Many projects of the “information and communications” type have larger effort with shorter development
schedule than those of other types. Projects of the “government” industry type tend to spend longer development schedule for maintenance.
Figure 6-3-8 Industry-Type-Based Effort and Development Schedule (Enhancement)
0
10
20
30
40
50
60
0 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=165
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
6.3.9 Architecture-Based Effort and Development Schedule:
Enhancement
This section presents on a per-system-architecture basis the relationship between the actual effort in the major development phases and development schedule (months) of the enhancement projects that went through the major development phases (from basic design to system test). Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 308_Architecture_1, _2, or _3 has a defined value. • Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Many projects of the 2-layer client/server architecture have longer development schedule than those of
other architectures.
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
120 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-3-9 Architecture-Based Effort and Development Schedule (Enhancement)
0
10
20
30
40
50
60
0 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=175Ac
tual
_mon
ths_
(Maj
or_d
evel
opm
ent_
phas
es)
[mon
ths]
6.3.10 Primary-Programming-Language-Based Effort and
Development Schedule: Enhancement
This section presents on a per-primary-programming-language basis the relationship between the actual effort in the major development phases and development schedule (months) of the enhancement projects that went through the major development phases (from basic design to system test). Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_effort_ (major_development_phases) > 0 • Actual_months_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_effort_
(major_development_phases) (Derived indicator)
• Y-axis: Actual_months_ (major_development_phases) (Derived indicator)
Projects using “COBOL” for their primary programming language have a trend that their enhancement
activities were completed in periods of time in a certain range regardless of the amount of effort. Figure 6-3-10 Primary-Programming-Language-Based Effort and Development Schedule
(Enhancement)
0
5
10
15
20
25
30
35
0 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000
Actual_effort_ (Major_development_phases) [person-hour]
Copyright IPA SEC
N=167
Actu
al_m
onth
s_ (M
ajor
_dev
elop
men
t_ph
ases
)[m
onth
s]
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 121
6.4 FP Size and Effort
This section presents the relationship between the FP size and effort. Some of data item names presented in this section accompany the note “Derived indicator”. Appendix A.4 describes the definitions and deriving methods of those data items.
This section analyzes projects that have valid FP size values with defined FP measurement methods. The first part of this section presents for every project type the analysis results of the projects that are categorized as the mixed_FP_measurement_methods, thereby showing the whole picture. The second part presents analysis of only IFPUG_group projects to ensure the reliability of FP size accuracy. 6.4.1 FP Size and Effort:
All Project Types, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and effort of projects of every type (development, maintenance/support, redevelopment, and enhancement) that are categorized as the mixed_FP_measurement_methods.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. • 701_FP_measurement_method_ (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “development” type have larger FP sizes and larger effort than those of other types.
Figure 6-4-1 FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods) with
Confidence Interval of 50%
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=451
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Development b: Maintenance/support c: Redevelopment d: Enhancement — y(50%) — y(-50%)
122 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-4-2 FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods) Magnified with Confidence Interval of 50% (FP ≤ 2,000 and effort ≤ 50,000)
0
5,000
10,000
15,000
20,000
25,000
30,000
35,000
40,000
45,000
50,000
0 200 400 600 800 1,000 1,200 1,400 1,600 1,800 2,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=451Ac
tual
_effo
rt_
(Maj
or_d
evel
opm
ent_
phas
es)
[per
son-
hour
]
Figure 6-4-3 FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods)
Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=451
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6.4.2 FP Size and Effort: All Project Types, IFPUG_Group
This section presents the relationship between the FP size and effort of projects of every type (development, maintenance/support, redevelopment, and enhancement) whose FP measurement method belongs to the IFPUG_group.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “development” type have larger FP sizes and larger effort than those of other types.
a: Development c: Redevelopment — y(50%)
b: Maintenance/support d: Enhancement — y(-50%)
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 123
Figure 6-4-4 FP Size and Effort (All Project Types, IFPUG_Group) with Confidence Interval of 50%
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=301
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-5 FP Size and Effort (All Project Types, IFPUG_Group) Magnified with
Confidence Interval of 50% (FP ≤ 2,000 and effort ≤ 50,000)
0
5,000
10,000
15,000
20,000
25,000
30,000
35,000
40,000
45,000
50,000
0 200 400 600 800 1,000 1,200 1,400 1,600 1,800 2,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=301
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-6 FP Size and Effort (All Project Types, IFPUG_Group) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=301
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Development b: Maintenance/support c: Redevelopment d: Enhancement — y(50%) — y(-50%)
a: Development c: Redevelopment — y(50%)
b: Maintenance/support d: Enhancement — y(-50%)
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
124 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.4.3 FP Size and Effort: Development, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and effort of the development projects that are
categorized as the mixed_FP_measurement_methods. One of the graphs presented in this section has logarithmic-scale X- and Y-axes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between the FP size and effort. Effort = A × (FP size)B, where B = 1.13 and R2 = 0.73 The FP size and effort have slightly strong positive correlation between them.
Figure 6-4-7 FP Size and Effort (Development, Mixed_FP_Measurement_Methods) with
Confidence Interval of 50%
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
y(50%)y(-50%)
Copyright IPA SEC
N=340
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-8 FP Size and Effort (Development, Mixed_FP_Measurement_Methods)
Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=340
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 125
6.4.4 FP Size and Effort: Development, IFPUG_Group
This section presents the relationship between the FP size and effort of the development projects whose FP measurement method belongs to the IFPUG_group. One of the graphs presented in this section has logarithmic-scale X- and Y-axes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between the FP size and effort. Effort = A × (FP size)B, where B = 1.17 and R2 = 0.78 The FP size and effort have strong correlation between them.
Figure 6-4-9 FP Size and Effort (Development, IFPUG_Group) with Confidence Interval of
50%
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
y(50%)y(-50%)
Copyright IPA SEC
N=218
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-10 FP Size and Effort (Development, IFPUG_Group) Magnified with Confidence
Interval of 50% (FP ≤ 2,000 and effort ≤ 60,000)
0
10,000
20,000
30,000
40,000
50,000
60,000
0 200 400 600 800 1,000 1,200 1,400 1,600 1,800 2,000
Actual_FP_Size_ (unadjusted)
y(50%)y(-50%)
Copyright IPA SEC
N=218
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
126 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-4-11 FP Size and Effort (Development, IFPUG_Group) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=218Ac
tual
_effo
rt_
(Maj
or_d
evel
opm
ent_
phas
es)
[per
son-
hour
]
6.4.5 Industry-Type-Based FP Size and Effort:
Development, IFPUG_Group
This section presents on a per-major-industry-type basis the relationship between the FP size and effort of the development projects whose FP measurement method belongs to the IFPUG_group. The major industry types presented in this section are the five industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”.
• 5001_Actual_FP_size_ (unadjusted)> 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “finance and insurance” type have larger effort than those of other types regardless of their
FP size. Figure 6-4-12 Industry-Type-Based FP Size and Effort (Development, IFPUG_Group)
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=136
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 127
6.4.6 Architecture-Based FP Size and Effort: Development, IFPUG_Group
This section presents on a per-system-architecture basis the relationship between the FP size and effort of the
development projects whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Many projects of the “intranet/Internet” or “client/server” type are large in both the FP size and effort.
Figure 6-4-13 Architecture-Based FP Size and Effort (Development, IFPUG_Group)
0
50,000
100,000
150,000
200,000
250,000
300,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=212
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
128 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.4.7 FP Size and Effort: Enhancement, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and effort of the enhancement projects that are
categorized as the mixed_FP_measurement_methods. One of the graphs presented in this section has logarithmic-scale X- and Y-axes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 701_FP_measurement_method_ (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted)> 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between the FP size and effort. Effort = A × (FP size)B, where B = 0.94 and R2 = 0.49 The trend found here is that the effort of enhancement projects does not increase as much as that of
development projects as the FP size increases. Figure 6-4-14 FP Size and Effort (Enhancement, Mixed_FP_Measurement_Methods) with
Confidence Interval of 50%
0
20,000
40,000
60,000
80,000
100,000
120,000
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000
Actual_FP_Size_ (unadjusted)
y(50%)y(-50%)
Copyright IPA SEC
N=93
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-15 FP Size and Effort (Enhancement, Mixed_FP_Measurement_Methods)
Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=93
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 129
6.4.8 FP Size and Effort: Enhancement, IFPUG_Group
This section presents the relationship between the FP size and effort of the enhancement projects whose FP measurement method belongs to the IFPUG_group. One of the graphs presented in this section has logarithmic-scale X- and Y-axes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted)> 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between the FP size and effort. Effort = A × (FP size)B, where B = 0.95 and R2 = 0.43 The trend found here is that the effort of enhancement projects does not increase as much as that of
development projects as the FP size increases. Figure 6-4-16 FP Size and Effort (Enhancement, IFPUG_Group) with Confidence Interval of
50%
0
20,000
40,000
60,000
80,000
100,000
120,000
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000
Actual_FP_Size_ (unadjusted)
y(50%)y(-50%)
Copyright IPA SEC
N=65
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-4-17 FP Size and Effort (Enhancement, IFPUG_Group) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=65
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
130 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.4.9 Industry-Type-Based FP Size and Effort: Enhancement, IFPUG_Group
This section presents on a per-major-industry-type basis the relationship between the FP size and effort of the
enhancement projects whose FP measurement method belongs to the IFPUG_group. The major industry types presented in this section are the five industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”.
• 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Figure 6-4-18 Industry-Type-Based FP Size and Effort (Enhancement, IFPUG_Group)
0
20,000
40,000
60,000
80,000
100,000
120,000
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=31
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 131
6.4.10 Architecture-Based FP Size and Effort: Enhancement, IFPUG_Group
This section presents on a per-system-architecture basis the relationship between the FP size and effort of the
enhancement projects whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 308_Architecture_1, _2, or _3 has a defined value. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Actual_months_
(major_development_phases) (Derived indicator)
Figure 6-4-19 Architecture-Based FP Size and Effort (Enhancement, IFPUG_Group)
0
20,000
40,000
60,000
80,000
100,000
120,000
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=65
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
132 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5 FP_Productivity
This section presents the analysis results of FP_productivity. The “FP_productivity” refers to the quotient obtained by dividing the FP size by the amount of effort used in the major development phases. That is, the FP_productivity is the FP size per person hour or per person month. (The coefficient of 160 hours/person month is assumed for conversion between person hours and person months.) Some of data item names presented in this section accompany the note “Derived indicator”. Appendix A.4 describes the definitions and deriving methods of those data items. This section analyzes projects that have valid FP size values with defined FP measurement methods.
The first part of this section presents for every project type the FP_productivity of the projects that are categorized as the mixed_FP_measurement_methods, thereby showing the whole picture. The second part presents analysis of only IFPUG_group projects to ensure the reliability of accuracy of the FP size, which is used as the divisor in calculation of the FP_productivity. 6.5.1 FP Size and FP_Productivity:
Development, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and FP_productivity of the development projects that are categorized as the mixed_FP_measurement_methods. This section analyzes the FP size of development projects categorized as the mixed_FP_measurement_methods. The first part of this section presents the whole picture of FP_productivity in a scattergram. The second part presents the basic statistics of FP_productivity on a per-FP-size-class basis.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: 5001_Actual FP size (unadjusted) • Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of 3,000 FPs or larger do not have high FP_productivity. Some projects of less than 1,000 FPs
have high FP_productivity with large variations. Figure 6-5-1 FP Size and FP_Productivity
(Development, Mixed_FP_Measurement_Methods)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,00
0
11,00
0
12,00
0
13,00
0
14,00
0
15,00
0
16,00
0
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=340
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 133
Figure 6-5-2 FP Size and FP_Productivity (Development, Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
Table 6-5-3 FP Size and FP_Productivity Basic Statistics
(Development, Mixed_FP_Measurement_Methods) (Unit: FPs/person-hour, FPs/160 person-hour)
FP size Unit N Min P25 Med P75 Max Mean S.D.All project types 340 0.013 0.061 0.104 0.195 0.837 0.149 0.126 Less than 400FPs 115 0.015 0.082 0.126 0.237 0.658 0.172 0.130 400FPs or more and less than 1,000FPs
106 0.014 0.062 0.112 0.185 0.837 0.150 0.126
1,000FPs or more and less than 3,000FPs
87 0.013 0.053 0.100 0.209 0.739 0.140 0.131
3,000FPs or more
FPs/ person-hour
32 0.018 0.046 0.063 0.104 0.312 0.086 0.063 All project types 340 2.13 9.75 16.64 31.17 133.88 23.83 20.15 Less than 400FPs 115 2.37 13.04 20.12 37.92 105.29 27.51 20.77 400FPs or more and less than 1,000FPs
106 2.30 9.99 18.00 29.63 133.88 24.03 20.14
1,000FPs or more and less than 3,000FPs
87 2.13 8.54 15.96 33.51 118.21 22.47 20.96
3,000FPs or more
FPs/ 160 person-hour
32 2.94 7.33 10.12 16.66 49.89 13.70 10.04 Figure 6-5-4 FP_Productivity Distribution
(Development, Mixed_FP_Measurement_Methods)
0
10
20
30
40
50
60
70
~0.0
25
~0.0
50
~0.0
75
~0.1
00
~0.1
25
~0.1
50
~0.1
75
~0.2
00
~0.2
25
~0.2
50
~0.2
75
~0.3
00
~0.3
25
~0.3
50
~0.3
75
~0.4
00
~0.4
25
~0.4
50
~0.4
75
~0.5
00
Ove
r 0.5
00
FP_productivity
Num
ber o
f pro
ject
s
0.80
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 400FPs
400FPs or
more and less than 1,000FPs
1,000FPs or
more and less than 3,000FPs
3,000FPs or more
FP size
134 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.2 FP Size and FP_Productivity: Development, IFPUG_Group
This section presents the relationship between the FP size and FP_productivity of the development projects whose FP measurement method belongs to the IFPUG_group.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of 1,000 FPs or larger have lower productivity than projects of less than 1,000 FPs. The
FP_productivity varies widely in the range less than 1,000 FPs. Figure 6-5-5 FP Size and FP_Productivity (Development, IFPUG_Group)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,00
0
11,00
0
12,00
0
13,00
0
14,00
0
15,00
0
16,00
0
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=218
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-6 FP-Size-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot
0.80
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 400FPs
400FPs or
more and less than 1,000FPs
1,000FPs or
more and less than 3,000FPs
3,000FPs or more
FP size
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 135
Table 6-5-7 FP-Size-Based FP_productivity Basic Statistics (Development, IFPUG_Group)
(Unit: FPs/person-hour, FPs/160 person-hour) FP size Unit N Min P25 Med P75 Max Mean S.D.
All project types 218 0.013 0.053 0.081 0.148 0.739 0.119 0.109 Less than 400FPs 61 0.020 0.080 0.108 0.181 0.429 0.145 0.099 400FPs or more and less than 1,000FPs
66 0.014 0.057 0.081 0.150 0.521 0.122 0.106
1,000FPs or more and less than 3,000FPs
61 0.013 0.044 0.061 0.102 0.739 0.111 0.135
3,000FPs or more
FPs/ person-hour
30 0.018 0.042 0.063 0.099 0.225 0.076 0.047 All project types 218 2.13 8.51 12.94 23.72 118.21 19.02 17.43 Less than 400FPs 61 3.22 12.75 17.34 28.98 68.71 23.12 15.92 400FPs or more and less than 1,000FPs
66 2.30 9.10 13.03 24.01 83.31 19.49 16.91
1,000FPs or more and less than 3,000FPs
61 2.13 7.04 9.76 16.39 118.21 17.82 21.60
3,000FPs or more
FPs/ 160 person-hour
30 2.94 6.72 10.01 15.88 35.97 12.11 7.45
136 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.3 Industry-Type-Based FP_Productivity: Development, IFPUG_Group
This section presents on a per-major-target-industry-type basis the FP_productivity of the development
projects whose FP measurement method belongs to the IFPUG_group. The major target industry types presented in this section are the five major industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”.
• 5001_Actual_FP_size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of the finance and insurance type have lower productivity than those of other types in terms of both
the mean and median. This low productivity cannot be attributed to industry-type-specific characteristics only. It is therefore meaningless to simply compare productivity values of industry types to each other.
See also the analysis results presented in Section 6.4.5. Figure 6-5-8 Industry-Type-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-9 Industry-Type-Based FP_Productivity Basic Statistics
(Development, IFPUG_Group) (Unit: FPs/person-hour)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 29 0.026 0.071 0.111 0.175 0.332 0.130 0.072H : Information and communications 18 0.050 0.070 0.114 0.169 0.478 0.143 0.108J : Wholesale/retail trade 24 0.041 0.060 0.067 0.093 0.739 0.117 0.145K : Finance and insurance 53 0.013 0.029 0.040 0.059 0.116 0.048 0.027R : Government, N.E.C. 12 0.020 0.074 0.099 0.131 0.266 0.113 0.069
0.50
0.40
0.30
0.20
0.10
0.00FP_productivity
F: Manufacturing
Type of industry_Major type
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 137
6.5.4 Architecture-Based FP_Productivity: Development, IFPUG_Group
This section presents on a per-system-architecture basis the FP_productivity of the development projects
whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted)> 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects for architectures that exploit networks such as the “client/server” and “intranet/Internet”
architectures generally have lower productivity than those of the stand-alone architecture. Many projects of the “3-layer client/server” or “intranet/Internet” architecture have large FP sizes with smaller
variations in productivity. See also the size data presented in Figure 6-4-13.
Figure 6-5-10 Architecture-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-11 Architecture-Based FP_Productivity Basic Statistics
(Development, IFPUG_Group) (Unit: FPs/person-hour)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 23 0.026 0.158 0.226 0.348 0.429 0.238 0.131 b : Mainframe 4 0.036 0.042 0.048 0.064 0.099 0.058 0.028 c : 2-layer client/server 41 0.014 0.064 0.100 0.173 0.739 0.149 0.148 d : 3-layer client/server 33 0.018 0.044 0.074 0.114 0.499 0.103 0.094 e : Intranet/Internet 111 0.013 0.053 0.070 0.104 0.478 0.093 0.071
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
a: Stand-alone
Architecture
b: Mainfram
e
c: 2-layer client/server
d: 3-layer client/server
e: Intranet/Internet
138 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.5 Primary-Programming-Language-Based FP_Productivity: Development, IFPUG_Group
This section presents the FP_productivity of the development projects whose FP measurement method is
the IFPUG_group, for each of the four major programming languages. Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”.
• 5001_Actual_FP_size_ (unadjusted)> 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects that used “COBOL” for their primary programming language have lower FP_productivity with
respect to the median and mean. Simultaneous use of different programming languages is commonly found. It seems therefore that this kind of analysis needs to consider combinations of programming languages actually used and the platform-based analysis presented in Section 6.5.6.
Figure 6-5-12 Primary-Programming-Language-Based FP_Productivity
(Development, IFPUG_Group) Box-and-Whisker Plot
Table 6-5-13 Primary-Programming-Language-Based FP_Productivity Basic Statistics
(Development, IFPUG_Group) (Unit: FPs/person-hour)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 31 0.014 0.037 0.053 0.068 0.225 0.059 0.039 g : C 34 0.015 0.045 0.079 0.168 0.403 0.114 0.100 h : VB 43 0.018 0.062 0.100 0.147 0.739 0.131 0.129 q : Java 33 0.013 0.063 0.074 0.112 0.478 0.101 0.082
0.50
0.40
0.30
0.20
0.10
0.00FP_productivity
b: COBOL
Primary programming language
g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 139
6.5.6 Platform-Based FP_Productivity: Development, IFPUG_Group
This section presents on a per-target-platform basis the FP_productivity of the development projects whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the target_platform_alternatives, projects were categorized into the Windows platform and Unix platform (see Appendix A.4) based on “Target platform_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Windows platform or Unix platform (Derived indicator)
derived from 309_Target_platform_1, _2, or _3. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP_size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of the Windows platform have slightly higher productivity than those of the Unix platform.
Figure 6-5-14 Platform-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-15 Platform-Based FP_Productivity Basic Statistics
(Development, IFPUG_Group) (Unit: FPs/person-hour)
Platform N Min P25 Med P75 Max Mean S.D. Windows 117 0.020 0.057 0.082 0.149 0.739 0.115 0.102 Unix 48 0.013 0.041 0.058 0.099 0.225 0.068 0.042
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Windows
Platform
Unix
140 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.7 Per-Month_Number_of_Staff and FP_Productivity: Development, Mixed_FP_Measurement_Methods
This section presents the relationship between the per-month_number_of_staff and FP_productivity
of the development projects that are categorized as the mixed_FP_measurement_methods. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
The first part of this section shows a scattergram to present the whole picture. The second part shows the productivity in box-and-whisker plots for above and below the per-month_number_of_staff of 10 separately.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) has a defined
value. • Actual_months_ (major_development_phases) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
The following equation gives the approximate correlation found here between the
per-month_number_of_staff and FP_productivity. FP_productivity = A × (per-month_number_of_staff)B, where B = -0.41 and R2 = 0.36 The box-and-whisker plots show that projects whose per-month_number_of_staff is 10 or more have rather
lower productivity than projects whose per-month_number_of_staff is less than 10. Figure 6-5-16 Per-Month_Number_of_Staff and FP_Productivity
(Development, Mixed_FP_Measurement_Methods)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0 10 20 30 40 50 60 70 80 90 100 110 120 130 140
Number_of_staff_per_month [persons]
Copyright IPA SEC
N=149
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-17 Per-Month-Number-of-Staff-Based
FP_Productivity (Development, Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
Table 6-5-18 Per-Month-Number-of-Staff -Based
FP_Productivity Basic Statistics (Development, Mixed_FP_Measurement_Methods)
(Unit: FPs/person-hour) Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D.
Less than 10 89 0.015 0.093 0.142 0.215 0.739 0.168 0.122 10 or more 60 0.013 0.036 0.055 0.071 0.312 0.068 0.056
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 10 Number_of_staff_per_month
10 or more
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 141
6.5.8 Per-Month_Number_of_Staff and FP_Productivity: Development, IFPUG_Group
This section presents the relationship between the per-month_number_of_staff and FP_productivity
of the development projects whose FP measurement method belongs to the IFPUG_group. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • Actual_months_ (major_development_phases) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
The following equation gives the approximate correlation found here between the
per-month_number_of_staff and FP_productivity. FP_productivity = A × (per-month_number_of_staff)B, where B = -0.37 and R2 = 0.34 The box-and-whisker plots show that projects whose per-month_number_of_staff is 10 or more have rather
lower productivity than projects whose per-month_number_of_staff is less than 10. Figure 6-5-19 Per-Month_Number_of_Staff and FP_Productivity
(Development, IFPUG_Group)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0 10 20 30 40 50 60 70 80 90 100 110 120 130 140
Number_of_staff_per_month [persons]
Copyright IPA SEC
N=101
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-20 Per-Month_Number-of-Staff-Based
FP_Productivity (Development, IFPUG_Group) Box-and-Whisker Plot
Table 6-5-21 Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics
(Development, IFPUG_Group) (Unit: FPs/person-hour)
Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D. Less than 10 46 0.022 0.081 0.117 0.176 0.739 0.153 0.127 10 or more 55 0.013 0.036 0.055 0.067 0.225 0.060 0.039
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 10 Number_of_staff_per_month
10 or more
142 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.9 Outsourcing_Ratio and FP_Productivity: Development, Mixed_FP_Measurement_Methods
This section presents the relationship among the outsourcing_ratio, FP size, and FP_productivity of the
development projects that are categorized as the mixed_FP_measurement_methods. See Derived Indicators in Appendix A.4 for the definition of outsourcing_ratio.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) has a defined
value. • Outsourcing_ratio ≥ 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Outsourcing_ratio (Derived indicator)• Y-axis: 5001_Actual_FP_size_ (unadjusted)
or FP_productivity (FP size / major-development-phases effort) (Derived indicator) [FPs / person hour]
No project has the outsourcing_ratio of zero. The outsourcing_ratio and FP_productivity do not have
clear correlation between them, while some of the projects of larger FP sizes have higher outsourcing_ratios. Figure 6-5-22 Outsourcing_Ratio and FP Size
(Development, Mixed_FP_Measurement_Methods)
0
2,000
4,000
6,000
8,000
10,000
12,000
14,000
16,000
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Actu
al_F
P_Si
ze_
(una
djus
ted)
Copyright IPA SEC
N=143
Figure 6-5-23 Outsourcing_Ratio and FP_Productivity
(Development, Mixed_FP_Measurement_Methods)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Copyright IPA SEC
N=143
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 143
6.5.10 Outsourcing_Ratio and FP_Productivity: Development, IFPUG_Group
This section presents the relationship among the outsourcing_ratio, FP size, and FP_productivity of the
development projects whose FP measurement method belongs to the IFPUG_group. See Derived Indicators in Appendix A.4 for the definition of outsourcing_ratio.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • Outsourcing_ratio ≥ 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Outsourcing_ratio (Derived indicator)• Y-axis: 5001_Actual_FP_size_ (unadjusted)
or FP_productivity (FP size / major-development-phases effort) (Derived indicator) [FPs / person hour]
No project has the outsourcing_ratio of zero. The outsourcing_ratio and FP_productivity do not have
clear correlation between them, while some of the projects of larger FP sizes have higher outsourcing_ratios. Figure 6-5-24 Outsourcing_Ratio and FP Size (Development, IFPUG_Group)
0
2,000
4,000
6,000
8,000
10,000
12,000
14,000
16,000
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Actu
al_F
P_Si
ze_
(una
djus
ted)
Copyright IPA SEC
N=133
Figure 6-5-25 Outsourcing_Ratio and FP_Productivity (Development, IFPUG_Group)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Copyright IPA SEC
N=133
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
144 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.11 FP Size and FP_Productivity: Enhancement, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and FP_productivity of the enhancement
projects that are categorized as the mixed_FP_measurement_methods. This section analyzes the FP size of the enhancement projects categorized as the mixed_FP_measurement_methods. The first part of this section shows a scattergram to present the whole picture. The second part shows that relationship on a per-class-of-FP-size basis.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 701_FP_measurement_method_ (actual) has a defined
value. • 5001_Actual_FP size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: 5001_Actual_FP size_ (unadjusted)• Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
In the range of FP size less than 1,000 FPs, some projects have high FP_productivity while variations in
the FP_productivity are large. Figure 6-5-26 FP Size and FP_Productivity
(Enhancement, Mixed_FP_Measurement_Methods)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
1.6
0 1,000 2,000 3,000 4,000 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=93
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-27 FP Size and FP_Productivity (Enhancement,
Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 400FPs
FP size
400FPs or
more and less than 1,000FPs
1,000FPs or more
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 145
Table 6-5-28 FP Size and FP_Productivity Basic Statistics (Enhancement, Mixed_FP_Measurement_Methods)
(Unit: FPs/person-hour, FPs/160 person-hour) FP size Unit N Min P25 Med P75 Max Mean S.D.
All project types 93 0.002 0.061 0.108 0.236 1.474 0.171 0.194 Less than 400FPs 61 0.002 0.061 0.116 0.217 0.672 0.158 0.139 400FPs or more and less than 1,000FPs
19 0.016 0.063 0.079 0.261 1.474 0.223 0.334
1,000FPs or more
FPs/ person-hour
13 0.022 0.070 0.096 0.222 0.416 0.155 0.140 All project types 93 0.27 9.77 17.30 37.78 235.77 27.28 31.04 Less than 400FPs 61 0.27 9.77 18.54 34.72 107.56 25.22 22.16 400FPs or more and less than 1,000FPs
19 2.54 10.01 12.61 41.70 235.77 35.64 53.47
1,000FPs or more
FPs/ 160 person-hour
13 3.48 11.13 15.43 35.59 66.57 24.74 22.38 Figure 6-5-29 FP_Productivity Distribution
(Enhancement, Mixed_FP_Measurement_Methods)
0
2
4
6
8
10
12
14
16
~0.
025
~0.
050
~0.
075
~0.
100
~0.
125
~0.
150
~0.
175
~0.
200
~0.
225
~0.
250
~0.
275
~0.
300
~0.
325
~0.
350
~0.
375
~0.
400
~0.
425
~0.
450
~0.
475
~0.
500
Ove
r 0.5
00
FP_productivity
Num
ber o
f pro
ject
s
6.5.12 FP Size and FP_Productivity: Enhancement, IFPUG_Group
This section presents the relationship between the FP size and FP_productivity of the enhancement projects whose FP measurement method belongs to the IFPUG_group.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”. • 5001_Actual_FP size_ (unadjusted)> 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: 5001_Actual_FP size_ (unadjusted)• Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects whose FP sizes are equal to or greater than 400 FPs and are less than 1,000 FPs have large
variations in their FP_productivity.
146 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-5-30 FP Size and FP_Productivity (Enhancement, IFPUG_Group)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
1.6
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=65FP
_pro
duct
ivity
(FP
siz
e/m
ajor
-dev
elop
men
t-pha
ses
effo
rt)
Figure 6-5-31 FP Size and FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-32 FP Size and FP_Productivity Basic Statistics (Enhancement, IFPUG_Group)
(Unit: FPs/person-hour, FPs/160 person-hour) FP size Unit N Min P25 Med P75 Max Mean S.D.
All project types 65 0.002 0.064 0.128 0.255 1.474 0.190 0.217 Less than 400FPs 37 0.002 0.074 0.158 0.243 0.672 0.175 0.146 400FPs or more and less than 1,000FPs
16 0.016 0.053 0.103 0.307 1.474 0.247 0.360
1,000FPs or more
FPs/ person-hour
12 0.022 0.066 0.097 0.247 0.416 0.160 0.145 All project types 65 0.27 10.24 20.47 40.81 235.77 30.40 34.74 Less than 400FPs 37 0.27 11.90 25.25 38.93 107.56 27.99 23.38 400FPs or more and less than 1,000FPs
16 2.54 8.45 16.54 49.11 235.77 39.57 57.65
1,000FPs or more
FPs/ 160 person-hour
12 3.48 10.53 15.53 39.50 66.57 25.61 23.14
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 400FPs
400FPs or
more and less than 1,000FPs
1,000FPs or more
FP size
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 147
6.5.13 Industry-Type-Based FP_Productivity: Enhancement, IFPUG_Group
This section presents on a per-major-target-industry-type basis the FP_productivity of the enhancement
projects whose FP measurement method belongs to the IFPUG_group. The major target industry types presented in this section are the five major industry types to which many of the sampled projects belong.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement”. • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”.
• 5001_Actual_FP size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of the “finance and insurance” type have lower productivity than those of other types in terms of
median. This low productivity cannot be attributed to industry-type-specific characteristics only. It is therefore meaningless to simply compare productivity values of industry types to each other.
See also the analysis results presented in Section 6.4.9. Figure 6-5-33 Industry-Type-Based FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-34 Industry-Type-Based FP_Productivity Basic Statistics
(Enhancement, IFPUG_Group) (Unit: FPs/person-hour)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 4 — — 0.092 — — — — H : Information and communications 3 — — 0.146 — — — — J : Wholesale/retail trade 7 — — 0.055 — — — — K : Finance and insurance 11 0.002 0.017 0.031 0.075 0.416 0.076 0.118 R : Government, N.E.C. 6 — — 0.076 — — — —
0.50
0.40
0.30
0.20
0.10
0.00FP_productivity
F: Manufacturing
Type of industry_Major type
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
148 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.14 Architecture-Based FP_Productivity: Enhancement, IFPUG_Group
This section presents on a per-system-architecture basis the FP_productivity of the enhancement projects
whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major enhancement phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 308_Architecture_1, _2, or _3 has a defined value • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP size_ (unadjusted) > 0 • FP_productivity (FP size / effort of major enhancement
phases) > 0
Analyzed data • FP_productivity (FP size / effort of major
enhancement phases) (Derived indicator) [FPs / person hour]
Projects for architectures that exploit networks such as the “client/server” and “intranet/Internet”
architectures generally have lower productivity than those of the stand-alone architecture. Many projects of the “intranet/Internet” architecture have larger FP sizes.
See also the analysis results presented in Section 6.4.10. Figure 6-5-35 Architecture-Based FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-36 Architecture-Based FP_Productivity Basic Statistics
(Enhancement, IFPUG_Group) (Unit: FPs/person-hour)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 19 0.074 0.154 0.256 0.332 1.474 0.329 0.320 b : Mainframe 3 — — 0.012 — — — — c : 2-layer client/server 16 0.031 0.075 0.134 0.198 0.425 0.164 0.119 d : 3-layer client/server 5 — — 0.049 — — — — e : Intranet/Internet 22 0.008 0.046 0.078 0.166 0.465 0.117 0.113
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP_productivity
a: Stand-alone
Architecture
b: Mainfram
e
c: 2-layer client/server
d: 3-layer client/server
e: Intranet/Internet
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 149
6.5.15 Primary-Programming-Language-Based FP_Productivity: Enhancement, IFPUG_Group
This section presents the FP_productivity of the enhancement projects whose FP measurement method is
the IFPUG_group, for each of the four major programming languages. Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major enhancement phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”
• 5001_Actual_FP size_ (unadjusted) > 0 • FP_productivity (FP size / effort of major enhancement
phases) > 0
Analyzed data • FP_productivity (FP size / effort of major
enhancement phases) (Derived indicator) [FPs / person hour]
Because of a small amount of available data, it is difficult to capture any trend.
Figure 6-5-37 Primary-Programming-Language-Based FP_Productivity (Enhancement,
IFPUG_Group) Box-and-Whisker Plot
Table 6-5-38 Primary-Programming-Language-Based FP_Productivity Basic Statistics
(Enhancement, IFPUG_Group) (Unit: FPs/person-hour)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 9 — — 0.031 — — — — g : C 5 — — 0.112 — — — — h : VB 12 0.008 0.076 0.139 0.213 0.416 0.169 0.139 q : Java 8 — — 0.084 — — — —
0.40
0.30
0.20
0.10
0.00
FP_productivityb: COBOL
Primary programming language
g: C h: VB q: Java
150 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.16 Platform-Based FP_Productivity: Enhancement, IFPUG_Group
This section presents on a per-target-platform basis the FP_productivity of the enhancement projects whose FP measurement method belongs to the IFPUG_group. Because of multiple choices allowed for the target_platform_alternatives, projects were categorized into the Windows platform and Unix platform (see Appendix A.4) based on “Target platform_1, _2, or _3”, whichever was relevant.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Windows platform or Unix platform (Derived indicator)
derived from 309_Target_platform_1, _2, or _3. • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP size_ (unadjusted) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
Projects of the Windows platform have higher productivity with larger productivity variations.
Figure 6-5-39 Platform-Based FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot
Table 6-5-40 Platform-Based FP_Productivity Basic Statistics
(Enhancement, IFPUG_Group) (Unit: FPs/person-hour)
Platform N Min P25 Med P75 Max Mean S.D. Windows 21 0.008 0.077 0.123 0.158 0.408 0.134 0.088 Unix 12 0.016 0.029 0.031 0.050 0.115 0.043 0.028
0.40
0.30
0.20
0.10
0.00
FP_productivity
Windows
Platform
Unix
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 151
6.5.17 Per-Month_Number_of_Staff and FP_Productivity: Enhancement, Mixed_FP_Measurement_Methods
This section presents the relationship between the per-month_number_of_staff and FP_productivity
of the enhancement projects that are categorized as the mixed_FP_measurement_methods. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
The first part of this section shows a scattergram to present the whole picture. The second part shows the productivity in box-and-whisker plots for above and below the per-month_number_of_staff of 10 separately.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method_ (actual) has a defined
value. • Actual_months_ (major_development_phases) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
The trend found here is that projects using larger per-month_number_of_staff have lower productivity.
Figure 6-5-41 Per-Month_Number_of_Staff and FP_Productivity
(Enhancement, Mixed_FP_Measurement_Methods)
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0 5 10 15 20 25 30 35 40 45 50
Number_of_staff_per_month [persons]
Copyright IPA SEC
N=33
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-42 Per-Month-Number-of-Staff-Based
FP_Productivity (Enhancement, Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
Table 6-5-43 Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics
(Enhancement, Mixed_FP_Measurement_Methods) (Unit: FPs/person-hour)
Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D. Less than 10 20 0.012 0.073 0.091 0.132 0.416 0.121 0.102 10 or more 13 0.015 0.022 0.035 0.051 0.076 0.040 0.020
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 10
Number_of_staff_per_month 10 or more
152 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.5.18 Per-Month_Number_of_Staff and FP_Productivity: Enhancement, IFPUG_Group
This section presents the relationship between the per-month_number_of_staff and FP_productivity
of the enhancement projects whose FP measurement method belongs to the IFPUG_group. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method” • Actual_months_ (major_development_phases) > 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: FP_productivity (FP size /
major-development-phases effort) (Derived indicator) [FPs / person hour]
The trend found here is that projects using larger per-month_number_of_staff have lower productivity.
Figure 6-5-44 Per-Month_Number_of_Staff and FP_Productivity
(Enhancement, IFPUG_Group)
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0 5 10 15 20 25 30 35 40 45 50
Number_of_staff_per_month [person]
Copyright IPA SEC
N=25
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-5-45 Per-Month-Number-of-Staff-Based
FP_Productivity (Enhancement, IFPUG_Group) Box-and-Whisker Plot
Table 6-5-46 Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics
(Enhancement, IFPUG_Group) (Unit: FPs/person-hour)
Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D. Less than 10 14 0.012 0.059 0.088 0.140 0.416 0.112 0.099 10 or more 11 0.016 0.026 0.048 0.053 0.076 0.042 0.020
0.40
0.30
0.20
0.10
0.00
FP_productivity
Less than 10 Number_of_staff_per_month
10 or more
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 153
6.5.19 Outsourcing_Ratio and FP_Productivity: Development, IFPUG_Group
This section shows a scattergram to present the relationship among the outsourcing_ratio, FP size, and
FP_productivity of the enhancement projects that are categorized as the mixed_FP_measurement_methods. See Derived Indicators in Appendix A.4 for the definition of outsourcing_ratio,.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method_ (actual)has a defined
value. • Outsourcing_ratio, ≥ 0 • FP_productivity (FP size / major-development-phases
effort) > 0
Analyzed data • X-axis: Outsourcing_ratio, (Derived indicator)• Y-axis: 5001_Actual_FP size_ (unadjusted)
or FP_productivity (FP size / major-development-phases effort) (Derived indicator) [FPs / person hour]
The outsourcing_ratio and FP_productivity do not have clear correlation between them, while many of the
projects of larger FP sizes have higher outsourcing_ratios. Figure 6-5-47 Outsourcing_Ratio, and FP Size (Enhancement,
Mixed_FP_Measurement_Methods)
0
500
1,000
1,500
2,000
2,500
3,000
3,500
4,000
4,500
5,000
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Actu
al_F
P_Si
ze_
(una
djus
ted)
Copyright IPA SEC
N=29
Figure 6-5-48 Outsourcing_Ratio and FP_Productivity (Enhancement,
Mixed_FP_Measurement_Methods)
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Copyright IPA SEC
N=29
FP_p
rodu
ctiv
ity(F
P s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
154 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.6 SLOC Size and Effort This section presents the relationship between the SLOC size and effort. Some of data item names presented in this section accompany the note “Derived indicator”. Appendix A.4 describes the definitions and deriving methods of those data items. 6.1.1 SLOC Size and Effort:
All Project Types, Mixed Primary Programming Languages
This section presents the relationship between the SLOC size and effort of all project types (development, maintenance/support, redevelopment, and enhancement) that used any of the defined primary programming languages.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. • 312_Primary_programming_language has a defined
value. • Actual_net_SLOC_size > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
The following equation gives the approximate correlation found here between the SLOC size and effort. Effort = A × (SLOC size)B, where B = 0.68 and R2 = 0.57 Ignoring characteristic differences among all project types and primary programming languages, the above
analysis cannot clarify statistical trends specific to the primary programming language. Programming-language-specific trends are analyzed in the sections that follow.
Figure 6-6-1 SLOC Size and Effort (All Project Types, Mixed Primary Programming
Languages) with Confidence Interval of 50%
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
900,000
1,000,000
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
y(50%)y(-50%)
Copyright IPA SEC
N=671
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 155
Figure 6-6-2 SLOC Size and Effort (All Project Types, Mixed Primary Programming Languages) Magnified with Confidence Interval of 50% (SLOC size ≤ 500,000 and effort ≤ 200,000)
0
20,000
40,000
60,000
80,000
100,000
120,000
140,000
160,000
180,000
200,000
0 100,000 200,000 300,000 400,000 500,000
Actual_net_SLOC_size
y(50%)y(-50%)
Copyright IPA SEC
N=671
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-3 SLOC Size and Effort (All Project Types, Mixed Primary Programming
Languages) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000
10,000
100,000
1,000,00
0
10,000,0
00
100,000
,000
Actual_net_SLOC_size
Copyright IPA SEC
N=671
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
156 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.6.2 SLOC Size and Effort: All Project Types, Major_Programming_Language_Group
This section presents the relationship between the SLOC size and effort of all project types whose primary
programming languages belong to the group of four major programming languages.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of any type (“development”, “maintenance/support”, “redevelopment”, or “enhancement”) seem to
have certain correlation between their SLOC size and effort. Figure 6-6-5 shows that “development” projects have smaller variations in effort while “maintenance/support” projects have larger variations in effort.
Figure 6-6-4 SLOC Size and Effort
(All Project Types, Major_Programming_Language_Group)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
900,000
1,000,000
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=520
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-5 SLOC Size and Effort
(All Project Types, Major_Programming_Language_Group) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100
1,000
10,000
100,000
1,000,000
10,000,000
100,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=520
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 157
6.6.3 Primary-Programming-Language-Based SLOC Size and Effort: Development, Major_Programming_Language_Group
This section presents on a per-primary-programming-language basis the relationship between the SLOC size
and effort of development projects. The subsections following this section analyze that relationship for each of the major programming languages separately.
The following two sections serve as informative references for this section: Section 6.7.1” SLOC Size and SLOC_Productivity: Development, Major_Programming_Language_Group”, and Section 6.7.2 “Primary-Programming-Language-Based SLOC_Productivity: Development”.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects whose primary programming language is “COBOL” or “C” have large SLOC sizes. Projects whose
primary programming language is “VB” have medium to low SLOC sizes. Figure 6-6-6 Primary-Programming-Language-Based SLOC Size and Effort (Development)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
900,000
1,000,000
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=280
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
b: COBOL g: C h: VB q: Java
158 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-6-7 Primary-Programming-Language-Based SLOC Size and Effort (Development) Magnified (SLOC size ≤ 500,000 and effort ≤ 200,000)
0
20,000
40,000
60,000
80,000
100,000
120,000
140,000
160,000
180,000
200,000
0 100,000 200,000 300,000 400,000 500,000
Actual_net_SLOC_size
Copyright IPA SEC
N=280Ac
tual
_effo
rt_
(Maj
or_d
evel
opm
ent_
phas
es)
[per
son-
hour
]
Figure 6-6-8 Primary-Programming-Language-Based SLOC Size and Effort (Development)
Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100
1,000
10,000
100,000
1,000,000
10,000,000
100,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=280
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
b: COBOL g: C h: VB q: Java
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 159
Primary-Programming-Language-Based SLOC Size and Effort: Development, COBOL
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “COBOL”. Effort = A × (SLOC size)B, where B = 0.86 and R2 = 0.76 The SLOC size and effort have strong correlation between them. Among the four major programming
languages, “COBOL” exhibits the second strongest correlation following “C”. Figure 6-6-9 Primary-Programming-Language-Based SLOC Size and Effort
(Development, COBOL)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
900,000
1,000,000
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=80
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-10 Primary-Programming-Language-Based SLOC Size and Effort
(Development, COBOL) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000 100,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=80
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
160 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Primary-Programming-Language-Based SLOC Size and Effort: Development, C
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “C”. Effort = A × (SLOC size)B, where B = 0.88 and R2 = 0.77 The SLOC size and effort have strong correlation between them. Among the four major programming
languages, “C” exhibits the strongest correlation. Figure 6-6-11 Primary-Programming-Language-Based SLOC Size and Effort
(Development, C)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=52
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-12 Primary-Programming-Language-Based SLOC Size and Effort (Development,
C) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=52
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 161
Primary-Programming-Language-Based SLOC Size and Effort: Development, VB
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “VB”. Effort = A × (SLOC size), where R2 = 0.74 The SLOC size and effort have slightly strong correlation between them.
Figure 6-6-13 Primary-Programming-Language-Based SLOC Size and Effort
(Development, VB)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=58
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-14 Primary-Programming-Language-Based SLOC Size and Effort
(Development, VB) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=58
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
162 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Primary-Programming-Language-Based SLOC Size and Effort: Development, Java
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “Java”. Effort = A × (SLOC size)B, where B = 0.84 and R2 = 0.75 The SLOC size and effort have slightly strong correlation between them. Projects of 500 KSLOCs or larger
have large variation in their effort. Figure 6-6-15 Primary-Programming-Language-Based SLOC Size and Effort
(Development, Java)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=90
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-16 Primary-Programming-Language-Based SLOC Size and Effort
(Development, Java) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=90
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 163
6.6.4 Industry-Type-Based SLOC Size and Effort: Development, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the relationship between the SLOC size and
effort of the development projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the industry type alternatives, projects were categorized based on “Industry type_1, _2, or _3” (major type), whichever relevant into the five major industry types to which many of the sampled projects belong.
Section 6.7.3 “Industry-Type-Based SLOC_Productivity: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Any of the three data items, 312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size> 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Many projects of the finance and insurance type have large size and effort.
Figure 6-6-17 Industry-Type-Based SLOC Size and Effort (Development,
Major_Programming_Language_Group)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=225
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
* The above scattergram does not cover two projects that were larger than 4,000 KSLOCs.
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
164 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.6.5 Architecture-Based SLOC Size and Effort: Development, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the relationship between the SLOC size and effort of
the development projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Section 6.7.4 “Architecture-Based SLOC_Productivity: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “intranet/Internet” or “client/server” architecture have relatively large sizes.
Figure 6-6-18 Architecture-Based SLOC Size and Effort
(Development, Major_Programming_Language_Group)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=240
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
* The above scattergram does not cover four projects that were larger than 4,000 KSLOCs.
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 165
6.6.6 Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group
This section presents on a per-primary-programming-language basis the relationship between the SLOC size
and effort of enhancement projects. The subsections following this section analyze that relationship for each of the major programming languages separately.
Section 6.7.8 SLOC “Size and SLOC_Productivity: Enhancement, Major_Programming_Language_Group” and Section 6.7.9 “Primary-Programming-Language-Based SLOC_Productivity: Enhancement” serve as informative references for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size_Enhancement
(Derived indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects that used “COBOL” have large size and large effort. Projects that used “VB” scatter in the
small-size region. Figure 6-6-19 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement)
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=209
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
b: COBOL g: C h: VB q: Java
166 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-6-20 Primary-Programming-Language-Based SLOC Size and Effort (Enhancement) Magnified (SLOC size ≤ 500,000 and effort ≤ 200,000)
0
20,000
40,000
60,000
80,000
100,000
120,000
140,000
160,000
180,000
200,000
0 100,000 200,000 300,000 400,000 500,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=209Ac
tual
_effo
rt_
(Maj
or_d
evel
opm
ent_
phas
es)
[per
son-
hour
]
Figure 6-6-21 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=209
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
b: COBOL g: C h: VB q: Java
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 167
Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, COBOL
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “COBOL”. Effort = A × (SLOC size)B, where B = 0.66 and R2 = 0.51 The trend found here is that the effort of enhancement projects does not increase as much as that of
development projects as the SLOC size increases. Figure 6-6-22 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, COBOL)
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=63
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-23 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, COBOL) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=63
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
168 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, C
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “C”. Effort = A × (SLOC size)B, where B = 0.27 and R2 = 0.16
Figure 6-6-24 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, C)
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
80,000
90,000
100,000
0 200,000 400,000 600,000 800,000 1,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=46
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-25 ( Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, C) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=46
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 169
Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, VB
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “VB”. Effort = A × (SLOC size)B, where B = 0.49 and R2 = 0.53 The trend found here is that the effort of enhancement projects does not increase as much as that of
development projects as the SLOC size increases. Figure 6-6-26 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, VB)
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
80,000
90,000
100,000
0 200,000 400,000 600,000 800,000 1,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=38
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-27 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, VB) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=38
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
170 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, Java
The following equation gives the approximate correlation found here between the SLOC size and effort of
the projects that used “Java”. Effort = A × (SLOC size)B, where B = 0.55 and R2 = 0.48 The trend found here is that the effort of enhancement projects does not increase as much as that of
development projects as the SLOC size increases. Figure 6-6-28 Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, Java)
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
80,000
90,000
100,000
0 200,000 400,000 600,000 800,000 1,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=62
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
Figure 6-6-29 ( Primary-Programming-Language-Based SLOC Size and Effort
(Enhancement, Java) Logarithmic Scale
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000 10,000 100,000 1,000,000 10,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=62
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 171
6.6.7 Industry-Type-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the relationship between the SLOC size and
effort of the enhancement projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the industry type alternatives, projects were categorized based on “Industry type_1, _2, or _3” (major type), whichever relevant into the five major industry types to which many of the sampled projects belong.
Section 6.7.10 “Industry-Type-Based SLOC_Productivity: Enhancement, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Any of the three data items, 312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size_Enhancement
(Derived indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “finance and insurance” type have relatively large size and large effort.
Figure 6-6-30 Industry-Type-Based SLOC Size and Effort
(Enhancement, Major_Programming_Language_Group)
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=163
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
172 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.6.8 Architecture-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the relationship between the SLOC size and effort of
the enhancement projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
Section 6.7.11 Architecture-Based SLOC_Productivity: Enhancement, Major_Programming_Language_Group serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 308_Architecture_1, _2, or _3 has a defined value • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • Actual_effort_ (major_development_phases) > 0
Analyzed data • X-axis: Actual_net_SLOC_size Enhancement
(Derived indicator) • Y-axis: Actual_effort_
(major_development_phases) (Derived indicator)
Projects of the “intranet/Internet” or “client/server” architecture have relatively large sizes.
Figure 6-6-31 Architecture-Based SLOC Size and Effort
(Enhancement, Major_Programming_Language_Group)
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=173
Actu
al_e
ffort
_ (M
ajor
_dev
elop
men
t_ph
ases
)[p
erso
n-ho
ur]
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 173
6.7 SLOC_Productivity
This section presents the analysis results of SLOC_Productivity. The SLOC_Productivity refers to the quotient obtained by dividing the SLOC size by the amount of effort used in the major development phases. That is, the SLOC_Productivity is the SLOC size per person hour or per person month. (The coefficient of 160 hours/person month is assumed for conversion between person hours and person months.)
Some of data item names presented in this section accompany the note “Derived indicator”. Appendix A.4 describes the definitions and deriving methods of those data items. This section analyzes projects that have valid SLOC size values with defined programming language names. The primary programming languages analyzed in this section belong to the group of major programming languages used by many of the sampled projects.
“Primary programming language_1” of a project is defined as the programming language that was used most widely in that project. The notation 312_Primary_programming_language_1, _2, or _3 used in the criteria description means that “312_Primary programming language_1, _2, or _3” satisfies the specified criterion. 6.7.1 SLOC Size and SLOC_Productivity:
Development, Major_Programming_Language_Group
This section presents the relationship between the SLOC size and SLOC_Productivity of the development projects whose primary programming languages belong to the group of four major programming languages. Many of these projects used more than one programming language at a time.
Presenting the relationship between the SLOC size and effort of almost the same projects, Section 6.6.3 “Primary-Programming-Language-Based SLOC Size and Effort: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
The first part of this section presents the relationship between the SLOC size and SLOC_Productivity for each of the four programming languages in Figure 6-7-1. The succeeding figures present that relationship for each of the major programming languages separately: “COBOL” (Figure 6-7-2), “C” (Figure 6-7-3), “VB” (Figure 6-7-4), and “Java” (Figure 6-7-5). The second part presents the SLOC_productivity on a per-SLOC-size-class basis, and presents the distribution of SLOC_productivity in a cross-analysis between the SLOC size class and the primary programming language.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The trend found here is that development projects of smaller sizes have larger variations in
SLOC_productivity regardless of whichever primary programming language used. Figure 6-7-1 Primary-Programming-Language-Based SLOC Size and SLOC_Productivity
(Development)
0
5
10
15
20
25
30
35
40
45
50
55
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=280
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
b: COBOL g: C h: VB q: Java
174 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-7-2 SLOC Size and SLOC_Productivity (Development, COBOL)
0
5
10
15
20
25
30
35
40
45
50
55
0 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 12,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=80SL
OC_
prod
uctiv
ity(S
LOC
siz
e/m
ajor
-dev
elop
men
t-pha
ses
effo
rt)
Figure 6-7-3 SLOC Size and SLOC_Productivity (Development, C)
0
5
10
15
20
25
30
35
40
45
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=52
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-7-4 SLOC Size and SLOC_Productivity (Development, VB)
0
5
10
15
20
25
30
35
40
45
0 200,000 400,000 600,000 800,000 1,000,000 1,200,000 1,400,000 1,600,000 1,800,000
Actual_net_SLOC_size
Copyright IPA SEC
N=58
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-7-5 SLOC Size and SLOC_Productivity (Development, Java)
0
5
10
15
20
25
30
35
40
45
0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000
Actual_net_SLOC_size
Copyright IPA SEC
N=90
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 175
Figures 6-7-6, 6-7-7, and 6-7-8 show the distribution of SLOC_productivity on a per-SLOC-size-class basis. Different programming languages are used at the same time in many cases. It seems therefore that a more precise productivity analysis needs to take into account in what combination programming languages are used. Table 6-7-6 SLOC-Size-Based SLOC_Productivity Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: SLOCs/person-hour, KSLOCs/160 person-hour)
SLOC SIZE Unit N Min P25 Med P75 Max Mean S.D.All project types 280 0.4 3.6 5.9 9.6 51.4 7.5 6.4 Less than 40KSLOCs 85 0.4 2.3 4.2 7.2 20.6 5.4 4.3 40KSLOCs or more and less than 100KSLOCs
55 0.9 4.3 6.9 10.1 34.3 7.7 5.4
100KSLOCs or more and less than 300KSLOCs
73 1.7 4.7 6.4 9.6 51.4 8.7 7.9
300KSLOCs or more
SLOCs/ person-hour
67 1.4 4.0 6.5 9.8 29.5 8.9 7.2 All project types 280 0.07 0.57 0.94 1.53 8.23 1.21 1.03 Less than 40KSLOCs 85 0.07 0.37 0.68 1.15 3.29 0.86 0.68 40KSLOCs or more and less than 100KSLOCs
55 0.14 0.68 1.10 1.62 5.49 1.23 0.86
100KSLOCs or more and less than 300KSLOCs
73 0.27 0.75 1.03 1.54 8.23 1.40 1.26
300KSLOCs or more
KSLOCs/ 160 person-hour
67 0.22 0.64 1.03 1.57 4.72 1.42 1.15 Figure 6-7-7 SLOC-Size-Based
SLOC_Productivity (Development, Major_Programming_ Language_Group) Box-and-Whisker-Plot
Figure 6-7-8 SLOC-Size-Based SLOC_Productivity (Development, Primary-Programming- Language-Based) Box-and-Whisker-Plot
q:Javah:VB
g
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
上
SLOC
_productivity
Less than 40KSLOCs
b: COBOL
40KSLOCs or more and less than
100KSLOCs
100KSLOCs or more and less than
300KSLOCs
300KSLOCs or more
SLOC size
g: C
h: VB q: Java
Less than 40KSLOCs
40KSLOCs or more and less than
100KSLOCs
100KSLOCs or more and less than
300KSLOCs
300KSLOCs or more
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Less than 40KSLOCs
40KSLOCs or
more and less than
100KSLOCs
100KSLOCs or
more and less than
300KSLOCs
300KSLOCs or more
SLOC size
176 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.2 Primary-Programming-Language-Based SLOC_Productivity: Development
This section presents on a per-primary-programming-language basis the distribution of
SLOC_productivity of development projects. Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Presenting the relationship between the SLOC size and effort of the same projects, Section 6.6.3 “Primary-Programming-Language-Based SLOC Size and Effort: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The median values listed in the table in Figure 6-7-10 show that there is no significant difference in the
SLOC_productivity. Figure 6-7-9 Primary-Programming-Language-Based SLOC_Productivity (Development)
Box-and-Whisker-Plot
Table 6-7-10 Primary-Programming-Language-Based SLOC_Productivity Basic Statistics
(Development) (Unit: SLOCs/person-hour)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 80 0.7 3.7 5.9 9.0 51.4 7.5 6.8 g : C 52 1.7 3.1 5.1 8.0 34.3 7.3 6.5 h : VB 58 0.7 3.0 5.5 9.6 41.6 7.4 7.0 q : Java 90 0.4 4.0 6.9 9.7 29.5 7.9 5.8
40.00
30.00
20.00
10.00
0.00SLO
C_productivity
b: COBOL
Primary programming language
g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 177
6.7.3 Industry-Type-Based SLOC_Productivity: Development, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the distribution of SLOC_productivity of
the development projects whose primary programming languages belong to the group of four major programming languages. The major target industry types presented in this section are the five industry types to which many of the sampled projects belong.
The first part of this section presents major-industry-type-based SLOC_productivity. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the primary programming language and the major industry type.
Presenting the relationship between the SLOC size and effort of the same projects, Section 6.6.4 “Industry-Type-Based SLOC Size and Effort: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Any of the three data items, 312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The analysis of projects of the major_programming_language_group shows that SLOC_productivity of
projects of the manufacturing type differs from that of projects of the finance and insurance type. SLOC_productivity varies depending on the primary programming language. Projects of the finance and insurance type have relatively low SLOC_productivity, even if they are examined from the programming-language-based point of view.
Figure 6-7-11 Industry-Type-Based
SLOC_Productivity (Development, Major_Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-12 Industry-Type-Based SLOC_Productivity (Development, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-13 Industry-Type-Based SLOC_productivity Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 30 2.8 6.7 8.3 11.6 34.3 9.9 6.2 H : Information and communications 34 1.7 4.4 6.0 9.8 28.1 8.1 6.1 J : Wholesale/retail trade 34 0.9 4.3 5.8 8.8 18.6 7.0 4.4 K : Finance and insurance 96 0.5 2.4 4.3 7.1 28.9 5.5 4.4 R : Government, N.E.C. 30 0.4 3.3 6.1 9.7 29.5 7.5 5.9
q:Javah:VB
30.00
20.00
10.00
0.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL
Type of industry_Major type
g: C
h: VB q: Java
F: Manufacturing
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
F: Manufacturing
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
30.00
20.00
10.00
0.00
SLOC
_productivity
F: Manufacturing
Type of industry_Major type
H: Inform
ation and com
munications
J: Wholesale/retail trade
K: Finance and insurance
R: G
overnment
178 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.4 Architecture-Based SLOC_Productivity: Development, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the distribution of SLOC_productivity of the
development projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the system architecture alternatives, projects were categorized based on “Architecture_1, _2, or _3”, whichever was relevant.
The first part of this section presents SLOC_productivity of each architecture type. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the primary programming language and the architecture.
Presenting the relationship between the SLOC size and effort of the same projects, Section 6.6.5 “Architecture-Based SLOC Size and Effort: Development, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The median values listed in the table in Figure 6-7-14 show that there is no significant difference in the
SLOC_productivity values with respect to the architecture and that the width of distribution varies widely depending on the architecture.
Figure 6-7-14 Architecture-Based
SLOC_Productivity (Development, Major_Programming_ Language_Group) Box-and-Whisker Plot
Figure 6-7-15 Architecture-Based SLOC_Productivity (Development, Primary-Programming- Language-Based) Box-and-Whisker Plot
Table 6-7-16 Architecture-Based SLOC_Productivity Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 8 — — 4.2 — — — — b : Mainframe 17 0.7 3.8 6.2 7.8 51.4 8.6 11.5 c : 2-layer client/server 65 0.7 2.3 6.1 9.9 34.3 8.3 7.5 d : 3-layer client/server 46 0.9 4.2 5.6 7.4 21.9 6.6 4.2 e : Intranet/Internet 104 0.5 3.8 6.6 9.6 41.6 7.8 6.2
q:Javah:VB
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL g: C
h: VB q: Java
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
Architecture
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Architecture
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 179
6.7.5 Platform-Based SLOC_Productivity: Development, Major_Programming_Language_Group
This section presents on a per-target-platform basis the distribution of SLOC_productivity of the
development projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the target_platform_alternatives, projects were categorized into the Windows platform and Unix platform based on “Target platform_1, _2, or _3”, whichever was relevant.
The first part of this section presents SLOC_productivity for each of the two platforms. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the two platforms and the primary programming languages.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Windows platform or Unix platform (Derived indicator)
derived from 309_Target_platform_1, _2, or _3. • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
Projects of the Windows platform have slightly higher productivity than those of the Unix platform.
Figure 6-7-17 Platform-Based
SLOC_Productivity (Development, Major_ Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-18 Platform-Based SLOC_Productivity (Development, Primary- Programming-Language- Based) Box-and-Whisker Plot
Table 6-7-19 Platform-Based SLOC_Productivity Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Platform N Min P25 Med P75 Max Mean S.D. Windows 149 0.5 4.3 6.7 10.2 41.6 8.1 6.4 Unix 107 0.4 3.3 5.4 8.5 28.9 6.9 5.5
q:Javah:VB
g
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
SLOC
productivity
b: COBOL g: C
h: VB q: Java
WindoPlatform
Unix Windo Unix
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Windows
PlatformUnix
180 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.6 Per-Month-Number-of-Staff and SLOC_Productivity: Development, Major_Programming_Language_Group
This section presents on a per-primary-programming-language basis the relationship between the
per-month_number_of_staff and SLOC_productivity of the development projects whose primary programming languages belong to the group of four major programming languages. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
The first part of this section shows a scattergram to present the whole picture. The second part shows the productivity in box-and-whisker plots for above and below per-month_number_of_staff of 10 separately. The second part also presents box-and-whisker plots to show SLOC_productivity in a cross-analysis between the class of per-month_number_of_staff and the primary programming language.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_months_ (major_development_phases) > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The four programming languages commonly exhibit a trend that SLOC_productivity decreases as the
per-month_number_of_staff increases. Figure 6-7-20 Per-Month-Number-of-Staff and SLOC_productivity
(Development, Primary-Programming-Language-Based)
0
5
10
15
20
25
30
35
40
0 20 40 60 80 100 120 140
Number_of_staff_per_month [persons]
Copyright IPA SEC
N=151
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 181
Figure 6-7-21 Per-Month-Number-of-
Staff-Based SLOC_Productivity (Development, Major_Programming_ Language_Group) Box-and-Whisker Plot
Figure 6-7-22 Per-Month-Number-of- Staff-Based SLOC_Productivity (Development, Primary-Programming- Language-Based) Box-and-Whisker Plot
Table 6-7-23 Per-Month-Number-of-Staff-Based SLOC_Productivity Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D. Less than 10 76 0.7 3.6 7.0 11.0 34.3 8.4 6.4 10 or more 75 0.4 3.8 5.7 8.6 29.5 7.1 5.8
q:Javah:VB
g
30.00
20.00
10.00
0.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL g: C
h: VB q: Java
Less than 10
Number_of_staff_per_month 10 or more Less than 10 10 or more
30.00
20.00
10.00
0.00
SLOC
_productivity
Less than 10 Number_of_staff_per_month
10 or more
182 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.7 Outsourcing_Ratio and SLOC_Productivity: Development, Major_Programming_Language_Group
This section presents on a per-primary-programming-language basis the relationship between the
outsourcing_ratio and the SLOC size and the relationship between the outsourcing_ratio and SLOC_productivity of the development projects whose primary programming languages belong to the group of major four programming languages. See Derived Indicators in Appendix A.4 for the detailed definition of the outsourcing_ratio.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Outsourcing_ratio ≥ 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Outsourcing_ratio (Derived indicator)• Y-axis: Actual_net_SLOC_size [SLOCs],
SLOC_productivity (SLOC size / major-development-phases effort) (Derived indicator) [SLOCs / person hour]
Many projects of larger SLOC sizes have higher outsourcing_ratios. None of the projects of zero
outsourcing_ratio have a large outsourcing_ratio. The data analyzed here is not enough to provide any proof that the outsourcing_ratio and
SLOC_productivity have significant correlation between them. Figure 6-7-24 Outsourcing_Ratio and SLOC Size
(Development, Primary-Programming-Language-Based)
0
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
7,000,000
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Actu
al_n
et_S
LOC_
size
Copyright IPA SEC
N=240
Figure 6-7-25 Outsourcing_Ratio and SLOC_Productivity
(Development, Primary-Programming-Language-Based)
0
10
20
30
40
50
60
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Copyright IPA SEC
N=240
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
b: COBOL g: C h: VB q: Java
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 183
6.7.8 SLOC Size and SLOC_Productivity: Enhancement. Major_Programming_Language_Group
This section presents the relationship between the SLOC size and SLOC_productivity of the enhancement
projects whose primary programming languages belong to the group of four major programming languages. Many of these projects used more than one programming language at a time.
Presenting the relationship between the SLOC size and effort of the same projects, Section 6.6.6
“Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group” serves as an informative reference for this section. The first part of this section presents the relationship between the SLOC size and SLOC_productivity for each of the four programming languages in Figure 6-7-28. The succeeding figures present that relationship for each of the major programming languages separately: “COBOL” (Figure 6-7-27), “C” (Figure 6-7-28), “VB” (Figure 6-7-29), and “Java” (Figure 6-7-30). The second part presents the SLOC_productivity on a per-SLOC-size-class basis, and presents the distribution of SLOC_productivity in a cross-analysis between the SLOC size class and the primary programming language.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Actual_net_SLOC_size_Enhancement
(Derived indicator) • Y-axis: SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The primary-programming-language-based and SLOC-size-class-based analysis of SLOC_productivity
shows that enhancement projects of 40 KSLOCs exhibit the same trend in SLOC_productivity regardless of the primary programming language, and that projects of large sizes have large width of SLOC_productivity distribution.
Figure 6-7-26 Primary-Programming-Language-Based SLOC Size and SLOC_Productivity
(Enhancement)
0
10
20
30
40
50
60
70
80
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=209
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
b: COBOL g: C h: VB q: Java
184 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-7-27 SLOC Size and SLOC_Productivity (Enhancement, COBOL)
0
10
20
30
40
50
60
70
80
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=63
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-7-28 SLOC Size and SLOC_Productivity (Enhancement, C)
0
10
20
30
40
50
60
70
80
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=46
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-7-29 SLOC Size and SLOC_Productivity (Enhancement, VB)
0
10
20
30
40
50
60
70
80
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=38
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
Figure 6-7-30 SLOC Size and SLOC_Productivity (Enhancement, Java)
0
10
20
30
40
50
60
70
80
0 250,000 500,000 750,000 1,000,000 1,250,000 1,500,000 1,750,000 2,000,000
Actual_net_SLOC_size_Enhancement
Copyright IPA SEC
N=62
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 185
Figures 6-7-31, 6-7-32, and 6-7-33 show the distribution of SLOC_productivity on a per-SLOC-size-class basis. Different programming languages are used at the same time in many cases. It seems therefore that this kind of analysis needs to consider other factors such as combinations of programming languages actually used. Table 6-7-31 SLOC-Size-Based SLOC_Productivity Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: SLOCs/person-hour, KSLOCs/160 person-hour)
SLOC SIZE Unit N Min P25 Med P75 Max Mean S.D.All project types 209 0.0 2.1 4.4 8.9 69.9 7.9 10.0 Less than 40KSLOCs 117 0.0 1.4 3.0 6.2 34.9 4.9 6.1 40KSLOCs or more and less than 100KSLOCs
42 0.3 3.0 5.4 10.3 46.4 8.7 9.1
100KSLOCs or more and less than 300KSLOCs
30 1.2 4.5 7.5 24.1 69.9 14.8 15.7
300KSLOCs or more
SLOCs/ person-hour
20 1.9 3.5 8.4 20.1 40.8 13.5 12.5 All project types 209 0.01 0.34 0.70 1.42 11.18 1.26 1.61 Less than 40KSLOCs 117 0.01 0.22 0.48 1.00 5.58 0.79 0.98 40KSLOCs or more and less than 100KSLOCs
42 0.04 0.48 0.86 1.64 7.42 1.38 1.45
100KSLOCs or more and less than 300KSLOCs
30 0.19 0.72 1.19 3.85 11.18 2.37 2.51
300KSLOCs or more
KSLOCs/ 160 person-hour
20 0.30 0.56 1.35 3.22 6.53 2.15 2.01 Figure 6-7-32 SLOC-Size-Based
SLOC_Productivity (Enhancement, Major_Programming_ Language_Group) Box-and-Whisker-Plot
Figure 6-7-33 SLOC-Size-Based SLOC_Productivity (Enhancement, Primary-Programming- Language-Based) Box-and-Whisker-Plot
q:Javah:VB
g 言語
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Less than 40KSLOCs
b: COBOL
40KSLOCs or more and less than
100KSLOCs
100KSLOCs or more and less than
300KSLOCs
300KSLOCs or more
SLOC size
g: C
h: VB q: Java
Less than 40KSLOCs
40KSLOCs or more and less than
100KSLOCs
100KSLOCs or more and less than
300KSLOCs
300KSLOCs or more
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
SLOC
productivity
Less than 40KSLOCs
40KSLOCs or
more and less than
100KSLOCs
100KSLOCs or
more and less than
300KSLOCs
300KSLOCs or more
SLOC size
186 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.9 Primary-Programming-Language-Based SLOC_Productivity: Enhancement
This section presents on a per-primary-programming-language basis the distribution of
SLOC_productivity of enhancement projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the primary programming language alternatives, projects were categorized based on “Primary programming language_1, _2, or _3”, whichever was relevant.
Presenting the relationship between the SLOC size and effort of the same projects, Section 6.6.6 “Primary-Programming-Language-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The median values obtained in the above analysis show that there is no significant difference in the
SLOC_productivity values with respect to the programming language. All the programming languages have slightly large width of distribution except for “COBOL”.
Figure 6-7-34 Primary-Programming-Language-Based SLOC_Productivity (Enhancement)
Box-and-Whisker-Plot
Table 6-7-35 Primary-Programming-Language-Based SLOC_Productivity Basic Statistics
(Enhancement) (Unit: SLOCs/person-hour)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 63 0.1 2.3 3.8 6.8 40.8 6.6 8.3 g : C 46 0.1 2.1 4.5 11.2 69.9 10.0 14.0 h : VB 38 0.1 1.8 4.4 10.7 39.9 9.3 11.1 q : Java 62 0.0 1.8 4.0 9.3 26.6 6.8 6.8
50.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL
Primary programming language
g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 187
6.7.10 Industry-Type-Based SLOC_Productivity: Enhancement, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the distribution of SLOC_productivity of
the enhancement projects whose primary programming languages belong to the group of four major programming languages. The major target industry types presented in this section are the five industry types to which many of the sampled projects belong.
The first part of this section presents major-industry-type-based SLOC_productivity. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the major industry type and the four programming languages used by many of the sampled projects.
Presenting the relationship between the SLOC size and effort of almost the same projects, Section 6.6.7 “Industry-Type-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• Any of the three data items, 312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
Projects of the “manufacturing” type have higher productivity than those of other types. Projects of
“government” type have slightly high productivity. Projects of the “finance and insurance” type tend to have low productivity, even if they are examined from the programming-language-based point of view.
Figure 6-7-36 Industry-Type-Based
SLOC_Productivity (Enhancement, Major_Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-37 Industry-Type-Based SLOC_Productivity (Enhancement, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-38 Industry-Type-Based SLOC_Productivity Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 14 0.7 6.7 15.4 22.2 30.9 14.9 9.4 H : Information and communications 33 0.1 2.0 3.4 7.5 32.7 5.5 6.3 J : Wholesale/retail trade 11 0.3 1.3 4.4 5.5 28.8 6.9 9.2 K : Finance and insurance 78 0.0 1.6 3.2 6.4 69.9 6.1 10.3 R : Government, N.E.C. 27 0.7 3.6 8.4 22.0 40.8 12.8 11.6
q:Javah:VB
g 言語
50.00
40.00
30.00
20.00
10.00
0.00
50.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL
Type of industry_Major type
g: C
h: VB q: Java
F: Manufacturing
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
F: Manufacturing
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
F: Manufacturing
Type of industry_Major type
H: Inform
ation and com
munications
J: Wholesale/retail trade
K: Finance and insurance
R: G
overnment
188 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.11 Architecture-Based SLOC_Productivity: Enhancement, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the distribution of SLOC_productivity of the
enhancement projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the system architecture alternatives, projects were categorized and statistically analyzed based on “Architecture_1, _2, or _3”, whichever was relevant.
The first part of this section presents SLOC_productivity for each architecture type. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the architecture and the primary programming language.
Presenting the relationship between the SLOC size and effort of almost the same projects, Section 6.6.8 “Architecture-Based SLOC Size and Effort: Enhancement, Major_Programming_Language_Group” serves as an informative reference for this section.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 308_Architecture_1, _2, or _3 has a defined value • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The median values obtained in the above analysis show that there is no significant difference in the
SLOC_productivity values with respect to the architecture and that the width of distribution varies widely depending on the architecture.
Figure 6-7-39 Architecture-Based
SLOC_Productivity (Enhancement, Major_Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-40 Architecture-Based SLOC_Productivity (Enhancement, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-41 Architecture-Based SLOC_Productivity Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 14 0.7 1.1 3.3 7.9 34.9 7.4 10.1 b : Mainframe 17 0.9 2.0 3.1 7.0 39.9 6.7 9.4 c : 2-layer client/server 55 0.1 2.3 4.6 11.9 40.8 9.2 10.3 d : 3-layer client/server 44 0.0 1.5 4.8 6.9 69.9 8.3 13.2 e : Intranet/Internet 43 0.4 2.4 4.5 7.7 32.7 6.4 6.3
q:Javah:VB
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL g: C
h: VB q: Java
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
Architecture
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Architecture
a: Stand-alone
d: 3-layer client/server
b: Mainfram
e
e: Intranet/Internet
c: 2-layer client/server
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 189
6.7.12 Platform-Based SLOC_Productivity: Enhancement, Major_Programming_Language_Group
This section presents on a per-target-platform basis the distribution of SLOC_productivity of the
enhancement projects whose primary programming languages belong to the group of four major programming languages. Because of multiple choices allowed for the target_platform_alternatives, projects were categorized into the Windows platform and Unix platform based on “Target platform_1, _2, or _3”, whichever was relevant.
The first part of this section presents SLOC_productivity for each of the two platforms. The second part presents box-and-whisker plots to show the SLOC_productivity in a cross-analysis between the two platforms and four primary programming languages used by many of the sampled projects.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Windows platform or Unix platform (Derived indicator)
derived from 309_Target_platform_1, _2, or _3. • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size_Enhancement > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
Projects of the Windows platform have slightly higher productivity than those of the Unix platform, even if
they are examined from the programming-language-based point of view. Figure 6-7-42 Platform-Based SLOC_
Productivity (Enhancement, Major_Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-43 Platform-Based SLOC_ Productivity (Enhancement, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-44 Platform-Based SLOC_Productivity Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Platform N Min P25 Med P75 Max Mean S.D. Windows 94 0.0 2.1 4.5 9.6 40.8 7.9 8.9 Unix 85 0.1 2.0 3.7 7.0 37.3 6.5 8.0
q:Javah:VB
g 言語
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL g: C
h: VB q: Java
Windows
Platform Unix Windows Unix
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
WindowsPlatform
Unix
190 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.13 Per-Month-Number-of-Staff and SLOC_Productivity: Enhancement, Major_Programming_Language_Group
This section presents the relationship between the per-month_number_of_staff and
SLOC_productivity of the enhancement projects whose primary programming languages belong to the group of four major programming languages. The per-month_number_of_staff was calculated from the actual_effort_ (major_development_phases) and the actual_months_ (major_development_phases). See Derived Indicators in Appendix A.4 for detailed definitions.
The first part of this section shows a scattergram to present the whole picture. The second part shows the productivity in box-and-whisker plots for above and below the per-month_number_of_staff of 10 separately. The second part also presents box-and-whisker plots to show SLOC_productivity in a cross-analysis between the class of per-month_number_of_staff and the four primary programming languages.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_months_ (major_development_phases) > 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Per-month_number_of_staff (Derived
indicator) • Y-axis: SLOC_productivity (SLOC size /
major-development-phases effort) (Derived indicator) [SLOCs / person hour]
The four programming languages commonly exhibit a trend that SLOC_productivity decreases as the
per-month_number_of_staff increases. This trend is similar to that of development projects. Figure 6-7-45 Per-Month-Number-of-Staff and SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based)
0
5
10
15
20
25
30
35
40
45
0 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180
Number_of_staff_per_month [persons]
Copyright IPA SEC
N=118
SLO
C_pr
oduc
tivity
(SLO
C s
ize/
maj
or-d
evel
opm
ent-p
hase
s ef
fort)
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 191
Figure 6-7-46 Per-Month-Number-of- Staff- Based SLOC_Productivity (Enhancement, Major_Programming_Language_ Group) Box-and-Whisker Plot
Figure 6-7-47 Per-Month-Number-of- Staff-Based SLOC_Productivity (Enhancement, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-48 Per-Month-Number-of-Staff-Based SLOC_Productivity Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: SLOCs/person-hour)
Number_of_staff_per_month N Min P25 Med P75 Max Mean S.D. Less than 10 78 0.1 1.6 5.2 9.7 39.9 8.1 9.3 10 or more 40 0.0 2.3 3.3 6.8 35.6 6.2 7.3
q:Javah:VB
g 言語
40.00
30.00
20.00
10.00
0.00
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
b: COBOL g: C
h: VB q: Java
Less than 10
Number_of_staff_per_month 10 or more Less than 10 10 or more
40.00
30.00
20.00
10.00
0.00
SLOC
_productivity
Less than 10 Number_of_staff_per_month
10 or more
192 IPA/SEC White Paper 2007 on Software Development Projects in Japan
6.7.14 Outsourcing_Ratio and SLOC_Productivity: Enhancement, Major_Programming_Language_Group
This section presents on a per-primary-programming-language basis the relationship between the
outsourcing_ratio and the SLOC size and the relationship between the outsourcing_ratio and SLOC_productivity of the enhancement projects whose primary programming languages belong to the group of major four programming languages. See Derived Indicators in Appendix A.4 for the detailed definition of the outsourcing_ratio.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Outsourcing_ratio ≥ 0 • SLOC_productivity (SLOC size /
major-development-phases effort) > 0
Analyzed data • X-axis: Outsourcing_ratio (Derived indicator)• Y-axis: Actual_net_SLOC_size_Enhancement
[SLOCs], SLOC_productivity (SLOC size / major-development-phases effort) (Derived indicator) [SLOCs / person hour]
Many projects of larger SLOC sizes have higher outsourcing_ratios. The outsourcing_ratio and SLOC_productivity have no significant correlation between them.
Figure 6-7-49 Outsourcing_Ratio and SLOC Size
(Enhancement, Primary-Programming-Language-Based)
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Actu
al_n
et_S
LOC_
size
_Enh
ance
men
t Copyright IPA SEC
N=163
Figure 6-7-50 Outsourcing_Ratio and SLOC_Productivity
(Enhancement, Primary-Programming-Language-Based)
0
5
10
15
20
25
30
35
40
45
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Outsourcing_ratio
Copyright IPA SEC
N=163
Actu
al_n
et_S
LOC_
size
(SLO
C si
ze/m
ajor
-dev
elop
men
t-pha
ses
effo
rt)
b: COBOL g: C h: VB q: Java
b: COBOL g: C h: VB q: Java
6. Analysis of the relationship among effort, development schedule, and size
IPA/SEC White Paper 2007 on Software Development Projects in Japan 193
6.8 Relationship Between the FP Size and SLOC Size
This section presents the relationship between the FP size and SLOC size.
6.8.1 FP and SLOC: Development, IFPUG_Group, Primary-Programming-Language-Based
This section presents the relationship between the FP size and SLOC size of the development projects that used programming languages belonging to the major_programming_language_group while using FP measurement methods of the IFPUG_group. This section also shows a graph with logarithmic-scale X- and Y-axes.
The enhancement projects satisfying the criteria total an insufficient number of 11. Therefore, the same kinds of graphs are not presented for those enhancement projects.
Stratification criteria • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• 701_FP_measurement_method_ (actual) = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”
• 5001_Actual_FP size_ (unadjusted) > 0 • Actual_net_SLOC_size > 0
Analyzed data • X-axis: 5001_Actual_FP size_ (unadjusted)• Y-axis: Actual_net_SLOC_size (Derived
indictor)
The following equation gives the approximate correlation found here between the FP size and SLOC size of
the projects with mixed programming languages. SLOC size = A × (FP size)B, where B = 1.03 and R2 = 0.61 The programming-language-based analysis using the same type of equation shows that Java exhibits a
correlation curve (B = 1.02, R2 = 0.84) that is the closest to the above correlation among the four programming languages. Thus, Java exhibits positive, strong correlation between the FP size and SLOC size. Note that this analysis was made against a small amount of samples (total of 55 samples broken down to Java: 14, VB: 17, C: 10, and COBOL: 14).
Figure 6-8-1 Primary-Programming-Language-Based FP Size and SLOC Size
(Development, IFPUG_Group)
0
500,000
1,000,000
1,500,000
2,000,000
2,500,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000
Actual_FP_Size_ (unadjusted)
Actu
al_n
et_S
LOC_
size
Copyright IPA SEC
N=55
b: COBOL g: C h: VB q: Java
194 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7 Analysis of Reliability
7.1 Scope of This Chapter
This chapter presents the reliability of the software developed by the sampled projects based on the “number_and the density of of_identified_defects” after system cutover. 7.1.1 Introduction
Major data items analyzed in Chapter 7 include the number_of_defects_identified after system cutover and the size (FP size and SLOC size).
The “number_of_identified_defects” refers to the cumulative number_of_defects_identified in 6 months after system cutover, unless otherwise noted. Note that the number_of_identified_defects does not include the number of any defects identified in the test phase before development completion.
The “density of identified defects” refers to the number_of_identified_defects per size of software. The number_of_identified_defects per 1,000 FPs is used as the “FP_identified defect density” for projects that measured their sizes in FPs. Likewise, the number_of_identified_defects per 1,000 lines of source code (1 KSLOC) is used as the “SLOC_identified_defect_density” for projects that measured their sizes in SLOCs.
Sections 7.2 and 7.3 present the reliability of the software developed by projects that measured their sizes in FPs. Sections 7.4 and 7.5 present the reliability of the software developed by projects that measured their sizes in SLOCs.
Table 7-1-1 lists the basic combinations of factors and characteristics used in this chapter to analyze the reliability of software. The reliability analyses presented in this chapter were made with stratification by different characteristics such as the project type, “industry type”, and system “architecture”.
Table 7-1-1 lists factors in the topmost row and characteristics in the second row to present the combinations of factors and characteristics. (For example, the size against the number of defects.) The third and succeeding rows show in what strata the analysis is made. (For example, project types or industry types.) The numbers (x.x.x) in the table are the section numbers of this chapter. You can pick a section number in the table and look up the row and column intersecting at the number to know what kind of analysis is presented in that section. Table 7-1-1 Combinations of Factors, Characteristics, and Stratification
FP size SLOC size
Characteristics
Mixed_FP_measurement_methods 7.2.1 7.3.1
FP: IFPUG_group 7.2.2 7.3.2
Mixed Primary Programming Languages 7.4.1 7.5.1
Primary_programming_language_group 7.4.2 7.5.2
FP: IFPUG_group 7.2.3 7.3.3
Type of Industry 7.3.4
Architecture 7.3.5
Primary_programming_language_group 7.4.3 7.5.3
Type of Industry 7.5.4
Architecture 7.5.5
FP: IFPUG_group 7.2.4 7.3.6
Type of Industry 7.3.7
Architecture 7.3.8
Primary_programming_language_group 7.4.4 7.5.6
Type of Industry 7.5.7
Architecture 7.5.8
Development
Enhancement
All project type
Factors
Typesof
Projects
Focus of Analysis Number_of_identified_
defects
FP_defects_density
Number_of_identified_
defects
SLOC_defects_density
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 195
7.1.2 Analyzed Data
The data analyzed in this chapter is the same data set defined in Section 5.1.1 “Analyzed Data”, unless otherwise noted. Where a different data set is used, the difference is written near the description of relevant stratification. If an analysis is made for the whole-project development schedule, for example, this condition is written where the analysis is presented. 7.1.3 Analysis Procedure
The analyses presented in this chapter were made in accordance with the procedure described in Section 3.1.2. The data was analyzed and examined with the “stratification” shown in Table 7-1-1.
This chapter does not examine correlation (for example, regression equations) of any kind. 7.1.4 Distribution of Major Factors
Among the major factors presented in this chapter, basic distributions of FP size and those of SLOC size are presented in histograms and basic statistics in Chapter 5. The recommendation is that you use these histograms and statistics as a reference when reading the relationship among these factors presented in Section 7.2 or later.
196 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.2 FP Size and Number_of_Identified_Defects
This section presents the number_of_identified_defects of the projects that measured their actual FP sizes. The number_of_identified_defects refers to the cumulative number of defects identified in 6 months after system cutover, unless otherwise noted. Note that the number_of_identified_defects does not include the number of any defects identified in the test phase before development completion. 7.2.1 FP Size and Number_of_Identified_Defects:
All Project Types, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and the number_of_identified_defects of the projects of every type that measured their FP sizes. The FP size measurement methods used by these projects include any kind of method such as methods with unknown names. This section presents that relationship in a scattergram and the basic statistics for each project type.
Stratification criteria • 103_Project_type has a defined value. • 701_FP_measurement_method = Any value (Including
“Unknown”) • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects
(Derived indicator)
A comparison of various project types of the same class of size shows that the development projects have
more identified defects. Figure 7-2-1 FP Size and Number_of_Identified_Defects
(Mixed_FP_Measurement_Methods)
0
50
100
150
200
250
300
350
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=260
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover one project (y = 800 approximately). Table 7-2-2 Number_of_Identified_Defects Basic Statistics
(Mixed_FP_Measurement_Methods) (Unit: number of projects)
Types of Projects N Min P25 Med P75 Max Mean S.D. All project types 260 0 1.0 3.0 14.3 818 21.6 67.0 a : Development 184 0 0.0 3.0 16.3 818 26.3 77.9 b : Maintenance/support 35 0 1.0 4.0 10.5 63 8.8 12.7 c : Redevelopment 13 0 1.0 8.0 16.0 160 21.2 42.9 d : Enhancement 28 0 0.0 1.0 7.0 70 7.1 15.0
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 197
7.2.2 FP Size and Number_of_Identified_Defects: All Project Types, IFPUG_Group
This section presents the relationship between the FP size and the number_of_identified_defects of
projects of every type that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type has a defined value. • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects
(Derived indicator)
A comparison of various project types of the same class of size shows that the development projects have
more identified defects. Figure 7-2-3 FP Size and Number_of_Identified_Defects (IFPUG_Group)
0
50
100
150
200
250
300
350
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=248
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover one project (y = 800 approximately). Figure 7-2-4 Number_of_Identified_Defects Distribution (IFPUG_Group)
0
10
20
30
40
50
60
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Over15Number_of_identified_defects
Num
ber o
f pro
ject
s
Table 7-2-5 Number_of_Identified_Defects Basic Statistics (IFPUG_Group)
(Unit: number of projects) Types of Projects N Min P25 Med P75 Max Mean S.D.
All project types 248 0.0 1.0 3.5 15.0 818.0 21.8 67.9 a : Development 178 0.0 0.0 3.0 16.8 818.0 27.0 79.1 b : Maintenance/support 30 0.0 1.0 4.0 10.8 63.0 9.7 13.5 c : Redevelopment 12 0.0 1.0 6.0 15.3 33.0 9.7 10.3 d : Enhancement 28 0.0 0.0 1.0 7.0 70.0 7.1 15.0
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
198 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.2.3 FP Size and Number_of_Identified_Defects: Development, IFPUG_Group
This section presents the relationship between the FP size and the number_of_identified_defects of the
development projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “a: Development” • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects
(Derived indicator)
Projects of small to medium FP sizes have large variations in the number_of_identified_defects. It seems
that the number_of_identified_defects increases as the FP size increases. Figure 7-2-6 FP Size and Number_of_Identified_Defects (Development, IFPUG_Group)
0
50
100
150
200
250
300
350
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=178
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover one project (y = 800 approximately). Table 7-2-7 Number_of_Identified_Defects Basic Statistics (Development, IFPUG_Group)
(Unit: number of projects) N Min P25 Med P75 Max Mean S.D.
178 0 0.0 3.0 16.8 818 27.0 79.1
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 199
7.2.4 FP Size and Number_of_Identified_Defects: Enhancement, IFPUG_Group
This section presents the relationship between the FP size and the number_of_identified_defects of the
enhancement projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects
(Derived indicator)
A comparison of the graphs in this section and those in Section 7.2.3 shows that the “enhancement”
projects have fewer identified defects than the “development” projects of the same class of size. Figure 7-2-8 FP Size and Number_of_Identified_Defects (Enhancement, IFPUG_Group)
0
10
20
30
40
50
60
70
80
0 1,000 2,000 3,000 4,000 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=58
Num
ber_
of_i
dent
ified
_def
ects
Table 7-2-9 Number_of_Identified_Defects Basic Statistics (Enhancement, IFPUG_Group)
(Unit: number of projects) N Min P25 Med P75 Max Mean S.D.
58 0 1.0 3.5 8.8 70 8.4 14.2
200 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.3 FP Size and Identified Defect Density
This section presents the density of the defects identified after system cutover of the projects that measured their actual FP sizes. The FP_identified_defect_density refers to the number_of_identified_defects per 1,000 FPs. The number_of_identified_defects refers to the cumulative number of defects identified in 6 months after system cutover, unless otherwise noted. 7.3.1 FP Size and Identified Defect Density:
All Project Types, Mixed_FP_Measurement_Methods
This section presents the relationship between the FP size and identified defect density of projects of every type that measured their FP sizes. This relationship is shown in a scattergram and the basic statistics for each project type. The FP size measurement methods used by these projects include any kind of method such as methods with unknown names.
Stratification criteria • 103_Project_type has a defined value. • 701_FP_measurement_method = Any value (Including
“Unknown”) • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Projects of small to medium FP sizes have large variations in the identified defect density. The identified
defect density values converge to a range below a certain level as the FP size increases, leaving no high density values above that level. All project types exhibit the median values of one digit, while the “maintenance/support” and “redevelopment” types exhibit slightly higher median values than the “development” or “enhancement” type.
Figure 7-3-1 FP Size and FP_Identified_Defect_Density
(Mixed_FP_Measurement_Methods)
0
50
100
150
200
250
300
350
400
450
500
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=260
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-2 FP_Identified_Defect_Density Basic Statistics
(Mixed_FP_Measurement_Methods) (Unit: defects / 1,000 FPs)
Types of Projects N Min P25 Med P75 Max Mean S.D. All project types 260 0.0 0.2 4.0 15.3 465.5 26.6 67.5 a : Development 184 0.0 0.0 3.1 14.6 411.3 25.7 66.5 b : Maintenance/support 35 0.0 2.4 8.0 41.4 465.5 51.7 98.5 c : Redevelopment 13 0.0 1.6 9.3 12.5 33.0 9.6 9.8 d : Enhancement 28 0.0 0.0 3.1 14.2 54.4 9.6 14.5
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 201
7.3.2 FP Size and Identified Defect Density: All Project Types, IFPUG_Group
This section presents the relationship between the FP size and identified defect density of projects of every
type that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type has a defined value. • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Taking up the largest portion, projects of the IFPUG_group exhibit almost the same distribution as that
presented in Section 7.3.1 for projects categorized as the mixed_FP_measurement_methods. Figure 7-3-3 FP Size and FP_Identified_Defect_Density (IFPUG_Group)
0
50
100
150
200
250
300
350
400
450
500
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=248
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-4 FP_Identified_Defect_Density Basic Statistics (IFPUG_Group)
(Unit: defects / 1,000 FPs) Types of Projects N Min P25 Med P75 Max Mean S.D.
All project types 248 0.0 0.3 4.0 15.2 465.5 26.1 67.8 a : Development 178 0.0 0.0 3.1 14.2 411.3 25.6 66.9 b : Maintenance/support 30 0.0 2.8 8.3 36.0 465.5 50.9 102.8 c : Redevelopment 12 0.0 1.4 8.2 11.6 33.0 9.0 9.9 d : Enhancement 28 0.0 0.0 3.1 14.2 54.4 9.6 14.5
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
202 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.3.3 FP Size and Identified Defect Density: Development, IFPUG_Group
This section presents the relationship between the FP size and identified defect density of the development
projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “a: Development” • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Taking up the largest portion among all project types of the IFPUG_group, the development projects of that
group exhibit almost the same distribution as the development projects categorized as the mixed_FP_measurement_methods.
The identified defect density values converge to a range below a certain level as the FP size increases, leaving no high density values above that level. The median of the FP_identified_defect_density values is as low as 3.1.
Figure 7-3-5 FP Size and FP_Identified_Defect_Density (Development, IFPUG_Group)
0
50
100
150
200
250
300
350
400
450
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=178
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-6 FP_Identified_Defect_Density Basic Statistics (Development, IFPUG_Group)
(Unit: defects / 1,000 FPs) FP SIZE N Min P25 Med P75 Max Mean S.D.
All project types 178 0.0 0.0 3.1 14.2 411.3 25.6 66.9 Less than 400FPs 41 0.0 0.0 0.0 11.8 366.5 28.2 74.9 400FPs or more and less than 1,000FPs 58 0.0 2.5 6.6 22.6 411.3 42.0 91.2 1,000FPs or more and less than 3,000FPs 52 0.0 0.6 1.9 9.2 130.3 11.6 24.7 3,000FPs or more 27 0.0 0.6 3.1 8.2 174.0 13.4 35.8
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 203
7.3.4 Industry-Type-Based FP Size and Identified Defect Density: Development, IFPUG_Group
This section presents on a per-major-target-industry-type basis the relationship between the FP size and
identified defect density of the development projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type) equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or ”R: Government”
• 701_FP_measurement_method = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”
• 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Projects of the “manufacturing” type and those of the “information and communications” type exhibit low
median values (0.3 and 0.0, respectively) with respect to the FP_Identified_Defect_Density, while projects of the “wholesale/retail trade” type exhibit a higher median value of 5.2.
Figure 7-3-7 Industry-Type-Based FP Size and FP_Identified_Defect_Density
(Development, IFPUG_Group)
0
10
20
30
40
50
60
70
80
90
100
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=116
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
* The above scattergram does not cover two projects (y = between 250 and 330 approximately). Table 7-3-8 Industry-Type-Based FP_Identified_Defect_Density Basic Statistics
(Development, IFPUG_Group) (Unit: defects / 1,000 FPs)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 26 0.0 0.0 0.3 7.0 33.2 5.2 8.3 H : Information and communications 13 0.0 0.0 0.0 12.3 74.0 15.4 26.5 J : Wholesale/retail trade 22 0.0 2.1 5.2 16.9 80.9 12.1 18.1 K : Finance and insurance 46 0.0 0.8 2.9 9.8 329.9 13.9 48.6 R : Government, N.E.C. 9 0.0 0.0 2.1 8.9 268.5 33.8 88.2
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
204 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.3.5 Architecture-Based FP Size and Identified Defect Density: Development, IFPUG_Group
This section presents on a per-system-architecture basis the relationship between the FP size and identified
defect density of the development projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value. • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Projects of the “2-layer client/server” architecture exhibit a high median value and a high mean value with
respect to the FP_Identified_Defect_Density. Note that these projects are generally smaller than other architecture types.
Figure 7-3-9 Architecture-Based FP Size and FP_Identified_Defect_Density
(Development, IFPUG_Group)
0
50
100
150
200
250
300
350
400
450
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=166
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-10 Architecture-Based FP_Identified_Defect_Density Basic Statistics
(Development, IFPUG_Group) (Unit: defects / 1,000 FPs)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 23 0.0 0.0 3.4 11.9 163.6 17.4 37.5 b : Mainframe 3 0.3 - 0.6 - 22.6 7.9 12.8 c : 2-layer client/server 28 0.0 0.0 4.6 174.4 411.3 87.1 131.7 d : 3-layer client/server 25 0.0 2.1 8.9 15.2 329.9 24.5 65.8 e : Intranet/Internet 87 0.0 0.7 2.6 11.7 174.0 10.8 22.8
F: Stand-alone H: Mainframe J: 2-layer client/server K: 3-layer client/server R: Intranet/Internet
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 205
7.3.6 FP Size and Identified Defect Density: Enhancement, IFPUG_Group
This section presents the relationship between the FP size and identified defect density of the enhancement
projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
The “enhancement” projects exhibit a higher median value of 6.0 with respect to the
FP_Identified_Defect_Density than the “development” projects (3.1) shown in Figure 7-3-6. Note that the enhancement projects are generally smaller than the development projects.
Figure 7-3-11 FP Size and FP_Identified_Defect_Density (Enhancement, IFPUG_Group)
0
50
100
150
200
250
300
350
400
450
500
0 1,000 2,000 3,000 4,000 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=58
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-12 FP_Identified_Defect_Density Basic Statistics
(Enhancement, IFPUG_Group) (Unit: defects / 1,000 FPs)
FP SIZE N Min P25 Med P75 Max Mean S.D. All project types 58 0.0 1.1 6.0 22.8 465.5 31.0 76.9 Less than 400FPs 29 0.0 0.0 11.6 38.9 465.5 52.9 104.1 400FPs or more and less than 1,000FPs 20 0.0 1.7 5.3 11.9 61.3 11.0 17.0 1,000FPs or more and less than 3,000FPs 7 0.0 1.3 2.5 5.6 8.5 3.5 3.3 3,000FPs or more 2 1.4 — 8.5 — 15.6 8.5 10.0
206 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.3.7 Industry-Type-Based FP Size and Identified Defect Density: Enhancement, IFPUG_Group
This section presents on a per-major-target-industry-type basis the relationship between the FP size and
identified defect density of the enhancement projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 201_Industry_type_1, _2, or _3 (major type) =
“F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”
• 701_FP_measurement_method = “a: IFPUG”, “b: SPR”, or “d: NESMA estimated method”
• 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Because a limited number of sampled projects are available for each industry type, read the results
presented in this section as non-definitive information. Figure 7-3-13 Industry-Type-Based FP Size and FP_Identified_Defect_Density
(Enhancement, IFPUG_Group)
0
10
20
30
40
50
60
70
80
0 1,000 2,000 3,000 4,000 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=30
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-14 Industry-Type-Based FP_Identified_Defect_Density Basic Statistics
(Enhancement, IFPUG_Group) (Unit: defects / 1,000 FPs)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 10 0.0 0.0 2.1 18.1 39.4 10.7 14.6 H : Information and communications 6 0.0 0.8 5.2 9.0 38.9 9.8 14.8 J : Wholesale/retail trade 4 0.0 1.1 2.4 21.2 74.9 19.9 36.7 K : Finance and insurance 8 1.0 2.0 2.7 4.0 15.6 4.2 4.7 R : Government, N.E.C. 2 0.6 - 1.4 - 2.2 1.4 1.1
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 207
7.3.8 Architecture-Based FP Size and Identified Defect Density: Enhancement, IFPUG_Group
This section presents on a per-system-architecture basis the relationship between the FP size and identified
defect density of the enhancement projects that measured their sizes by any FP measurement method of the IFPUG_group (IFPUG, SPR, or NESMA estimated method).
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 308_Architecture_1, _2, or _3 has a defined value. • 701_FP_measurement_method = “a: IFPUG”,
“b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: 5001_Actual_FP_size_ (unadjusted)• Y-axis: Number_of_identified_defects per
1,000 FPs (Derived indicator) [defects / 1,000 FPs]
Because a limited number of sampled projects are available for each architecture type, read the results
presented in this section as non-definitive information. Figure 7-3-15 Architecture-Based FP Size and FP_Identified_Defect_Density
(Enhancement, IFPUG_Group)
0
50
100
150
200
250
300
350
400
450
500
0 1,000 2,000 3,000 4,000 5,000
Actual_FP_Size_ (unadjusted)
Copyright IPA SEC
N=57
Num
ber_
of_i
dent
ified
_def
ects
per 1
000
FPs
Table 7-3-16 Architecture-Based FP_Identified_Defect_Density Basic Statistics
(Enhancement, IFPUG_Group) (Unit: defects / 1,000 FPs)
Architecture N Min P25 Med P75 Max Mean S.D. a : Stand-alone 26 0.0 3.5 12.5 36.0 465.5 51.4 106.5 b : Mainframe 2 3.9 — 4.0 — 4.1 4.0 0.2 c : 2-layer client/server 15 0.0 0.0 4.1 9.4 170.1 20.2 44.4 d : 3-layer client/server 4 0.6 1.6 2.1 6.6 19.6 6.1 9.0 e : Intranet/Internet 10 0.0 1.6 2.6 13.7 74.9 12.4 22.7
F: Stand-alone H: Mainframe J: 2-layer client/server K: 3-layer client/server R: Intranet/Internet
208 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.4 SLOC Size and Number_of_Identified_Defects
This section presents the number_of_identified_defects of the projects that measured their actual SLOC sizes. The number_of_identified_defects refers to the cumulative number of defects identified in 6 months after system cutover, unless otherwise noted. 7.4.1 SLOC Size and Number_of_Identified_Defects:
All Project Types, Mixed Primary Programming Languages
This section presents the relationship between the SLOC size and the number_of_identified_defects of projects of every type. Programming languages used by the projects that provided their SLOC size data analyzed here include any kind of programming language such as those with unknown names.
Stratification criteria • 103_Project_type has a defined value. • 312_Primary_programming_language_1 = Any value
(Including “Unknown”) • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects
(Derived indicator)
A comparison of various project types of the same class of size shows that the “development” projects have
more identified defects. Figure 7-4-1 SLOC Size and Number_of_Identified_Defects
(Mixed Primary Programming Languages)
0
20
40
60
80
100
120
140
160
180
200
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000
SLOC size [KSLOC]
Copyright IPA SEC
N=427
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover five projects (y = between 320 and 1300 approximately). Table 7-4-2 Number_of_Identified_Defects Basic Statistics
(Mixed Primary Programming Languages) (Unit: number of projects)
Types of Projects N Min P25 Med P75 Max Mean S.D. All project types 427 0 0.0 2.0 8.0 1,262 20.4 87.3 a : Development 244 0 0.0 2.5 12.3 1,262 27.5 111.2 b : Maintenance/support 108 0 0.0 1.0 7.0 320 13.9 40.3 c : Redevelopment 25 0 0.0 2.0 5.0 160 12.8 32.9 d : Enhancement 50 0 0.0 0.0 1.0 49 3.1 9.4
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 209
7.4.2 Primary-Programming-Language-Based SLOC Size and Number_of_Identified_Defects: All Project Types
This section presents on a per-primary-programming-language basis (COBOL, C, VB, and Java) the
relationship between the SLOC size and the number_of_identified_defects of projects of every type.
Stratification criteria • 103_Project_type has a defined value. • Any of the three data items,
312_Primary_programming_language_1, _2, and _3 equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”
• Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects
(Derived indicator)
COBOL exhibits a higher median value of 3.0 than the other three languages (1.0) with respect to the
number_of_identified_defects. It is uncertain whether the relationship between the SLOC size and the number_of_identified_defects varies depending on any characteristics specific to programming languages.
Figure 7-4-3 Primary-Programming-Language-Based SLOC Size and
Number_of_Identified_Defects (All Project Types)
0
20
40
60
80
100
120
140
160
180
200
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000
SLOC size [KSLOC]
Copyright IPA SEC
N=336
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover four projects (y = between 320 and 1300 approximately).
Figure 7-4-4 Primary-Programming-Language-
Based SLOC Size and Number_of_Identified_Defects (All Project Types) Box-and-Whisker Plot
Table 7-4-5 Primary-Programming-Language-Based Number_of_Identified_Defects Basic
Statistics (All Project Types) (Unit: number of projects)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 336 0 0.0 2.0 8.0 1,262 20.1 87.0 b : COBOL 77 0 1.0 3.0 14.0 630 23.1 77.2 g : C 62 0 0.0 1.0 5.8 1,262 28.3 160.5 h : VB 81 0 0.0 1.0 8.0 321 14.0 42.2 q :Java 116 0 0.0 1.0 6.3 432 18.0 56.0
b: COBOL g: C h: VB q: Java
60
50
40
30
20
10
0
Num
ber_of_identified_defects
b: COBOL
Primary programming language
g: C h: VB q: Java
[Defects/
210 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.4.3 Primary-Programming-Language-Based SLOC Size and Number_of_Identified_Defects: Development
This section presents on a per-primary-programming-language basis (COBOL, C, VB, and Java) the
relationship between the SLOC size and the number_of_identified_defects of development projects.
Stratification criteria • 103_Project_type = “a: Development” • 312_Primary_programming_language_1 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects
(Derived indicator)
It is uncertain whether the relationship between the SLOC size and the number_of_identified_defects
varies depending on any characteristics specific to programming languages. Figure 7-4-6 Primary-Programming-Language-Based SLOC Size and
Number_of_identified_defects (Development)
0
20
40
60
80
100
120
140
160
180
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000
SLOC size [KSLOC]
Copyright IPA SEC
N=173
Num
ber_
of_i
dent
ified
_def
ects
* The above scattergram does not cover two projects (y = between 320 and 1300 approximately). Table 7-4-7 Primary-Programming-Language-Based Number_of_identified_defects Basic
Statistics (Development) (Unit: number of projects)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 43 0 2.0 3.0 18.5 630 32.8 100.0 g : C 31 0 0.0 2.0 11.5 1,262 52.5 226.0 h : VB 36 0 0.0 1.5 10.0 139 12.9 26.5 q : Java 63 0 0.0 2.0 6.0 432 16.5 57.7
b: COBOL g: C h: VB q: Java
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 211
7.4.4 Primary-Programming-Language-Based SLOC Size and Number_of_Identified_Defects: Enhancement
This section presents on a per-primary-programming-language basis (COBOL, C, VB, and Java) the
relationship between the SLOC size and the number_of_identified_defects of enhancement projects.
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or “
d: Enhancement” • 312_Primary_programming_language_1 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size _enhancement > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size _enhancement
(Derived indicator) • Y-axis: Number_of_identified_defects
(Derived indicator)
It is uncertain whether the relationship between the SLOC size and the number_of_identified_defects
varies depending on any characteristics specific to programming languages. Figure 7-4-8 Primary-Programming-Language-Based SLOC Size and
Number_of_identified_defects (Enhancement)
0
20
40
60
80
100
120
140
160
180
0 250 500 750 1,000 1,250 1,500 1,750 2,000
Actual_net_SLOC_size_enhancement [KSLOC]
Copyright IPA SEC
N=104
Num
ber_
of_i
dent
ified
_def
ect
* The above scattergram does not cover one project (y = 320 approximately). Table 7-4-9 Primary-Programming-Language-Based Number_of_identified_defects Basic
Statistics (Enhancement) (Unit: number of projects)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 21 0 0.0 2.0 7.0 146 13.0 32.4 g : C 23 0 0.0 0.0 4.0 24 3.3 6.4 h : VB 23 0 0.0 1.0 2.0 87 7.2 19.9 q : Java 37 0 0.0 1.0 3.0 320 20.0 59.5
b: COBOL g: C h: VB q: Java
212 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.5 SLOC Size and SLOC_Identified_Defect_Density
This section presents the SLOC_identified_defect_density of the projects that measured their actual
SLOC sizes. The SLOC_identified_defect_density refers to the number_of_identified_defects per 1,000 lines of source code (1 KSLOC). The number_of_identified_defects refers to the cumulative number of defects identified in 6 months after system cutover, unless otherwise noted. 7.5.1 SLOC Size and Identified Defect Density:
All Project Types, Mixed Primary Programming Languages
This section presents the relationship between the SLOC size and identified defect density of projects of every type that measured their actual SLOC sizes. Programming languages used by the projects that provided their SLOC size data analyzed here include any kind of programming language such as those with unknown names.
Stratification criteria • 103_Project_type has a defined value. • 312_Primary_programming_language_1 = Any value
(Including “Unknown”) • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. “Development” projects
and “redevelopment” projects exhibit relatively high median values of 0.026 and 0.032, respectively, with respect to the SLOC_identified_defect_density. “Maintenance/support” projects and “enhancement” projects exhibit lower median values of 0.004 and 0.000, respectively.
Figure 7-5-1 SLOC Size and SLOC_Identified_Defect_Density
(Mixed Primary Programming Languages)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000
SLOC size [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=427
* The above scattergram does not cover six projects (x = nearly zero, y = between 2 and 6) and one project (x = 12100 approximately, and y = nearly zero).
Table 7-5-2 SLOC_Identified_Defect_Density Basic Statistics
(Mixed Primary Programming Languages) (Unit: defects/KSLOCK)
Types of Projects N Min P25 Med P75 Max Mean S.D. All project types 427 0.000 0.000 0.018 0.075 5.845 0.132 0.465 a : Development 244 0.000 0.000 0.026 0.084 4.708 0.119 0.375 b : Maintenance/support 108 0.000 0.000 0.004 0.052 5.845 0.175 0.694 c : Redevelopment 25 0.000 0.000 0.032 0.172 2.041 0.181 0.417 d : Enhancement 50 0.000 0.000 0.000 0.057 0.957 0.077 0.192
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 213
7.5.2 Primary-Programming-Language-Based SLOC Size and SLOC Identified Defect Density: All Project Types
This section presents on a per-primary-programming-language basis the relationship between the SLOC
size and identified defect density of the projects of every type that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type has a defined value. • 312_Primary_programming_language_1, _2, or _3 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. Both “COBOL” and “VB”
exhibit a high median value of 0.024 with respect to the identified defect density, and “C” exhibits the lowest median value of 0.004.
Figure 7-5-3 Primary-Programming-Language-Based SLOC Size and
SLOC_Identified_Defect_Density (All Project Types)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
0 500 1,000 1,500 2,000 2,500 3,000
SLOC size [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=336
* The above scattergram does not cover two projects (x = nearly zero, y = between 2 and 2.5) and two projects (x = between 3800 and 12100, and y = nearly zero).
Figure 7-5-4 Primary-Programming-
Language-Based SLOC_Identified_Defect_Density (All Project Types) Box-and-Whisker Plot
Table 7-5-4 Primary-Programming-Language-
Based SLOC_Identified_Defect_Density Basic Statistics (All Project Types) (Unit: defects/KSLOCK)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 336 0.000 0.000 0.018 0.067 2.413 0.093 0.236 b : COBOL 77 0.000 0.004 0.024 0.063 1.093 0.075 0.160 g : C 62 0.000 0.000 0.004 0.045 0.476 0.051 0.100 h : VB 81 0.000 0.000 0.024 0.083 1.269 0.103 0.206 q : Java 116 0.000 0.000 0.013 0.088 2.413 0.122 0.330
b: COBOL g: C h: VB q: Java
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
KSLOC]
SLOC
_defects_density
b:Primary programming
g: C h: VB q: Java
[Defects/
214 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.5.3 Primary-Programming-Language-Based SLOC Size and Identified Defect Density: Development
This section presents on a per-primary-programming-language basis the relationship between the SLOC
size and identified defect density of the development projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “a: Development” • 312_Primary_programming_language_1 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. “COBOL” and “VB”
exhibit high median values of 0.037 and 0.026, respectively, with respect to the SLOC_identified_defect_density, and “Java” exhibits the lowest median value of 0.011.
Figure 7-5-6 Primary-Programming Language-Based SLOC Size and
SLOC_Identified_Defect_Density (Development)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
0 250 500 750 1,000 1,250 1,500 1,750 2,000 2,250 2,500 2,750 3,000
SLOC size [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=173
* The above scattergram does not cover two projects (x = nearly zero, y = 2.4 approximately) and one project (x = 12100 approximately, and y = nearly zero)
Table 7-5-7 SLOC-Size-Based SLOC_Identified_Defect_Density Basic Statistics
(Development, Major_Programming_Language_Group) (Unit: defects/KSLOCK)
SLOC size N Min P25 Med P75 Max Mean S.D.All project types 173 0.000 0.000 0.024 0.065 2.413 0.086 0.237 Less than 40LSOCs 56 0.000 0.000 0.000 0.061 1.269 0.074 0.191 40KSLOCs or more and less than 100KSLOCs 32 0.000 0.000 0.029 0.084 0.342 0.067 0.088 100KSLOCs or more and less than 300KSLOCs 40 0.000 0.006 0.031 0.100 2.413 0.150 0.415 300KSLOCs or more 45 0.000 0.005 0.020 0.052 0.476 0.060 0.107
b: COBOL g: C h: VB q: Java
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 215
Figure 7-5-8 Primary-Programming-Language-Based SLOC_Identified_Defect_Density (Development) Box-and-Whisker Plot
Table 7-5-9 Primary-Programming-Language-Based SLOC_Identified_Defect_Density
Basic Statistics (Development) (Unit: defects/KSLOCK)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 43 0.000 0.016 0.037 0.060 1.093 0.075 0.175 g : C 31 0.000 0.000 0.014 0.055 0.476 0.061 0.106 h : VB 36 0.000 0.000 0.026 0.101 1.269 0.104 0.231 q : Java 63 0.000 0.000 0.011 0.065 2.413 0.097 0.315
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
KSLOC]
SLOC
_defects_density
b: COBOL
Primary programming language
g: C h: VB q: Java
[Defects/KSLOC]
216 IPA/SEC White Paper 2007 on Software Development Projects in Japan
7.5.4 Industry-Type-Based SLOC Size and Identified Defect Density: Development, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the relationship between the SLOC size and
identified defect density of the development projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “a: Development” • 201_Industry_type_1, _2, or _3 (major type) =
“F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”
• 312_Primary_programming_language_1 = “b: COBOL”, “g: C”, “h: VB”, or “q: Java”
• Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
Projects of the “manufacturing” type and those of the “government” type exhibit high median values of
0.081 and 0.050, respectively, with respect to the SLOC_identified_defect_density, and projects of the “information and communications” type exhibit the lowest median value of 0.017.
Figure 7-5-10 Industry-Type-Based SLOC Size and SLOC_Identified_Defect_Density
(Development, Major_Programming_Language_Group)
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
0 500 1,000 1,500 2,000 2,500 3,000SLOC size [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=139
* The above scattergram does not cover two projects: one project (x = nearly zero, y = 2.4 approximately) and another (x = 12,000 approximately, and y = nearly zero).
Figure 7-5-11 Industry-Type-Based
SLOC_Identified_Defect_Density (Development, Major_Programming_Language_ Group) Box-and-Whisker Plot
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
SLOC]
SLOC
_defects_density
F: Manufacturing
Industry Type - Major Type
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
[Defects/KSLOC]
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 217
Table 7-5-12 Industry-Type-Based SLOC_Identified_Defect_Density Basic Statistics (Development, Major_Programming_Language_Group)
(Unit: defects / 1,000 FPs) Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D.
F : Manufacturing 10 0.000 0.003 0.081 0.189 0.342 0.109 0.118 H : Information and communications 16 0.000 0.000 0.017 0.074 0.307 0.060 0.095 J : Wholesale/retail trade 25 0.000 0.000 0.036 0.073 2.413 0.161 0.482 K : Finance and insurance 63 0.000 0.000 0.021 0.052 0.151 0.035 0.042 R : Government, N.E.C. 25 0.000 0.007 0.050 0.142 1.269 0.180 0.335
218 IPA/SEC White Paper 2007 on Software Development Projects in Japan
0.30
0.25
0.20
0.15
0.10
0.05
0.00
SLOC]
SLOC
_defects_density
a: Stand-alone
Architecture
b: Mainfram
e
c: 2-layer client/server
d: 3-layer client/server
e: Intranet/Internet
[Defects/KSLOC]
7.5.5 Architecture-Based SLOC Size and Identified Defect Density: Development, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the relationship between the SLOC size and
identified defect density of the development projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “a: Development” • 308_Architecture_1, _2, or _3 has a defined value. • 312_Primary_programming_language_1 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size (Derived
indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. Projects of the “3-layer
client/server” architecture have larger variations in the SLOC_identified_defect_density and exhibit a slightly higher median value than projects of other architecture types. Projects of the “intranet/Internet” architecture and those of the “2-layer client/server” architecture exhibit low median values.
Figure 7-5-13 Architecture-Based SLOC Size and SLOC_Identified_Defect_Density
(Development, Major_Programming_Language_Group)
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0.50
0 250 500 750 1,000 1,250 1,500 1,750 2,000 2,250 2,500 2,750 3,000
SLOC size [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=140
* The above scattergram does not cover two projects (x = between 3,800 and 12,000 approximately, y = nearly zero).
Figure 7-5-14 Architecture-Based
SLOC_Identified_Defect_Density (Development, Major_Programming_Language_Group) Box-and-Whisker Plot
Table 7-5-15 Architecture-Based
SLOC_Identified_Defect_Density Basic Statistics (Development, Major_Programming_Language_Group).
(Unit: defects/KSLOCK) Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 6 0.000 0.000 0.000 0.000 0.151 0.025 0.062 b : Mainframe 7 0.000 0.016 0.044 0.055 0.117 0.043 0.039 c : 2-layer client/server 35 0.000 0.000 0.016 0.035 0.223 0.030 0.049 d : 3-layer client/server 38 0.000 0.000 0.037 0.120 0.238 0.062 0.071 e : Intranet/Internet 54 0.000 0.002 0.017 0.056 0.476 0.050 0.090
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 219
7.5.6 Primary-Programming-Language-Based SLOC Size and Identified Defect Density: Enhancement
This section presents on a per-primary-programming-language basis the relationship between the SLOC
size and identified defect density of the enhancement projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 312_Primary_programming_language_1 =
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size _enhancement > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size _enhancement
(Derived indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. The “enhancement”
projects have lower SLOC_identified_defect_density values than the “development” projects presented in Section 7.5.3.
Figure 7-5-16 Primary-Programming Language-Based SLOC Size and
SLOC_Identified_Defect_Density (Enhancement)
0.0
0.2
0.4
0.6
0.8
1.0
0 250 500 750 1,000 1,250 1,500 1,750 2,000
Actual_net_SLOC_size_enhancement [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=104
Table 7-5-17 SLOC Size-Based SLOC_Identified_Defect_Density Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: defects/KSLOCK)
SLOC size N Min P25 Med P75 Max Mean S.D.All project types 104 0.000 0.000 0.004 0.052 1.007 0.076 0.166 Less than 40LSOCs 44 0.000 0.000 0.000 0.007 0.526 0.036 0.097 40KSLOCs or more and less than 100KSLOCs 21 0.000 0.018 0.044 0.151 0.688 0.137 0.182 100KSLOCs or more and less than 300KSLOCs 20 0.000 0.000 0.010 0.172 1.007 0.126 0.241 300KSLOCs or more 19 0.000 0.000 0.005 0.022 0.704 0.048 0.160
b: COBOL g: C h: VB q: Java
220 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 7-5-18 Primary-Programming Language-Based SLOC_Identified_Defect_Density (Enhancement) Box-and-Whisker Plot
Table 7-5-19 Primary-Programming Language-Based SLOC_Identified_Defect_Density
Basic Statistics (Enhancement) (Unit: defects/KSLOCK)
Primary programming language N Min P25 Med P75 Max Mean S.D. b : COBOL 21 0.000 0.000 0.005 0.024 0.412 0.047 0.100 g : C 23 0.000 0.000 0.000 0.015 0.387 0.047 0.106 h : VB 23 0.000 0.000 0.003 0.049 0.526 0.064 0.135 q : Java 37 0.000 0.000 0.010 0.128 1.007 0.117 0.230
7.5.7 Industry-Type-Based SLOC Size and Identified Defect
Density: Enhancement, Major_Programming_Language_Group
This section presents on a per-major-target-industry-type basis the relationship between the SLOC size and
identified defect density of the enhancement projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 201_Industry_type_1, _2, or _3 (major type) =
“F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”
• 312_Primary_programming_language_1 = “b: COBOL”, “g: C”, “h: VB”, or “q: Java”
• Actual_net_SLOC_size _enhancement > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size _enhancement
(Derived indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density varies widely while the SLOC size is small. Because a limited
number of sampled projects are available for some of the industry types, it is uncertain whether the SLOC_identified_defect_density varies depending on the industry type.
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
KSLOC]
SLOC
_defects_density
b: COBOL
Primary programming language
g: C h: VB q: Java
[Defects/KSLOC]
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 221
Figure 7-5-20 Industry-Type-Based SLOC Size and SLOC_Identified_Defect_Density (Enhancement, Major_Programming_Language_Group)
0.0
0.2
0.4
0.6
0.8
1.0
0 250 500 750 1,000 1,250 1,500 1,750 2,000Actual_net_SLOC_size_enhancement [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=83
Figure 7-5-21 Industry-Type-Based SLOC_Identified_Defect_Density (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot
Table 7-5-22 Industry-Type-Based SLOC_Identified_Defect_Density Basic Statistics
(Enhancement, Major_Programming_Language_Group) (Unit: defects/KSLOCK)
Industry Type (Major Type) N Min P25 Med P75 Max Mean S.D. F : Manufacturing 8 0.000 0.009 0.094 0.276 0.408 0.148 0.160 H : Information and communications 13 0.000 0.000 0.000 0.013 0.031 0.007 0.012 J : Wholesale/retail trade 7 0.000 0.000 0.000 0.098 1.007 0.172 0.372 K : Finance and insurance 38 0.000 0.000 0.000 0.061 0.688 0.062 0.138 R : Government, N.E.C. 17 0.000 0.003 0.018 0.041 0.412 0.059 0.112
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
KSLOC]
SLOC
_defects_density
F: Manufacturing
Industry Type - Major Type
H: Inform
ation and com
munications
J: Wholesale/
retail trade
K: Finance and insurance
R: G
overnment
[Defects/KSLOC]
F: Manufacturing H: Information and communications J: Wholesale/retail trade K: Finance and insurance R: Government
222 IPA/SEC White Paper 2007 on Software Development Projects in Japan
0.30
0.25
0.20
0.15
0.10
0.05
0.00
SLOC]
SLOC
defectsdensity
a:
Architecture
b: c: 2-layer
d: 3-layer
e:
[Defects/
7.5.8 Architecture-Based SLOC Size and Identified Defect Density: Enhancement, Major_Programming_Language_Group
This section presents on a per-system-architecture basis the relationship between the SLOC size and
identified defect density of the enhancement projects that used COBOL, C, VB, or Java for their primary programming language.
Stratification criteria • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 308_Architecture_1, _2, or _3 has a defined value. • 312_Primary_programming_language_1=
“b: COBOL”, “g: C”, “h: VB”, or “q: Java” • Actual_net_SLOC_size _enhancement > 0 • Number_of_identified_defects ≥ 0
Analyzed data • X-axis: Actual_net_SLOC_size_enhancement
(Derived indicator) • Y-axis: Number_of_identified_defects per
1 KSLOCs (Derived indicator) [defects / KSLOC]
The SLOC_identified_defect_density fluctuates while the SLOC size is small. It is uncertain whether the
SLOC_identified_defect_density varies depending on the type of system architecture. Figure 7-5-23 Architecture-Based SLOC Size and SLOC_Identified_Defect_Density
(Enhancement, Major_Programming_Language_Group)
0.0
0.2
0.4
0.6
0.8
1.0
0 250 500 750 1,000 1,250 1,500 1,750 2,000
Actual_net_SLOC_size_enhancement [KSLOC]
Def
ects
per K
SLO
C
Copyright IPA SEC
N=82
Figure 7-5-24 Architecture-Based
SLOC_Identified_Defect_Density (Enhancement, Major_Programming_Language_Group) Box-and-Whisker Plot
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet
7. Analysis of Reliability
IPA/SEC White Paper 2007 on Software Development Projects in Japan 223
Table 7-5-24 Architecture-Based SLOC_Identified_Defect_Density Basic Statistics (Enhancement, Major_Programming_Language_Group)
(Unit: defects/KSLOCK) Architecture N Min P25 Med P75 Max Mean S.D.
a : Stand-alone 14 0.000 0.000 0.000 0.006 1.007 0.082 0.268 b : Mainframe 4 0.000 0.010 0.017 0.074 0.231 0.066 0.110 c : layer client/server 28 0.000 0.000 0.014 0.067 0.704 0.076 0.153 d : 3-layer client/server 19 0.000 0.000 0.004 0.067 0.688 0.087 0.171 e : Intranet/Internet 17 0.000 0.000 0.000 0.036 0.345 0.044 0.090
224 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8 Development-Phase-Based Analysis
This chapter presents the results of development-phase-based analyses of effort, development schedule, the number of issues pointed out in reviews, the test case density, the identified software failure density, and the identified software fault density.
8.1 Development-Phase-Based Analysis of Effort and Development Schedule
This section presents the analysis results of the ratio of months spent and the ratio of effort spent in each of
the major development phases. This analysis was made for the projects that stated that they went through the major development phases (from basic design to system test) by marking all the relevant phase-check data items with “ ”. This section calculates and presents the ratio of effort or months spent in a phase by dividing that effort or months by total actual effort or months spent in all the major development phases.
* The term “system test” used in figures and tables in this chapter refers to the system test done by the vendor.
8.1.1 Phase-Based Development Schedule: Development
This section presents the ratio of actual months spent in each of the major development phases of development projects. The first part describes the stratification criteria and analyzed data applied to the analysis in this chapter, and the second part presents the ratio of actual months in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “a: Development” • All data items for the major development phases’ actual
months have positive values.
Analyzed data • Actual months_Basic design, Actual
months_Detailed design, Actual months_Construction, Actual months_Integration test, and Actual months_System test * All the above five data items are derived indicators
holding each phase's actual months calculated from the actual beginning date and actual completion date of each phase. For projects lacking one or both of these dates but providing relevant data on actual months, the analysis in this section used the data instead.
The development projects have high ratios of design phase months and high ratios of construction phase
months. Figure 8-1-1 Phase-Based Actual Month Ratio (Development) Box-and-Whisker Plot
Table 8-1-2 Phase-Based Actual Month Ratio Basic Statistics (Development)
(Unit: ratio) Phase N Min P25 Med P75 Max Mean S.D.
Basic design 69 0.033 0.167 0.237 0.308 0.522 0.243 0.107Detailed design 69 0.033 0.141 0.200 0.245 0.400 0.196 0.087Implementation 69 0.047 0.205 0.261 0.333 0.902 0.282 0.134Integration test 69 0.016 0.075 0.125 0.176 0.386 0.131 0.072System test 69 0.016 0.074 0.118 0.192 0.571 0.148 0.109
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Rat
io
Basic design Detailed design
Implementation Integration test System test
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 225
8.1.2 Phase-Based Development Schedule: Enhancement
This section presents the ratio of actual months spent in each of the major development phases of enhancement projects. The first part describes the stratification criteria and analyzed data applied to the analysis in this chapter, and the second part presents the ratio of actual months in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • All data items for the major development phases’ actual
months have positive values.
Analyzed data • Actual months_Basic design, Actual
months_Detailed design, Actual months_Construction, Actual months_Integration test, and Actual months_System test * All the above five data items are derived indicators
holding each phase’s actual months calculated from the actual beginning date and actual completion date of each phase. For projects lacking one or both of these dates but providing relevant data on act
The enhancement projects have higher ratios of test phase months than the development projects.
Figure 8-1-3 Phase-Based Actual Month Ratio (Enhancement) Box-and-Whisker Plot
Table 8-1-4 Phase-Based Actual Month Ratio Basic Statistics (Enhancement)
(Unit: ratio) Phase N Min P25 Med P75 Max Mean S.D.
Basic design 54 0.045 0.143 0.181 0.229 0.444 0.194 0.083Detailed design 54 0.051 0.117 0.174 0.222 0.690 0.184 0.103Implementation 54 0.050 0.168 0.272 0.352 0.643 0.275 0.129Integration test 54 0.022 0.112 0.155 0.206 0.458 0.160 0.079System test 54 0.027 0.101 0.158 0.260 0.471 0.187 0.114
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
比
率 Rat
io
Basic design Detailed design
Implementation Integration test System test
226 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.1.3 Phase-Based Effort: Development
This section presents the ratio of actual effort spent in each of the major development phases of development projects. The first part describes the stratification criteria and analyzed data applied to the analysis in this section, and the second part presents the ratio of actual effort in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “a: Development” • All data items for the major development phases' actual
effort have positive values.
Analyzed data • Actual effort (total person hours)_Basic design,
Actual effort (total person hours)_Detailed design, Actual effort (total person hours)_Construction, Actual effort (total person hours)_Integration test, and Actual effort (total person hours)_System test* All the above five data items are derived indicators
holding each phase's actual person hours converted from the sum of actual in-house effort and actual outsourced effort.
The median values show that the effort spent in the construction phase takes up one-third or more of the
entire effort. Figure 8-1-5 Phase-Based Actual Effort Ratio (Development) Box-and-Whisker Plot
Table 8-1-6 Phase-Based Actual Effort Ratio Basic Statistics (Development)
(Unit: ratio) Phase N Min P25 Med P75 Max Mean S.D.
Basic design 313 0.001 0.092 0.140 0.203 0.589 0.162 0.099Detailed design 313 0.016 0.111 0.166 0.219 0.613 0.173 0.087Implementation 313 0.018 0.268 0.350 0.455 0.847 0.375 0.156Integration test 313 0.002 0.103 0.150 0.208 0.508 0.162 0.091System test 313 0.000 0.058 0.116 0.177 0.564 0.128 0.091
Rat
io
Basic design Detailed design
Implementation Integration test System test
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 227
8.1.4 Phase-Based Effort: Enhancement
This section presents the ratio of actual effort spent in each of the major development phases of enhancement projects. The first part describes the stratification criteria and analyzed data applied to the analysis in this section, and the second part presents the ratio of actual effort in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • All data items for the major development phases’ actual
effort have positive values.
Analyzed data • Actual effort (total person hours)_Basic design,
Actual effort (total person hours)_Detailed design, Actual effort (total person hours)_Construction, Actual effort (total person hours)_Integration test, and Actual effort (total person hours)_System test* All the above five data items are derived indicators
holding each phase's actual person hours converted from the sum of actual in-house effort and actual outsourced effort.
The median values show that the effort spent in the construction phase takes up one-third of the entire
effort. The enhancement projects have higher ratios of test phase effort than the development projects. Figure 8-1-7 Phase-Based Actual Effort Ratio (Enhancement) Box-and-Whisker Plot
Table 8-1-8 Phase-Based Actual Effort Ratio Basic Statistics (Enhancement)
(Unit: ratio) Phase N Min P25 Med P75 Max Mean S.D.
Basic design 215 0.005 0.094 0.140 0.197 0.557 0.152 0.088Detailed design 215 0.012 0.104 0.158 0.213 0.567 0.169 0.087Implementation 215 0.029 0.238 0.333 0.463 0.825 0.351 0.156Integration test 215 0.007 0.117 0.172 0.228 0.680 0.182 0.097System test 215 0.000 0.073 0.125 0.196 0.680 0.147 0.109
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Rat
io
Basic design Detailed design
Implementation Integration test System test
228 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.2 Number of Issues Pointed Out in Reviews
This section presents the analysis results of the number of issues pointed out in reviews. This analysis was made for the projects that stated that they went through the major development phases (from basic design to system test) by marking all the relevant phase-check data items with “ ”.
* The term “system test” used in figures and tables in this chapter refers to the system test done by the vendor.
8.2.1 Number of Issues Pointed Out in Reviews in Basic Design
Phase: All Project Types This section presents the number issues pointed out in reviews in the basic design phase per 1,000 FPs, per
1 KSLOCs, per 1,000 person hours, or per 160 person hours, which is referred to as the review-pointed-out issue density. Because of insufficient samples available for other development phases, this section does not present the review-pointed-out issue density in other phases.
The analysis presented in this section used the actual effort spent in each of the major development phases to calculate the review-pointed-out issue density with two bases: per 1,000 person hours and per 160 person hours.
The first part describes the stratification criteria and analyzes data applied to the analysis in this section, and the second part presents the basic statistics of review-pointed-out issue density in the basic design phase.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type has a defined value. • 5249_Design_phase_review-pointed-out_issues_
(Basic design) has a valid value. • For review-pointed-out issues per unit FP size:
5001_Actual_FP_size_ (unadjusted) > 0 • For review-pointed-out issues per unit SLOC size:
Actual_net_SLOC_size > 0 • For review-pointed-out issues per unit effort: Actual effort
(major development phases) > 0
Analyzed data • 5249_Design_phase_review-pointed-out_
issues_ (Basic design)
Table 8-2-1 Basic Design Review-Pointed-Out Issues per Unit FP Size Basic Statistics
(Unit: Issues/1,000 FPs) N Min P25 Med P75 Max Mean S.D.
21 0.000 22.340 59.782 140.323 470.730 91.870 105.993 Table 8-2-2 Basic Design Review-Pointed-Out Issues per Unit SLOC Size Basic Statistics
(Unit: Issues/KSLOCs) N Min P25 Med P75 Max Mean S.D.
102 0.000 0.166 0.758 2.739 69.659 2.778 7.679 Table 8-2-3 Basic Design Review-Pointed-Out Issues per Unit Effort Basic Statistics (1)
(Unit: Issues/1,000 person-hours) N Min P25 Med P75 Max Mean S.D.
126 0.000 1.224 3.162 7.493 85.973 7.922 14.148 Table 8-2-4 Basic Design Review-Pointed-Out Issues per Unit Effort Basic Statistics (2)
(Unit: Issues/160 person-hours) N Min P25 Med P75 Max Mean S.D.
126 0.000 0.196 0.506 1.199 13.756 1.268 2.264
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 229
8.3 Test-Phase-Based Test Cases and Identified Software Failures
This section presents the number of test cases used, the number of software failures identified, and the
number of software faults identified in the integration test or system test per unit size or unit effort. The analysis in this section was made for the projects that stated that they went through the major development phases (from basic design to system test) by marking all the relevant phase-check data items with “ ”. These projects make up the population similar to that analyzed in Section 8.1.
* The term “system test” used in figures and tables in this chapter refers to the system test done by the vendor.
8.3.1 FP-Size-Based Test Case Density and Software
Failure/Fault Density: All Project Types
This section presents the number of test cases per unit FP size, the number of identified software failures per unit FP size, and the number of identified software faults per unit FP size. The first part describes the stratification criteria and analyzed data applied to the analysis in this section, and the second part presents for each test phase the number of test cases per unit FP size, the number of software failures per unit FP size, and the number of software faults per unit FP size in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type has a defined value. • 701_FP_measurement_method_ (actual) = Any value
(Including “Unknown”) • 5001_Actual_FP_size_ (unadjusted) > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has approximately three times as high test case
density as the system test phase. Test cases used in the system test phase are fewer in number than those in the integration test phase. The
projects presenting both the number of identified software failures and the number of identified software faults are insufficient in number. The data shown below therefore do not provide any acceptable explanation for the difference between these two numbers.
Figure 8-3-1 Test Cases and Identified Software Failures/Faults per Unit FP Size (All
Project Types) Box-and-Whisker Plot
400
350
300
250
200
150
100
50
0
7,000
6,000
5,000
4,000
3,000
2,000
1,000
0
Number of test cases per unit FP size
Number of test cases
for integration test/KFP
[Cases/KFP]
Number of identified software failures per unit FP size
Number of test cases for system test/KFP
Number of software
failures for integration test/KFP
Number of software
failures for system
test/KFP
Number of software faults for
integration test/KFP
Number of software faults for system
test/KFP
[Cases/KFP]
230 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Table 8-3-2 Test Cases and Identified Software Failures/Faults per Unit FP Size Basic Statistics
(Unit: Issues/1,000 FPs) N Min P25 Med P75 Max Mean S.D.
Number of test cases for integration test/KFP 89 16.8 396.4 1,192.2 2,912.3 125,000.0 4,471.0 14,539.9Number of test cases for system test/KFP 101 2.9 170.9 367.9 1,011.2 149,538.5 4,005.6 19,314.9Number of failures for integration test/KFP 84 0.0 31.7 80.5 151.4 13,074.4 315.3 1,432.7Number of failures for system test/KFP 96 0.0 8.5 30.9 101.3 6,537.2 187.8 781.1Number of faults for integration test/KFP 77 0.0 24.6 50.6 100.4 558.5 89.4 113.2Number of faults for system test/KFP 77 0.0 8.4 28.0 68.6 390.6 53.4 76.0
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 231
8.3.2 FP-Size-Based Test Case Density and Software Failure/Fault Density: Development
Stratification criteria
• All phase-check data items for the major development phases = Marked with “ ”
• 103_Project_type = “a: Development” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has four or more times as high test case density as
the system test phase. Test cases used in the system test phase are less in number than those in the integration test phase. The
projects presenting both the number of identified software failures and the number of identified software faults are insufficient in number. The data shown below therefore do not provide any acceptable explanation for the difference between these two numbers.
Figure 8-3-3 Test Cases and Identified Software Failures/Faults per Unit FP Size
(Development, IFPUG_Group) Box-and-Whisker Plot
Table 8-3-4 Test Cases and Identified Software Failures/Faults per Unit FP Size Basic
Statistics (Development, IFPUG_Group) (Unit: Cases/1,000 FPs)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/KFP 32 16.8 749.2 1,759.2 2,517.1 13,296.3 2,216.2 2,532.1Number of test cases for system test/KFP 33 16.8 292.9 418.7 1,529.6 12,069.9 1,412.0 2,350.7Number of failures for integration test/KFP 22 0.0 28.8 77.7 132.8 558.5 116.2 139.0Number of failures for system test/KFP 22 0.0 3.0 24.1 69.7 336.5 56.9 84.6Number of faults for integration test/KFP 45 4.3 27.0 56.0 104.3 558.5 104.7 128.4Number of faults for system test/KFP 44 0.0 7.9 20.9 66.0 390.6 55.9 87.9
7,000
6,000
5,000
4,000
3,000
2,000
1,000
0
400
350
300
250
200
150
100
50
0
[件 ]
Number of test cases per unit FP size (Development)
Number of test cases
for integration test/KFP
[Cases/KFP]
Number of identified software failures per unit FP size (Development)
Number of test cases for system test/KFP
Number of failures for integration test/KFP
Number of failures for
system test/KFP
Number of faults for
integration test/KFP
Number of faults for system
test/KFP
[Cases/KFP]
232 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.3.3 FP-Size-Based Software Fault Density: Enhancement
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • 701_FP_measurement_method_ (actual) =
“a: IFPUG”, “b: SPR”, or “d: NESMA estimated method” • 5001_Actual_FP_size_ (unadjusted) > 0
Analyzed data • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the enhancement projects have higher density of software faults identified in
the system test phase than the development projects. Figure 8-3-5 Identified Software Faults per Unit FP Size (Enhancement, IFPUG_Group)
Box-and-Whisker Plot
Table 8-3-6 Identified Software Faults per Unit FP Size Basic Statistics
(Enhancement, IFPUG_Group) (Unit: Cases/1,000 FPs)
N Min P25 Med P75 Max Mean S.D. Number of faults for integration test/KFP 16 0.0 22.8 45.6 89.3 456.0 82.0 112.8Number of faults for system test/KFP 20 0.0 10.9 36.2 72.8 230.8 58.4 68.4
300
250
200
150
100
50
0
Number of identified software failures per unit FP size (Enhancement)
Number of faults for
integration test/KFP
Number of faults for system
test/KFP
[Cases/KFP]
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 233
8.3.4 SLOC-Size-Based Test Cases and Software Failure/Fault Density: All Project Types
This section presents the number of test cases per unit SLOC size, the number of identified software
failures per unit SLOC size, and the number of identified software faults per unit SLOC size. The first part describes the stratification criteria and analyzed data applied to the analysis in this section, and the second part presents for each test phase the number of test cases per unit SLOC size, the number of software failures per unit SLOC size, and the number of software faults per unit SLOC size in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type has a defined value. • 312_Primary_programming_language_1, _2, or _3 =
Any value (Including “Unknown”) • Actual_net_SLOC_size > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has two or more times as high test case density as
the system test phase. Test cases used in the system test phase are less in number than those in the integration test phase. The projects presenting both the number of identified software failures and the number of identified software faults are insufficient in number. The data shown below therefore do not provide any acceptable explanation for the difference between these two numbers.
Figure 8-3-7 Test Cases and Identified Software Failures/Faults per Unit SLOC Size (All
Project Types) Box-and-Whisker Plot
Table 8-3-8 Test Cases and Identified Software Failures/Faults per Unit SLOC Size Basic
Statistics (All Project Types) (Unit: Cases/KSLOC)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/KSLOC
229 0.088 8.927 23.612 60.317 1,964.000 55.675 147.221
Number of test cases for system test/KSLOC 298 0.019 3.303 10.148 29.338 604.000 37.094 78.158Number of failures for integration test/KSLOC 234 0.000 0.415 1.257 2.325 47.833 2.491 5.437Number of failures for system test/KSLOC 306 0.000 0.074 0.323 1.091 32.066 1.189 2.679Number of faults for integration test/KSLOC 58 0.000 0.376 1.185 2.212 40.000 2.289 5.365Number of faults for system test/KSLOC 78 0.000 0.004 0.132 0.458 32.066 1.181 4.123
8
7
6
5
4
3
2
1
0
160
140
120
100
80
60
40
20
0
Number of test cases per unit SLOC size
Number of test cases for
integration test/KSLOC
[Cases/KSLOC]
Number of identified software failures/faults per unit SLOC size
Number of test cases for system
test/KSLOC
Number of failures for integration
test/KSLOC
Number of failures for
system test/KSLOC
Number of faults for
integration test/KSLOC
Number of faults for system
test/KSLOC
[Cases/KSLOC]
234 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.3.5 SLOC-Size-Based Test Cases and Software Failure/Fault Density: Development
This section presents the SLOC-size-based test case density and the SLOC-size-based software failure or
fault density with respect to development projects, and then presents the basic statistics of these kinds of density for each of the major primary programming languages.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “a: Development” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
Figure 8-3-9 Test Cases and Identified Software Failures/Faults per Unit SLOC Size
(Development, Major_Primary_Programming_Language_Group) Box-and-Whisker Plot
160
140
120
100
80
60
40
20
0
10
9
8
7
6
5
4
3
2
1
0
[件/ ]
Number of test cases per unit SLOC size (Development)
Number of test cases for
integration test/KSLOC
[Cases/KSLOC]
Number of identified software failures per unit SLOC size (Development)
Number of test cases for system
test/KSLOC
Number of failures for integration
test/KSLOC
Number of failures for
system test/KSLOC
Number of faults for
integration test/KSLOC
Number of faults for system
test/KSLOC
[Cases/KSLOC]
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 235
Table 8-3-10 Primary-Programming-Language-Based Integration Test Cases per Unit SLOC Size Basic Statistics (Development)
(Unit: Cases/KSLOC) Primary programming language N Min P25 Med P75 Max Mean S.D.
All project types 90 0.088 10.034 20.556 46.231 139.817 32.190 31.496 b : COBOL 25 0.365 5.565 13.225 46.921 117.644 28.320 30.365 g : C 11 5.030 11.140 24.328 59.982 124.357 41.008 38.824 h : VB 20 0.125 12.280 19.122 49.474 114.612 33.395 32.881 q : Java 34 0.088 10.044 25.274 40.457 139.817 31.474 29.780
Table 8-3-11 Primary-Programming-Language-Based System Test Cases per Unit SLOC
Size Basic Statistics (Development) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 112 0.027 2.124 6.360 18.215 509.991 28.787 77.293 b : COBOL 31 0.097 1.149 5.599 20.481 287.827 20.219 51.394 g : C 17 0.027 2.326 5.255 10.630 219.094 20.129 51.920 h : VB 27 0.125 1.957 7.799 18.150 509.991 55.220 133.739 q : Java 37 0.068 2.212 5.405 16.462 152.985 20.654 37.587
Table 8-3-12 Primary-Programming-Language-Based Identified Software Failures per
Unit SLOC Size Integration Test Basic Statistics (Development) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 92 0.000 0.660 1.461 2.556 16.289 2.070 2.299 b : COBOL 28 0.000 0.435 1.070 2.129 5.156 1.390 1.397 g : C 12 0.000 0.812 1.388 2.522 5.228 1.805 1.519 H : VB 20 0.052 0.508 1.395 4.778 16.289 3.310 4.116 q : Java 32 0.074 1.333 1.854 2.551 4.403 1.990 1.081
Table 8-3-13 Primary-Programming-Language-Based Identified Software Failures per
Unit SLOC Size System Test Basic Statistics (Development) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 122 0.000 0.134 0.480 1.350 9.611 1.067 1.578 b : COBOL 36 0.000 0.036 0.362 0.972 3.210 0.636 0.784 g : C 19 0.000 0.150 0.568 1.244 4.058 0.949 1.136 h : VB 28 0.000 0.147 0.468 1.360 6.487 1.061 1.488 q : Java 39 0.000 0.126 0.797 1.994 9.611 1.526 2.185
Table 8-3-14 Primary-Programming-Language-Based Identified Software Faults per Unit
SLOC Size Integration Test Basic Statistics (Development) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. Primary_programming_language_ group
17 0.250 1.255 1.467 3.942 9.872 2.651 2.445
Table 8-3-15 Primary-Programming-Language-Based Identified Software Faults per Unit
SLOC Size System Test Basic Statistics (Development) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. Primary_programming_language_ group
25 0.000 0.043 0.145 0.439 6.487 0.754 1.590
236 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.3.6 SLOC-Size-Based Test Cases and Identified Software Failures/Faults: Enhancement
This section presents the SLOC-size-based test case density and the SLOC-size-based software failure or
fault density with respect to enhancement projects, and then presents the basic statistics of these kinds of density for each of the major four primary programming languages.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type = “b: Maintenance/Support” or
“d: Enhancement” • Any of the three data items,
312_Primary_programming_language_1, _2, and _3, equals “b: COBOL”, “g: C”, “h: VB”, or “q: Java”.
• Actual_net_SLOC_size > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has two or more times as high test case density as
the system test phase. Figure 8-3-16 Test Cases and Identified Software Defects/Faults per Unit SLOC Size
(Enhancement, Major_Primary_Programming_Language_Group) Box-and-Whisker Plot
160
140
120
100
80
60
40
20
0
8
7
6
5
4
3
2
1
0
Number of test cases per unit SLOC size (Enhancement)
Number of test cases for
integration test/KSLOC
[Cases/KSLOC]
Number of identified software failures per unit SLOC size (Enhancement)
Number of test cases for system
test/KSLOC
Number of failures for integration
test/KSLOC
Number of failures for
system test/KSLOC
Number of faults for
integration test/KSLOC
Number of faults for system
test/KSLOC
[Cases/KSLOC]
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 237
Table 8-3-17 Primary-Programming-Language-Based Integration Test Cases per Unit SLOC Size Basic Statistics (Enhancement)
(Unit: Cases/KSLOC) Primary programming language N Min P25 Med P75 Max Mean S.D.
All project types 88 0.321 7.152 22.889 60.471 1,964.000 72.660 221.701 b : COBOL 31 0.321 3.829 15.000 25.808 82.438 21.185 22.353 g : C 19 3.632 6.818 22.167 77.630 314.680 74.224 103.790 H : VB 12 1.445 7.249 28.146 91.147 631.064 92.243 175.367 q : Java 26 3.774 9.187 40.542 84.888 1,964.000 123.854 378.102
Table 8-3-18 Primary-Programming-Language-Based System Test Cases per Unit SLOC
Size Basic Statistics (Enhancement) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 111 0.019 3.527 10.295 30.643 604.000 38.246 81.973 b : COBOL 42 0.019 0.685 6.350 17.914 82.875 13.414 17.655 g : C 27 1.250 7.384 20.277 84.027 316.628 58.106 79.036 H : VB 16 0.723 4.836 18.866 36.700 215.700 46.684 70.695 q : Java 26 0.115 1.609 8.492 18.281 604.000 52.542 133.564
Table 8-3-19 Primary-Programming-Language-Based Identified Software Failures per
Unit SLOC Size Integration Test Basic Statistics (Enhancement) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 89 0.000 0.334 1.099 2.128 42.358 2.919 6.762 b : COBOL 33 0.000 0.109 0.455 2.000 5.226 1.137 1.380 g : C 19 0.067 0.292 1.333 2.625 42.358 6.358 12.783 H : VB 12 0.181 0.758 0.884 3.075 10.946 2.771 3.524 q : Java 25 0.000 0.820 1.542 2.075 24.000 2.728 4.805
Table 8-3-20 Primary-Programming-Language-Based Identified Software Failures per
Unit SLOC Size System Test Basic Statistics (Enhancement) ((Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. All project types 111 0.000 0.041 0.196 0.763 13.333 1.119 2.377 b : COBOL 45 0.000 0.025 0.091 0.567 12.222 0.902 2.304 g : C 25 0.000 0.188 0.418 1.074 13.333 1.566 3.006 H : VB 15 0.036 0.160 0.487 1.604 4.917 1.175 1.559 q : Java 26 0.000 0.000 0.104 0.455 10.000 1.033 2.276
Table 8-3-21 Primary-Programming-Language-Based Identified Software Faults per Unit
SLOC Size Integration Test Basic Statistics (Enhancement) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. Primary_programming_language_ group
22 0.000 0.217 0.649 1.779 40.000 2.831 8.387
Table 8-3-22 Primary-Programming-Language-Based Identified Software Faults per Unit
SLOC Size System Test Basic Statistics (Enhancement) (Unit: Cases/KSLOC)
Primary programming language N Min P25 Med P75 Max Mean S.D. Primary_programming_language_ group
28 0.000 0.000 0.081 0.306 13.333 0.746 2.518
238 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.3.7 Effort-Based Test Cases and Identified Software Failures/Faults: All Project Types
This section presents the number of test cases per unit effort, the number of identified software failures per
unit effort, and the number of identified software faults per unit effort. The analysis presented in this section used the actual effort spent in each of the major development phases to calculate the test case density and the software failure or fault density with two bases: per 1,000 person hours and per 160 person hours. The first part describes the stratification criteria and analyzed data applied to the analysis in this section, and the second part presents the test case density and the software failure or fault density in box-and-whisker plots and basic statistics.
Stratification criteria • All phase-check data items for the major development
phases = Marked with “ ” • 103_Project_type has a defined value. • Actual effort (major development phases) > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has two or more times as high test case density as the system test phase.
Figure 8-3-23 Test Cases and Identified Software Failures/Faults per Unit Effort (All
Project Types) Box-and-Whisker Plot
Table 8-3-24 Test Cases and Identified Software Failures/Faults per Unit Effort Basic
Statistics (All Project Types) (1) (Unit: Cases/1,000 person-hours)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/effort
329 0.3 53.8 132.5 348.4 11,389.3 330.7 783.0
Number of test cases for system test/effort 433 0.1 19.1 54.4 186.4 12,511.1 277.8 949.6Number of failures for integration test/effort 325 0.0 2.7 7.4 16.3 1,191.3 18.4 70.8Number of failures for system test/effort 436 0.0 0.5 2.0 6.5 736.8 10.5 48.2Number of faults for integration test/effort 146 0.0 3.7 9.0 16.7 84.8 12.8 13.7Number of faults for system test/effort 180 0.0 0.3 2.1 7.3 230.8 7.5 19.7
Table 8-3-25 Test Cases and Identified Software Failures/Faults per Unit Effort Basic
Statistics (All Project Types) (2) (Unit: Cases/160 person-hours)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/effort
329 0.0 8.6 21.2 55.8 1,822.3 52.9 125.3
Number of test cases for system test/effort 433 0.0 3.1 8.7 29.8 2,001.8 44.5 151.9Number of failures for integration test/effort 325 0.0 0.4 1.2 2.6 190.6 2.9 11.3Number of failures for system test/effort 436 0.0 0.1 0.3 1.0 117.9 1.7 7.7Number of faults for integration test/effort 146 0.0 0.6 1.4 2.7 13.6 2.0 2.2Number of faults for system test/effort 180 0.0 0.1 0.3 1.2 36.9 1.2 3.2
1,200
1,100
1,000
900
800
700
600
500
400
300
200
100
0
50
45
40
35
30
25
20
15
10
5
0
Number of test cases per unit effort
Number of test cases for
integration test/K
person-hour
[Cases/K person-hour]
Number of identified software failures/faults per unit effort
Number of test cases for system
test/K person-hour
Number of failures for integration
test/K person-hour
Number of failures for
system test/K
person-hour
Number of faults for
integration test/K
person-hour
Number of faults for system test/K
person-hour
[Cases/K person-hour]
8. Development-Phase-Based Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 239
8.3.8 Effort-Based Test Cases and Identified Software Failures/Faults: Development
Stratification criteria
• All phase-check data items for the major development phases = Marked with “ ”
• 103_Project_type = “a: Development” • Actual effort (major development phases) > 0
Analyzed data • Number of test cases (Data items 5251 and 5252)• Number of identified software failures
(Data items 5253 and 5254) • Number of identified software faults
(Data items 10098 and 10099)
The median values show that the integration test phase has approximately three times as high test case density as the system test phase, and that the integration test phase has approximately three times as high software failure or fault density as the system test.
Figure 8-3-26 Test Cases and Identified Software Failures/Faults per Unit Effort
(Development) Box-and-Whisker Plot
Table 8-3-27 Test Cases and Identified Software Failures/Faults per Unit Effort Basic
Statistics (Development) (1) (Unit: Cases/1,000 person-hours)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/effort
185 0.3 53.5 123.0 326.1 11,389.3 318.7 892.8
Number of test cases for system test/effort 231 0.2 18.6 44.0 169.1 12,511.1 282.7 1,155.3Number of failures for integration test/effort 182 0.0 4.0 8.4 17.3 1,191.3 22.8 91.6Number of failures for system test/effort 234 0.0 1.2 3.2 8.9 736.8 15.7 64.5Number of faults for integration test/effort 73 0.0 5.0 10.0 17.2 84.8 14.3 15.0Number of faults for system test/effort 83 0.0 0.7 2.9 9.4 230.8 10.7 27.7
Table 8-3-28 Test Cases and Identified Software Failures/Faults per Unit Effort Basic
Statistics (Development) (2) (Unit: Cases/160 person-hours)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/effort
185 0.0 8.6 19.7 52.2 1,822.3 51.0 142.9
Number of test cases for system test/effort 231 0.0 3.0 7.0 27.1 2,001.8 45.2 184.9Number of failures for integration test/effort 182 0.0 0.6 1.3 2.8 190.6 3.7 14.7Number of failures for system test/effort 234 0.0 0.2 0.5 1.4 117.9 2.5 10.3Number of faults for integration test/effort 73 0.0 0.8 1.6 2.7 13.6 2.3 2.4Number of faults for system test/effort 83 0.0 0.1 0.5 1.5 36.9 1.7 4.4
1,200
1,100
1,000
900
800
700
600
500
400
300
200
100
0
50
45
40
35
30
25
20
15
10
5
0
Number of test cases per unit effort (Development)
Number of test cases for
integration test/K
person-hour
[Cases/K person-hour]
Number of identified software failures/faults per unit effort (Development)
Number of test cases for system
test/K person-hour
Number of failures for integration
test/K person-hour
Number of failures for
system test/K
person-hour
Number of faults for
integration test/K
person-hour
Number of faults for system test/K
person-hour
[Cases/K person-hour]
240 IPA/SEC White Paper 2007 on Software Development Projects in Japan
8.3.9 Effort-Based Test Cases and Identified Software Failures/Faults: Enhancement
Stratification criteria
• All phase-check data items for the major development phases = Marked with “ ”
• 103_Project_type = “b: Maintenance/Support” or “d: Enhancement”
• Actual effort (major development phases) > 0
Analyzed data • Number of test cases (Data items 5251 and 5252) • Number of identified software failures (Data items
5253 and 5254) • Number of identified software faults (Data items
10098 and 10099)
The median values show that the test case density in the integration test phase is slightly under twice the test case density in the system test phase. The median values also show that the integration test phase has approximately five times as high software failure or fault density as the system test phase.
Figure 8-3-29 Test Cases and Identified Software Defects/Faults per Unit Effort
(Enhancement) Box-and-Whisker Plot
Table 8-3-30 Test Cases and Identified Software Defects/Faults per Unit Effort Basic
Statistics (Enhancement) (1) (Unit: Cases/1,000 person-hours)
N Min P25 Med P75 Max Mean S.D. Number of test cases for integration test/effort
133 0.4 51.1 138.5 347.0 5,399.9 332.0 626.1
Number of test cases for system test/effort 190 0.1 18.5 77.2 215.7 6,586.0 274.0 656.2Number of failures for integration test/effort 134 0.0 2.1 5.5 12.2 234.0 11.8 23.6Number of failures for system test/effort 191 0.0 0.2 1.1 4.1 120.9 4.2 10.6Number of faults for integration test/effort 64 0.0 2.7 6.6 16.3 58.8 11.4 12.1Number of faults for system test/effort 88 0.0 0.1 1.4 6.4 40.9 5.0 7.5
Table 8-3-31 Test Cases and Identified Software Defects/Faults per Unit Effort Basic
Statistics (Enhancement) (2) (Unit: Cases/160 person-hours)
N Min P25 Med P75 Max Mean S.D.Number of test cases for integration test/effort
133 0.1 8.2 22.2 55.5 864.0 53.1 100.2
Number of test cases for system test/effort 190 0.0 3.0 12.4 34.5 1,053.8 43.8 105.0Number of failures for integration test/effort 134 0.0 0.3 0.9 1.9 37.4 1.9 3.8Number of failures for system test/effort 191 0.0 0.0 0.2 0.7 19.3 0.7 1.7Number of faults for integration test/effort 64 0.0 0.4 1.0 2.6 9.4 1.8 1.9Number of faults for system test/effort 88 0.0 0.0 0.2 1.0 6.5 0.8 1.2
1,200
1,100
1,000
900
800
700
600
500
400
300
200
100
0
[件 人時]
50
45
40
35
30
25
20
15
10
5
0
Number of test cases per unit effort (Enhancement)
Number of test cases for
integration test/K
person-hour
[Cases/K person-hour]
Number of identified software failures/faults per unit effort (Enhancement)
Number of test cases for system
test/K person-hour
Number of failures for integration
test/K person-hour
Number of failures for
system test/K
person-hour
Number of faults for
integration test/K
person-hour
Number of faults for system test/K
person-hour
[Cases/K person-hour]
9. Estimates-Results Analysis and Productivity Cross-Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 241
9 Estimates-Results Analysis and Productivity Cross-Analysis
9.1 Estimates-Results Analysis
This section presents the analyses of the difference between the project estimates and associated actual results with respect to the size, effort, development schedule, and other factors. This section analyzes some of the factors with the stratification by the project type to examine trends that depend on the project type.
The analyses in this section used data item values predicted at the “beginning of basic design” phase as the estimates and those taken “after project completion” as the actual results.
This section analyzes the difference between estimates and results in terms of estimation error that is the percentage of the difference from the actual result to its estimate, that is, {(Actual result - Estimate) ÷ Estimate} × 100. Variations in the estimation error of major factors are as follows: 0 to +15% (size), -1 to +28% (effort), and 0 to +6% (development schedule). (These variations are taken from the range between “P25” and “P75”.)
The results presented in this section seem to imply a trend that the project size grows larger as the system specifications are broken down into finer details in response to the progress of development. The results also show that relatively few projects have actual development schedules exceeding their planned schedule, which may be attributed to stringent business restrictions that forced projects to keep to their agreed development schedule at any risk, or may be attributed to increased project work force that absorbed the increased project size.
Note that the above viewpoints are applicable only to general aspects of estimation error. When you evaluate estimation error of size, effort, or development schedule in the planning phase of a particular project, taking into account the characteristics specific to that project is always necessary. Figure 9-1-1 Overgrowth of Size, Effort, and Development Schedule
(Concept of Estimation Error)
Actual effortExcess
Planned effort
Planned duration
Actual duration
Excess
Planned size Excess
Actual size
242 IPA/SEC White Paper 2007 on Software Development Projects in Japan
9.1.1 Estimates-Results Analysis of Size
This section presents the analysis of size estimation error of the projects that provide their planned FP size determined in the basic design phase and their actual FP size. Figure 9-1-2 shows the distribution of planned size and actual size, and Table 9-1-3 shows the percentage of increase from the planned size to actual size, that is, {(Actual size - Planned size) ÷ Planned size} × 100.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. (Any project type) • 701_FP_measurement_method (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual FP size_unadjusted (planned) > 0
Analyzed data • X-axis: FP size_unadjusted (planned) determined
in the basic design phase • Y-axis: FP size_unadjusted (actual)
The green diagonal line of Figure 9-1-2 (y = x) indicates that the actual FP sizes of the projects under the
line are within their planned sizes, and that the actual FP sizes of the projects above the line exceed their planned sizes. Figure 9-1-2 shows as a reference the regression line (orange) obtained from the samples of the planned FP sizes and actual FP sizes.
Table 9-1-3 shows that the actual FP sizes differ from the planned FP sizes by 0 to +15% (in the range between “P25” and “P75”).
The graph shows that many of the “development” projects have large FP sizes and that the ratio of development projects whose actual sizes exceed their planned sizes is higher than that of other project types. The actual sizes of the “maintenance/support” projects and “enhancement” projects do not significantly exceed their planned sizes. The trend presented here seems to support a common view that development projects and large projects have difficulties in size estimation and tend to result in an actual size larger than the planned size.
Figure 9-1-2 Planned FP Size and Actual FP Size
0
2,000
4,000
6,000
8,000
10,000
12,000
14,000
0 2,000 4,000 6,000 8,000 10,000 12,000 14,000
Planned FP size (at the beginning of basic design)
Act
ual F
P s
ize
Copyright IPA SEC
N=80
Table 9-1-3 FP Size Estimation Error Percentage
N P10 P25 Med P75 P90 80 -0.159 0.000 0.005 0.149 0.613
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
9. Estimates-Results Analysis and Productivity Cross-Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 243
Figure 9-1-4 FP Size Estimation Error Percentage Distribution
9.1.2 Effort Estimates-Results Analysis of Effort
This section presents the analysis of effort estimation error of the projects that provide their planned effort determined in the basic design phase and their actual effort. The analysis in this section excludes projects that do not provide planned effort determined in the basic design phase.
Figure 9-1-5 shows the distribution of planned effort and actual effort, and Table 9-1-6 shows the percentage of increase from the planned effort to actual effort, that is, {(Actual effort - Planned effort) ÷ Planned effort} × 100.
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. (Any project type) • Planned effort (whole project) determined in the basic design
phase > 0 • Actual_effort_ (whole_project) > 0
Analyzed data • X-axis: Whole-project effort_planned
determined in the basic design phase (in person hours)
• Y-axis: Whole-project effort_actual (in person hours)
The green diagonal line in Figure 9-1-5 (y = x) indicates that effort values of the projects under the line are
within their planned effort values, and that the effort values of the projects above the line exceed their planned effort values. Figure 9-1-5 shows as a reference the regression line (orange) obtained from the samples of the planned effort and actual effort.
Table 9-1-6 shows that the actual effort values differ from the planned effort values by -1 to +28% (in the range between “P25” and “P75”).
The graph shows that almost all large projects are of the “development” or “maintenance/support” type and that many of these projects have actual effort values exceeding their planned effort values. The actual effort values of “redevelopment” projects exceed their planned effort values by a small amount.
Num
ber o
f pro
ject
s
Exceeding planned size
Ove
r 1.2
The ratio of FP size excess to planned FP siz Within planned size
244 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 9-1-5 Planned Effort and Actual Effort
Table 9-1-6 Effort Estimation Error Percentage
N P10 P25 Med P75 P90 433 -0.167 -0.018 0.030 0.278 0.823
Figure 9-1-7 Effort Estimation Error Percentage Distribution
9.1.3 Estimates-Results Analysis of Development Schedule
This section presents the analysis of development-schedule estimation error of the projects that provide their planned months determined in the basic design phase and their actual months. The projects analyzed in this section are those that provide both the planned development schedule (major development phases) and the actual development schedule (major development phases). The projects making up the population analyzed in this section do not match the projects whose size or effort is analyzed in Section 9.1.2 or Section 9.1.3. The reason for this mismatch is that not every project provides both the planned and actual values of size, effort, or development schedule and that the estimates-results analyses exclude every project that lacks x value or y value (by leaving the associated data entry field blank).
Figure 9-1-8 shows the distribution of planned months and actual months, and Table 9-1-9 shows the percentage of increase from the planned months to actual months, that is, {(Actual months - Planned months) ÷ Planned months} × 100.
Num
ber o
f pro
ject
s
Ove
r 3.0
Exceeding planned size The ratio of effort excess to planned effort
( Within planned
Act
ual e
ffort
(Ove
rall
proj
ect)
Planned effort (Overall project: at the beginning of basic design)
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
9. Estimates-Results Analysis and Productivity Cross-Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 245
Stratification criteria • Projects that went through the major development phases • 103_Project_type has a defined value. (Any project type) • Major development phases months_actual > 0 • Major development phases months_planned > 0
Analyzed data • X-axis: Major development phases
months_planned • Y-axis: Major development phases
months_actual
The green diagonal line of Figure 9-1-8 (y = x) indicates that development schedule values of the projects under the line are within their planned values, and that the development schedule values of the projects above the line exceed their planned values. Figure 9-1-8 shows as a reference the regression line (orange) obtained from the samples of the planned development schedule and actual development schedule.
Table 9-1-9 shows that the actual months differ from the planned months by 0 to +6% (in the range between “P25” and “P75”).
Among the projects with a planned development schedule of 1 year or less, some projects of the “development”, “maintenance/support”, or “redevelopment” type have high ratios of excessive months. Many of the “redevelopment” projects have actual months that exceed their planned months by a negligible amount. Few of the redevelopment projects have actual months within their planned months.
Figure 9-1-8 Planned Development Schedule and Actual Development Schedule
0
5
10
15
20
25
30
35
0 5 10 15 20 25 30 35
Planned duration [months]
Act
ual d
urat
ion
[mon
ths]
Copyright IPA SEC
N=324
Table 9-1-9 Development Schedule Estimation Error Percentage
N P10 P25 Med P75 P90 324 -0.085 0.000 0.000 0.061 0.269
Figure 9-1-10 Development Schedule Estimation Error Percentage Distribution
Num
ber o
f pro
ject
s
Exceeding planned size
Ove
r 0.6
0
The ratio of FP size excess to planned FP siz
Within planned size
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
246 IPA/SEC White Paper 2007 on Software Development Projects in Japan
9.2 Productivity Analysis
This section presents the analyses of FP size and FP_productivity from a newer viewpoint than that used in the FP_productivity analysis presented in Chapter 6. The analyses in this section use combinations of two or more characteristics to examine the productivity based on the system size, the project team size, the clearness of requirements specifications, and the level of reliability requirements. 9.2.1 FP_productivity:
FP-Size-Based and Stratified by Industry Type
This section presents the FP_productivity based on the combination of the FP size and industry type of the development projects that measured their FP sizes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • Any of the three data items, 201_Industry_type_1, _2,
and _3 (major type), equals “F: Manufacturing”, “H: Information and communications”, “K: Finance and insurance”, “J: Wholesale/retail trade”, or “R: Government”.
• 701_FP_measurement_method (actual) has a defined value.
• 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort (major_development_phases) > 0
Analyzed data • X-axis: FP size_unadjusted (actual) • Y-axis: FP_productivity (FP / major
development phases) (Derived indicator) [FPs / person hours]
The results presented in this section show an obvious trend that the productivity of projects of the
“manufacturing” type decreases as the size increases. The projects of the “manufacturing” type have large variations in productivity.
The projects of the “finance and insurance” type have relatively small variations in productivity and have a weak trend that the productivity decreases as the size increases as in the case of the “manufacturing” type.
The difference of productivity among different industry types is attributed to various factors such as the characteristics of developed system and applied technologies. Note that the results presented in this section do not provide any basis for letting you determine which industry type has better productivity than which type.
Table 9-2-1 Size-Based FP_productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) (FPs / Person Hours) (Unit: FPs/person-hour)
FP size N Min P25 Med P75 Max Mean S.D. All project types 227 0.013 0.054 0.100 0.185 0.837 0.141 0.131 Less than 400FPs 81 0.015 0.078 0.106 0.195 0.658 0.160 0.133 400FPs or more and less than 1,000FPs
69 0.014 0.058 0.103 0.183 0.837 0.141 0.129
1,000FPs or more and less than 3,000FPs
58 0.013 0.044 0.068 0.212 0.739 0.136 0.141
3,000FPs or more 19 0.018 0.036 0.052 0.101 0.312 0.074 0.068 Table 9-2-2 Size-Based FP_productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) (FPs / 160 Person Hours) (Unit: FPs/160 person-hour)
FP size N Min P25 Med P75 Max Mean S.D. All project types 227 2.13 8.72 15.96 29.53 133.88 22.58 20.96 Less than 400FPs 81 2.37 12.50 16.92 31.16 105.29 25.63 21.35 400FPs or more and less than 1,000FPs
69 2.30 9.22 16.47 29.20 133.88 22.58 20.57
1,000FPs or more and less than 3,000FPs
58 2.13 7.09 10.81 33.85 118.21 21.83 22.56
3,000FPs or more 19 2.94 5.82 8.33 16.08 49.89 11.90 10.82
9. Estimates-Results Analysis and Productivity Cross-Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 247
Figure 9-2-3 Size-Based, Industry-Type-Based FP_productivity Basic Statistics (Development, Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
9.2.2 FP_productivity:
FP-Size-Based and Stratified by Project Team Size
This section presents the FP_productivity of the development projects that measured their FP sizes based on the combination of the FP size and the number_of_staff_per_month.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 701_FP_measurement_method (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_months_ (major_development_phases) > 0 • Actual_effort (major_development_phases) > 0
Analyzed data • Y-axis: FP_productivity (FP / major
development phases) (Derived indicator) [FPs / person hours]
• Number_of_staff_per_month (Derived indicator)
The projects of medium to large size made up of 10 or more staff exhibit an obvious trend that the
productivity decreases as the size increases. On the contrary, the projects made up of less than 10 staff have no such trend as of this writing. Currently projects seem to be able to maintain or increase their productivity up to a certain project size including small sizes of less than 10 staff. We will continue focusing on this aspect of FP_productivity.
R:公務K:金融・ 保険業
J:卸売・ 小売業H:情報通信業F:製造業
3210
3210 3210
0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP
生
産
性 0.60
0.50
0.40
0.30
0.20
0.10
0.00
FP規模
0: 400FP未満
1: 400FP以上1,000FP未満
2: 1,000FP以上3,000FP未満
3: 3,000FP以上
Manufacturing Information and communications Wholesale/retail trade
Finance and insurance
0: Less than 400FPs 1: 400FPs or more and less than
1,000FPs 2: 1,000FPs or more and less than
3,000FPs 3: 3,000FPs or more
FP_p
rodu
ctiv
ity
FP size
Government
248 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 9-2-4 Team-Size-Based FP_productivity (Development, Mixed_FP_Measurement_Methods) Box-and-Whisker Plot
9.2.3 FP Size and FP_productivity:
Stratified by Clearness of Requirements Specifications
This section presents the relationship between the clearness of requirements specifications and the FP size and that between the clearness of requirements specifications and the FP_productivity of the development projects that measured their FP sizes.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 501_Clearness_of_ requirements_specifications has a
valid value. • 01_FP_measurement_method (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort (major_development_phases) > 0
Analyzed data • X-axis: FP size_unadjusted (actual) • Y-axis: FP_productivity (FP / major
development phases) (Derived indicator) [FPs / person hours]
Many of the projects with requirements specifications that are “very clear” or “clear” have small to medium
sizes. Few large projects have “very clear” or “clear” requirements specifications. The median values of FP size show that projects with “ambiguous” or “very ambiguous” requirements specifications have larger FP sizes than those with clear requirements specifications. This indicates that large projects tend to have delays in completion of their requirements specifications.
Among projects of 1,000 FPs or less, many of the projects have “clear” requirements specifications and high productivity and projects with “ambiguous” requirements specifications have on average low productivity. That is, projects with clear requirements specifications tend to have high productivity.
0.80
0.70
0.60
0.50
0.40
0.30
0.20
0.10
0.00
10 or moreLess than 10 FP
_pro
duct
ivity
Less than 400FPs
400FPs or more and less
than 1,000FPs
1,000FPs or more and less
than 3,000FPs
3,000FPs or more
FP size
Less than 400FPs
400FPs or more and less
than 1,000FPs
1,000FPs or more and less
than 3,000FPs
3,000FPs or more
9. Estimates-Results Analysis and Productivity Cross-Analysis
IPA/SEC White Paper 2007 on Software Development Projects in Japan 249
Figure 9-2-5 Requirements-Specifications-Clearness-Based Distribution of FP Size and FP_productivity (Development, Mixed_FP_Measurement_Methods)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000 11,000 12,000
Actual_FP_Size_ (unadjusted)
a:非常に明確
b:かなり明確
c:ややあいまい
d:非常にあいまい
Copyright IPA SEC
N=121
FP_p
rodu
ctiv
ity(F
Ps/
Maj
or d
evel
opm
ent p
hase
s)
Table 9-2-6 Requirements-Specifications-Clearness-Based Effort Basic Statistics
(Development, Mixed_FP_Measurement_Methods) (Unit: person-hour)
Clearness of user requirements specifications N Min P25 Med P75 Max Mean S.D.
All project types 121 62 3,317 12,540 29,205 285,417 30,372 47,331a:Very clear 11 450 8,816 19,767 50,308 95,370 31,805 30,952b:Clear 66 62 2,661 11,633 32,810 285,417 33,528 55,095c:Ambiguous 35 774 4,687 12,788 25,318 142,725 21,020 26,492d:Very ambiguous 9 1,035 4,980 16,210 27,885 188,034 41,846 64,372
Table 9-2-7 Requirements-Specifications-Clearness-Based FP Size Basic Statistics
(Development, Mixed_FP_Measurement_Methods) (Unit: FP)
Clearness of user requirements specifications N Min P25 Med P75 Max Mean S.D.
All project types 121 25 495 934 1,996 14,545 1,788 2,359a:Very clear 11 175 456 1,001 2,626 4,042 1,538 1,436b:Clear 66 25 467 761 2,116 14,545 1,859 2,677c:Ambiguous 35 194 691 970 1,886 5,038 1,470 1,205d:Very ambiguous 9 245 754 1,279 1,654 11,670 2,812 3,837
9.2.4 FP Size and FP_productivity:
Stratified by Clearness of Requirements Specifications
This section presents the relationship between the level of reliability requirements and the FP size, and that between the level of reliability requirements and FP_productivity.
Stratification criteria • Projects that went through the major development phases • 103_Project_type = “a: Development” • 514_Level_of_requirements (Reliability) has a valid
value. • 701_FP_measurement_method (actual) has a defined
value. • 5001_Actual_FP_size_ (unadjusted) > 0 • Actual_effort (major_development_phases) > 0
Analyzed data • X-axis: FP size_unadjusted (actual) • Y-axis: FP_productivity (FP / major
development phases) (Derived indicator) [ FPs / person hours ]
Very clear Clear
Ambiguous
Very ambiguous
250 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Projects with a low level of reliability requirements tend to have low FP_productivity of 0.25 or less (the area marked with “I” in Figure 9-2-9). Projects with a very high level of reliability requirements have very low FP_productivity of 0.1 or less (the area marked with “II” in Figure 9-2-9), possibly caused by their large sizes.
The results presented in this section show an obvious trend that, among projects with a high level of reliability requirements, small projects have high productivity while large projects have low productivity (the area marked with “III” in Figure 9-2-9).
Projects with a medium level of reliability requirements have a certain amount of variations in the FP_productivity regardless of their size.
Figure 9-2-8 Reliability-Requirements-Level-Based Distribution of FP Size and
FP_productivity (Development, Mixed_FP_Measurement_Methods)
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0 2,000 4,000 6,000 8,000 10,000 12,000
Actual_FP_Size_ (unadjusted)
a:極めて高い
b:高い
c:中位
d:低い
Copyright IPA SEC 2007
FP_p
rodu
ctiv
ity(F
Ps/
Maj
or d
evel
opm
ent p
hase
s)
N=75
Figure 9-2-9 Reliability-Requirements-Level-Based Distribution of FP Size and
FP_productivity (Development, Mixed_FP_Measurement_Methods) X-Axis in Logarithmic Scale
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
1 10 100 1,000 10,000 100,000
Actual_FP_Size_ (unadjusted)
a:極めて高い
b:高い
c:中位
d:低い
Copyright IPA SEC 2007
N=75
Ⅰ
Ⅲ
Ⅱ
FP_p
rodu
ctiv
ity(F
Ps/
Maj
or d
evel
opm
ent p
hase
s)
Very high
High Medium
Low
Very high
High Medium
Low
10. Postscript
IPA/SEC White Paper 2007 on Software Development Projects in Japan 251
10 Postscript
The White Paper 2007 is the third edition of the White Paper. In fiscal 2006, SEC conducted analysis of software development based on data for approximately 1774 projects by working with 20 companies, and published the results of the analysis as the third edition. The third edition was created under the policy that SEC makes no major changes to the data items and focuses on collection of software development data from companies about projects that make modifications to base systems. (These projects are classified as maintenance/service projects and enhancement projects). SEC collected data about development projects as it did before, of course
SEC has been making painstaking efforts to inspect collected data every year. The data inspection consists of many tasks including checking whether necessary data items exist in the collected data and whether the collected data is valid and consistent and correcting errors in the collected data. The exact aim of data inspection is to ensure the reliability of the collected data. On the other hand, the constant effort of data inspection enables, in a step-by-step manner, SEC to expand the scope of analysis, increase analysis variations, and make experimental analysis.
In fiscal 2006, SEC carried out theme-based productivity factor analysis, which had been conducted continuously, and made a new attempt with experimental analysis that analyzes the relationship between clarity of requirements specifications or reliability requirements and productivity (see Chapter 9). The experimental analysis was a great success in that it revealed an aspect of software development that productivity is affected by requirements and constraints as well as system characteristics. The ongoing analysis objectives are to make finer analysis of trade-offs among the size, development schedule, effort, productivity, and reliability, and to analyze the relationship between productivity/reliability and other characteristics. To achieve these objectives, SEC has to collect project data from more companies and has to collect and analyze reliability data.
Opinions and viewpoints presented in this White Paper were conceived through discussions in many task forces with members from industrial, governmental and academic societies and discussions in the various activities of SEC researchers. Involving elaborate factors and many issues, quantitative analysis of project data is far from making quick answers to successful software development. However, this state of affairs does not oppose the essential fact that it is crucial to conduct discussions about concrete subjects based on data about facts. The analysis efforts of SEC are building up a model viewpoint that we can share to see underlying relationships among many factors in a systematic manner. For this reason, SEC believes that it is highly valuable to continue these efforts.
SEC will take detailed actions to improve software development projects via quantitative analysis while working with activities in related areas. SEC will also watch domestic and international benchmark standardization activities to find out effective tools and methods. In addition to publication activities, SEC will hold quantitative data practice seminars and make its software tools open to the public, thereby providing solutions available to those who face many difficulties in software development. SEC appreciates reader responses such as requests for opinions, feedback on the application of SEC's analysis results or tools and participation in SEC's activities.
252 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(This page in intentionally left blank)
Appendix
IPA/SEC White Paper 2007 on Software Development Projects in Japan 253
Appendix
A: Data Item Definitions ................................254 A.1 Mapping Between Phase Names and SLCP.........254 A.2 Data Item Definitions Version 2.3 ..........................255 A.3 Industry Classification............................................271 A.4 Derived indicators Names and Definitions.............272
B: Data Entry Form Version 2.3....................275
C: Per-Data-Item Reply Status .....................278
D: Glossary ...................................................288
E: References ...............................................290 E.1 Reference Materials ..............................................290 E.2 Reference Information...........................................291
254 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Appendix A Data Item Definitions A.1 Mapping Between Phase Names and SLCP
The following table lists the mapping between SLCP (see ISO/IEC 12207) and the names of the software development phases used for the data item definitions presented in this White Paper. The “Phase” column of the table lists the phase names used in the definitions of collected data items and in other sections of this White Paper. The “SLCP Process/Activity” column lists corresponding SLCP processes or activities and the “SLCP Definitions” column lists the SLCP definitions of the processes or activities.
Phase SLCP Process/Activity SLCP Definitions Development planning
System development planning
The planner confirms the basic requirements for system development, checks feasibility, sets up schedules, determines a policy for system selection, establishes personnel assignment to project promotion tasks, clarifies the basic policy for system migration and system operation and maintenance, clarifies the basic policy for environment preparation, training, and quality, makes a development plan, and asks for approval.
Requirements definition
System requirements analysis Software requirements analysis
The developer fixes software requirements including quality specifications, documents the requirements, evaluates the software requirements based on defined criteria and documents the results, and holds joint reviews to establish agreed software requirements.
Basic design System architecture design Software architecture design
The developer translates requirements for the software product into software architectures, and clarifies at the highest level the software architectures, software components of the software product, database design, provisional test requirements for software integration testing, and tentative schedule of the test. The developer also compiles provisional versions of user documents and holds joint interviews.
Detailed design Software detailed design
The developer draws detailed design of each software component of the software product. The developer subdivides software components into software units each of which is separately coded, compiled, and tested. The developer draws detailed design of interfaces and databases, updates user documents as required, and defines requirements for software unit testing and test schedules. The developer holds joint reviews.
Development Software coding and testing
The developer develops software units and databases, and sets up test procedures and test data for the units and databases. The developer also tests the units and databases to check that they satisfy relevant requirements. The developer updates user documents and other documents based on test results if necessary.
Integration test Software integration System integration
The developer makes a plan to integrate software units and components into the software product, and completes the product. The developer also carries out integration tests. The developer updates user documents and other documents based on test results if necessary. The developer also holds joint reviews.
System test Software conformance test System conformance test
The developer carries out conformance test in accordance with requirements for software product conformance test. The developer updates user documents and other documents based on test results if necessary. The developer also carries out audit.
Acceptance test Software installation support Software acquisition support
The developer makes a plan to install the software product in the real environment specified by the contract and installs the product. The developer supports software product acquisition reviews and tests carried out by the party who acquires the product. The developer also provides the party with continuous initial training and support as specified by the contract.
Follow-up (operation)
Operation process The developer operates the software product and provides the user with operation support. The operator's tasks in the operation process include establishing the base environment for the operation process in accordance with the management process established to manage the operation process.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 255
A.2 Data Item Definitions Version 2.3
This section presents the definitions of data items used in this White Paper. The project data presented in this White Paper were collected and analyzed in accordance with the definitions.
The definitions of data items are listed in tables with three column headers: “Data Name”, “Definition”, and “Allowable Values”. The “Data Name” column lists the names of data items in the format “Number_Name.” The “Definitions” column lists the definitions of data items. The “Allowable Values” column lists allowable values for data items as optional choices for inquires written in the Data Entry Form presented in Appendix B. If a data item needs a free text, a pair of parentheses is provided in the Allowable Values column for the text. In some cases, an example free text or supplementary information is written in the column. (0) Administration office data
Data Name Definitions Allowable Values
101_Project ID
The identifier of the project. (The administration office assigns a unique project ID to each project in a manner that the assigned ID never lets anyone identify the company that offered data about the project.)
1, 2, 3, ... : IDs distinguishing whole system projects from each other. 1-1, 1-2, ... : IDs distinguishing sub-system projects from each other.
102_Data reliability The reliability of project data classified in four grades from A to D. The administration office assigns a reliability value to this data item.
A: The project data is confirmed as reasonable and completely consistent.
B: The project data looks reasonable, but it has several factors that affect consistency of the data.
C: Consistency of the project data cannot be evaluated because critical data items are missing.
D: The project data has one or more factors indicating that the data is unreliable.
(1) Development project general data items
Data Name Definitions Allowable Values
10084_Proprietary project ID
The project identification assigned by the company that offered the project data. This data item is also used for sub-system identification. * If a project data set is captured for each sub-system, do not combine the data sets of sub-systems into one.
1, 2, 3, ... : IDs distinguishing a whole system projects from each other 1-1, 1-2, ... : IDs distinguishing sub-system projects from each other
11001_Whole/sub flag The flag that identifies whether the data belongs to the whole system project or a sub-system project.
a: Whole system project. b: Sub-system project
11002_Grouping ID
A free text that defines a group identification for project grouping. *1: Write a free text for this data item regardless of the choice made for data item 11001. *2: Assign a positive integer (1, 2, 3, ... ) to the grouping ID. *3: Leave this data item blank if there is no project grouping.
( ) Example 1: Assign “1” to the whole system
project and to the two sub-system projects. Example 2: Assign “2” to the two sub-system
projects. * It may be considered in an analysis process to aggregate whole system/sub-system projects assigned the same grouping ID.
10085_Data reliability The reliability of project data classified in four grades from A to D.
A: The project data is confirmed as reasonable and completely consistent.
B: The project data looks reasonable, but it has several factors that affect consistency of the data.
C: Consistency of the project data cannot be evaluated because critical data items are missing.
D: The project data has one or more factors indicating that the data is unreliable.
256 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Data Name Definitions Allowable Values
103_Project type The type of project (development or not)
a: Development—The project developed a system from scratch, or developed 90% or larger portion of the existing system.
b: Maintenance/support—The project made corrections or modifications such as function addition to the existing system after the system entered the operation phase. (The newly developed portion amounts to less than 10% of the whole system.)
c: Redevelopment—The project rebuilt the system based on the existing system without making any changes to the functional specifications. (This type is also referred to as replacement.)
d: Enhancement—The project modified functionality of and/or added new functionality to the existing system. (The newly developed portion amounts to 10% to about 90%.)
104_Existing system stability
The stability of the existing system. This data item is valid if data item 103 has the value of “b” (“Maintenance/support”), “c” (“Redevelopment”), or “d” (“Enhancement”).
a: Stable. b: Reaching the stable state. c: Unstable. d: Stability unknown.
105_Project category The category of the project.
a: Commercial package development. b: Entrusted development. c: For in-house use. d: Prototyping. e: Other (description)
106_Entrusted development working site
The working site for entrusted development. This data item is valid if data item 105 has the value of “b” (“Entrusted development”).
a: Customer’s site. b: In-house site. c: Other (description)
107_Project purpose The primary purpose(s) of the project (multiple choice).
a: Software development b: Infrastructure-building c: Operational environment preparation d: System migration e: Maintenance f: Operation support g: Consulting h: Project management i: Quality assurance j: On-site environment preparation/adjustment
for a running system k: Customer training l: Other (description)
108_New customer or old customer
Did the project serve a new customer or an old customer? a: New customer. b: Old customer.
109_New business or not Was the project aimed at a new industry or business or an old industry or business?
a: New industry or business. b: Old industry or business.
118_Source of outsourced workforce
Where did the outsourced workforce come from? Choose one to three alternatives if the project used outsourced workforce. * An affiliate refers to a company that has capital transactions with another.
a: Japanese company (intra-group/affiliate).
b: Japanese company (out-of-group/non-affiliate).
c: Foreign company (intra-group/affiliate).
d: Foreign company (out-of-group/non-affiliate).
e: No outsourcing.
119_Outsourcing country
The name of the country from which the outsourced workforce came. Write one or more country names if data item 118 has the value of “c” (“Foreign company (intra-group/affiliate)”), or “d” (“Foreign company (out-of-group/non-affiliate)”). Example: China, India
( )
110_New subcontractors or not
Whether or not the project used one or more new subcontractors. Choose one alternative if data item 118 has any value other than “e” (“No outsourcing”).
a: The subcontractors were new to the company of the project.
b: The company of the project used the subcontractors more than once.
111_Using new technology or not Whether or not the project used new technology. a: Used new technology.
b: Did not use new technology.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 257
Data Name Definitions Allowable Values 112_Clearness of responsibility and roles of project team members
How clearly were the responsibility and roles of project team members defined?
a: Very clear. b: Clear. c: Not so clear. d: Unclear
113_Clearness of goals and priority
How clearly were the project objectives (delivery date, quality, technologies, etc.) and their priority defined?
a: Very clear. b: Clear. c: Not so clear. d: Unclear
114_Working space The working space for the project team.
a: Enough closed space for each member. b: Adequate space for each member, and very
good working environment for concentrated brainwork.
c: Stuffy open space, interrupting concentrated brainwork.
d: Very packed open space lacking space for documents and computers.
115_Project environment (acoustic noise)
The level of acoustic noise in the working environment.
a: No noise and minimum interruption by phone calls.
b: Noise level below human awareness and intermittent interruption by phone calls.
c: Occasional high-level noise and frequent interruption by phone calls.
d: Deafening noise constantly hampering concentrated brainwork. Interruption by phone call repeats within one hour.
116_Project success_Self-evaluation
Evaluate whether or not the project is an overall success with respect to QCD. A project is a success if its planning is appropriate and if its planned goals were achieved. A project that had no planning is a success if it ends up with desirable results.
a: All the QCD elements are successful. b: Two of the QCD elements are successful. c: One of the QCD elements is successful. d: None of the QCD elements is successful.
117_Subjective evaluation of customer satisfaction
How do you feel about the customer's satisfaction? Choose one alternative based on your own feeling.
a: The customer is fully satisfied. b: The customer is almost satisfied. c: The customer is dissatisfied at some points. d: The customer is not at all satisfied.
120_Evaluation of planning (Cost)
Whether or not the cost planning was valid. a: The basis of cost estimation was clear and feasibility was confirmed.
b: The basis of cost estimation was unclear or feasibility was not confirmed.
c: No planning. 121_Evaluation of planning (Quality)
Whether or not the objectives of delivered quality were valid.
a: The quality objectives were clear and feasibility was confirmed.
b: The quality objectives were unclear or feasibility was not confirmed.
c: No planning. 122_Evaluation of planning (Development schedule)
Whether or not the development schedule planning was valid.
a: The basis of development schedule planning was clear and feasibility was confirmed.
b: The basis of development schedule planning was unclear or feasibility was not confirmed.
c: No planning. 123_Eveluation of results (Cost)
Evaluation of the results of cost planning. a: The actual cost is less than the planned cost by 10% or more.
b: The actual cost nearly equals the planned cost with an error less than ±10%.
c: The actual cost exceeded the planned cost by 30% or less.
d: The actual cost exceeded the planned cost by 50% or less.
e: The actual cost exceeded the planned cost by more than 50%.
124_Evaluation of results (Quality)
Evaluation of achievement of delivered quality objectives.
a: The number of defects after system cutover is less than the planned value by 20% or more.
b: The number of defects after system cutover is less than the planned value.
c: The number of defects after system cutover exceeded the planned value by 50% or less.
d: The number of defects after system cutover exceeded the planned value by 100% or less.
e: The number of defects after system cutover exceeded the planned value by more than 100%.
258 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Data Name Definitions Allowable Values 125_Evaluation of results (Development schedule)
Evaluation of achievement of development schedule planning objectives. Whether the product was delivered before or after the planned delivery date or the delivery date agreed with the customer.
a: The product was delivered before the planned delivery date.
b: The product was delivered on the planned delivery date.
c: The actual delivery delayed by 10 days or less.
d: The actual delivery delayed by 30 days or less.
e: The actual delivery delayed more than 30 days.
126_Reason for QCD objectives failure
The reason why the cost, quality, and delivery objectives were not achieved. (For example, data item 123 has the value “c”, “d”, or “e.”) Choose one to three alternatives. Note: Data item “803_deference between the plan and results (reason for early/delayed delivery)” of the Data Item Definition Version 1.0 was obsoleted and integrated into this data item.
a: Incomplete development objectives. b: Incomplete RFP contents. c: Delayed completion of requirements
specifications. d: Insufficient analysis of requirements. e: Miscasting of in-house members. f: Subcontractor selection failure. g: Insufficient capability of the development
team. h: Incomplete test planning. i: Insufficient acceptance inspection. j: Insufficient system test and/or acceptance
test. k: Insufficient ability of project managers. l: Other (description)
1012_General comment
Write a comment such as considerations necessary for analysis of the offered data, or a note for SEC. Example 1: The project used outsourced
workforce, but the ratio of outsourced effort is left blank because it is unknown.
Example 2: More than 30% of the in-house development effort was used for infrastructure-building.
( ) * Free text up to 512 characters.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 259
(2) Project applications Data Name Definitions Allowable Values
201_Industry type The type of industry the developed system is used for, or the type of industry in which the project's customer works.
Choose an appropriate type among “industry types” from 01 to 99 listed in Appendix A.3.
202_Business type The type of business the developed system is used for.
a: Management/planning. b: Accounting. c: Sales. d: Production/distribution. e: Personnel/welfare. f: General management. g: General affairs. h: Research/development. i:
Technology/control. j: Master management. k: Ordering/inventory. l: Distribution management. m: Subcontractor management. n: Contract/transfer. o: Customer management. p: Product planning (per-product). q: Product management (per-product). r: Facility (stores). s: Information analysis. t: Other (description)
203_System applications The application of the system developed by the project.
a: Workflow support and management b: Network management c: Job management and monitoring d: Process control e: Security management f: Finance dealing g: Reporting h: Online analysis and reporting i: Data management/data mining j: Web portal site k: ERP l: SCM m: CRM_CTI n: Document management o: Knowledge management p: Catalog management q: Mathematical modeling
(finance/engineering) r: 3D modeling/animation s: Geographic/spatial positioning t: Graphics and publishing tools u: Imaging v: Video processing w: Voice processing x: Built-in software (for machine control) y: Device drivers/interface drivers z: OS/software utilities A: Software development tools B: Consumer products (word processors,
spread sheets, etc.) C: EDI D: EAI E: Emulators F: File transfer G: Other (description)
204_User accessibility Whether the system developed by the project is accessible to limited users or is open to the public.
a: Accessible to limited users. b: Open to the public.
205_Number of users The number of users who use the developed system. This data item is valid if data item 204 has the value “a” (“Accessible to limited users”).
About ( ) users.
206_Number of user sites The number of user sites where servers or other devices are installed. ( ) sites.
207_User concurrency The maximum number of users who concurrently use the developed system. ( ) users.
260 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(3) System characteristics Data Name Definitions Allowable Values
301_Type of developed system The type of the software developed by the project.
a: Application software b: System software (middleware, operating
system) c: Tool software d: Development environment software e: Other (description)
302_Use of business application package
Did the project use one or more business software packages? *: Except for in-house business software packages.
a: Yes. b: No.
303_First-time use of business application package
Whether or not the company of the project used the business software package(s) for the first time. This data item is valid if data item 302 has the value “a” (“Yes”).
a: The business software packages were used for the first time.
b: The business software packages were used more than once.
c: How many times the business software packages were used is unknown.
304_Name of business software package
The name of the software package(s) used for the project. This data item is valid if data item 302 has the value “a” (“Yes”). Ex: SAP, Oracle Applications
( )
305_The functional size ratio of business software package
Make a rough estimation of the ratio of the total functional size of the used business software package(s) to the functional size of the whole developed system. This data item is valid if data item 302 has the value “a” (“Yes”).
About ( )%
306_Customization cost ratio of business software package
The ratio of customization cost of the used business software package(s) to the total cost of the package(s). This data item is valid if data item 302 has the value “a” (Yes).
( ) %
307_Processing Mode In what processing mode the developed system is used.
a: Batch processing b: Interactive processingc: Online transaction processing d: Other (description)
308_Architecture The type of architecture of the developed system. * Up to three types are selectable from the largest size to smaller ones.
a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet f: Other (description)
309_Target platform The primary operating system platform of the developed system.
a: Windows 95, 98, or Me b: Windows NT, 2000, or XP c: Windows Server 2003 d: HP-UX, e: HI-UX, f: AIX, g: Solaris, h: Redhat Linux, i: SUSE Linux, j: Miracle Linux, k: Turbo Linux, l: Other type of Linux, m: Linux, n: Other type of UNIX, o: MVS, p: IMS, q: TRON, r: Office computer system, s: Other (description)
310_Use of Web technology What kinds of Web technology did the project use?
a: HTML, b: XML, c: Java Script, d: ASP, e: JSP, f: J2EE, g: Apache, h: IIS, i: Tomcat, j: JBOSS, k: OracleAS, l: WebLogic, m: WebSphere, n: Coldfusion, o: WebService, p: Other (description), q: None.
311_Online transaction processing system
The software used for online transaction processing.
a: TUXEDO, b: CICS, c: OPENTP1, d: Other (description), e: None.
312_Primary programming language
The programming language primarily used. * Up to 5 languages are selectable from the most frequently used one to lesser ones. * Choose “w: Other” for unlisted languages such as CGI, Java applets, and EJB and write the names of the languages.
a: Assembly language, b: COBOL, c: PL/I, d: Pro*C, e: C++, f: Visual C++, g: C, h: VB, i: Excel (VBA), (j and k are intentionally omitted.), l: InputMan, m: PL/SQL, n: ABAP, o: C#, p: Visual Basic.NET, q: Java, r: Perl, s: Shell script, t: Delphi, u: HTML, v: XML, w: Other (description).
313_Use of DBMS What kind of DBMS did the project use?
a: Oracle, b: SQL Server, c: PostgreSQL, d: MySQL, e: Sybase, f: Informix, g: ISAM, h: DB2, i: Access, j: HiRDB, k: IMS, l: Other (description), m: None.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 261
(4) Development techniques Data Name Definitions Allowable Values 401_Development life cycle model Development life cycle model a: Waterfall. b: Iterative.
c: Other (description). 402_Use of operation support tool Did the project use an operation support tool? a: JP1, b: SystemWalker, c: Senju,
d: A-Auto, e: Other (description), f: None.
403_Examined similar projects or not
Did the project examine one or more similar past projects in the planning phase? * Choose “b: No” if the project did not examine one or more similar projects that existed.
a: Yes. b: No.
404_Use of project management tool Did the project use a project management tool? a: Yes. b: No.
405_Use of configuration management tool
Did the project use a configuration management tool? * Example tools: ClearCase, CVS, Subversion, PVCS, SCCS, VSS.
a: Yes (Name of the tool). b: No.
406_Use of design support tool
Did the project use a design support tool? a: Yes (Name of the tool). b: No.
407_Use of documentation tool
Did the project use a documentation tool? a: Yes (Name of the tool). b: No.
408_Use of debug/test tool Did the project use a debug/testing support tool? a: Yes (Name of the tool). b: No. 409_Use of CASE tool Did the project use an upstream or integrated
CASE tool? * Data item “410_Use of integrated CASE tool” of the Data Item Definition Version 1.0 was obsoleted and integrated into data item 409 of the Data Item Definition Version 2.3.
a: Yes (Name of the tool). b: No.
411_Use of code generator Did the project use a code generator? * If the code generator is used in-house and its name cannot be disclosed, write “In-house tool.”
a: Yes (Name of the generator). b: No.
412_Application of Development Methods
The schematic development approach applied to the project.
a: Structured analysis and design. b: Object-oriented analysis and design. c: Data-oriented approach (DOA). d: Other (description). e: None.
413_Reuse rate of development plan document
The ratio of number of reused pages of the development plan document to the total number of pages of the document.
( ) %
414_Reuse rate of requirements definition document
The ratio of number of reused pages of the requirements definition document to the total number of pages of the document.
( ) %
415_Reuse rate of basic design document
The ratio of number of reused pages of the basic design document to the total number of pages of the document.
( ) %
416_Reuse rate of detailed design document
The ratio of number of reused pages of the detailed design document to the total number of pages of the document.
( ) %
417_Reuse rate of source code
The ratio of reused SLOC size of the source code to the total SLOC size of source code. ( ) %
418_Reuse rate of software components
The reuse ratio of reused software components such as library components in terms of functional size. The approximate ratio of functional size of the reused software components to the total functional size of the developed system.
About ( ) %
419_Reuse rate of test cases for integration test
The ratio of the number of test cases reused for the integration test to the total number of test cases. ( ) %
420_Reuse rate of test cases for system test
The ratio of the number of test cases reused for the system test to the total number of test cases. ( ) %
421_Reuse rate of test cases for acceptance test
The ratio of the number of test cases reused for the acceptance test to the total number of test cases. ( ) %
422_Use of development frameworks
Did the project use a development framework? * Example development frameworks: Struts, Net, JBOSS, J2EE.
a: Yes (Name of the framework). b: No.
262 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(5) User requirement management Data Name Definitions Allowable Values
501_Clearness of user requirements specifications
The degree of clearness the requirements specifications had at the beginning of the basic design phase.
a: Very clear. b: Clear. c: Ambiguous. d: Very ambiguous.
502_User participation in user requirement specifications
The degree of user participation in the requirement specifications.
a: Full. (For example, the user compiled the whole specifications.)
b: Adequate. (For example, the user compiled the basic part and the vendor compiled the rest.)
c: Inadequate. (For example, the user made a rough draft and the vendor did all the rest.)
d: None. (For example, the vendor compiled the whole specifications.)
503_User expertise in development
The level of user expertise in computer systems and system development.
a: Comprehensive. (For example, the user fully understood the vendor talking about the system.)
b: Adequate. (For example, the user understood the vendor talking about the system for the most part.)
c: Inadequate. (For example, the vendor had to provide additional, detailed explanations of the system in many cases.)
d: None. (For example, the vendor had to explain everything about the system.)
504_User expertise in applied business The level of user expertise in the applied business.
a: Comprehensive. (For example, the user fully understood the vendor talking about the applied business.)
b: Adequate. (For example, the user understood the vendor talking about the applied business for the most part.)
c: Inadequate. (For example, the vendor had to make additional, detailed explanations of the applied business in many cases.)
d: None. (For example, the vendor had to explain everything about the applied business.)
505_Clearness of user role and responsibility
How clearly were the role and responsibility of the user and those of the vender defined?
a: Very clear. b: Clear. c: Ambiguous. d: Very ambiguous
506_User acknowledgment of requirements specifications
Did the user acknowledge the requirements specifications? a: Yes. b: No.
507_User comprehension of system design
The degree of user understanding of the system design
a: Full. b: Adequate. c: Inadequate. d: None.
508_User acknowledgment of system design Did the user acknowledge the system design? a: Yes. b: No.
509_User participation in acceptance test
The degree of user participation in the “acceptance test”.
a: Full. b: Adequate. c: Inadequate. d: None.
5114-5121_Changes to requirements specifications (per-phase)
How did the requirements specifications change in each phase, and how did the changes affect the effort?
* Choose one alternative for each phase. a: No change. b: Minor change. c: Major change. d: Critical change.
511_Number of members participated in requirements definition
The number of key persons who defined the requirements. ( ) persons
512_Level of requirements (Reliability)
The level of reliability requirements in terms of the failure rate, recovery time, data recovery, and other factors.
a: Very high. b: High. c: Medium. d: Low.
513_Level of requirements (Usability)
The level of usability requirements in terms of the ease of software learning, ease of operation learning, ease of operation management, the sophistication of graphical interface design, and other factors.
a: Very high. b: High. c: Medium. d: Low.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 263
Data Name Definitions Allowable Values
514_Level of requirements (Performance and efficiency)
The level of performance and efficiency requirements in terms of the response time, processing time, processing power, the utilization of system resources such as hard disks and memory, and other factors.
a: Very high. b: High. c: Medium. d: Low.
515_Level of requirements (Maintainability)
The level of maintainability requirements in terms of the ease of software correction, ease of fault locating, ease of fault identification, ease of software change, protection against possible troubles in software change, the ease of software correction validity verification, and other factors.
a: Very high. b: High. c: Medium. d: Low.
516_Level of requirements (Portability)
The level of portability requirements in terms of the ease of adjustment to a new environment, ease of installation in the environment, ease of concurrent operation with other software components, ease of porting from other software, and other factors.
a: Very high. b: High. c: Medium. d: Low.
517_Level of requirements (Running cost)
The level of requirements in terms of the system running cost.
a: Very high. b: High. c: Medium. d: Low.
518_Level of requirements (Security) The level of system security requirements. a: Very high. b: High. c: Medium.
d: Low.
519_Legal restrictions Legal restrictions placed on the developed system.
a: Industrial legal restrictions. b: Regular legal restrictions. c: None. * Example industrial legal restrictions include the Banking Law, and the Securities and Exchange Law.
(6) Skills and experience of staff
Data Name Definitions Allowable Values 601_PM skill The skill level of project managers defined by the
IT Skill Standard Version 2 “Project Management”. * For the skill level index and the degree of skill, refer to” IT Skill Standard Version 2” at http://www.ipa.go.jp/jinzai/itss.
a: Level 6 or 7 b: Level 5 c: Level 4 d: Level 3
602_Staff skill_application domain experience
The skill level of staff with respect to the application to which the developed system is aimed at.
a: All staff had enough experience. b: Half of the staff had enough experience, and
the rest had adequate experience. c: Half of the staff had adequate experience,
and the rest had no experience. d: All staff was without experience.
603_Staff skill_analysis and design experience
The skill level of staff with respect to the system analysis and design.
a: All staff had enough experience. b: Half of the staff had enough experience, and
the rest had adequate experience. c: Half of the staff had adequate experience,
and the rest had no experience. d: All staff was without experience.
604_Staff skill_programming language and software tool experience
The skill level of staff with respect to programming languages and software tools.
a: All staff had enough experience. b: Half of the staff had enough experience, and
the rest had adequate experience. c: Half of the staff had adequate experience,
and the rest had no experience. d: All was without experience.
605_Staff skill_development platform experience
The skill level of staff with respect to the use of development platform.
a: All staff had enough experience. b: Half of the staff had enough experience, and
the rest had adequate experience. c: Half of the staff had adequate experience,
and the rest had no experience. d: All staff was without experience.
264 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(Information) The following table lists the mapping between the 601_PM skill alternatives of the Data Item Definitions Version 1.0 and those of the Data Item Definitions Version 2.0 or later.
SEC Data Item Old Definitions (Version 1.0 or Before)
SEC Data Item New Definitions (Version 2.0 or later) IT Skill Standard Level
Grade guideline for system development, application development, or system integration
a: Level 6 or 7 The person experienced management of 500 staff or more at the maximum, or experienced projects each of which had per-year contract money of ¥1 billion or more.
a: The person experienced project management in many intricate large- or medium-size projects. a: Level 6 or 7
The person experienced management of N staff (50 ≤ N < 500) at the maximum, or experienced projects each of which had per-year contract money of ¥0.5 billion or more.
b: The person experienced project management in a few intricate medium-to-large-size projects.
b: Level 5 The person experienced management of N staff (10 ≤ N < 50) at the maximum, or experienced projects each of which had per-year contract money of ¥0.1 billion or more.
c: The person experienced project management in only small-to-medium-size projects.
c: Level 4 The person experienced management of less than 10 staff at the maximum.
d: The person has no project management experience. d: Level 3 The number of staff is irrelevant at this level.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 265
(7) System size Data Name Definitions Allowable Values
701_Primary FP size measurement method
The measurement method used to measure the actual FP size. * Excluding the use-case point.
a: IFPUG. b: SPR. c: NESMA indicative method d: NESMA estimated method e: COSMIC-FFP f: Other (description)
10124_Purity of measurement method for actual FP size 10125_Name of the method
Conformity of the FP size measurement method to the measurement standard. a: The used method conforms to the measurement standard (ISO or JIS). b: A standard method was customized in accordance with proprietary rules.
a: Original method. b: Customized method. (Write the name of the method if any.)
702_FP measurement support technology
Did the project use an FP measurement support tool, or did it have staff dedicated to FP measurement?
a: Yes. (Support tool or dedicated staff) b: No.
11018_Inclusiveness of existing FP size
The flag indicating whether or not the value of 5001_Actual FP size (unadjusted) includes the FP size of the existing system to which the upgrade was made. This data item is valid if data item 103_Project type has the value “b” (Maintenance/support) or “c” (Enhancement).
0: Unknown. 1: The existing FP size included. 2: The existing FP size excluded.
Fluctuations of the planned FP size and the measurement method for planned FP size 5082_Unadjusted FP size_after development planning 10116_Measurement method 10117_Name of the customized measurement method
The unadjusted FP size measured after development planning, and the method used to measure the unadjusted FP size. If the method is a customized method, write its name.
( ) FPs. ( ) method.
5083_Unadjusted FP size_after requirements definition 10118_Measurement method 10119_Name of the customized measurement method
The unadjusted FP size measured after requirements definition, and the method used to measure the unadjusted FP size. If the method is a customized method, write its name.
( ) FPs. ( ) method.
5084_Unadjusted FP size_after basic design 10120_Name of the measurement method 10121_Name of the customized measurement method
The unadjusted FP size measured after basic design, and the method used to measure the unadjusted FP size. If the method is a customized method, write the name of the method.
( ) FPs. ( ) method.
5085_Unadjusted FP size_after detailed design 10122_Name of the measurement method 10123_Name of the customized measurement method
The unadjusted FP size measured after detailed design, and the method used to measure the unadjusted FP size. If the method is a customized method, write the name of the method.
( ) FPs. ( ) method.
5001_Actual FP size (unadjusted)The actual FP size that was measured at the completion of system test without adjustment by the value adjustment factor.
( ) FPs
5002_Actual FP size (adjusted) The actual FP size that was measured at the completion of system test with adjustment by the value adjustment factor.
( ) FPs
5003_FP value adjustment factor The factor to adjust the FP value. ( )
706_Unadjusted FP size reliability
The reliability level of the unadjusted FP size classified in four grades from A to D. The administration office assigns a reliability value to this data item.
A: The unadjusted FP size is confirmed as reasonable and completely consistent.
B: The unadjusted FP size looks reasonable, but its consistency cannot be evaluated because the adjusted FP size or the FP value adjustment factor is missing.
C: The unadjusted FP size is unobtainable because the offered data does not include the unadjusted FP size or the detailed FP size.
D: The unadjusted FP size is unreliable because of one or more factors.
Detailed FP size (for IFPUG)
5026-5033_EI External inputs. Write relevant numbers in the following pairs of parentheses if planned values exist.
• The number of functions. High ( ). Average ( ). Low ( ).
• FP size ( ).
5034-5041_EO External outputs. Write relevant numbers in the following pairs of parentheses if planned values exist.
• The number of functions. High ( ). Average ( ). Low ( ).
• FP size ( ).
266 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Data Name Definitions Allowable Values
5042-5049_EQ External inquiry. Write relevant numbers in the following pairs of parentheses if planned values exist.
• The number of functions. High ( ). Average ( ). Low ( ).
• FP size ( ).
5050-5057_ILF Internal logical files. Write relevant numbers in the following pairs of parentheses if planned values exist.
• The number of functions. High ( ). Average ( ). Low ( ).
• FP size ( ).
5058-5065_EIF External interface files. Write relevant numbers in the following pairs of parentheses if planned values exist.
• The number of functions. High ( ). Average ( ). Low ( ).
• FP size ( ). Detailed FP size (for methods other than IFPUG)
5066-5069_Transactional functions
Equivalent to EI, EO, and EQ of IFPUG. Write relevant numbers in the following pairs of parentheses if planned values exist.
Number of functions ( ). FP size ( ).
5070-5073_Data functions Equivalent to ILF and EIF of IFPUG. Write relevant numbers in the following pairs of parentheses if planned values exist.
Number of functions ( ). FP size ( ).
Actual enhancement FP size (5022-5025)
The following four kinds of FP size of an enhancement project. • Existing FP (5022) • Added FP (5023) • Changed FP (5024) • Removal FP (5025)
Existing : ( ) FPs Added : ( ) FPs Changed : ( ) FPs Deleted : ( ) FPs
Planned enhancement FP size (11007-11010)
The following four kinds of FP size of an enhancement project. • Existing FP (11007) • Added FP (11008) • Changed FP (11009) • Deleted FP (11010) * The above FP sizes are mandatory if the corresponding FP sizes (5022-5025) have valid values.
Existing : ( ) FPs Added : ( ) FPs Changed : ( ) FPs Deleted : ( ) FPs
COSMIC-FFP detailed values 5074_Number of triggering events The number of COSMIC-FFP triggering events. ( ) 5075_Number of functional processes
The number of COSMIC-FFP functional processes. ( )
5076_Number of data groups The number of COSMIC-FFP data groups. ( ) 5077_Entry The value of the COSMIC-FFP entry. ( ) 5078_Exit The value of the COSMIC-FFP exit. ( ) 5079_Read The value of the COSMIC-FFP read. ( ) 5080_Write The value of the COSMIC-FFP write. ( ) 5081_Cfsu The value of the COSMIC-FFP Cfsu. ( ) Fluctuations of planned SLOC size
5086_After development planning The planned SLOC size after completion of development planning. ( ) SLOCs
5087_After requirements definition
The planned SLOC size after completion of requirements definition. ( ) SLOCs
5088_After basic design The planned SLOC size after completion of basic design. ( ) SLOCs
5089_After detailed design The planned SLOC size after completion of detailed design. ( ) SLOCs
Actual SLOC size
Actual SLOC size (5004, 5005, 5006, 10086, 10087)
The following kinds of actual SLOC size measured on completion of system test. • SLOC size (5004) • Excluding comment lines (5005) with the
comment line ratio (10086) • Excluding blank lines (5006) with the blank line
ratio (10087) *1 The above sizes are mandatory if FP sizes are unavailable. Write the above sizes if SLOC sizes as well as FP sizes are available. *2 The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs Comment lines: a: Included. b: Excluded.
If comment lines are included, write the comment line ratio in multiples of 5%, for example, 25%.
Blank lines: a: Included. b: Excluded. If blank lines are included, write the blank line ratio in multiples of 5%, for example, 25%.
11003_Actual SLOC size (Existing)
Write the actual SLOC size of the existing system if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 267
Data Name Definitions Allowable Values
11004_Actual SLOC size (added/developed)
Write the actual SLOC size of the added or developed part if data item 5004 has a valid value.* The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11005_Actual SLOC size (change)
Write the actual SLOC size of the changed part if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11006_Actual SLOC size (deletion)
Write the actual SLOC size of the deleted part if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11011_Planned SLOC size (Existing)
Write the planned SLOC size of the existing system if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11012_Planned SLOC size (added/developed)
Write the planned SLOC size of the added or developed part if data item 5004 has a valid value.* The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11013_Planned SLOC size (change)
Write the planned SLOC size of the changed part if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
11014_Planned SLOC size (deletion)
Write the planned SLOC size of the deleted part if data item 5004 has a valid value. * The SLOC size is not measured as the number of lines or kilo lines.
( ) SLOCs
5007-5021, 10001-10005, 10088-10097_Per-language actual SLOC size
Write the SLOC sizes of the top five programming languages from the most frequently used language to the lesser ones if more than one programming language was used. • Programming language (10001-10005) • SLOC size (5007, 5010, 5013, 5016, 5019) • Comment lines (5008, 5011, 5014, 5017, 5020) • Comment line ratio (10088, 10090, 10092, 10094, 10096) • Blank lines (5009, 5012, 5015, 5018, 5021) • Blank line ratio (10089, 10091, 10093, 10095, 10097)
a: Language ( ), ( ) SLOCs b: Language ( ), ( ) SLOCsc: Language ( ), ( ) SLOCs d: Language ( ), ( ) SLOCse: Language ( ), ( ) SLOCs Choose the following alternatives for each language. • Comment lines: a: Included. b: Excluded.
If comment lines are included, write the comment line ratio in multiples of 5%, for example, 25%.
• Blank lines: a: Included. b: Excluded. If blank lines are included, write the blank line ratio in multiples of 5%, for example, 25%.
11017_Inclusiveness of Existing SLOC size
The flag indicating whether or not the SLOC size of the existing system is included in the value of data item 5004_Actual SLOC size if data item 103_Project type has the value “b” (Maintenance/support) or “d” (Enhancement).
0: Unknown. 1: Included. 2: Excluded.
Volume of design documents (Actual) 5090_Development plan document
The actual number of pages of the development plan document. ( ) pages
5091_Requirements definition document
The actual number of pages of the requirement definition document. ( ) pages
5092_Basic design document The actual number of pages of the basic design document. ( ) pages
5093_Detailed design document The actual number of pages of the detailed design document. ( ) pages
Other size indicators
5094_Number of DFD data items The number of Data Flow Diagram (DFD) data items. ( )
5095_Number of DFD processes The number of DFD processes. ( ) 5096_Number of DB tables The number of database tables. ( ) 5097_Number of GUI screen types
The number of Graphical User Interface (GUI) screen types. ( )
5098_Number of report formats The number of report formats. ( ) 5099_ Number of batch programs The number of batch programs. ( )
5100-5102_Number of use-cases The number of use-cases classified in the following three grades: simple (5100), typical (5101), complex (5102).
Simple : ( ). Typical : ( ). Complex : ( ).
5103-5105_Number of actors The number of actors classified in the following three grades: simple (5103), typical (5104), complex (5105).
Simple : ( ). Typical : ( ). Complex : ( ).
268 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(8) Development schedule Data Name Definitions Allowable Values
5123-5148_ Planned work period of phases
The planned beginning date and the planned completion date of each development phase, or the duration in months (to one decimal place) obtained by “subtracting the planned beginning date of each development phase from its planned completion date”. Write the total planned months of all tasks that do not fall into any defined category.
The beginning date and completion date in the form yy/mm/dd (dd is optional), or the duration in months. • Beginning date (yy/mm/[dd]), and
completion date (yy/mm/[dd]) • ( ) months
5150-5175_ Actual work period of phases
The actual beginning date and the actual completion date of each development phase, or the duration in months (to one decimal place) obtained by “subtracting the actual beginning date of each development phase from its actual completion date”. Write the total actual months of all tasks that do not fall into any defined category.
The beginning date and completion date in the form yy/mm/dd (dd is optional), or the duration in months. • Beginning date (yy/mm/[dd]), and
completion date (yy/mm/[dd]) • ( ) months
5122, 5131, 5140_ Planned work period of whole project
The planned beginning date and the planned completion date of the project, or the duration in months (to one decimal place) obtained by “subtracting the planned beginning date of the project from its planned completion date”. The beginning date of a project refers to the date on which effort of the project originates for the first time. The completion date of a project refers to the date on which effort of the project ceases to originate.
The beginning date and completion date in the form yy/mm/dd (dd is optional), or the duration in months. • Beginning date (yy/mm/[dd]), and
completion date (yy/mm/[dd]) • ( ) months
5149, 5158, 5167_ Actual work period of whole project
The actual beginning date and the actual completion date of the project, or the duration in months (to one decimal place) obtained by subtracting the actual beginning date of the project from its actual completion date. The beginning date of a project refers to the date on which effort of the project originates for the first time. The completion date of a project refers to the date on which effort of the project ceases to originate. For example, the completion date of inspection by the ordering company or the delivery date.
The beginning date and completion date in the form yy/mm/dd (dd is optional), or the duration in months. • Beginning date (yy/mm/[dd]), and
completion date (yy/mm/[dd]) • ( ) months
806_Idling duration
The duration in which the project remained idling. (For example, waiting for a signature of the customer or for the arrival of test data.) The active duration of the project is obtained by subtracting the idling duration from the whole project development schedule.
( ) months
(9) Effort (cost)
Data Name Definitions Allowable Values 901_ Unit of effort Person-hours or person-months. a: Person-hours. b: Person-months.
902_Conversion ratio among person-month and person-hour
The ratio for conversion from person-month values to person-hour values. Write how many person-hours equals one person-month. Example: 160 person-hours/person-month.
• 1 person-month = ( ) person-hours.
5106-5113_Ddevelopment phases involved in the total project effort
Mark the phases involved in the project (from “development planning” to “acceptance test”). Use the following symbols:
: The project had actual tasks in this phase and associated data such as effort is written for this phase. : The project had no task in this phase.
⇒: The project had actual tasks in this phase, but associated data such as effort are included in other phase's data and not written.
If data sets of multiple phases are aggregated to
manage these phases as one phase or for convenience purposes, associate the aggregated data to the most downstream phase. Example: If data sets about basic design, detailed
design, and implementation phases are aggregated, mark the basic design and detailed design phases with ⇒ and the development phase with .
• Development planning ( ) • Requirements definition ( ) • Basic design ( ) • Detailed design ( ) • Implementation ( ) • Integration test ( ) • System test ( ) • Acceptance test ( )
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 269
Data Name Definitions Allowable Values
Actual in-house effort
Actual effort provided by staff including temporary staff who worked with the regular staff. (a) Development effort (5176-5184, 10130) (b) Management effort (5185-5193, 10131) (c) Other effort (10006-10014, 10132): The actual
effort that does not fall into the development or management category. Example: Infrastructure-building, operation environment preparation, system migration, operation support, and consulting.
(d) Out-of-category effort (5194, 10133-10141): The actual effort that does not fall into any defined category.
*: Write effort on a per-phase basis. The total project effort is calculated automatically. *: For the “out-of-category phase”, write the actual effort of those tasks that does not fall into any defined category.
(a) Development ( ). (b) Management ( ). (c) Other effort ( ). (d) Out-of-category effort ( ).
Actual review effort
The actual effort of in-house reviews. (Part of in-house effort) *: Per-phase (5206-5213, 10146) *: The review effort of the whole project (5205) is calculated automatically.
( )
Actual number of reviews
The number of times the project hold a review. *: Per-phase (5215-5222, 10147) *: The number of reviews of the whole project (5214) is calculated automatically.
( ) times.
Number of issues pointed out in reviews
The number of issues pointed out in reviews. *: Per-phase (5249, 5250, 10078-10083, 10150) *: The number of reviews of the whole project (10077) is calculated automatically.
( ) issues.
Outsourced effort
The amount of outsourced effort. (Not included in the in-house effort) *: Per-phase (5196-5203, 10145) *: The number of reviews of the whole project (5195) is calculated automatically.
( )
Use of outsourced effort (10033-10040, 10144)
Whether or not the project used outsourced effort. This data item is assigned the symbol “ ” automatically if the amount of outsourced effort is assigned a valid value.
< Automatic entry >
5204_The ratio of expenditure on outsourced work
Write the ratio of the expenditure on outsourced work to the total cost if the amount of outsourced effort is unknown.
( ) %
Average number of regular staff (5223-5231)
The average number of regular staff. ( ) persons
Number of regular staff at peak time (5232-5240)
The number of regular staff at the peak. ( ) persons
Average number of outsourced staff (10059-10067)
The average number of outsourced staff. ( ) persons
Number of outsourced staff at peak time (10068-10076)
The number of outsourced staff at the peak. ( ) persons
11015_Planned effort of whole project (basic design)
The planned effort of the whole project estimated at the beginning of basic design (including in-house and outsourced effort).
( )
11016_Planned effort of whole project (detailed design)
The planned effort of the whole project estimated at the beginning of detailed design (including in-house and outsourced effort).
( )
270 IPA/SEC White Paper 2007 on Software Development Projects in Japan
(10) Quality Data Name Definitions Allowable Values Defects identified after system cutover
5267-5270, 10112-10115_Total number of defects identified after system cutover
The number of defects reported after system cutover (after the beginning of in-service operation), classified in the number of failures and the number of faults. These numbers each are accumulated in the designated period of time after the cutover (1, 3, or 6 months). *1: For example, if the system has been operating for just 5 months, write the number of defects identified in 1 month after cutover and that identified in 3 months after cutover. *2: Do not write these numbers if the date of system cutover is unknown.
The number of defects identified in 1, 3, or 6 months after system cutover. • 1 month: ( ) failures and ( ) faults. • 3 months: ( ) failures and ( ) faults. • 6 months: ( ) failures and ( ) faults. * You can write defect counts for more than one month-period.
5255-5266, 10100-10111_ Number of defects identified after system cutover (per-degree-of- criticalness)
The number of defects reported after system cutover, classified in the number of failures and the number of faults on a per-degree-of-criticalness. The degree of criticalness • Very critical: The defect causes damage to the customer and quick countermeasures have to be taken. • Critical: The defect causes no damage to the customer, but quick countermeasures have to be taken. • Insignificant: The defect causes no damage to the customer and there is no need of quick countermeasures. Each of these numbers are accumulated in the designated period of time after the cutover (1, 3, or 6 months). *1: For example, if the system has been operating for just 5 months, write the number of defects identified in 1 month after cutover and that identified in 3 months after cutover. *2: Do not write these numbers if the date of system cutover is unknown.
The number of defects identified in 1, 3, or 6 months after system cutover. • 1 month: ( ) failures and ( ) faults
Very critical: ( ) failures and ( ) faults.Critical: ( ) failures and ( ) faults.Insignificant: ( ) failures and ( ) faults.
• 3 months: ( ) failures and ( ) faults Very critical: ( ) failures and ( ) faults.Critical: ( ) failures and ( ) faults.Insignificant: ( ) failures and ( ) faults.
• 6 months: ( ) failures and ( ) faults Very critical: ( ) failures and ( ) faults.Critical: ( ) failures and ( ) faults.Insignificant: ( ) failures and ( ) faults.
* You can write defect counts for more than one month-period and for more than one degree of criticalness.
Per-test-phase number of test cases
5251, 1005_ Integration test
Number of test cases for integration test (5251), and the definition of test cases for integration test (1005).
• Number of test cases: ( ) cases • Supplementary information on the definition of test case count (optional)
5252, 1005_System test
Number of test cases for system test (5252), and the definition of test cases for system test (1005).
• Number of test cases: ( ) cases • Supplementary information on the definition of test case count (optional)
Per-test-phase number of identified software defects
5253, 10098, 1007_ Integration test
The number of failures caused by software defects (5253), the number of faults caused by software defects (10098), and the definition of bug count (1007).
• The number of identified software defects Number of failures: ( ). Number of faults: ( ).
• Supplementary information on the definition of software bug count (optional).
5254, 10099, 1007_ System test
The number of failures caused by software defects (5254), the number of faults caused by software defects (10099), and the definition of bug count (1007).
• The number of identified software defects Number of failures: ( ). Number of faults: ( ).
• Supplementary information on the definition of software bug count (optional).
5241_Personnel assignment for quality assurance
Personnel assignment to quality assurance tasks during system development. * The per-phase data items from 5242 to 5248 of the Data Item Definitions Version 1.0 were obsoleted and do not exist in the Data Item Definitions Version 2.0 or later.
a: Project members were assigned to quality assurance tasks.
b: Special members were dedicated to quality assurance tasks.
c: No member was assignment to quality assurance tasks.
1010_Personnel assignment to testing
Personnel assignment to testing tasks.
a: The members assigned to testing tasks were enough in number and had enough skills.
b: The members assigned to testing tasks had enough skills, but they were not enough in number.
c: The members assigned to testing tasks were enough in number, but they had insufficient skills.
d: The members assigned to testing tasks were not enough in number and had insufficient skills.
1011_Existence of quantitative delivery quality standards
Did the project have quantitative delivery quality standards? a: Yes (description). b: No.
1013_Existence of third-party reviews
Did the project hold third-party reviews? * The third party refers to those who are not formal members of the project (for example, quality assurance staff and PMO).
a: Yes. b: No.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 271
A.3 Industry Classification
The following table lists the industry classification applied to the collected data. Major industry types are labeled with capital letters (A, B, C, ...), and each major type has one or more
subtypes labeled with 2-digit numbers (01, 02, ... ).
According to the Japan Standard Industrial Classification (revised March 2002). (Applied to the survey in October 2002 and later.) (Quoted from the Web site of Statistics Bureau of Ministry of Internal Affairs and Communications at
http://www.stat.go.jp/index/seido/sangyo/)
A - AGRICULTURE 01 AGRICULTURE
B - FORESTRY 02 FORESTRY
C - FISHERIES 03 FISHERIES 04 AQUACULTURE
D - MINING 05 MINING
E - CONSTRUCTION 06 CONSTRUCTION WORK, GENERAL, INCLUDING PUBLIC
AND PRIVATE CONSTRUCTION WORK 07 CONSTRUCTION WORK BY SPECIALIST CONTRACTOR,
EXCEPT EQUIPMENT INSTALLATION WORK 08 EQUIPMENT INSTALLATION WORK
F - MANUFACTURING 09 MANUFACTURE OF FOOD 10 MANUFACTURE OF BEVERAGES, TOBACCO AND FEED 11 MANUFACTURE OF TEXTILE MILL PRODUCTS, EXCEPT
APPAREL AND OTHER FINISHED PRODUCTS MADE FROM FABRICS AND SIMILAR MATERIALS
12 MANUFACTURE OF APPAREL AND OTHER FINISHED PRODUCTS MADE FROM FABRICS AND SIMILAR MATERIALS
13 MANUFACTURE OF LUMBER AND WOOD PRODUCTS, EXCEPT FURNITURE
14 MANUFACTURE OF FURNITURE AND FIXTURES 15 MANUFACTURE OF PULP, PAPER AND PAPER PRODUCTS 16 PRINTING AND ALLIED INDUSTRIES 17 MANUFACTURE OF CHEMICAL AND ALLIED PRODUCTS 18 MANUFACTURE OF PETROLEUM AND COAL PRODUCTS 19 MANUFACTURE OF PLASTIC PRODUCTS, EXCEPT
OTHERWISE CLASSIFIED 20 MANUFACTURE OF RUBBER PRODUCTS 21 MANUFACTURE OF LEATHER TANNING, LEATHER
PRODUCTS AND FUR SKINS 22 MANUFACTURE OF CERAMIC, STONE AND CLAY PRODUCTS23 MANUFACTURE OF IRON AND STEEL 24 MANUFACTURE OF NON-FERROUS METALS AND
PRODUCTS 25 MANUFACTURE OF FABRICATED METAL PRODUCTS 26 MANUFACTURE OF GENERAL MACHINERY 27 MANUFACTURE OF ELECTRICAL MACHINERY, EQUIPMENT
AND SUPPLIES 28 MANUFACTURE OF INFORMATION AND COMMUNICATION
ELECTRONICS EQUIPMENT 29 ELECTRONIC PARTS AND DEVICES 30 MANUFACTURE OF TRANSPORTATION EQUIPMENT 31 MANUFACTURE OF PRECISION INSTRUMENTS AND
MACHINERY 32 MISCELLANEOUS MANUFACTURING INDUSTRIES
G - ELECTRICITY, GAS, HEAT SUPPLY AND WATER 33 PRODUCTION, TRANSMISSION AND DISTRIBUTION OF
ELECTRICITY 34 MANUFACTURE OF GAS 35 HEAT SUPPLY 36 COLLECTION, PURIFICATION AND DISTRIBUTION OF
WATER, AND SEWAGE COLLECTION, PROCESSING AND DISPOSAL
H - INFORMATION AND COMMUNICATIONS 37 COMMUNICATIONS 38 BROADCASTING 39 INFORMATION SERVICES 40 INTERNET BASED SERVICES 41 VIDEO PICTURE, SOUND INFORMATION, CHARACTER
INFORMATION PRODUCTION AND DISTRIBUTION I - TRANSPORT
42 RAILWAY TRANSPORT 43 ROAD PASSENGER TRANSPORT 44 ROAD FREIGHT TRANSPORT 45 WATER TRANSPORT 46 AIR TRANSPORT 47 WAREHOUSING 48 SERVICES INCIDENTAL TO TRANSPORT
J - WHOLESALE AND RETAIL TRADE 49 WHOLESALE TRADE, GENERAL MERCHANDISE 50 WHOLESALE TRADE (TEXTILE AND APPAREL) 51 WHOLESALE TRADE (FOOD AND BEVERAGES) 52 WHOLESALE TRADE (BUILDING MATERIALS, MINERALS
AND METALS, ETC.) 53 WHOLESALE TRADE (MACHINERY AND EQUIPMENT) 54 MISCELLANEOUS WHOLESALE TRADE 55 RETAIL TRADE, GENERAL MERCHANDISE 56 RETAIL TRADE (DRY GOODS, APPAREL AND APPAREL
ACCESSORIES) 57 RETAIL TRADE (FOOD AND BEVERAGES) 58 RETAIL TRADE (MOTOR VEHICLES AND BICYCLES) 59 RETAIL TRADE (FURNITURE, HOUSEHOLD UTENSIL AND
HOUSEHOLD APPLIANCE) 60 MISCELLANEOUS RETAIL TRADE
K - FINANCE AND INSURANCE 61 BANKING 62 FINANCIAL INSTITUTIONS FOR COOPERATIVE
ORGANIZATIONS 63 INSTITUTIONS DEALING WITH POSTAL SAVINGS,
GOVERNMENT-RELATED FINANCIAL INSTITUTIONS 64 NON-DEPOSIT MONEY CORPORATIONS ENGAGED IN THE
PROVISION OF FINANCE, CREDIT AND INVESTMENT 65 SECURITIES AND FUTURES COMMODITY DEALING
ACTIVITIES 66 FINANCIAL AUXILIARIES 67 INSURANCE INSTITUTIONS, INCLUDING INSURANCE
AGENTS, BROKERS AND SERVICES L - REAL ESTATE
68 REAL ESTATE AGENCIES 69 REAL ESTATE LESSORS AND MANAGERS
M - EATING AND DRINKING PLACES, ACCOMMODATIONS 70 GENERAL EATING AND DRINKING PLACES 71 SPREE EATING AND DRINKING PLACES 72 ACCOMMODATIONS
N - MEDICAL, HEALTH CARE AND WELFARE 73 MEDICAL AND OTHER HEALTH SERVICES 74 PUPLIC HEALTH AND HYGIENE 75 SOCIAL INSURANCE AND SOCIAL WELFARE
O - EDUCATION, LEARNING SUPPORT 76 SCHOOL EDUCATION 77 MISCELLANEOUS EDUCATION, LEARNING SUPPORT
P - COMPOUND SERVICES 78 POSTAL SERVICES, EXCEPT OTHERWISE CLASSIFIED 79 COOPERATIVE ASSOCIATIONS, N.E.C.
Q - SERVICES, N.E.C 80 PROFESSIONAL SERVICES, N.E.C. 81 SCIENTIFIC AND DEVELOPMENT RESEARCH INSTITUTES82 LAUNDRY, BEAUTY AND BATH SERVICES 83 MISCELLANEOUS LIVING-RELATED AND PERSONAL
SERVICES 84 SERVICES FOR AMUSEMENT AND HOBBIES 85 WASTE DISPOSAL BUSINESS 86 AUTOMOBILE MAINTENANCE SERVICES 87 MACHINE, ETC. REPAIR SERVICES, EXCEPT OTHERWISE
CLASSIFIED 88 GOODS RENTAL AND LEASING 89 ADVERTISING 90 MISCELLANEOUS BUSINESS SERVICES 91 POLITICAL, BUSINESS AND CULTURAL ORGANIZATIONS92 RELIGION 93 MISCELLANEOUS SERVICES
R - GOVERNMENT, N.E.C. 94 FOREIGN GOVERNMENTS AND INTERNATIONAL
AGENCIES IN JAPAN 95 NATIONAL GOVERNMENT SERVICES 96 LOCAL GOVERNMENT SERVICES
S - INDUSTRIES UNABLE TO CLASSIFY 99 INDUSTRIES UNABLE TO CLASSIFY
272 IPA/SEC White Paper 2007 on Software Development Projects in Japan
A.4 Derived indicators Names and Definitions
Appendix A.4 lists the names and definitions of derived indicators with data items defined in Appendix A.2.
* The term “derived indicators” used in this White Paper is referred to as the “derived measurement quantity” in ISO/IEC 15939 Software Measurement Process.
Category Name Definition
Actual net SLOC size The SLOC size that excludes the SLOC size of comment lines and blank lines. The SLOC size of comment lines and blank lines is obtained by using the ratio of comment lines (10086_Actual SLOC size_comment line ratio) and the ratio of blank lines (10087_Actual SLOC size_blank line ratio). The net SLOC size is obtained by subtracting the SLOC size of comment lines and blank lines from the SLOC size defined as 5004_Actual SLOC size_SLOC. This White Paper uses the terms SLOC size, net SLOC size, and actual net SLOC size interchangeably. The unit KSLOC is used to represent multiples of one thousand SLOCs.
Actual SLOC size_enhancement The SLOC size of a type b project “maintenance/support” or type d “enhancement” excluding the SLOC size of the existing system to which enhancement was made. The value of Actual SLOC size_enhancement is calculated as shown below. (1) If 11003_Actual SLOC size (existing) + 11004_Actual SLOC size
(addition/development) + 11005_Actual SLOC size (modification) + 11006_Actual SLOC size (removal) > 0, then Actual SLOC size_enhancement = 11004_Actual SLOC size (addition/development) + 11005_Actual SLOC size (modification) + 11006_Actual SLOC size (removal).
(2) If 11003_Actual SLOC size (existing), 11004_Actual SLOC size (addition/development), 11005_Actual SLOC size (modification), and 11006_Actual SLOC size (removal) have no value, and if 11017_existing SLOC size inclusion = 1, then Actual SLOC size_enhancement = 5004_Actual SLOC size_SLOC.
Note: If the four data items from 11003 to 11006 have no value, and if 11017_ existing SLOC size inclusion = 0 or 2, Actual SLOC size_enhancement is not calculated.
Actual net SLOC size_enhancement Actual net SLOC size_enhancement equals Actual SLOC size_enhancement that excludes comment lines and blank lines as shown below. Actual net SLOC size_enhancement = Actual SLOC size_enhancement – the SLOC size of comment lines and blank lines The SLOC size of comment lines and blank lines is calculated by using the ratio of comment lines (10086_Actual SLOC size_comment line ratio) and the ratio of blank lines (10087_Actual SLOC size_blank line ratio).
Data functions The value of data functions equals the sum of the values of two data items measured by the IFPUG method as shown below. Data functions = 5057_Actual ILF_FP + 5065_Actual EIF_FP
Size
Transactional functions The value of transactional functions equals the sum of the values of three data items measured by the IFPUG method as shown below. Transactional functions = 5053_Actual EI + 5041_Actual EO + 5049_Actual EQ
Actual months_whole project Actual months_whole project equals the value of 5167_Actual work period of whole project. If 5167_ Actual work period of whole project has no value, use 10128_Months (actual)_whole project (stated value) instead.
Development schedule
Actual months (Major-development phases)
The amount that is obtained by dividing the duration in days from the beginning date of the Major-development phase period to the end of the period by 30 days per month. This calculation uses 5165_Completion date of system test (actual) and 5152_Beginning date of basic design (actual).
Actual effort (Major-development phases)
The sum of effort in person-hours that was used in all five phases from basic design to system test and the effort used for every task that does not fall into any defined category. See note 1 presented under this table. The value of Actual effort (Major-development phases) is calculated only for projects that went through all five phases. The effort consists of staff effort (development, management, other, and out-of-category) and outsourced effort.
Effort
Actual effort (whole project) The sum of effort in person-hours that was used in the whole project duration from development planning to acceptance test and the effort used for every task that does not fall into any defined category. The effort consists of staff effort (development, management, other, and out-of-category) and outsourced effort.
Appendix A: Data Item Definitions
IPA/SEC White Paper 2007 on Software Development Projects in Japan 273
Category Name Definition The ratio of outsourcing The ratio of outsourcing equals the ratio of outsourced effort. (See the next
data item). If the ratio of outsourced effort amount cannot be calculated, use 5204_Actual outsourcing (expenditure ratio) instead.
The ratio of outsourced effort The amount obtained by dividing the sum of effort used in all five phases from basic design to system test and the effort used for every task that does not fall into any defined category by Actual effort (Major-development phases) as shown below. The ratio of outsourced effort = Outsourced effort ÷ Actual effort (Major-development phases) The ratio of outsourced effort equals 0% if the outsourced effort is explicitly stated as zero.
The ratio of basic design effort The ratio of effort used in the basic design phase to actual effort used in all five phases. The ratio of basic design effort = Actual basic design effort ÷ Actual effort (Major-development phases)
FP productivity FPs per person-hour FP productivity = 5001_Actual FP size_unadjusted ÷ Actual effort (Major-development phases)
SLOC productivity SLOCs per person-hour SLOC productivity = Actual net SLOC size ÷ Actual effort (Major-development phases)
Productivity
SLOC productivity_enhancement SLOCs per person-hour SLOC productivity_enhancement = Actual net SLOC size_enhancement ÷ Actual effort (Major-development phases) Actual net SLOC size_enhancement excludes the SLOC size of the existing system to which enhancement was made. Note that the actual effort used in this calculation may include effort for tasks done to the existing system.
Number of defects identified after system cutover
The number of defects identified after system cutover equals the number of defects identified after system cutover (faults) described in the next row. If the number of defects identified after system cutover (faults) is unavailable, use the number of defects identified after system cutover (failures) instead.
Number of defects identified after system cutover (faults)
The number of faults identified after system cutover. The value of this derived indicator takes a value selected among the values for the following data items. That value is selected by choosing data items having a non-zero fault count and then choosing one of the chosen data items that has the longest fault count period. • 10112_Number of faults identified after system cutover (total)_1 month • 10113_Number of faults identified after system cutover (total)_3 month • 10114_Number of faults identified after system cutover (total)_6 month
Number of defects identified after system cutover (failures)
The number of failures identified after system cutover. The value of this derived indicator takes a value selected among the values of the following data items. That value is selected by choosing data items having a non-zero fault count and then choosing one of the chosen data items that has the longest fault count period. • 5267_Number of failures identified after system cutover (total)_1 month • 5268_Number of failures identified after system cutover (total)_3 month • 5269_Number of failures identified after system cutover (total)_6 month
Defect density per FP The number of identified defects per FP. Defect density per FP = Number of defects identified after system cutover ÷ 5001_Actual FP size (unadjusted)
Reliability
Defect density per SLOC The number of identified defects per KSLOC. Defect density per SLOC = Number of defects identified after system cutover ÷ Actual net SLOC size x 1,000
Personnel assignment
Number of staff per month Number of staff per month = Actual effort (Major-development phases) ÷ Actual months (Major-development phases) ÷ Conversion ratio among person-month and person-hour The conversion ratio among person-month and person-hour equals 902_Conversion ratio among person-month and person-hour if 901_Unit of effort has the value “b” (person-months). The conversion ratio among person-month and person-hour equals 160 if 901_Unit of effort has the value “a” (person-hours).
Primary stratification categories
The category of target platform type Windows category or Unix category depending on 309_Target platform_1, 309_Target platform_2, and 309_Target platform_3. “Windows category”:
a: Windows 95, 98, or Me. b: Windows NT, 2000, or XP. c: Windows Server 2003.
“Unix category” d: HP-UX, e: HI-UX, f: AIX, g: Solaris, h: Redhat Linux, i: SUSE Linux, j: Miracle Linux, k: Turbo Linux, l: Other Linux, m: Linux, n: Other Unix
Category “Other” Any platform that does not fall into the types from a to n.
274 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Category Name Definition Primary stratification categories
Primary programming language group (* A group of project data sets whose primary programming languages match any of the designated languages. If 312_Primary programming language_1, 312_Primary programming language_2, or 312_Primary programming language_3 of a project data set matches any of the designated languages, the data set belongs to the group.)
The primary programming language group consists of project data sets whose 312_Primary programming language_1, 312_Primary programming language_2, or 312_Primary programming language_3 matches any of the designated languages, or COBOL, C, VB, and Java in this case. Member data sets of the group make up another data set. Whether or not a project data set belongs to the group is determined by searching for the first match, starting with the project's 312_Primary programming language_1 and ending up with 312_Primary programming language_3. If, for example, 312_Primary programming language_1 matches one of the designated languages, the search ends here. Example 1. If a project data set has the following data values, the first
match takes place at Java to finish the search here and the data set belongs to the group.
312_Primary programming language_1 = “a: Assembly language” 312_Primary programming language_2 = “c: PL/I” 312_Primary programming language_3 = “q: Java”
Example 2. If a project data set has the following data values, the first match takes place at VB to finish the search here and the data set belongs to the group.
Thus the language in question fall in the VB category and no more check is made for 312_Primary programming language_2 and others. 312_Primary programming language_1 = “h: VB” 312_Primary programming language_2 = “g: C” 312_Primary programming language_3 = “a: Assembly language”
Example 3. If a project data set has the following data values, no match takes place and the data set does not belong to the group.
312_Primary programming language_1 = “c: PL/I” 312_Primary programming language_2 = “m: PL/SQL” 312_Primary programming language_3 = “a: Assembly language”
Mixed FP measurement methods If a project does not distinguish the following FP measurement methods from each other, the FP measurement method of the project is categorized as Mixed FP measurement methods: IFPUG, SPR, NESMA estimated method, and proprietary FP measurement methods.
FP measurement method categories
IFPUG group The general term for the following FP size measurement methods: IFPUG, SPR, and NESMA estimated method.
*1 Elements of Actual effort (Major-development phases)
Actual effort (Major-development phases) is calculated for each project that went through all five phases from basic design to system test. The following table illustrates what types of effort constitute Actual effort (Major-development phases). Yellow cells of the table are the places to hold effort values that make up Actual effort (Major-development phases).
The Actual effort (Major-development phases) is obtained by adding up all the yellow-cell effort values and converting the sum of the values to a person-hour value.
← Major-development phases →
Effort Elements Development planning
Requirements definition
Basic design
Detailed design Development Integration
test System
test Acceptance
test Out-of-phase
[In-house] development effort
[In-house] management effort
[In-house] other effort
[In-house] out-of-category effort
[Outsourced] development effort
Appendix B: Data Entry Form Version 2.3
IPA/SEC White Paper 2007 on Software Development Projects in Japan 275
Appendix B Data Entry Form Version 2.3
Appendix B presents the Data Entry Form Version 2.3, which was used to collect project data presented in this White Paper.
Refer to Appendix A.2 for definitions of the data items listed in the entry form. Data items whose item number or item name is written in blue letters were modified after the issue of the White Paper 2006.
Data Entry Form (1/3) Rose : Mandatiry Beige : Mandatiry Light yellow : Important Light green : Recommended Light blue : Alternative : Automatic entry (Entry disabled)
Data Entry Form Ver.2.3
Copyright (C) 2005-2006 IPA SEC. All rights reserved.
Category Item No. (*) Choosean alternative
10084 Proprietary project ID
11001 Whole/sub flag (*) The flag that identifies whether the data belongs to the whole system project or a sub-system project.
11002 Grouping ID
10085 Reliability of company-evaluated project data (*) The reliability of project data.103 Project type (*) The type of project (development or not)
104 The stability of the existing system (*)
105 Project category (*) The category of the project. ← Name for "Other"106 Entrusted development working site (*) Choose one to three alternatives for the working site if "entrusted development" is chosen for item 105.
Software development (*) Infrastructure-building (*) Operational environment preparation(*)
System migration (*) Maintenance (*) Operation support (*)
107 Project purpose The primary purpose(s) of the project (multiple choice). Consulting (*) Project management (*) Quality assurance (*)* Write "O" for every alternative that fits your project.
On-site environment preparation/adjustmentfor a running system (*) Customer training (*) Other (description)
108 New customer or old customer (*) Did the project serve a new customer or an old customer?109 New business or not (*) Was the project aimed at a new industry or business or an old industry or business?
118 Source of outsourced workforce (*)
119 Outsourcing country
110 New subcontractors or not (*)
111 Using new technology or not (*) Whether or not the project used new technology.112 How clearly were the responsibility and roles of project team members defined?
113 Clearness of goals and priority (*)
114 Working space (*) The working space for the project team.115 Project environment (acoustic noise) (*) The level of acoustic noise in the working environment.
116 Project success_Self-evaluation (*)
120 Evaluation of planning (Cost) (*) Whether or not the cost planning was valid.121 Evaluation of planning (Quality) (*) Whether or not the objectives of delivered quality were valid.122 Evaluation of planning (Development schedule) (*) Whether or not the development schedule planning was valid.123 Eveluation of results (Cost) (*) Evaluation of the results of cost planning.124 Evaluation of results (Quality) (*) Evaluation of achievement of delivered quality objectives.
125 Evaluation of results (Development schedule) (*)
126 Reason for QCD objectives failure (*)
117 Subjective evaluation of customer satisfaction (*)
201 Industry type (*)
202 Business type (*) The type of business the developed system is used for. (Choose one to three alternatives.)203 System applications (*) The application of the system developed by the project. (Choose one to three alternatives.)204 User accessibility (*)
205 Number of users (persons)
206 Number of user sites The number of user sites where servers or other devices are installed. (sites)207 User concurrency The maximum number of users who concurrently use the developed system. (persons)301 Type of developed system (*) The type of the software developed by the project. ← Name for "Other."
302 Use of business application package (*)
303 First-time use of business application package (*)
304 Name of business software package
305 The functional size ratio of business software package (%)
306 Customization cost ratio of business software package (%)
307 Processing Mode (*) In what processing mode the developed system is used. (Choose one to three alternatives.)
308 Architecture (*)
309 Target platform (*) The primary operating system platform of the developed system. (Choose one to three alternatives.)310 Use of Web technology (*) What kinds of Web technology did the project use? (Choose one to three alternatives.)311 Online transaction processing system (*) The software used for online transaction processing. ← Name for "Other."312 Primary programming language (1) (*) ← Language for "Other."312 Primary programming language (2) (*) ← Language for "Other."312 Primary programming language (3) (*) ← Language for "Other."312 Primary programming language (4) (*) ← Language for "Other."312 Primary programming language (5) (*) ← Language for "Other."313 Use of DBMS (*) What kind of DBMS did the project use? (Choose one to three alternatives.)401 Development life cycle model (*) Development life cycle model ← Name for "Other."402 Use of operation support tool (*) Did the project use an operation support tool? ← Name for "Other."
403 Examined similar projects or not (*)
404 Use of project management tool (*) Did the project use a project management tool?
405 Use of configuration management tool (*) ← Write the name of the tool(s) if you choose Yes.
406 Use of design support tool (*) Did the project use a design support tool? ← Write the name of the tool(s) if you choose Yes.
407 Use of documentation tool (*) Did the project use a documentation tool? ← Write the name of the tool(s) if you choose Yes.
408 Use of debug/testing support tool (*) Did the project use a debug/testing support tool? ← Write the name of the tool(s) if you choose Yes.
409 Use of CASE tool (*) Did the project use an upstream or integrated CASE tool? ← Write the name of the tool(s) if you choose Yes.
411 Use of code generator (*) ← Write the name of the tool(s) if you choose Yes.
412 Application of Development Methods (*) The schematic development approach applied to the project. ← Name for "Other."413 Re-use rate_development planning document Number of re-used pages/Number of total pages (%)414 Re-use rate_requirements definition document Number of re-used pages/Number of total pages (%)415 Re-use rate_basic design document Number of re-used pages/Number of total pages (%)416 Re-use rate_detailed design document Number of re-used pages/Number of total pages (%)
The name of the software package(s) used for the project. This data item is valid if data item 302 has thevalue “a” (Yes). Ex: SAP, Oracle Applications
Free text, or alternatives
Choose an alternative for the stability of the existing system if "maintenance/service" or "enhancement" ischosen for item 103.
The number of users who use the developed system. This data item is valid if data item 204 has the value“a” (Accessible to limited users).
Whether or not the company of the project used the business software package(s) for the first time. Thisdata item is valid if data item 302 has the value “a” (Yes).
Description
Whether or not the schedule planning was valid. Evaluate the schedule planning based on the state ofdelay in product delivery with respect to the delivery date specified by the customer.The reason why the cost, quality, and development schedule (delivery date) objectives were not achieved.(For example, the data item 123 has the value "c", "d", or "e.") Choose one to three alternatives.
(1) GeneralCharacteristics ofDevelopmentProjects
(2) ProjectApplications
(3) SystemCharacteristics
(4) DevelopmentTechniques
Data Item
Did the project use a code generator?* If the name of the code generator used is an in-house tool and making its name open is not allowed, write“In-house tool.”
How do you feel about the customer's satisfaction? Choose one alternative based on your own feeling.The type of industry the developed system is used for, or the type of industry in which the project'scustomer works. (Choose one to three alternatives.)
Whether the system developed by the project is accessible to limited users or is open to the public.
Evaluate whether or not the project is an overall success with respect to QCD.* A project is a success if its planning is appropriate and if its planned goals were achieved. A project thathad no planning is a success if it ends up with desirable results.
Choose one to three alternatives if the data item 118 has any value other than "e."(Keep consistency with item 118.)
Cleaness of responsibility and roles of project team members (*)
Did the project use one or more business software packages?# Except for in-house business software packages.
Make a rough estimation of the ratio of the total functional size of the used business software package(s) tothe functional size of the whole developed system. This data item is valid if data item 302 has the value “a”(Yes).
The programming language primarily used.*1 Up to 5 languages are selectable from the most frequently used one to lesser ones.*2 Choose "w: Other" for unlisted languages such as CGI, Java applets, and EJB and write the names ofthe languages.
Did the project examine one or more similar past projects in the planning phase?# Choose “b: No” if the project did not examine one or more similar projects that existed.
Did the project use a configuration management tool?# Example configuration management tools: ClearCase, CVS, Subversion, PVCS, SCCS, VSS.
The ratio of customization cost of the used business software package(s) to the total cost of thepackage(s). This data item is valid if data item 302 has the value “a” (Yes).
The type of architecture of the developed system. (Up to three types are selectable from the largest size tosmaller ones.)
The project identification assigned by the company that offered the project data. This data item is also usedfor sub-system identification.Example: 1-1, 1-2, …(IDs distinguishing sub-system projects from each other)
Assign the same group ID to member projects of the same group.* Write a free text for this data item regardless of the choice made for the data item 11001.
Choose one to three alternatives if the project used outsourced workforce.* An affiliate refers to a company that has capital transactions with another.Write one or more country names if item 118 has the value of "c" or "d."Example: China, India
How clearly were the project objectives (delivery date, quality, technologies, etc.) and their priority defined?
276 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Data Entry Form (2/3) Category Item No. (*) Choose
an alternative417 Reuse rate of source code The ratio of reused SLOC size of the source code to the total SLOC size of source code. (%)
418 Reuse rate of software components (%)
419 Reuse rate of test cases for integration test Number of re-used test cases/Number of total test cases (%)420 Reuse rate of test cases for system test Number of re-used test cases/Number of total test cases (%)421 Reuse rate of test cases for acceptance test Number of re-used test cases/Number of total test cases (%)
422 Use of development frameworks (*) ← Write the name of the tool(s) if you choose Yes.
501 Clearness of user requirements specifications (*) The degree of clearness the requirements specifications had at the beginning of the basic design phase.502 User participation in user requirement specifications (*) The degree of user participation in the requirement specifications.503 User expertise in computing (*) The level of user expertise in computer systems and system development.504 User expertise in applied business (*) The level of user expertise in the applied business.505 Clearness of user role and responsibility (*) How clearly were the role and responsibility of the user and those of the vender defined?506 User acknowledgment of requirements specifications (*) Did the user acknowledge the requirements specifications?507 User comprehension of system design (*) The degree of user understanding of the system design508 User acknowledgment of system design (*) Did the user acknowledge the system design?509 User participation in acceptance test (*) The degree of user participation in the acceptance test.511 Number of members participated in requirements definition The number of key persons who defined the requirements. (persons)
512 Level of requirements (Reliability) (*)
513 Level of requirements (Usability) (*)
514 Level of requirements (Performance and efficiency) (*)
515 Level of requirements (Maintainability) (*)
516 Level of requirements (Portability) (*)
517 Level of requirements (Running cost) (*) The level of requirements in terms of the system running cost.518 Level of requirements (Security) (*) The level of system security requirements.519 Legal restrictions (*) Legal restrictions placed on the developed system.
601 PM skill (*)
Staff skills 602 Staff skill_application domain experience (*)603 Staff skill_analysis and design experience (*) The skill level of staff with respect to the system analysis and design.604 Staff skill_programming language and software tool experience(*) The skill level of staff with respect to programming languages and software tools.605 Staff skill_development platform experience(*) The skill level of staff with respect to the use of development platform.
General comment 1012
■ Size(1) FP (2) FP size of upgraded part * Enter the FP size of the existing system and the FP sizes of added, changed, and/or deleted parts if "upgrade" is chosen for item 103.
Phase FP size Measurementmethod (*)
Name for "Other"method Part Actual FP size Planned FP size
After systemplanning Existing FPs
Planned FP size(unadjusted)
Afterrequirements
definitionAdded FPs
After basic design Changed FPsAfter detailed
design Deleted FPs
UnadjustedActual FP size Adjusted
Adjustment factor
Purity of measurement method for actual FP size (*) ← Write the name of the method if it is a customized version.FP measurement support technology(*)
(3) SLOC Write SLOC sizes in SLOCs (not in KSLOCs).Planned SLOC size Actual SLOC size
After systemplanning
Afterrequirements
definitionAfter basic design After detailed
design Actual size Comment lineinclusion (*)
Comment lineratio(*)
Blank lineinclusion (*) Blank line ratio (*)
Part-based SLOC size Per-programming-language actual SLOC size (top 5 languages)
(Part) Planned Actual Language Actual Comment lineinclusion (*)
Comment lineratio(*)
Blank lineinclusion (*) Blank line ratio (*) ← Enter the actual SLOC sizes of the top five programming languages.
Existing Independent of the Part-based SLOC size table.Added/newChangedDeleted
(4) Detailed FP size (for IFPUG)) * If "IFPUG" is chosen for item 701, enter the number of and the FP size of each basic FP element (EI, EO, EQ, ILF, and EIF) on a per-degree-of-complexity.
Number of FunctionsLarge Medium Small
Planned * FP = Large × 6 + Medium × 4 + Small × 3Actual
Planned * FP = Large × 7 + Medium × 5 + Small × 4Actual
Planned * FP = Large × 6 + Medium × 4 + Small × 3Actual
Planned * FP = Large × 15 + Medium × 10 + Small × 7Actual
Planned * FP = Large × 10 + Medium × 7 + Small × 5Actual
(5) Detailed FP size (for methods other than IFPUG (6) Detailed FP size (COSMIC-FFP) * If the used FP measurement method is the type "Other" that is created based on the NESMA indicative, NESMA estimated, or IFPUG method, * If the used FP measurement method is COSMIC-FFP, enter its detailed data. enter the total number of transactional functions, that of data functions, and their total FP sizes.
Number offunctions FP Value
Transactional functions Planned Number of triggering eventsActual Number of functional processes
Data functions Planned Number of data groupsActual Entry
Subprocesses ExitReadWrite
Cfsu
(7) Other indices related to the sizeValue Simple Typical Complex
Developmentplanning Use-case Number of
use-casesDesign documentvolume
Requirementsdefinition Number of actors
Basic designDetailed design
DFD Number ofdata sets
Number ofprocesses
Number of database tablesNumber of GUI screen typesNumber of report formatsNumber of batch processes
Free text, or alternatives
(4) DevelopmentTechniques
(5) UserRequirementManagement
Transactionalfunctions
FP
Function
Item
EI
EO
Data functions
Item
ItemItem
Data Item
EQ
ILF
EIF
①Contract of this project (primary subcontract, secondary subcontract, in-house contract).②If the system size is measured in SLOCs, clarify in what kind of quantity the size was measured (number of lines, number of steps, number of physical lines, or number of logical lines).③Write remarks such as the outsourced effort (converted from the amount of money ordered with actual effort).
The level of usability requirements in terms of the ease of software learning, ease of operation learning,ease of operation management, the sophistication of graphical interface design, and other factors.The level of performance and efficiency requirements in terms of the response time, processing time,processing power, the utilization of system resources such as hard disks and memory, and other factors.The level of maintainability requirements in terms of the ease of software correction, ease of fault locating,ease of fault identification, ease of software change, protection against possible troubles in softwarechange, the ease of software correction validity verification, and other factors.The level of portability requirements in terms of the ease of adjustment to a new environment, ease ofinstallation in the environment, ease of concurrent operation with other software components, ease ofporting from other software, and other factors.
The skill level of project managers. Score the PM skill in accordance with the job "Project Management" ofthe IT Skill Standard Version 1.1.The skill level of staff with respect to the application to which the developed system is aimed at.
The level of reliability requirements in terms of the failure rate, recovery time, data recovery, and otherfactors.
The reuse ratio of reused software components such as library components in terms of functional size.The approximate ratio of functional size of the reused software components to the total functional size ofthe developed system.
Did the project use a development framework?Examples: Struts, Net framework, JBOSS, J2EE.
Description
Appendix B: Data Entry Form Version 2.3
IPA/SEC White Paper 2007 on Software Development Projects in Japan 277
Data Entry Form (3/3) ■ Effort, development schedule, number of staff
← If the effort in the project data is measured in person-months, enter the number of hours one person works per month at the working rate of 100%. Otherwise, enter "1" here.
At the beginningof basic design [Person hours]
At the beginningof detailed design [Person hours]
Item DevelopmentPlanning
RequirementsDefinition Basic Design Detailed Design Implementation Integration Test System Test Acceptance
Test Out-Of-Category Whole Project
Origination of actual tasks (*)Origination of requirements specifications changes(*)
Beginning datePlanned Completion date Offered data (reference)↓
MonthsBeginning date Idling duration↓ (*2)
Actual Completion dateMonths
In-house Development 0.0 Offered data (reference)↑Management (*3) 0.0Other (*4) 0.0Out-of-category (*5) 0.0
< Subtotal > In-house 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [Person hours]Hours 0 0 0 0 0 0 0 0 0 0 h
Actual effort Review effort (in-house) 0.0 [Person hours]Number of times 0
Number of issues 0Outsourced Origination of tasks
Development 0.0Expenditure ratio (%)
< Total > In-house + Outsourced 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [Person hours]Hours 0 0 0 0 0 0 0 0 0 0 h
In-house AverageNumber of staff Peak
Outsourced AveragePeak
(*1) Enter the beginning date and the completion date or the number of months (to one decimal place) in the development schedule cells. You can enther both of them in the cells. (*2) The duration in which the project remained idling. (For example, waiting for a signature of the customer or for the arrival of test data.) The active duration of the project is obtained by subtracting the idling duration from the whole project development schedule. (*3) If the project management effort was collected separately, enter its amount. (*4) If the project has some actual effort does not fall into the development effort or management effort, enter the actual effort. (For example, effort of infrastructure-building, operation environment preparation, system migration, operation support, consulting.) (*5) Enter the effort that does not fall into any defined category.
■ Quality and reliability
← If you choose Yes, write a description.
1 month 3 months 6 months 12 monthsNumber of test cases
FailuresFaults
Very criticalFailures Critical
InsignificantTotal
(*1) Very criticalFaults Critical
InsignificantTotal
(*1) Grades of criticalnessVery critical The defect causes a damage to the customer and quick countermeasures have to be taken.
Critical The defect causes no damage to the customer, but quick countermeasures have to be taken.Insignificant The defect causes no damage to the customer and there is no need of quick countermeasures.
Follow-Up (operation)
Number ofidentified defects
Developmentschedule (*1)
Integration Test System Test
Identifieddefects
Existence of quantitative delivery quality standard (*)Personnel assignment to quality assurance tasks (*)
Existence of third-party reviews (*)
Personnel assignment to test tasks (*)
Unit of effort (*)
Planneddevelopment
effort
Conversion ratiofrom person-month to person-hour
Definition of test case countDefinition of software bug count
278 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Appendix C Per-Data-Item Reply Status
Appendix C presents the actual response to each data item inquiry. [Table conventions] • The Data Item column lists data items in the same order as those listed in the data item definitions of
Appendix A.2. • The Priority column lists the following priority symbols that indicate the degree of necessity for data
collection.
◎: Entry is mandatory. : Entry is mandatory under a certain condition. (Suppose that there is a data item that has an
alternative “Other (description)” among others. If the answerer chooses this alternative, he or she is required to write the name of the requested item. In such a case, the alternative is marked with this symbol.
: Entry is important. : Entry is recommended.
Blank: Entry is optional. • The “Total” column lists the total number of entries for each data item. In the following example, the Total
column lists the total of 1,774, 1,774, and 13 for 103_Project type, 105_Project category, and 105_Project category_by_name, respectively.
• The column “Alternative n” lists the number of entries for the alternative n. The following example shows the entry counts of alternatives of 103_Project type as follows: Alternative 1 (a: Development) 1,035 Alternative 2 (b: Maintenance/support) 472 Alternative 3 (c: Redevelopment) 86 Alternative 4 (d: Enhancement) 181
(Total 1,774)
Example Reply Status
Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4 Alternative 5
a: Development b: Maintenance/support c: Redevelopment d: Enhancement
103_Project type ◎ 1,774 1,035 472 86 181a: Commercialpackagedevelopment
b: Entrusteddevelopment c: For in-house use d: Prototyping e: Other
105_Project category ◎ 1,774 101 1,634 11 16 12105_Project category_byname □ 13 • “Alternatives” whose cells are green are newly added items of Data Item Definitions Version 2.3, 2007. • “Alternatives “whose cells are yellow hold the item (alternative) names associated with categorical data
items. With data items 126, 201, 202, 203, 309, 310, 312, and 313 having many alternatives, a separate table is provided for each alternative of these data items to present the reply status on a per-alternative basis.
Appendix C: Per-Data-Item Reply Status
IPA/SEC White Paper 2007 on Software Development Projects in Japan 279
Per-data-item reply status (1/7) Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4 Alternative 5
10084_Proprietary project ID ◎ 1,774a: Whole system project b: Sub-system project
11001_Whole/sub flag ◎ 1,165 1,017 14811002_Grouping ID ○ 74
A B C D Unknown10085_Data reliability company-evaluated 412 184 177 16 4 31
a: Development b: Maintenance/support c: Redevelopment d: Enhancement103_Project type ◎ 1,774 1,035 472 86 181
a: Stable b: Reaching the stablestate c: Unstable d: Stability unknown
104_Existing system stability ○ 351 233 82 14 22a: Commercial packagedevelopment
b: Entrusteddevelopment c: For in-house use d: Prototyping e: Other
105_Project category ◎ 1,774 101 1,634 11 16 12105_Project category_byname □ 13
a: Customer’s site b: In-house site c: Other106_Entrusted development working site_1 513 51 454 8
a: Customer’s site b: In-house site c: Other106_Entrusted development working site_2 74 46 12 16
a: Customer’s site b: In-house site c: Other106_Entrusted development working site_3 3 0 0 3
○
107_Project purpose 1_software development □ 1,765 1765○
107_Project purpose 2_infrastructure-building □ 172 172○
107_Project purpose 3_operation environmentpreparation □ 109 109
○
107_Project purpose 4_system migration □ 306 306○
107_Project purpose 5_maintenance □ 156 156○
107_Project purpose 6_operation support □ 37 37○
107_Project purpose 7_consulting □ 11 11○
107_Project purpose 8_project management □ 428 428○
107_Project purpose 9_quality assurance □ 122 122○
107_Project purpose 10_on-site environmentpreparation/adjustment for a running system □ 141 141
○
107_Project purpose 11_customer training □ 99 99○ Customer test support Installation
107_Project purpose 12_other (description) □ 7 2 2 3a: New customer b: Old customer
108_New customer or old customer ○ 603 113 490a: New industry orbusiness
b: Old industry orbusiness
109_New business or not ○ 554 92 462
a: Japanese company(intra-group/affiliate)
b: Japanese company(out-of-group/non-affiliate)
c: Foreign company(intra-group/affiliate)
d: Foreign company(out-of-group/non-affiliate)
e: No outsourcing
118_Source of outsourced workforce_1 △ 407 173 177 7 12 38
a: Japanese company(intra-group/affiliate)
b: Japanese company(out-of-group/non-affiliate)
c: Foreign company(intra-group/affiliate)
d: Foreign company(out-of-group/non-affiliate)
e: No outsourcing
118_Source of outsourced workforce_2 △ 69 10 50 0 9 0
a: Japanese company(intra-group/affiliate)
b: Japanese company(out-of-group/non-affiliate)
c: Foreign company(intra-group/affiliate)
d: Foreign company(out-of-group/non-affiliate)
e: No outsourcing
118_Source of outsourced workforce_3 △ 10 0 0 7 3 0119_Outsourcing country △ 39
a: The subcontractorswere new to thecompany of the project
b: The company of theproject used thesubcontractors morethan once
110_New subcontractors or not_1 ○ 396 33 363
a: The subcontractorswere new to thecompany of the project
b: The company of theproject used thesubcontractors morethan once
110_New subcontractors or not_2 30 12 18
a: The subcontractorswere new to thecompany of the project
b: The company of theproject used thesubcontractors morethan once
110_New subcontractors or not_3 0 0 0
a: Used new technology b: Did not use newtechnology
111_Using new technology or not ○ 489 114 375a: Very clear b: Clear c: Not very clear d: Unclear
112_Clearness of responsibility and roles ofproject team members ○ 506 189 289 26 2
a: Very clear b: Clear c: Not very clear d: Unclear113_Clearness of goals and priority ○ 476 127 313 31 5
280 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Per-data-item reply status (2/7) Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4 Alternative 5 Alternative 6
a: Enough closed spacefor each member
b: Adequate space foreach member, and verygood workingenvironment forconcentrated brainwork
c: Stuffy open space,interruptingconcentrated brainwork
d: Very packed openspace lacking space fordocuments andcomputers
114_Working space ○ 425 15 262 144 4
a: No noise andminimum interruption byphone calls
b: Noise level belowhuman awareness andintermittent interruptionby phone calls
c: Occasional high-levelnoise and frequentinterruption by phonecalls
d: Deafening noiseconstantly hamperingconcentrated brainworkInterruption by phonecall repeats within onehour
115_Project environment (acoustic noise) ○ 397 23 306 68 0a: All the QCD elementsare successful
b: Two of the QCDelements are successful
c: One of the QCDelements is successful
d: None of the QCDelements is successful
116_Project success_Self-evaluation ○ 279 166 79 26 8a: Successful b: Almost successful c: Failed a little d: Failed
116_Project success_Self-evaluation_old ○ 811 292 462 43 14
a: The basis of costestimation was clear andfeasibility was confirmed
b: The basis of costestimation was unclearor feasibility was notconfirmed
c: No planning
120_Evaluation of planning (Cost) ◎ 603 537 66 0
a: The quality objectiveswere clear and feasibilitywas confirmed
b: The quality objectiveswere unclear orfeasibility was notconfirmed
c: No planning
121_Evaluation of planning (Quality) ◎ 580 486 59 35
a: The basis ofdevelopment scheduleplanning was clear andfeasibility was confirmed
b: The basis ofdevelopment scheduleplanning was unclear orfeasibility was notconfirmed
c: No planning
122_Evaluation of planning (Developmentschedule) ◎ 603 544 54 5
a: The actual cost is lessthan the planned cost by10% or more
b: The actual cost nearlyequals the planned costwith an error less than ±10%
c: The actual costexceeded the plannedcost by 30% or less
d: The actual costexceeded the plannedcost by 50% or less
e: The actual costexceeded the plannedcost by more than 50%
123_Eveluation of results (Cost) ◎ 743 96 549 57 14 27a: The number ofdefects after systemcutover is less than theplanned value by 20% ormore
b: The number ofdefects after systemcutover is less than theplanned value
c: The number ofdefects after systemcutover exceeded theplanned value by 50% orless
d: The number ofdefects after systemcutover exceeded theplanned value by 100%or less
e: The number ofdefects after systemcutover exceeded theplanned value by morethan 100%
124_Evaluation of results (Quality) ◎ 458 51 303 67 20 17a: The product wasdelivered before theplanned delivery date
b: The product wasdelivered on the planneddelivery date
c: The actual deliverydelayed by 10 days orless
d: The actual deliverydelayed by 30 days orless
e: The actual deliverydelayed more than30 days
125_Evaluation of results (Developmentschedule) ◎ 741 24 581 29 37 70
a: The customer is fullysatisfied
b: The customer isalmost satisfied
c: The customer isdissatisfied at somepoints
d: The customer is not atall satisfied
117_Subjective evaluation of customersatisfaction ○ 298 74 183 30 11
a: Accessible to limitedusers b: Open to the public
204_User accessibility ◎ 1,641 1,382 259205_Number of users ○ 162206_Number of user sites 210207_User concurrency 85
a: Application softwareb: System software(middleware, operatingsystem)
c: Tool software d: Developmentenvironment software e: Other
301_Type of developed system ◎ 1,758 1,660 67 14 9 8301_Type of developed system_byname □ 4
a: Yes b: No302_Use of business application package ◎ 1,304 249 1,055
a: The businesssoftware packages wereused for the first time
b: The businesssoftware packages wereused more than once
c: How many times thebusiness softwarepackages were used isunknown
303_Using business application package for thefirst time □ 100 21 77 2
304_Name of business software package □ 195305_The functional size ratio of businesssoftware package 27
306_Customization cost ratio of businesssoftware package 17
a: Batch processing b: Interactive processing c: Online transactionprocessing d: Other
307_Mode of operation_1 532 57 366 99 10
a: Batch processing b: Interactive processing c: Online transactionprocessing d: Other
307_Mode of operation_2 60 25 27 8 0
a: Batch processing b: Interactive processing c: Online transactionprocessing d: Other
307_Mode of operation_3 2 0 0 2 0a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet f: Other
308_Architecture_1 ◎ 1,675 221 143 448 317 475 71a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet f: Other
308_Architecture_2 52 2 2 11 8 24 5a: Stand-alone b: Mainframe c: 2-layer client/server d: 3-layer client/server e: Intranet/Internet f: Other
308_Architecture_3 4 0 0 1 1 0 2
Appendix C: Per-Data-Item Reply Status
IPA/SEC White Paper 2007 on Software Development Projects in Japan 281
Per-data-item reply status (3/7) Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4 Alternative 5 Option 6
a: TUXEDO b: CICS c: OPENTP1 d: Other e: None311_Online transaction processing system 154 15 0 8 26 105311_Online transaction processing_byname 3312_Primary programming language_1_name □ 78312_Primary programming language_2_name □ 31312_Primary programming language_3_name □ 29312_Primary programming language_4_name 8312_Primary programming language_5_name 2
a: Waterfall b: Iterative c: Other401_Development life cycle model ◎ 1,678 1,617 41 20401_Development life cycle model_byname □ 24
a: JP1 b: SystemWalker c: Senju d: A-Auto e: Other f: None402_Use of operation support tool 205 49 2 14 1 30 109402_Use of operation tools_byname □ 20
a: Yes b: No403_Examined similar projects or not △ 244 160 84
a: Yes b: No404_Use of project management tool △ 528 214 314
a: Yes b: No405_Use of configuration management tool △ 519 225 294405_Use of configuration managementtool_name □ 132
a: Yes b: No406_Use of design support tool △ 504 87 417406_Use of design support tool_name □ 39
a: Yes b: No407_Use of documentation tool △ 498 168 330407_ Use of documentation tool_name □ 19
a: Yes b: No408_Use of debug/testing support tool △ 565 215 350408_Use of debug/testing support tool_name □ 68
a: Yes b: No409_Use of CASE tool △ 267 20 247409_Use of CASE tool_name □ 11
a: Yes b: No411_Use of code generator △ 273 41 232411_Use of code generator_name □ 30
a: Structured analysisand design
b: Object-orientedanalysis and design
c: Data-orientedapproach (DOA) d: Other e: None
412_Application of Development Methods △ 407 151 52 21 105 78412_Use of schematic developmentapproach_name □ 27
413_Re-use rate of development plan document 41414_Re-use rate of requirements definitiondocument 43
415_Re-use rate of basic design document 45416_Re-use rate of detailed design document 45417_Reuse rate of source code △ 194418_Reuse rate of software components 38419_Reuse rate of test cases for integration test 42420_Reuse rate of test cases for system test 42421_Reuse rate of test cases for acceptance test 37
a: Yes b: No422_Use of development frameworks ○ 255 59 196422_Use of development framework_name □ 59
a: Very clear b: Clear c: Ambiguous d: Very ambiguous501_Clearness of user requirements ○ 672 74 348 203 47
a: Full b: Adequate c: Inadequate d: None502_User participation in user requirementspecifications ○ 600 174 210 190 26
a: Comprehensive b: Adequate c: Inadequate d: None503_User expertise in computing △ 346 81 175 64 26
a: Comprehensive b: Adequate c: Inadequate d: None504_User expertise in applied business 124 48 66 7 3
a: Very clear b: Clear c: Ambiguous d: Very ambiguous505_Clearness of user role and responsibility △ 185 43 112 23 7
a: Yes b: No506_User acknowledgment of requirementsspecifications △ 180 172 8
a: Full b: Adequate c: Inadequate d: None507_User comprehension of system design △ 174 50 101 22 1
a: Yes b: No508_User acknowledgment of system design △ 180 167 13
a: Full b: Adequate c: Inadequate d: None509_User participation in acceptance test ○ 424 117 201 42 64511_Number of members participated inrequirements definition △ 106
282 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Per-data-item reply status (4/7) Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4 Alternative 5 Alternative 6
a: Very high b: High c: Medium d: Low512_Level of requirements (Reliability) ○ 578 76 186 294 22
a: Very high b: High c: Medium d: Low513_ Level of requirements (Usability) △ 182 19 79 77 7
a: Very high b: High c: Medium d: Low514_ Level of requirements (Performance andefficiency) ○ 659 52 205 374 28
a: Very high b: High c: Medium d: Low515_ Level of requirements (Maintainability) △ 182 17 47 104 14
a: Very high b: High c: Medium d: Low516_Level of requirements (Portability) △ 188 13 26 87 62
a: Very high b: High c: Medium d: Low517_Level of requirements (Running cost) △ 171 6 22 102 41
a: Very high b: High c: Medium d: Low518_Level of requirements (Security) ○ 429 27 109 263 30
a: Industrial legalrestrictions
b: Regular legalrestrictions c: None
519_Legal restrictions △ 373 31 58 284a: Level 6 or 7 b: Level 5 c: Level 4 d: Level 3
601_PM skill ○ 498 89 108 249 52
a: All staff had enoughexperience
b: Half of the staff hadenough experience, andthe rest had adequateexperience
c: Half of the staff hadadequate experience,and the rest had noexperience
d: All staff was withoutexperience
602_Staff skill_application domain experience ○ 687 151 377 132 27
a: All staff had enoughexperience
b: Half of the staff hadenough experience, andthe rest had adequateexperience
c: Half of the staff hadadequate experience,and the rest had noexperience
d: All staff was withoutexperience
603_Staff skill_analysis and design experience ○ 451 108 249 91 3
a: All staff had enoughexperience
b: Half of the staff hadenough experience, andthe rest had adequateexperience
c: Half of the staff hadadequate experience,and the rest had noexperience
d: All staff was withoutexperience
604_Staff skill_programming language andsoftware tool experience ○ 642 151 374 110 7
a: All staff had enoughexperience
b: Half of the staff hadenough experience, andthe rest had adequateexperience
c: Half of the staff hadadequate experience,and the rest had noexperience
d: All staff was withoutexperience
605_Staff skill_development platform experience ○ 525 171 260 82 12
a: IFPUG b: SPR c: NESMA indicativemethod
d: NESMA estimatedmethod e: COSMIC-FFP f: Other
10116_FP measurement method (afterdevelopment planning) □ 22 21 0 0 0 0 1
10117_Name of the customized measurementmethod (after development planning) □ 1
a: IFPUG b: SPR c: NESMA indicativemethod
d: NESMA estimatedmethod e: COSMIC-FFP f: Other
10118_FP measurement method (afterrequirements definition) □ 29 15 4 1 8 0 1
10119_Name of the customized measurementmethod (after requirements definition) □ 0
a: IFPUG b: SPR c: NESMA indicativemethod
d: NESMA estimatedmethod e: COSMIC-FFP f: Other
10120_FP measurement method (after basicdesign) □ 82 50 6 2 22 0 2
10121_Name of the customized measurementmethod (after basic design) □ 2
a: IFPUG b: SPR c: NESMA indicativemethod
d: NESMA estimatedmethod e: COSMIC-FFP f: Other
10122_FP measurement method (after detaileddesign) □ 31 9 2 0 19 0 1
10123_Name of the customized measurementmethod (after detailed design) □ 1
a: IFPUG b: SPR c: NESMA indicativemethod
d: NESMA estimatedmethod e: COSMIC-FFP f: Other
701_Primary FP measurement method (actual) □ 832 223 240 2 54 0 313701_Primary FP measurement method(actual)_name □ 312
a: Original method b: Customized method10124_Purity of measurement method for actualFP size (actual) □ 802 486 316
10124_Purity of measurement method for actualFP size (actual)_name □ 312
a: Yes (Support tool ordedicated staff) b: No
702_FP measurement support technology 508 495 13706_Unadjusted FP size reliability 0806_Idling duration 23901_Unit of effort ◎ 1,774902_Conversion ratio among person-month andperson-hour ◎ 1,774
1005_Definition of test case count △ 451007_Definition of software bug count △ 38
a: The membersassigned to testing taskswere enough in numberand had enough skills
b: The membersassigned to testing taskshad enough skills, butthey were not enough innumber
c: The membersassigned to testing taskswere enough in number,but they had insufficientskills
d: The membersassigned to testing taskswere not enough innumber and hadinsufficient skills
1010_Personnel assignment to testing 314 98 75 95 46a: Yes b: No
1011_Existence of quantitative delivery qualitystandards △ 277 245 32
1011_Existence of quantitative delivery qualitystandards_name 186
1012_General comment △ 104
Appendix C: Per-Data-Item Reply Status
IPA/SEC White Paper 2007 on Software Development Projects in Japan 283
Per-data-item reply status (5/7) Data Item Priority Total Alternative 1 Alternative 2 Data Item Priority Total
5001_Actual FP size (unadjusted) ◎ 860 5026_Planned EI_highly complex 15002_Actual FP size (adjusted) ○ 644 5027_Planned EI_complex 15003_FP value adjustment factor ○ 704 5028_Planned EI_not complex 15004_Actual SLOC size_SLOC ◎ 983 5029_Planned EI_FP 52
a: Included b: Excluded 5030_Actual EI_highly complex 135005_Actual SLOC size comment line inclusion ◎ 960 222 738 5031_Actual EI_complex 1510086_Actual SLOC size_comment line ratio ○ 128 5032_Actual EI_not complex 145006_Actual SLOC size blank line inclusion ◎ 960 5033_Actual EI_FP ○ 8110087_Actual SLOC size_blank line ratio ○ 32 5034_Planned EO_highly complex 111003_Actual SLOC size (existing) □ 177 5035_Planned EO_complex 111004_Actual SLOC size (added/developed) □ 411 5036_Planned EO_not complex 111005_Actual SLOC size (changed) □ 191 5037_Planned EO_FP 4911006_Actual SLOC size (deleted) □ 55 5038_Actual EO_highly complex 1210001_Per-language actual SLOC size (1)_name ○ 717 5039_Actual EO_complex 145007_Per-language actual SLOC size_SLOC_1 □ 629 5040_Actual EO_not complex 11
a: Included b: Excluded 5041_Actual EO_FP ○ 765008_Per-language actual SLOC size_SLOC_comment lineinclusion_1 □ 687 132 555 5042_Planned EQ_highly complex 1
10088_Per-language actual SLOC size (1)_comment line ratio 95 5043_Planned EQ_complex 1a: Included b: Excluded 5044_Planned EQ_not complex 1
5009_Per-language actual SLOC size_SLOC_blank lineinclusion_1 574 34 540 5045_Planned EQ_FP 46
10089_Per-language actual SLOC size (1)_blank line ratio 28 5046_Actual EQ_highly complex 1410002_Per-language actual SLOC size (2)_name ○ 307 5047_Actual EQ_complex 145010_Per-language actual SLOC size_SLOC_2 □ 281 5048_Actual EQ_not complex 14
a: Included b: Excluded 5049_Actual EQ_FP ○ 755011_Per-language actual SLOC size_SLOC_comment lineinclusion_2 □ 314 61 253 5050_Planned ILF_highly complex 1
10090_Per-language actual SLOC size (2)_comment line ratio 29 5051_Planned ILF_complex 1a: Included b: Excluded 5052_Planned ILF_not complex 1
5012_Per-language actual SLOC size_SLOC_blank lineinclusion_2 252 21 231 5053_Planned ILF_FP 51
10091_Per-language actual SLOC size (2)_blank line ratio 14 5054_Actual ILF_highly complex 810003_Per-language actual SLOC size (3)_name ○ 114 5055_Actual ILF_complex 125013_Per-language actual SLOC size_SLOC_3 □ 107 5056_Actual ILF_not complex 13
a: Included b: Excluded 5057_Actual ILF_FP ○ 3385014_Per-language actual SLOC size_SLOC_comment lineinclusion_3 □ 129 30 99 5058_Planned EIF_highly complex 1
10092_Per-language actual SLOC size (3)_comment line ratio 17 5059_Planned EIF_complex 1a: Included b: Excluded 5060_Planned EIF_not complex 1
5015_Per-language actual SLOC size_SLOC_blank lineinclusion_3 105 14 91 5061_Planned EIF_FP 46
10093_Per-language actual SLOC size (3)_blank line ratio 8 5062_Actual EIF_highly complex 810004_Per-language actual SLOC size (4)_name 32 5063_Actual EIF_complex 115016_Per-language actual SLOC size_SLOC_4 26 5064_Actual EIF_not complex 15
a: Included b: Excluded 5065_Actual EIF_FP ○ 2865017_Per-language actual SLOC size_SLOC_comment lineinclusion_4 29 5 24 5066_Planned transactional functions_number of functions 20
10094_Per-language actual SLOC size (4)_comment line ratio 4 5067_Planned transactional functions_FP 20a: Included b: Excluded 5068_Actual transactional functions_number of functions 237
5018_Per-language actual SLOC size_SLOC_blank lineinclusion_4 25 2 23 5069_Actual transactional functions_FP ○ 325
10095_Per-language actual SLOC size (4)_blank line ratio 1 5070_Planned data functions_number of functions 1910005_Per-language actual SLOC size (5)_name 8 5071_Planned data functions_FP 195019_Per-language actual SLOC size_SLOC_5 7 5072_Actual data functions_number of functions 249
a: Included b: Excluded 5073_Actual data functions_FP ○ 3505020_Per-language actual SLOC size_SLOC_comment lineinclusion_5 7 0 7 5074_COSMIC-FFP details_number of triggering events 0
10096_Per-language actual SLOC size (5)_comment line ratio 0 5075_COSMIC-FFP details_number of functional processes 0a: Included b: Excluded 5076_COSMIC-FFP details_number of data groups 0
5021_Per-language actual SLOC size_SLOC_blank lineinclusion_5 7 0 7 5077_COSMIC-FFP details_Entry 0
10097_Per-language actual SLOC size (5)_blank line ratio 0 5078_COSMIC-FFP details_Exit 05022_Actual enhancement FP size (existing) □ 87 5079_COSMIC-FFP details_Read 05023_Actual enhancement FP size (added) □ 52 5080_COSMIC-FFP details_Write 05024_Actual enhancement FP size (changed) □ 47 5081_COSMIC-FFP details_Cfsu 05025_Actual enhancement FP size (deleted) □ 14 5082_Unadjusted planned FP size_after development planning □ 2411007_Actual enhancement FP size (existing) △ 2 5083_Unadjusted planned FP size_after requirements definition □ 3111008_Actual enhancement FP size (added) △ 3 5084_Unadjusted planned FP size_after basic design □ 8711009_Actual enhancement FP size (changed) △ 2 5085_Unadjusted planned FP size_after detailed design □ 3111010_Actual enhancement FP size (deleted) △ 2 5086_Planned SLOC size_after development planning □ 203
5087_Planned SLOC size_after requirements definition □ 865088_Planned SLOC size_after basic design □ 1295089_Planned SLOC size_after detailed design □ 10411011_Planned SLOC size (existing) □ 12211012_Planned SLOC size (added/developed) □ 26211013_Planned SLOC size (changed) □ 12111014_Planned SLOC size (deleted) □ 275090_Design document volume_development plan document 325091_Design document volume_requirements definitiondocument ○ 166
5092_Design document volume_basic design document ○ 2635093_Design document volume_detailed design document ○ 2565094_Number of DFD data items 35095_Number of DFD processes 25096_Other size index_number of DB tables ○ 3135097_Other size index_number of GUI screen types ○ 3665098_Other size index_number of report formats ○ 2965099_ Other size index_number of batch programs ○ 1805100_Number of use-cases_simple 105101_Number of use-cases_typical 85102_Number of use-cases_complex 95103_Number of actors_simple 105104_Number of actors_typical 75105_Number of actors_complex 9
284 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Per-data-item reply status (6/7) Data Item Priority Total Alternative 1 Alternative 2 Alternative 3 Alternative 4
○ ⇒ ×5106_Phase existence_Development plan ◎ 1,343 140 132 1,071
○ ⇒ ×5107_Phase existence_Requirements definition ◎ 1,528 747 294 487
○ ⇒ ×5108_Phase existence_Basic design ◎ 1,692 1,412 176 104
○ ⇒ ×5109_Phase existence_Detailed design ◎ 1,738 1,280 373 85
○ ⇒ ×5110_Phase existence_Construction ◎ 1,750 1,511 194 45
○ ⇒ ×5111_Phase existence_Integration test ◎ 1,690 1,270 297 123
○ ⇒ ×5112_Phase existence_System test ◎ 1,656 1,392 53 211
○ ⇒ ×5113_Phase existence_Acceptance test ◎ 1,312 234 17 1,061
a: None b: A few changes c: Many changes d: Serious changes5114_Origination of requirements specifications changes_development plan 7 7 0 0 0
a: None b: A few changes c: Many changes d: Serious changes5115_Origination of requirements specifications changes_requirements definition 21 10 6 1 4
a: None b: A few changes c: Many changes d: Serious changes5116_Origination of requirements specifications changes_basic design ○ 64 17 32 11 4
a: None b: A few changes c: Many changes d: Serious changes5117_Origination of requirements specifications changes_detailed design ○ 63 24 26 10 3
a: None b: A few changes c: Many changes d: Serious changes5118_Origination of requirements specifications changes_construction ○ 56 27 23 4 2
a: None b: A few changes c: Many changes d: Serious changes5119_Origination of requirements specifications changes_integration test ○ 53 32 15 5 1
a: None b: A few changes c: Many changes d: Serious changes5120_Origination of requirements specifications changes_system test ○ 38 18 16 3 1
a: None b: A few changes c: Many changes d: Serious changes5121_Origination of requirements specifications changes_acceptance test ○ 22 9 10 3 05122_Planned beginning date_whole project ◎ 7005123_Planned beginning date_development plan △ 135124_Planned beginning date_requirements definition △ 975125_Planned beginning date_basic design □ 4005126_Planned beginning date_detailed design △ 1365127_Planned beginning date_construction △ 1475128_Planned beginning date_integration test △ 1355129_Planned beginning date_system test △ 1095130_Planned beginning date_acceptance test △ 395131_Planned completion date_whole project ◎ 7005132_Planned completion date_development plan △ 135133_Planned completion date_requirements definition △ 645134_Planned completion date_basic design △ 1655135_Planned completion date_detailed design △ 1395136_Planned completion date_construction △ 1625137_Planned completion date_integration test △ 1255138_Planned completion date_system test □ 3765139_Planned completion date_acceptance test △ 6910126_Duration in months (planned)_whole project (offered data) △ 1655141_Duration in months (planned)_development plan 25142_Duration in months (planned)_requirements definition 265143_Duration in months (planned)_basic design △ 525144_Duration in months (planned)_detailed design △ 485145_Duration in months (planned)_construction △ 495146_Duration in months (planned)_integration test △ 495147_Duration in months (planned)_system test △ 345148_Duration in months (planned)_acceptance test 1710127_Duration in months (planned)_out of category 35149_Actual beginning date_whole project ◎ 1,3155150_Actual beginning date_development plan △ 145151_Actual beginning date_requirements definition △ 2905152_Actual beginning date_basic design □ 6965153_Actual beginning date_detailed design △ 2095154_Actual beginning date_construction △ 2185155_Actual beginning date_integration test △ 1395156_Actual beginning date_system test △ 1815157_Actual beginning date_acceptance test △ 745158_Actual completion date_whole project ◎ 1,3145159_Actual completion date_development plan △ 135160_Actual completion date_requirements definition △ 1205161_Actual completion date_basic design △ 2285162_Actual completion date_detailed design △ 2115163_Actual completion date_construction △ 2395164_Actual completion date_integration test △ 2105165_Actual completion date_system test □ 7265166_Actual completion date_acceptance test △ 12810128_Duration in months (actual)_whole project (offered data) △ 8715168_Duration in months (actual)_development plan 55169_Duration in months (actual)_requirements definition 1725170_Duration in months (actual)_basic design △ 2275171_Duration in months (actual)_detailed design △ 2535172_Duration in months (actual)_construction △ 2705173_Duration in months (actual)_integration test △ 1795174_Duration in months (actual)_system test △ 2295175_Duration in months (actual)_acceptance test 5610129_Duration in months (actual)_out of category 35
Appendix C: Per-Data-Item Reply Status
IPA/SEC White Paper 2007 on Software Development Projects in Japan 285
Per-data-item reply status (7/7) Data Item Priority Total Data Item Priority Total Alternative 1 Alternative 2 Alternative 3
11015_Project effort in person-hours (planned at thebeginning of basic design) □ 444 5332_Number of staff at peak time_whole project ◎ 1,000
11016_Project effort in person-hours (planned at thebeginning of basic design) ○ 144 5333_Number of staff at peak time_development plan 17
5177_Actual effort (development)_development plan ○ 110 5334_Number of staff at peak time_requirements definition 485178_Actual effort (development)_requirements definition ○ 614 5335_Number of staff at peak time_basic design ○ 965179_Actual effort (development)_basic design □ 1,001 5336_Number of staff at peak time_detailed design ○ 935180_Actual effort (development)_detailed design □ 955 5337 Number of staff at peak time_construction ○ 1025181_Actual effort (development)_construction □ 1,079 5338_Number of staff at peak time_integration test ○ 945182_Actual effort (development)_integration test □ 876 5339_Number of staff at peak time_system test ○ 835183_Actual effort (development)_system test □ 994 5340_Number of staff at peak time_acceptance test 205184_Actual effort (development)_acceptance test ○ 102 10059_Average number of outsourced staff_whole project △ 144
10130_Actual effort (development)_out of category ○ 656 10060_Average number of outsourced staff_developmentplan 3
5186_Actual effort (management)_development plan ○ 12 10061_Average number of outsourced staff_requirementsdefinition 13
5187_Actual effort (management)_requirements definition ○ 37 10062_Average number of outsourced staff_basic design △ 385188_Actual effort (management)_basic design ○ 63 10063_Average number of outsourced staff_detailed design △ 415189_Actual effort (management)_detailed design ○ 44 10064_Average number of outsourced staff_construction △ 495190_Actual effort (management)_construction ○ 60 10065_Average number of outsourced staff_integration test △ 425191_Actual effort (management)_integration test ○ 42 10066_Average number of outsourced staff_system test △ 245192_Actual effort (management)_system test ○ 59 10067_Average number of outsourced staff_acceptance test 13
5193_Actual effort (management)_acceptance test ○ 16 10068_Number of outsourced staff at peak time_wholeproject 117
10131_Actual effort (management)_out of category 647 10069_Number of outsourced staff at peaktime_development plan 3
10007_Actual effort (management)_development plan 2 10070_Number of outsourced staff at peaktime_requirements definition 13
10008_Actual effort (management)_requirements definition 12 10071_Number of outsourced staff at peak time_basicdesign 35
10009_Actual effort (management)_basic design 27 10072_Number of outsourced staff at peak time_detaileddesign 39
10010_Actual effort (management)_detailed design 5 10073_Number of outsourced staff at peak time_construction 45
10011_Actual effort (management)_construction 17 10074_Number of outsourced staff at peak time_integrationtest 40
10012_Actual effort (management)_integration test 22 10075_Number of outsourced staff at peak time_system test 23
10013_Actual effort (management)_system test 23 10076_Number of outsourced staff at peak time_acceptancetest 14
10014_Actual effort (management)_acceptance test 6a: Projectmembers wereassigned
b: Dedicatedmembers wereassigned
c: No memberswere assigned
10132_Actual effort (management)_out of category 365 5241_Personnel assignment for quality assurance ○ 380 252 124 410133_Actual effort (out-of-category)_development plan 7 a: Yes b: No10134_Actual effort (out-of-category)_requirements definition 7 1013_Existence of third-party reviews ○ 227 206 21
10135_Actual effort (out-of-category)_basic design 9 10079_Number of issues pointed out in reviews_requirementsdefinition 17
10136_Actual effort (out-of-category)_detailed design 6 5249_Per-phase number of issues pointed out inreviews_basic design △ 171
10137_Actual effort (out-of-category)_construction 6 5250_Per-phase number of issues pointed out inreviews_detailed design △ 0
10138_Actual effort (out-of-category)_integration test 5 10080_Number of issues pointed out in reviews_construction △ 59
10139_Actual effort (out-of-category)_system test 14 10081_Number of issues pointed out in reviews_integrationtest △ 26
10140_Actual effort (out-of-category)_acceptance test 29 10082_Number of issues pointed out in reviews_system test △ 17
10141_Actual effort (out-of-category)_out of category 134 10083_Number of issues pointed out in reviews_acceptancetest 5
5196_Actual outsourcing (effort)_development plan ○ 23 5251_Number of test cases_integration test ○ 701
5197_Actual outsourcing (effort)_requirements definition ○ 164 5252_Number of test cases_system test ○ 857
5198_Actual outsourcing (effort)_basic design ○ 394 5253_Number of identified failures caused by softwaredefects_integration test ○ 660
5199_Actual outsourcing (effort)_detailed design ○ 385 5254_Number of identified failures caused by softwaredefects_system test ○ 853
5200_Actual outsourcing (effort)_construction ○ 480 10098_Number of identified faults caused by softwaredefects_integration test ○ 245
5201_Actual outsourcing (effort)_integration test ○ 370 10099_Number of identified faults caused by softwaredefects_system test ○ 368
5202_Actual outsourcing (effort)_system test ○ 408 5255_Number of identified failures (Very critical)_1 month ○ 2485203_Actual outsourcing (effort)_acceptance test ○ 99 5256_Number of identified failures (Very critical)_3 month ○ 20510145_Actual outsourcing (effort)_out of category 515 5257_Number of identified failures (Very critical)_6 month △ 595204_Actual outsourcing data (expenditure ratio) □ 274 5259_Number of identified failures (Critical)_1 month ○ 2445206_Actual review status (effort)_development plan 8 5260_Number of identified failures (Critical)_3 month ○ 1525207_Actual review status (effort)_requirements definition 22 5261_Number of identified failures (Critical)_6 month △ 595208_Actual review status (effort)_basic design △ 109 5263_Number of identified failures (Insignificant)_1 month △ 2485209_Actual review status (effort)_detailed design △ 108 5264_Number of identified failures (Insignificant)_3 month △ 1535210_Actual review status (effort)_construction △ 58 5265_Number of identified failures (Insignificant)_6 month △ 705211_Actual review status (effort)_integration test △ 26 5267_Number of identified failures (Total)_1 month ◎ 5205212_Actual review status (effort)_system test △ 25 5268_Number of identified failures (Total)_3 month ◎ 4525213_Actual review status (effort)_acceptance test 6 5269_Number of identified failures (Total)_6 month △ 1045215_Actual review status (number of times)_developmentplan 3 10100_Number of identified faults (Very critical)_1 month 68
5216_Actual review status (number of times)_requirementsdefinition 16 10101_Number of identified faults (Very critical)_3 month 65
5217_Actual review status (number of times)_basic design 57 10102_Number of identified faults (Very critical)_6 month 515218_Actual review status (number of times)_detailed design 43 10104_Number of identified faults (Critical)_1 month 735219_Actual review status (number of times)_construction 23 10105_Number of identified faults (Critical)_3 month 715220_Actual review status (number of times)_integration test 35 10106_Number of identified faults (Critical)_6 month 535221_Actual review status (number of times)_system test 17 10108_Number of identified faults (Insignificant)_1 month 785222_Actual review status (number of times)_acceptance test 5 10109_Number of identified faults (Insignificant)_3 month 775223_Average number of staff_whole project ◎ 742 10110_Number of identified faults (Insignificant)_6 month 595224_Average number of staff_development plan 19 10112_Number of identified faults (Total)_1 month ○ 1495225_Average number of staff_requirements definition 29 10113_Number of identified faults (Total)_3 month ○ 1815226_Average number of staff_basic design △ 81 10114_Number of identified faults (Total)_6 month △ 1025227_Average number of staff_detailed design △ 775228_Average number of staff_construction △ 845229_Average number of staff_integration test △ 785230_Average number of staff_system test △ 555231_Average number of staff_acceptance test 20
286 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Per-alternative reply status: Data item 201 Data Item 201_Industry_1 201_Industry_2 201_Industry_3
Priority ◎
Total 1,525 30 501: Agriculture 5 0 0 4 0 002: Forestry 0 0 0 13 2 0
03: Fisheries 1 0 0 3 0 0
04: Aquaculture 0 0 0 0 1 005: Mining 0 0 0 4 0 006: Construction work, general, including public andprivate construction work 13 0 0 46 0 0
07: Construction work by specialist contractor, exceptequipment installation work 5 0 0 6 1 0
08: Equipment installation work 5 0 0 3 0 009: Manufacture of food 24 0 0 12 0 0
10: Manufacture of beverages, tobacco and feed 2 0 0 3 0 0
11: Manufacture of textile mill products, exceptapparel and other finished products made fromfabrics and similar materials
0 0 0 16 0 0
12: Manufacture of apparel and other finishedproducts made from fabrics and similar materials 4 0 0 149 3 0
13: Manufacture of lumber and wood products,except furniture 0 0 0 21 0 1
14: Manufacture of furniture and fixtures 4 0 0 7 1 2
15: Manufacture of pulp, paper and paper products 4 0 0 65 0 0
16: Printing and allied industries 9 0 0 85 0 0
17: Manufacture of chemical and allied products 19 0 0 6 0 0
18: Manufacture of petroleum and coal products 1 0 0 158 1 0
19: Manufacture of plastic products, except otherwiseclassified 0 0 0 11 0 0
20: Manufacture of rubber products 1 0 0 13 0 021: Manufacture of leather tanning, leather productsand fur skins 1 0 0 5 0 0
22: Manufacture of ceramic, stone and clay products 1 0 0 0 0 023: Manufacture of iron and steel 4 0 0 2 0 024: Manufacture of non-ferrous metals and products 3 0 0 13 0 025: Manufacture Of Fabricated Metal Products 4 0 0 1 0 026: Manufacture of general machinery 5 0 0 9 0 027: Manufacture of electrical machinery, equipmentand supplies 24 0 0 7 0 0
28: Manufacture of information and communicationelectronics equipment 13 0 0 1 0 0
29: Electronic parts and devices 17 0 0 3 0 030: Manufacture of transportation equipment 36 0 0 3 0 031: Manufacture of precision instruments andmachinery 17 1 0 32 0 0
32: Miscellaneous manufacturing industries 29 0 0 6 0 033: Production, transmission and distribution ofelectricity 18 0 0 1 0 0
34: Manufacture of gas 8 0 0 5 1 035: Heat supply 5 0 0 7 0 036: Collection, purification and distribution of water,and sewage collection, processing and disposal 0 0 0 0 0 0
37: Communications 80 4 0 2 1 0
38: Broadcasting 9 0 0 0 0 0
39: Information services 102 3 0 1 0 040: Internet based services 9 6 0 8 0 041: Video picture, sound information, characterinformation production and distribution 7 1 0 1 0 0
42: Railway transport 19 0 0 0 0 043: Road passenger transport 6 0 0 0 0 044: Road freight transport 3 1 0 1 0 0
45: Water transport 4 0 0 0 0 0
46: Air transport 17 0 0 60 2 047: Warehousing 2 1 0 49 0 048: Services incidental to transport 21 0 1 50 0 049: Wholesale trade, general merchandise 10 0 1 62 0 0
50: Wholesale trade (textile and apparel)51: Wholesale trade (food and beverages)52: Wholesale trade (building materials, minerals andmetals, etc.)53: Wholesale trade (machinery and equipment)54: Miscellaneous wholesale trade
55: Retail trade, general merchandise
56: Retail trade (dry goods, apparel and apparelaccessories)57: Retail trade (food and beverages)58: Retail trade (motor vehicles and bicycles)59: Retail trade (furniture, household utensil andhousehold appliance)
60: Miscellaneous retail trade
61: Banking
62: Financial institutions for cooperative organizations
63: Institutions dealing with postal savings,government-related financial institutions64: Non-deposit money corporations engaged in theprovision of finance, credit and investment65: Securities And Futures Commodity DealingActivities66: Financial auxiliaries67: Insurance institutions, including insurance agents,brokers and services
68: Real estate agencies
69: Real estate lessors and managers
70: General eating and drinking places
71: Spree eating and drinking places72: Accommodations73: Medical and other health services74: Puplic health and hygiene75: Social Insurance And Social Welfare
76: School education
77: Miscellaneous education, learning support
78: Postal services, except otherwise classified79: Cooperative Associations, N.E.C.
80: Professional services, n.e.c.
81: Scientific And Development Research Institutes
82: Laundry, beauty and bath services
83: Miscellaneous living-related and personal services84: Services for amusement and hobbies
85: Waste disposal business
86: Automobile maintenance services87: Machine, etc. Repair services, except otherwiseclassified88: Goods rental and leasing89: Advertising
90: Miscellaneous business services
91: Political, business and cultural organizations92: Religion93: Miscellaneous services
Unknown
94: Foreign governments and international agencies injapan95: National government services96: Local government services99: Industries unable to classify
Per-alternative reply status: Data item 202 Data Item 202_Business
Type_1202_Business
Type_2202_Business
Type_3Priority ◎
Total 1,355 92 19a: Management/planning 14 2 0b: Accounting 95 8 0c: Sales 160 10 0d: Production/distribution 62 2 2e: Personnel/welfare 43 0 1f: General management 162 8 3g: General affairs 24 7 0h: Research/development 27 0 0i: Technology/control 55 2 0j: Master management 23 3 3k: Ordering/inventory 79 10 1l: Distribution management 15 3 0m: Subcontractor management 2 1 0n: Contract/transfer 52 5 0o: Customer management 65 3 3p: Product planning (per-product) 13 1 0q: Product management (per-product) 47 7 1r: Facility (stores) 23 0 0s: Information analysis 73 8 4t: Other 321 12 1
Appendix C: Per-Data-Item Reply Status
Software Development Data White Paper 2007 287
Per-alternative reply status: Data item 203 Data Item 203_ System
applications_1203_ System
applications_2203_ System
applications_3Priority ○
Total 336 23 2a: Workflow support and management 65 6 1b: Network management 13 0 0c: Job management and monitoring 6 0 0d: Process control 2 1 0e: Security management 4 2 0f: Finance dealing 68 3 0g: Reporting 7 2 0h: Online analysis and reporting 6 0 0i: Data management/data mining 41 3 0j: Web portal site 6 2 0k: ERP 7 1 0l: SCM 6 0 0m: CRM_CTI 15 0 0n: Document management 6 0 0o: Knowledge management 0 0 0p: Catalog management 0 1 0q: Mathematics modeling (finance/engineering) 0 0 0r: 3D modeling/animation 2 0 0s: Geographic/spatial positioning 8 0 0t: Graphics and publishing tools 1 1 0u: Imaging 1 0 0v: Video processing 0 0 0w: Voice processing 1 0 0x: Built-in software (for machine control) 4 0 0y: Device drivers/interface drivers 0 0 0z: OS/software utilities 0 0 0A: Software development tools 1 0 0B: Consumer products (word processors, spreadsheets, etc.) 1 0 0
C: EDI 2 1 0D: EAI 1 0 0E: Emulators 1 0 0F: File transfer 0 0 1G: Other 61 0 0
Per-alternative reply status: Data item 309 Data Item 309_Target
platform_1309_Targetplatform_2
309_Targetplatform_3
Priority ◎
Total 1,498 418 55a: Windows 95, 98, or Me 32 67 7b: Windows NT, 2000, or XP 706 133 16c: Windows Server 2003 82 31 2d: HP-UX 126 43 2e: HI-UX 22 7 2f: AIX 39 16 4g: Solaris 178 60 6h: Redhat Linux 23 8 1i: SUSE Linux 0 0 0j: Miracle Linux 0 0 0k: Turbo Linux 3 1 0l: Other type of Linux 5 1 0m: Linux 37 5 1n: Other type of UNIX 45 14 1o: MVS 60 1 1p: IMS 7 2 0q: TRON 1 0 0r: Office computer system 11 2 0s: Other 121 27 12
Per-alternative reply status: Data item 310 Data Item 310_Use of Web
technology_1310_Use of Web
technology_2310_Use of Web
technology_3Priority ◎
Total 1,076 178 75a: HTML 77 25 4b: XML 17 17 1c: Java Script 95 41 8d: ASP 53 6 2e: JSP 38 15 14f: J2EE 22 18 9g: Apache 27 9 8h: IIS 29 8 3i: Tomcat 5 12 9j: JBOSS 1 1 0k: OracleAS 4 1 0l: WebLogic 40 4 4m: WebSphere 38 5 6n: Coldfusion 4 0 0o: WebService 2 0 1p: Other 61 16 6q: None 563 0 0
Per-alternative reply status: Data item 312 Data Item
312_Primaryprogramminglanguage_1
312_Primaryprogramminglanguage_2
312_Primaryprogramminglanguage_3
312_Primaryprogramminglanguage_4
312_Primaryprogramminglanguage_5
Priority ◎ ○ ○
Total 1,610 707 253 19 4a: Assembly language 1 2 0 0 0b: COBOL 318 47 4 0 0c: PL/I 8 3 2 0 0d: Pro*C 17 14 6 2 0e: C++ 87 21 9 0 0f: Visual C++ 72 33 8 0 0g: C 220 101 21 3 1h: VB 276 120 28 2 0i: Excel (VBA) 11 9 4 1 0j:PowerBuilder 7 8 7 0 0k:Developer2000 17 1 0 0 0l: InputMan 0 3 0 0 0m: PL/SQL 37 59 25 1 0n: ABAP 13 0 0 0 0o: C# 25 4 2 0 0p: Visual Basic.NET 32 6 1 0 0q: Java 310 77 35 1 0r: Perl 7 7 8 0 1s: Shell script 2 16 11 0 0t: Delphi 5 10 3 0 0u: HTML 12 47 8 0 0v: XML 3 7 6 1 0w: Other 130 112 65 8 2
Per-alternative reply status: Data item 313 Data Item 313_Use of
DBMS_1313_Use of
DBMS_2313_Use of
DBMS_3Priority ◎
Total 1,303 52 6a: Oracle 634 5 2b: SQL Server 102 9 1c: PostgreSQL 15 1 0d: MySQL 6 0 0e: Sybase 10 0 0f: Informix 1 1 0g: ISAM 6 1 0h: DB2 53 15 0i: Access 24 7 0j: HiRDB 50 3 0k: IMS 45 0 0l: Other 157 9 2m: None 200 1 1
Per-alternative reply status: Data item 126
Data Item126_Reason ofQCD objectives
failure_1
126_Reason ofQCD objectives
failure_2
126_Reason ofQCD objectives
failure_3PriorityTotal 82 30 11
a: Incomplete development objectives 1 0 0b: Incomplete RFP contents 1 0 0c: Delayed completion of requirements specifications 24 1 0d: Insufficient analysis of requirements 11 7 0e: Miscasting of in-house members 10 9 0f: Subcontractor selection failure 2 3 4g: Insufficient capability of the development team 1 1 4h: Incomplete test planning 7 1 1i: Insufficient acceptance inspection 1 2 1j: Insufficient system test and/or acceptance test 6 1 0k: Insufficient ability of project managers 6 1 0l: Other 12 4 1
288 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Appendix D Glossary
Appendix D presents brief descriptions of statistical terms used in the analysis presented in this White Paper. The descriptions use statistical terms based on Statistics Science Dictionary (Asakura Publishing) and other references. • Median (50th percentile)
Suppose that we have samples arranged in a line in the order of their values. If the samples amount to an odd number, the median of the samples equals the value of the sample that divides them into equal halves of higher-value samples and lower-value samples. If the samples amount to an even number, the median of the samples equals the mean value of the values of two samples adjacent to the center of the line. The use of the median is especially suitable for indication of the location of asymmetric distribution. Outliers do not affect the median.
• Mean
The mean of samples equals the amount obtained by dividing the sum of the values of the samples by the number of the samples.
• Variance
Suppose that we have n samples of distribution F: X1, X2, ..., Xn. The variance of the samples is obtained by dividing the number of the samples into the sum of squares of the difference between Xi (i = 1, ..., n) and the mean of the samples.
• Standard deviation
The standard deviation equals the square root of the variance. (The standard deviation indicates the degree of dispersion.)
• Standard error
Suppose that we have samples of a statistic T. The standard deviation of the samples is referred to as the standard error of T. If, for example, we have n samples (X1, X2, ..., Xn) of a distribution that has the variance of σ2 then the standard mean of the samples equals ( X1 + X2 + ... + Xn) ÷ n. The standard error of the standard mean equals σ ÷ √n. In some cases, the standard error is used as a replacement for the standard deviation (as the square root of the variance).
• Normal distribution
The normal distribution renders a distribution curve that spreads symmetrically on both sides of the mean. The curve reaches the highest at the mean and descends gradually on both sides of the mean as it goes further from the mean. The curve becomes flatter as its standard deviation becomes larger, and it becomes peaky as its standard deviation becomes smaller.
• Histogram
The y-axis of a histogram shows the absolute or relative frequency and its x-axis shows continuous intervals of some value, or continuous categories. A histogram illustrates a series of bars like a bar chart, and each bar indicates the absolute or relative frequency of cases that belong to the category indicated by the bar.
• Skewness
The skewness indicates how much portion of a distribution concentrated on the right or left to make the distribution curve asymmetry with respect to the normal distribution.
• Correlation coefficient
The correlation coefficient indicates the degree of correlation between two variables. If variables x and y have linear relationship between them, they are correlated. The correlation coefficient takes a value between -1 and 1. As the correlation coefficient goes closer to 1 or -1, relationship between the two variables becomes more linear. As the correlation coefficient departs from 1 or -1 toward 0, the relationship becomes less linear. The correlation coefficient that is nearly zero or zero indicates that the two variables have no correlation between them.
• Box-and-whisker plot
A box-and-whisker plot displays a summary of major statistics (median, first and third quartiles, maximum, and minimum) of a population as an illustration of box and whiskers with outliers taken into account. Two opposite edges of the box reside at the upper and lower quartiles. Thus the box covers 50% of the population. A whisker extends from each of the two edges up or down to the highest or lowest observation except for outliers. The line that divides the box into two portions resides at the median. Refer to Section 3.3 for details.
• Quartiles (25th, 50th, and 75th percentiles)
Quartiles divide a probability or frequency distribution into four equal portions. These quartiles are referred to as the first, second, and third quartiles from the lowest one to the highest. The second quartile equals the median.
Appendix D: Glossary
IPA/SEC White Paper 2007 on Software Development Projects in Japan 289
• 25th percentile The 25th percentile equals the value below which 25% of the observations reside.
• 75th percentile
The 75th percentile equals the value that divides the observations into the lower 75% portion and the upper 25% portion.
• Outliers
Outliers are observations that fall in two ranges, upper and lower. The upper range extends between the level that departs from the upper hinge by a distance 1.5 times larger than the box's height and the level that departs from the upper hinge by a distance 3 times larger than the box's height. Similarly, the lower range extends between the level that departs from the lower hinge by a distance 1.5 times larger than the box's height and the level that departs from the lower hinge by a distance 3 times larger than the box's height.
• Extreme outlier
Extreme outliers reside at any points higher than the upper boundary or the lower boundary. The upper boundary resides at the level that departs from the upper hinge by a distance 3 times larger than the box's height. Similarly, the lower boundary resides at the level that departs from the lower hinge by a distance 3 times larger than the box's height.
• p-value
The p-value equals the probability of acquiring under a null hypothesis a value of a test statistic that extremely departs from a real observed value. (A null hypothesis is a hypothesis that is constructed to start a test with.) The p-value indicates the probability of not-acquiring the real value if the null hypothesis is true. If a small p-value is obtained, the null hypothesis and the observations are considered to have no mutual consistency and the null hypothesis is considered to be false (and the null hypothesis is rejected.)
• Function point method
The method to capture the size of software by quantifying the size of functions of the software. Two terms function point and function point method are generally referred to as the functional size and the functional size measurement in JIS X 0135-1:1999 (ISO/IEC 14143-1). Many kinds of function point measurement methods were created. Popular function point measurement methods suitable for business application software development include IFPUG method, NESMA estimated method, and SPR method.
• Functional size
(Based on a quotation from the Software Measurement—Functional Size Measurement—JIS X 0135-1:1999) The functional size is the size of software obtained by quantifying functional user requirements. Functional user requirements constitute a subset of user requirements and specify the user tasks and user procedures that have to be implemented in software to satisfy user needs. Functional user requirements exclude quality requirements and technical requirements.
• FP size
The FP size refers to the size of software represented in function points. • SLOC size, KSLOC
SLOC stands for Source Lines of Code. The SLOC size refers to the size of software represented in source lines of code. The unit KSLOC is used to represent multiples of one thousand SLOCs.
290 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Appendix E References
Appendix E.1 presents reference materials related to this White Paper, including reference books and documents, books describing definitions, and industry standards. Appendix E.2 lists information sources in Japan and other countries and lists software programs used for analysis presented in this White Paper. E.1 Reference Materials [1] Tadakazu Okuno, Hitoshi Kume, Toshirou Haga, Tadashi Yoshizawa “Multivariate Analysis” Union
of Japanese Scientists and Engineers, 1971 [2] Yutaka Tanaka, Kazuaki Wakimoto “Multivariate Statistics and Analysis” Gendai-Sugakusha, 1983 [3] “Daijirin 2nd ed” SANSEIDO publishing, 1998 [4] B. S. Everitt (Translated by Yoshikazu Shimizu) “Dictionary of Statistics”, Asakura Publishing, 2003 [5] JIS X 0135-1:1999 (ISO/IEC 14143-1:1998 Information technology -- Software measurement --
Functional size measurement -- Part 1: Definition of concepts) [6] JIS X 0141:2004 (ISO/IEC 15939:2002 Software engineering - Software measurement process) [7] JIS X 0160:1996 (ISO/IEC 12207:1995 Information technology -- Software life cycle processes) [8] ISO/IEC 12207:1995/Amd 1:2002 [9] ISO/IEC 12207:1995/Amd 2:2004 [10] ISO/IEC 20926:2003 Software engineering -- IFPUG 4.1 Unadjusted functional size measurement
method -- Counting practices manual [11] ISO/IEC 24570:2005 size measurement method version 2.1 -- Definitions and counting guidelines for
the application of Function Point Analysis (* NESMA method) [12] ISO/IEC 19761:2003 Software engineering -- COSMIC-FFP -- A functional size measurement
method [13] Japan Function Point User Group (JFPUG), “Counting Practice Manual 4.2” [14] C.J. Lokan, “Function Points. Advances in Computers,” M. Zelkowitz (ed), Volume 65, Chapter 7,
Academic Press, 2005 [15] Capers Jones, “Applied Software Measurement,2nd ed,” New York: McGraw-Hill, 1996 Translated
and edited by Tsuruho and Tomino “Quantitative Method for Software Development 2nd ed” KYORITSU SHUPPAN
[16] R. E. Park, “Software Size Measurement: A Framework for Counting Source Statements,” Technical Report CMU/SEI-92-TR-020, 1992
[17] W. B. Goethert, E. K. Bailey, M. B. Busby, “Software Effort & Schedule Measurement: A Framework for Counting Staff-hours and Reporting Schedule Information,” Technical Report CMU/SEI-92-TR-021, 1992
[18] W. A. Florac, “Software Quality Measurement: A Framework for Counting Problems and Defects,” Technical Report CMU/SEI-92-TR-022, 1992
[19] B. W. Boehm, et al., “Software Cost Estimation with COCOMO II,” Prentice Hall PTR, 2000 [20] David Garmus and David Herron, Translated and edited by Kodama “ Function Point Measurement
and Analysis” Pearson Education, 2002 [21] S.H.Kan (Translated and edited by Koyama and Tomino) “Measure and Model for Software Quality
Engineering” KYORITSU SHUPPAN, 2004 [22] ISBSG, “The Benchmark Release 6,” http://www.isbsg.org.au [23] ISBSG, “The Benchmark Release 8,” http://www.isbsg.org.au [24] SEC journal, IPA Software Engineering Center, Ohmsha, http://sec.ipa.go.jp/secjournal/ [25] Shigeru Nishiyama “Technology Review: Current Trend of Software Functional Size Measurement
Methods”, SEC journal No.5, IPA Software Engineering Center, Ohmsha, 2006 [26] Kadota, Mashima, Masuda, Hatano, Isono, Utsumi, Kikuchi, Hattori, Hosoya, Mori, “Technology
Review: Analysis of Factors of Schedule Tightness”, SEC journal No.10, IPA Software Engineering Center, Ohmsha, 2007
Appendix E: References
IPA/SEC White Paper 2007 on Software Development Projects in Japan 291
[27] Kikuchi, “Quantitative Data Analysis”, SEC journal No.10, IPA Software Engineering Center, Ohmsha, 2007
[28] IPA Software Engineering Center, “White Paper 2005 on Software Development Projects in Japan” Nikkei Business Publications, 2005
[29] IPA Software Engineering Center, “White Paper 2006 on Software Development Projects in Japan” Nikkei Business Publications, 2006
E.2 Reference Information ( User groups and associations in Japan and other countries [1] Japan Function Point User Group (JFPUG) [2] NESMA (Netherlands Software Metrics Users Association), http://www.nesma.nl/sectie/home/,
Japanese http://www.nesma.nl/japanese/index.htm [3] ISBSG, http://www.isbsg.org.au �Software programs The following software programs were used to analyze project data presented in this White Paper.
• Microsoft Office Excel 2003 Excel was used to draw or calculate pie charts, bar graphs, histograms, basic statistics, correlation coefficients, correlation curves, and percentiles.
• SPSS 13.0J for Windows, SPSS Japan SPSS was used to draw box-and-whisker plots.
• SEC-proprietary software tools Software tools proprietary to SEC were used to calculate values necessary to draw graphs for confidence intervals.
* Microsoft Office Excel 2003 is a registered trademark of Microsoft Corporation. * SPSS is a registered trademark of SPSS Inc.
For more information about these software programs, contact the relevant companies.
292 IPA/SEC White Paper 2007 on Software Development Projects in Japan
List of Figures and Tables Chapter 2 Data Collection Figure 2-2-1 ● Data Volume and Company Count Figure 2-2-2 ● Data-Updated Project Count per Fiscal Year Figure 2-2-3 ● Per-Beginning-Year and Per-Completion-Year Project Count Table 2-2-4 ● Per-Beginning-Year and Per-Completion-Year Cross-Total Table 2-2-5 ● Per-Beginning-Year and Per-Completion-Year Project Count Chapter 3 Data Analysis Figure 3-1-1 ● Characteristic Elements and Their Mutual Relationship Figure 3-2-1 ● Example of Outliers Table 3-3-1 ● Unit Notation Table 3-3-2 ● Basic Statistics Formats Table 3-3-3 ● Evaluation Criteria When Using Basic Statistics Table 3-3-4 ● Evaluation Criteria When Using Regression Analysis Figure 3-3-5 ● Example of Scattergram with Confidence Width Figure 3-3-6 ● Box-And-Whisker Plot Example Chapter 4 Profiles of Collected Data Figure 4-2-1 ● Types of Projects Figure 4-2-2 ● Category of Project Figure 4-2-3 ● Purpose of Project Figure 4-2-4 ● New Customer or Not Figure 4-2-5 ● New Business or Not Figure 4-2-6 ● Using New Technology or Not Figure 4-3-1 ● Type of Industry (Major Type) Figure 4-3-2 ● Type of Business Figure 4-3-3 ● User Accessibility Figure 4-4-1 ● Type of Developed System Figure 4-4-2 ● Use of Business Application Package Figure 4-4-3 ● Mode of Processing Figure 4-4-4 ● Architecture Figure 4-4-5 ● Target Platform Figure 4-4-6 ● Use of Web technology Figure 4-4-7 ● Programming Language Figure 4-4-8 ● Use of DBMS Figure 4-5-1 ● Development Life Cycle Model Figure 4-5-2 ● Examined Similar In-House Projects or Not Figure 4-5-3 ● Application of Development Methods Figure 4-5-4 ● Use of Development Frameworks Figure 4-5-5 ● Using Tool Software or Not Figure 4-6-1 ● User Requirements and Participation Figure 4-6-2 ● Level of Requirements Figure 4-7-1 ● Experiences and Skills of PMs Figure 4-7-2 ● Experiences of Development Staff Figure 4-7-3 ● Personnel Assignment and Skills for Testing Figure 4-8-1 ● Types of Software Sizing Scale (by Number of Projects) Figure 4-8-2 ● Types of Software Sizing Scale (by Number of Companies) Figure 4-8-3 ● FP Measurement Method (by Number of Projects) Figure 4-8-4 ● FP Measurement Method (by Number of Companies) Figure 4-8-5 ● Purity of FP Measurement Method Figure 4-8-6 ● Actual FP Size Figure 4-8-7 ● Actual SLOC Size Figure 4-9-1 ● Planned Project Duration in Months
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 293
Figure 4-9-2 ● Actual Project Duration in Months Figure 4-9-3 ● Planned Major-development Phase Duration in Months Figure 4-9-4 ● Actual Major-development Phase Duration in Months Figure 4-10-1 ● Project Effort Planned at the Beginning of Basic Design (Person Hours) Figure 4-10-2 ● Actual Project Effort (Person Hours) Figure 4-10-3 ● Actual Major-development Phase Effort (Person Hours) Figure 4-10-4 ● Project Effort Planned at the Beginning of Basic Design (Person Months) Figure 4-10-5 ● Actual Project Effort (Person Months) Figure 4-10-6 ● Actual Major-development Phase Effort (Person Months) Figure 4-10-7 ● Effort Unit Figure 4-10-8 ● Conversion Ratio Among Person-Month and Person-Hour Figure 4-11-1 ● Number of Staff Members per Month Figure 4-11-2 ● The Ratio of Outsourced Effort Amount Figure 4-11-3 ● The Ratio of Expenditure on Outsourced Work Figure 4-11-4 ● Source of Outsourced Workforce Figure 4-11-5 ● New Subcontractors or Not Figure 4-12-1 ● Number of Defects Identified After System Cutover Figure 4-12-2 ● Number of Defects Identified After System Cutover (Number of failure) Figure 4-12-3 ● Number of Defects Identified After System Cutover (Number of fault) Figure 4-12-4 ● Personnel Assignment for Quality Assurance Figure 4-12-5 ● Practice of Quality Assurance Standard and Review Figure 4-13-1 ● Development Phase Combinations Figure 4-14-1 ● Evaluation of Planning (QCD) Figure 4-14-2 ● Evaluation of Actual Results (QCD) Figure 4-14-3 ● Self-Evaluation o Project Success Figure 4-14-4 ● Subjective Evaluation of Customer Satisfaction by the Vendor Chapter 5 Statistics of Major Project Elements Figure 5-1-1 ● Stratification Scheme and Analyzed Elements Table 5-1-2 ● Basic Statistics Presentation Format Table 5-2-1 ● Per-Project-Type FP Size Project Count Table 5-2-2 ● Per-FP-Measurement-Method FP Size Project Count Figure 5-2-3 ● FP Size Distribution Table 5-2-4 ● Per-Project-Type FP Size Distribution Figure 5-2-5 ● Per-Project-Type FP Size Basic Statistics Figure 5-2-6 ● FP Size Distribution (Development, Mixed FP Measurement Methods) Figure 5-2-7 ● FP Size Distribution (Development, IFPUG Group) Table 5-2-8 ● Per-FP-Measurement-Method FP Size Basic Statistics (Development) Figure 5-2-9 ● FP Size Distribution (Enhancement, Mixed FP Measurement Methods) Figure 5-2-10 ● FP Size Distribution (Enhancement, IFPUG Group) Table 5-2-11 ● Per-FP-Measurement-Method FP Size Basic Statistics (Enhancement) Table 5-2-12 ● Per-Industry-Type FP-Size Project Count (Development, Mixed FP measurement methods) Figure 5-2-13 ● Per-Industry-Type FP Size Distribution (Development, Mixed FP measurement methods) Table 5-2-14 ● Per-Industry-Type FP Size Basic Statistics (Development, Mixed FP measurement methods) Table 5-2-15 ● Per-Industry-Type FP-Size Project Count (Enhancement, Mixed FP measurement methods) Figure 5-2-16 ● Per-Industry-Type FP Size Distribution (Enhancement, Mixed FP measurement methods) Table 5-2-17 ● Per-Industry-Type FP Size Basic Statistics (Enhancement, Mixed FP measurement methods) Table 5-2-18 ● Per-Architecture-Type FP-Size Project Count (Development, Mixed FP measurement methods) Figure 5-2-19 ● Per-Architecture-Type FP Size Distribution (Development, Mixed FP measurement methods) Table 5-2-20 ● Per-Architecture-Type FP Size Basic Statistics (Development, Mixed FP measurement
methods) Table 5-2-21 ● Per-Architecture-Type FP-Size Project Count (Enhancement, Mixed FP measurement methods) Figure 5-2-22 ● Per-Architecture-Type FP Size Distribution (Enhancement, Mixed FP measurement methods) Table 5-2-23 ● Per-Architecture-Type FP Size Basic Statistics (Enhancement, Mixed FP measurement
methods) Table 5-2-24 ● Per-Business-Type FP-Size Project Count (Development, Mixed FP measurement methods) Table 5-2-25 ● Per-Business-Type FP Size Count (Enhancement, Mixed FP measurement methods) Table 5-2-26 ● Per-Business-Type FP Size Basic Statistics (Development, Mixed FP measurement methods) Table 5-3-1 ● Per-Project-Type SLOC Size Project Count Figure 5-3-2 ● Per-Primary- Programming-Language SLOC Size Project Count Figure 5-3-3 ● SLOC Size Distribution (Mixed Primary Programming Languages)
294 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 5-3-4 ● Per-Project-Type SLOC Size Distribution (Mixed Primary Programming Languages) Table 5-3-5 ● Per-Project-Type SLOC Size Basic Statistics (Mixed Primary Programming Languages) Figure 5-3-6 ● SLOC Size Distribution (Development, Mixed Primary Programming Languages) Table 5-3-7 ● SLOC Size Basic Statistics (Development) Figure 5-3-8 ● SLOC Size Distribution (Enhancement, Mixed Primary Programming Languages) Table 5-3-9 ● SLOC Size Basic Statistics (Enhancement) Table 5-3-10 ● Per-Industry-Type SLOC Size Project Count (Development, Mixed Primary Programming
Languages) Figure 5-3-11 ● Per-Industry-Type SLOC Size Distribution (Development, Mixed Primary Programming
Languages) Table 5-3-12 ● Per-Industry-Type SLOC Size Basic Statistics (Development, Mixed Primary Programming
Languages) Table 5-3-13 ● Per-Industry-Type SLOC Size Project Count (Enhancement, Mixed Primary Programming
Languages) Figure 5-3-14 ● Per-Industry-Type SLOC Size Distribution (Enhancement, Mixed Primary Programming
Languages) Table 5-3-15 ● Per-Industry-Type SLOC Size Basic Statistics (Enhancement, Mixed Primary Programming
Languages) Table 5-3-16 ● Per-Architecture-Type SLOC Size Project Count (Development, Mixed Primary Programming
Languages) Figure 5-3-17 ● Per-Architecture-Type SLOC Size Distribution (Development, Mixed Primary Programming
Languages) Table 5-3-18 ● Per-Architecture-Type SLOC Size Basic Statistics (Development, Mixed Primary
Programming Languages) Table 5-3-19 ● Per-Architecture-Type SLOC Size Project Count (Enhancement, Mixed Primary Programming
Languages) Figure 5-3-20 ● Per-Architecture-Type SLOC Size Distribution (Enhancement, Mixed Primary Programming
Languages) Table 5-3-21 ● Per-Architecture-Type SLOC Size Basic Statistics (Enhancement, Mixed Primary
Programming Languages) Table 5-3-22 ● Per-Business-Type SLOC Size Project Count (Development, Mixed Primary Programming
Languages) Figure 5-3-23 ● Per-Business-Type SLOC Size Distribution (Development, Mixed Primary Programming
Languages) Table 5-3-24 ● Per-Business-Type SLOC Size Basic Statistics (Development, Mixed Primary Programming
Languages) Table 5-3-25 ● Per-Business-Type SLOC Size Project Count (Enhancement, Mixed Primary Programming
Languages) Figure 5-3-26 ● Per-Business-Type SLOC Size Distribution (Enhancement, Mixed Primary Programming
Languages) Table 5-3-27 ● Per-Business-Type SLOC Size Basic Statistics (Enhancement, Mixed Primary Programming
Languages) Table 5-4-1 ● Per-Project-Type Schedule Duration Project Count Figure 5-4-2 ● Schedule Duration Distribution Figure 5-4-3 ● Per-Project-Type Schedule Duration Distribution Table 5-4-4 ● Per-Project-Type Schedule Duration Basic Statistics Figure 5-4-5 ● Schedule Duration Distribution (Development) Table 5-4-6 ● Schedule Duration Basic Statistics (Development) Figure 5-4-7 ● Schedule Duration Distribution (Enhancement) Table 5-4-8 ● Schedule Duration Basic Statistics (Enhancement) Table 5-4-9 ● Per-Industry-Type Schedule Duration Project Count (Development) Figure 5-4-10 ● Per-Industry-Type Schedule Duration Distribution (Development) Table 5-4-11 ● Per-Industry-Type Schedule Duration Basic Statistics (Development) Table 5-4-12 ● Per-Industry-Type Schedule Duration Project Count (Enhancement) Figure 5-4-13 ● Per-Industry-Type Schedule Duration Distribution (Enhancement) Table 5-4-14 ● Per-Industry-Type Schedule Duration Basic Statistics (Enhancement) Table 5-4-15 ● Per-Architecture-Type Schedule Duration Project Count (Development) Figure 5-4-16 ● Per-Architecture-Type Schedule Duration Distribution (Development) Table 5-4-17 ● Per-Architecture-Type Schedule Duration Basic Statistics (Development) Table 5-4-18 ● Per-Architecture-Type Schedule Duration Project Count (Enhancement) Figure 5-4-19 ● Per-Architecture-Type Schedule Duration Distribution (Enhancement) Table 5-4-20 ● Per-Architecture-Type Schedule Duration Basic Statistics (Enhancement) Table 5-4-21 ● Per-Business-Type Schedule Duration Project Count (Development)
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 295
Figure 5-4-22 ● Per-Business-Type Schedule Duration Distribution (Development) Table 5-4-23 ● Per-Business-Type Schedule Duration Basic Statistics (Development) Table 5-4-24 ● Per-Business-Type Schedule Duration Project Count (Enhancement) Figure 5-4-25 ● Per-Business-Type Schedule Duration Distribution (Enhancement) Table 5-4-26 ● Per-Business-Type Schedule Duration Basic Statistics (Enhancement) Table 5-5-1 ● Per-Project-Type Effort Project Count Figure 5-5-2 ● Effort Distribution Figure 5-5-3 ● Per-Project-Type Effort Distribution Table 5-5-4 ● Per-Project-Type Effort Basic Statistics Figure 5-5-5 ● Effort Distribution of FP-Size Projects (Development, Mixed FP Measurement Methods) Figure 5-5-6 ● Effort Distribution of FP-Size Projects (Development, IFPUG Group) Table 5-5-7 ● Per-FP-Measurement-Method Effort Basic Statistics (Development) Figure 5-5-8 ● Effort Distribution of FP-Size Projects (Enhancement, Mixed FP Measurement Methods) Figure 5-5-9 ● Effort Distribution of FP-Size Projects (Enhancement, IFPUG Group) Table 5-5-10 ● Per-FP-Measurement-Method Effort Basic Statistics (Enhancement) Figure 5-5-11 ● Effort Distribution of SLOC-Size Projects (Development, Mixed Primary Programming
Languages) Table 5-5-12 ● Per-Primary-Programming-Language SLOC-Size Projects Effort Basic Statistics
(Development) Figure 5-5-13 ● Effort Distribution of SLOC-Size Projects (Enhancement, Mixed Primary Programming
Languages) Table 5-5-14 ● Per-Primary-Programming-Language SLOC-Size Projects Effort Basic Statistics
(Enhancement) Table 5-5-15 ● Per-Industry-Type Effort Project Count (Development) Figure 5-5-16 ● Per-Industry-Type Effort Distribution (Development) Table 5-5-17 ● Per-Industry-Type Effort Basic Statistics (Development) Table 5-5-18 ● Per-Industry-Type Effort Project Count (Enhancement) Figure 5-5-19 ● Per-Industry-Type Effort Distribution (Enhancement) Table 5-5-20 ● Per-Industry-Type Effort Basic Statistics (Enhancement) Table 5-5-21 ● Per-Architecture-Type Effort Project Count (Development) Figure 5-5-22 ● Per-Architecture-Type Effort Distribution (Development) Table 5-5-23 ● Per-Architecture-Type Effort Basic Statistics (Development) Table 5-5-24 ● Per-Architecture-Type Effort Project Count (Enhancement) Figure 5-5-25 ● Per-Architecture-Type Effort Distribution (Enhancement) Table 5-5-26 ● Per-Architecture-Type Effort Basic Statistics (Enhancement) Table 5-5-27 ● Per-Business-Type Effort Project Count (Development) Table 5-5-28 ● Per-Business-Type Effort Project Count (Enhancement) Table 5-5-29 ● Per-Business-Type Effort Basic Statistics (Development) Table 5-5-30 ● Per-Business-Type Effort Basic Statistics (Enhancement) Table 5-6-1 ● Per-Project-Type Head-Count Per Month Project Count Figure 5-6-2 ● Distribution of Head-Count Per Month Figure 5-6-3 ● Distribution of Per-Project-Type Head-Count Per Month Table 5-6-4 ● Basic Statistics of Per-Project-Type Head-Count Per Month Figure 5-6-5 ● Distribution of Head-Count Per Month (Development) Figure 5-6-6 ● Distribution of Head-Count Per Month (Enhancement) Figure 5-6-7 ● Per-Industry-Type Head-Count Per Month Project Count (Development) Table 5-6-8 ● Distribution of Per-Industry-Type Head-Count Per Month (Development) Table 5-6-9 ● Basic Statistics of Per-Industry-Type Head-Count Per Month (Development) Table 5-6-10 ● Per-Industry-Type Head-Count Per Month Project Count (Enhancement) Figure 5-6-11 ● Distribution of Per-Industry-Type Head-Count Per Month (Enhancement) Table 5-6-12 ● Basic Statistics of Per-Industry-Type Head-Count Per Month (Enhancement) Table 5-6-13 ● Per-Architecture-Type Head-Count Per Month Project Count (Development) Figure 5-6-14 ● Distribution of Per-Architecture-Type Head-Count Per Month (Development) Table 5-6-15 ● Basic Statistics of Per-Architecture-Type Head-Count Per Month (Development) Table 5-6-16 ● Per-Architecture-Type Head-Count Per Month Project Count (Enhancement) Figure 5-6-17 ● Distribution of Per-Architecture-Type Head-Count Per Month (Enhancement) Table 5-6-18 ● Basic Statistics of Per-Architecture-Type Head-Count Per Month (Enhancement) Table 5-6-19 ● Per-Business-Type Head-Count Per Month Project Count (Development) Table 5-6-20 ● Per-Business-Type Head-Count Per Month Project Count (Enhancement) Table 5-6-21 ● Basic Statistics of Per-Business-Type Head-Count Per Month (Development) Table 5-6-22 ● Basic Statistics of Per-Business-Type Head-Count Per Month (Enhancement)
296 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Chapter 6 Analysis of the Relationship Among Effort, Development Schedule, and Size
Table 6-1-1 ● Combinations of Major Factors Table 6-1-2 ● Combinations of Factors, Characteristics, and Stratification Table 6-2-1 ● Major Factors and Associated Sections Figure 6-3-1 ● Whole-Project Effort and Development Schedule (Development) with Confidence Intervals of
50% and 95% Figure 6-3-2 ● Major-Development-Phase Effort and Development Schedule (Development) with Confidence
Intervals of 50% and 95% Figure 6-3-3 ● Industry-Type-Based Effort and Development Schedule (Development) Figure 6-3-4 ● Architecture-Based Effort and Development Schedule (Development) Figure 6-3-5 ● Primary-Programming-Language-Based Effort and Development Schedule (Development) Figure 6-3-6 ● Whole-Project Effort and Development Schedule (Enhancement) with Confidence Intervals of
50% and 95% Figure 6-3-7 ● Major-Development-Phase Effort and Development Schedule (Enhancement) with Confidence
Intervals of 50% and 95% Figure 6-3-8 ● Industry-Type-Based Effort and Development Schedule (Enhancement) Figure 6-3-9 ● Architecture-Based Effort and Development Schedule (Enhancement) Figure 6-3-10 ● Primary-Programming-Language-Based Effort and Development Schedule (Enhancement) Figure 6-4-1 ● FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods) with Confidence
Interval of 50% Figure 6-4-2 ● FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods) Magnified with
Confidence Interval of 50% (FP ≤ 2,000 and effort ≤ 50,000) Figure 6-4-3 ● FP Size and Effort (All Project Types, Mixed_FP_Measurement_Methods) Logarithmic Scale Figure 6-4-4 ● FP Size and Effort (All Project Types, IFPUG_Group) with Confidence Interval of 50% Figure 6-4-5 ● FP Size and Effort (All Project Types, IFPUG_Group) Magnified with Confidence Interval of
50% (FP ≤ 2,000 and effort ≤ 50,000) Figure 6-4-6 ● FP Size and Effort (All Project Types, IFPUG_Group) Logarithmic Scale Figure 6-4-7 ● FP Size and Effort (Development, Mixed_FP_Measurement_Methods) with Confidence
Interval of 50% Figure 6-4-8 ● FP Size and Effort (Development, Mixed_FP_Measurement_Methods) Logarithmic Scale Figure 6-4-9 ● FP Size and Effort (Development, IFPUG_Group) with Confidence Interval of 50% Figure 6-4-10 ● FP Size and Effort (Development, IFPUG_Group) Magnified with Confidence Interval of 50%
(FP ≤ 2,000 and effort ≤ 60,000) Figure 6-4-11 ● FP Size and Effort (Development, IFPUG_Group) Logarithmic Scale Figure 6-4-12 ● Industry-Type-Based FP Size and Effort (Development, IFPUG_Group) Figure 6-4-13 ● Architecture-Based FP Size and Effort (Development, IFPUG_Group) Figure 6-4-14 ● FP Size and Effort (Enhancement, Mixed_FP_Measurement_Methods) with Confidence
Interval of 50% Figure 6-4-15 ● FP Size and Effort (Enhancement, Mixed_FP_Measurement_Methods) Logarithmic Scale Figure 6-4-16 ● FP Size and Effort (Enhancement, IFPUG_Group) with Confidence Interval of 50% Figure 6-4-17 ● FP Size and Effort (Enhancement, IFPUG_Group) Logarithmic Scale Figure 6-4-18 ● Industry-Type-Based FP Size and Effort (Enhancement, IFPUG_Group) Figure 6-4-19 ● Architecture-Based FP Size and Effort (Enhancement, IFPUG_Group) Figure 6-5-1 ● FP Size and FP_Productivity (Development, Mixed_FP_Measurement_Methods) Figure 6-5-2 ● FP Size and FP_Productivity (Development, Mixed_FP_Measurement_Methods)
Box-and-Whisker Plot Table 6-5-3 ● FP Size and FP_Productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) Figure 6-5-4 ● FP_Productivity Distribution (Development, Mixed_FP_Measurement_Methods) Figure 6-5-5 ● FP Size and FP_Productivity (Development, IFPUG_Group) Figure 6-5-6 ● FP-Size-Based FP_Productivity (Development, IFPUG_Group) Box-and-Whisker Plot Table 6-5-7 ● FP-Size-Based FP_productivity Basic Statistics (Development, IFPUG_Group) Figure 6-5-8 ● Industry-Type-Based FP_Productivity (Development, IFPUG_Group) Box-and-Whisker Plot Table 6-5-9 ● Industry-Type-Based FP_Productivity Basic Statistics (Development, IFPUG_Group) Figure 6-5-10 ● Architecture-Based FP_Productivity (Development, IFPUG_Group) Box-and-Whisker Plot Table 6-5-11 ● Architecture-Based FP_Productivity Basic Statistics (Development, IFPUG_Group) Figure 6-5-12 ● Primary-Programming-Language-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot Table 6-5-13 ● Primary-Programming-Language-Based FP_Productivity Basic Statistics (Development,
IFPUG_Group) Figure 6-5-14 ● Platform-Based FP_Productivity (Development, IFPUG_Group) Box-and-Whisker Plot
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 297
Table 6-5-15 ● Platform-Based FP_Productivity Basic Statistics (Development, IFPUG_Group) Figure 6-5-16 ● Per-Month_Number_of_Staff and FP_Productivity (Development,
Mixed_FP_Measurement_Methods) Figure 6-5-17 ● Per-Month-Number-of-Staff-Based FP_Productivity (Development,
Mixed_FP_Measurement_Methods) Box-and-Whisker Plot Table 6-5-18 ● Per-Month-Number-of-Staff -Based FP_Productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) Figure 6-5-19 ● Per-Month_Number_of_Staff and FP_Productivity (Development, IFPUG_Group) Figure 6-5-20 ● Per-Month_Number-of-Staff-Based FP_Productivity (Development, IFPUG_Group)
Box-and-Whisker Plot Table 6-5-21 ● Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics (Development,
IFPUG_Group) Figure 6-5-22 ● Outsourcing_Ratio and FP Size (Development, Mixed_FP_Measurement_Methods) Figure 6-5-23 ● Outsourcing_Ratio and FP_Productivity (Development, Mixed_FP_Measurement_Methods) Figure 6-5-24 ● Outsourcing_Ratio and FP Size (Development, IFPUG_Group) Figure 6-5-25 ● Outsourcing_Ratio and FP_Productivity (Development, IFPUG_Group) Figure 6-5-26 ● FP Size and FP_Productivity (Enhancement, Mixed_FP_Measurement_Methods) Figure 6-5-27 ● FP Size and FP_Productivity (Enhancement, Mixed_FP_Measurement_Methods)
Box-and-Whisker Plot Table 6-5-28 ● FP Size and FP_Productivity Basic Statistics (Enhancement,
Mixed_FP_Measurement_Methods) Figure 6-5-29 ● FP_Productivity Distribution (Enhancement, Mixed_FP_Measurement_Methods) Figure 6-5-30 ● FP Size and FP_Productivity (Enhancement, IFPUG_Group) Figure 6-5-31 ● FP Size and FP_Productivity (Enhancement, IFPUG_Group) Box-and-Whisker Plot Table 6-5-32 ● FP Size and FP_Productivity Basic Statistics (Enhancement, IFPUG_Group) Figure 6-5-33 ● Industry-Type-Based FP_Productivity (Enhancement, IFPUG_Group) Box-and-Whisker Plot Table 6-5-34 ● Industry-Type-Based FP_Productivity Basic Statistics (Enhancement, IFPUG_Group) Figure 6-5-35 ● Architecture-Based FP_Productivity (Enhancement, IFPUG_Group) Box-and-Whisker Plot Table 6-5-36 ● Architecture-Based FP_Productivity Basic Statistics (Enhancement, IFPUG_Group) Figure 6-5-37 ● Primary-Programming-Language-Based FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot Table 6-5-38 ● Primary-Programming-Language-Based FP_Productivity Basic Statistics (Enhancement,
IFPUG_Group) Figure 6-5-39 ● Platform-Based FP_Productivity (Enhancement, IFPUG_Group) Box-and-Whisker Plot Table 6-5-40 ● Platform-Based FP_Productivity Basic Statistics (Enhancement, IFPUG_Group) Figure 6-5-41 ● Per-Month_Number_of_Staff and FP_Productivity (Enhancement,
Mixed_FP_Measurement_Methods) Figure 6-5-42 ● Per-Month-Number-of-Staff-Based FP_Productivity (Enhancement,
Mixed_FP_Measurement_Methods) Box-and-Whisker Plot Table 6-5-43 ● Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics (Enhancement,
Mixed_FP_Measurement_Methods) Figure 6-5-44 ● Per-Month_Number_of_Staff and FP_Productivity (Enhancement, IFPUG_Group) Figure 6-5-45 ● Per-Month-Number-of-Staff-Based FP_Productivity (Enhancement, IFPUG_Group)
Box-and-Whisker Plot Table 6-5-46 ● Per-Month-Number-of-Staff-Based FP_Productivity Basic Statistics (Enhancement,
IFPUG_Group) Figure 6-5-47 ● Outsourcing_Ratio, and FP Size (Enhancement, Mixed_FP_Measurement_Methods) Figure 6-5-48 ● Outsourcing_Ratio and FP_Productivity (Enhancement, Mixed_FP_Measurement_Methods) Figure 6-6-1 ● SLOC Size and Effort (All Project Types, Mixed Primary Programming Languages) with
Confidence Interval of 50% Figure 6-6-2 ● SLOC Size and Effort (All Project Types, Mixed Primary Programming Languages) Magnified
with Confidence Interval of 50% (SLOC size ≤ 500,000 and effort ≤ 200,000) Figure 6-6-3 ● SLOC Size and Effort (All Project Types, Mixed Primary Programming Languages)
Logarithmic Scale Figure 6-6-4 ● SLOC Size and Effort (All Project Types, Major_Programming_Language_Group) Figure 6-6-5 ● SLOC Size and Effort (All Project Types, Major_Programming_Language_Group)
Logarithmic Scale Figure 6-6-6 ● Primary-Programming-Language-Based SLOC Size and Effort (Development) Figure 6-6-7 ● Primary-Programming-Language-Based SLOC Size and Effort (Development) Magnified
(SLOC size ≤ 500,000 and effort ≤ 200,000) Figure 6-6-8 ● Primary-Programming-Language-Based SLOC Size and Effort (Development) Logarithmic
Scale Figure 6-6-9 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, COBOL)
298 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 6-6-10 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, COBOL) Logarithmic Scale
Figure 6-6-11 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, C) Figure 6-6-12 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, C) Logarithmic
Scale Figure 6-6-13 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, VB) Figure 6-6-14 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, VB)
Logarithmic Scale Figure 6-6-15 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, Java) Figure 6-6-16 ● Primary-Programming-Language-Based SLOC Size and Effort (Development, Java)
Logarithmic Scale Figure 6-6-17 ● Industry-Type-Based SLOC Size and Effort (Development,
Major_Programming_Language_Group) Figure 6-6-18 ● Architecture-Based SLOC Size and Effort (Development,
Major_Programming_Language_Group) Figure 6-6-19 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement) Figure 6-6-20 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement) Magnified
(SLOC size ≤ 500,000 and effort ≤ 200,000) Figure 6-6-21 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement) Logarithmic
Scale Figure 6-6-22 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, COBOL) Figure 6-6-23 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, COBOL)
Logarithmic Scale Figure 6-6-24 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, C) Figure 6-6-25 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, C) Logarithmic
Scale Figure 6-6-26 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, VB) Figure 6-6-27 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, VB)
Logarithmic Scale Figure 6-6-28 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, Java) Figure 6-6-29 ● Primary-Programming-Language-Based SLOC Size and Effort (Enhancement, Java)
Logarithmic Scale Figure 6-6-30 ● Industry-Type-Based SLOC Size and Effort (Enhancement,
Major_Programming_Language_Group) Figure 6-6-31 ● Architecture-Based SLOC Size and Effort (Enhancement,
Major_Programming_Language_Group) Figure 6-7-1 ● Primary-Programming-Language-Based SLOC Size and SLOC_Productivity (Development) Figure 6-7-2 ● SLOC Size and SLOC_Productivity (Development, COBOL) Figure 6-7-3 ● SLOC Size and SLOC_Productivity (Development, C) Figure 6-7-4 ● SLOC Size and SLOC_Productivity (Development, VB) Figure 6-7-5 ● SLOC Size and SLOC_Productivity (Development, Java) Table 6-7-6 ● SLOC-Size-Based SLOC_Productivity Basic Statistics (Development,
Major_Programming_Language_Group) Figure 6-7-7 ● SLOC-Size-Based SLOC_Productivity (Development,
Major_Programming_Language_Group) Box-and-Whisker-Plot Figure 6-7-8 ● SLOC-Size-Based SLOC_Productivity (Development,
Primary-Programming-Language-Based) Box-and-Whisker-Plot Figure 6-7-9 ● Primary-Programming-Language-Based SLOC_Productivity (Development)
Box-and-Whisker-Plot Table 6-7-10 ● Primary-Programming-Language-Based SLOC_Productivity Basic Statistics (Development) Figure 6-7-11 ● Industry-Type-Based SLOC_Productivity (Development,
Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-12 ● Industry-Type-Based SLOC_Productivity (Development,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-13 ● Industry-Type-Based SLOC_productivity Basic Statistics (Development,
Major_Programming_Language_Group) Figure 6-7-14 ● Architecture-Based SLOC_Productivity (Development,
Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-15 ● Architecture-Based SLOC_Productivity (Development,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-16 ● Architecture-Based SLOC_Productivity Basic Statistics (Development,
Major_Programming_Language_Group)
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 299
Figure 6-7-17 ● Platform-Based SLOC_Productivity (Development, Major_Programming_Language_Group) Box-and-Whisker Plot
Figure 6-7-18 ● Platform-Based SLOC_Productivity (Development, Primary-Programming-Language-Based) Box-and-Whisker Plot
Table 6-7-19 ● Platform-Based SLOC_Productivity Basic Statistics (Development, Major_Programming_Language_Group)
Figure 6-7-20 ● Per-Month-Number-of-Staff and SLOC_productivity (Development, Primary-Programming-Language-Based)
Figure 6-7-21 ● Per-Month-Number-of- Staff-Based SLOC_Productivity (Development, Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-22 ● Per-Month-Number-of-Staff-Based SLOC_Productivity (Development,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-23 ● Per-Month-Number-of-Staff-Based SLOC_Productivity Basic Statistics (Development,
Major_Programming_Language_Group) Figure 6-7-24 ● Outsourcing_Ratio and SLOC Size (Development, Primary-Programming-Language-Based) Figure 6-7-25 ● Outsourcing_Ratio and SLOC_Productivity (Development,
Primary-Programming-Language-Based) Figure 6-7-26 ● Primary-Programming-Language-Based SLOC Size and SLOC_Productivity (Enhancement) Figure 6-7-27 ● SLOC Size and SLOC_Productivity (Enhancement, COBOL) Figure 6-7-28 ● SLOC Size and SLOC_Productivity (Enhancement, C) Figure 6-7-29 ● SLOC Size and SLOC_Productivity (Enhancement, VB) Figure 6-7-30 ● SLOC Size and SLOC_Productivity (Enhancement, Java) Table 6-7-31 ● SLOC-Size-Based SLOC_Productivity Basic Statistics (Enhancement,
Major_Programming_Language_Group) Figure 6-7-32 ● SLOC-Size-Based SLOC_Productivity (Enhancement, Major_Programming_ Language_Group) Box-and-Whisker-Plot Figure 6-7-33 ● SLOC-Size-Based SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Box-and-Whisker-Plot Figure 6-7-34 ● Primary-Programming-Language-Based SLOC_Productivity (Enhancement)
Box-and-Whisker-Plot Table 6-7-35 ● Primary-Programming-Language-Based SLOC_Productivity Basic Statistics (Enhancement) Figure 6-7-36 ● Industry-Type-Based SLOC_Productivity (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-37 ● Industry-Type-Based SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-38 ● Industry-Type-Based SLOC_Productivity Basic Statistics (Enhancement, Major_Programming_Language_Group) Figure 6-7-39 ● Architecture-Based SLOC_Productivity (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-40 ● Architecture-Based SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-41 ● Architecture-Based SLOC_Productivity Basic Statistics (Enhancement,
Major_Programming_Language_Group) Figure 6-7-42 ● Platform-Based SLOC_Productivity (Enhancement, Major_Programming_Language_ Group) Box-and-Whisker Plot Figure 6-7-43 ● Platform-Based SLOC_Productivity (Enhancement, Primary-Programming-Language- Based) Box-and-Whisker Plot Table 6-7-44 ● Platform-Based SLOC_Productivity Basic Statistics (Enhancement,
Major_Programming_Language_Group) Figure 6-7-45 ● Per-Month-Number-of-Staff and SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Figure 6-7-46 ● Per-Month-Number-of- Staff-Based SLOC_Productivity (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot Figure 6-7-47 ● Per-Month-Number-of-Staff-Based SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Box-and-Whisker Plot Table 6-7-48 ● Per-Month-Number-of-Staff-Based SLOC_Productivity Basic Statistics (Enhancement,
Major_Programming_Language_Group) Figure 6-7-49 ● Outsourcing_Ratio and SLOC Size (Enhancement, Primary-Programming-Language-Based) Figure 6-7-50 ● Outsourcing_Ratio and SLOC_Productivity (Enhancement,
Primary-Programming-Language-Based) Figure 6-8-1 ● Primary-Programming-Language-Based FP Size and SLOC Size (Development,
IFPUG_Group)
300 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Chapter 7 Analysis of Reliability Table 7-1-1 ● Combinations of Factors, Characteristics, and Stratification Figure 7-2-1 ● FP Size and Number_of_Identified_Defects (Mixed_FP_Measurement_Methods) Table 7-2-2 ● Number_of_Identified_Defects Basic Statistics (Mixed_FP_Measurement_Methods) Figure 7-2-3 ● FP Size and Number_of_Identified_Defects (IFPUG_Group) Figure 7-2-4 ● Number_of_Identified_Defects Distribution (IFPUG_Group) Table 7-2-5 ● Number_of_Identified_Defects Basic Statistics (IFPUG_Group) Figure 7-2-6 ● FP Size and Number_of_Identified_Defects (Development, IFPUG_Group) Table 7-2-7 ● Number_of_Identified_Defects Basic Statistics (Development, IFPUG_Group) Figure 7-2-8 ● FP Size and Number_of_Identified_Defects (Enhancement, IFPUG_Group) Table 7-2-9 ● Number_of_Identified_Defects Basic Statistics (Enhancement, IFPUG_Group) Figure 7-3-1 ● FP Size and FP_Identified_Defect_Density (Mixed_FP_Measurement_Methods) Table 7-3-2 ● FP_Identified_Defect_Density Basic Statistics (Mixed_FP_Measurement_Methods) Figure 7-3-3 ● FP Size and FP_Identified_Defect_Density (IFPUG_Group) Table 7-3-4 ● FP_Identified_Defect_Density Basic Statistics (IFPUG_Group) Figure 7-3-5 ● FP Size and FP_Identified_Defect_Density (Development, IFPUG_Group) Table 7-3-6 ● FP_Identified_Defect_Density Basic Statistics (Development, IFPUG_Group) Figure 7-3-7 ● Industry-Type-Based FP Size and FP_Identified_Defect_Density (Development,
IFPUG_Group) Table 7-3-8 ● Industry-Type-Based FP_Identified_Defect_Density Basic Statistics (Development,
IFPUG_Group) Figure 7-3-9 ● Architecture-Based FP Size and FP_Identified_Defect_Density (Development, IFPUG_Group) Table 7-3-10 ● Architecture-Based FP_Identified_Defect_Density Basic Statistics (Development,
IFPUG_Group) Figure 7-3-11 ● FP Size and FP_Identified_Defect_Density (Enhancement, IFPUG_Group) Table 7-3-12 ● FP_Identified_Defect_Density Basic Statistics (Enhancement, IFPUG_Group) Figure 7-3-13 ● Industry-Type-Based FP Size and FP_Identified_Defect_Density (Enhancement,
IFPUG_Group) Table 7-3-14 ● Industry-Type-Based FP_Identified_Defect_Density Basic Statistics (Enhancement,
IFPUG_Group) Figure 7-3-15 ● Architecture-Based FP Size and FP_Identified_Defect_Density (Enhancement, IFPUG_Group) Table 7-3-16 ● Architecture-Based FP_Identified_Defect_Density Basic Statistics (Enhancement,
IFPUG_Group) Figure 7-4-1 ● SLOC Size and Number_of_Identified_Defects (Mixed Primary Programming Languages) Table 7-4-2 ● Number_of_Identified_Defects Basic Statistics (Mixed Primary Programming Languages) Figure 7-4-3 ● Primary-Programming-Language-Based SLOC Size and Number_of_Identified_Defects (All
Project Types) Figure 7-4-4 ● Primary-Programming-Language-Based SLOC Size and Number_of_Identified_Defects (All
Project Types) Box-and-Whisker Plot Table 7-4-5 ● Primary-Programming-Language-Based Number_of_Identified_Defects Basic Statistics (All
Project Types) Figure 7-4-6 ● Primary-Programming-Language-Based SLOC Size and Number_of_identified_defects
(Development) Table 7-4-7 ● Primary-Programming-Language-Based Number_of_identified_defects Basic Statistics
(Development) Figure 7-4-8 ● Primary-Programming-Language-Based SLOC Size and Number_of_identified_defects
(Enhancement) Table 7-4-9 ● Primary-Programming-Language-Based Number_of_identified_defects Basic Statistics
(Enhancement) Figure 7-5-1 ● SLOC Size and SLOC_Identified_Defect_Density (Mixed Primary Programming Languages) Table 7-5-2 ● SLOC_Identified_Defect_Density Basic Statistics (Mixed Primary Programming Languages) Figure 7-5-3 ● Primary-Programming-Language-Based SLOC Size and SLOC_Identified_Defect_Density
(All Project Types) Figure 7-5-4 ● Primary-Programming-Language-Based SLOC_Identified_Defect_Density (All Project Types) Box-and-Whisker Plot Table 7-5-4 ● Primary-Programming-Language-Based SLOC_Identified_Defect_Density Basic Statistics (All
Project Types) Figure 7-5-6 ● Primary-Programming Language-Based SLOC Size and SLOC_Identified_Defect_Density
(Development) Table 7-5-7 ● SLOC-Size-Based SLOC_Identified_Defect_Density Basic Statistics (Development,
Major_Programming_Language_Group)
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 301
Figure 7-5-8 ● Primary-Programming-Language-Based SLOC_Identified_Defect_Density (Development) Box-and-Whisker Plot
Table 7-5-9 ● Primary-Programming-Language-Based SLOC_Identified_Defect_Density Basic Statistics (Development)
Figure 7-5-10 ● Industry-Type-Based SLOC Size and SLOC_Identified_Defect_Density (Development, Major_Programming_Language_Group)
Figure 7-5-11 ● Industry-Type-Based SLOC_Identified_Defect_Density (Development, Major_Programming_Language_Group) Box-and-Whisker Plot
Table 7-5-12 ● Industry-Type-Based SLOC_Identified_Defect_Density Basic Statistics (Development, Major_Programming_Language_Group)
Figure 7-5-13 ● Architecture-Based SLOC Size and SLOC_Identified_Defect_Density (Development, Major_Programming_Language_Group)
Figure 7-5-14 ● Architecture-Based SLOC_Identified_Defect_Density (Development, Major_Programming_Language_Group) Box-and-Whisker Plot
Table 7-5-15 ● Architecture-Based SLOC_Identified_Defect_Density Basic Statistics (Development, Major_Programming_Language_Group).
Figure 7-5-16 ● Primary-Programming Language-Based SLOC Size and SLOC_Identified_Defect_Density (Enhancement)
Table 7-5-17 ● SLOC Size-Based SLOC_Identified_Defect_Density Basic Statistics (Enhancement, Major_Programming_Language_Group)
Figure 7-5-18 ● Primary-Programming Language-Based SLOC_Identified_Defect_Density (Enhancement)
Box-and-Whisker Plot Table 7-5-19 ● Primary-Programming Language-Based SLOC_Identified_Defect_Density Basic Statistics
(Enhancement) Figure 7-5-20 ● Industry-Type-Based SLOC Size and SLOC_Identified_Defect_Density (Enhancement,
Major_Programming_Language_Group) Figure 7-5-21 ● Industry-Type-Based SLOC_Identified_Defect_Density (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot Table 7-5-22 ● Industry-Type-Based SLOC_Identified_Defect_Density Basic Statistics (Enhancement,
Major_Programming_Language_Group) Figure 7-5-23 ● Architecture-Based SLOC Size and SLOC_Identified_Defect_Density (Enhancement,
Major_Programming_Language_Group) Figure 7-5-24 ● Architecture-Based SLOC_Identified_Defect_Density (Enhancement,
Major_Programming_Language_Group) Box-and-Whisker Plot Table 7-5-24 ● Architecture-Based SLOC_Identified_Defect_Density Basic Statistics (Enhancement,
Major_Programming_Language_Group) Chapter 8 Development-Phase-Based Analysis Figure 8-1-1 ● Phase-Based Actual Month Ratio (Development) Box-and-Whisker Plot Table 8-1-2 ● Phase-Based Actual Month Ratio Basic Statistics (Development) Figure 8-1-3 ● Phase-Based Actual Month Ratio (Enhancement) Box-and-Whisker Plot Table 8-1-4 ● Phase-Based Actual Month Ratio Basic Statistics (Enhancement) Figure 8-1-5 ● Phase-Based Actual Effort Ratio (Development) Box-and-Whisker Plot Table 8-1-6 ● Phase-Based Actual Effort Ratio Basic Statistics (Development) Figure 8-1-7 ● Phase-Based Actual Effort Ratio (Enhancement) Box-and-Whisker Plot Table 8-1-8 ● Phase-Based Actual Effort Ratio Basic Statistics (Enhancement) Table 8-2-1 ● Basic Design Review-Pointed-Out Issues per Unit FP Size Basic Statistics Table 8-2-2 ● Basic Design Review-Pointed-Out Issues per Unit SLOC Size Basic Statistics Table 8-2-3 ● Basic Design Review-Pointed-Out Issues per Unit Effort Basic Statistics (1) Table 8-2-4 ● Basic Design Review-Pointed-Out Issues per Unit Effort Basic Statistics (2) Figure 8-3-1 ● Test Cases and Identified Software Failures/Faults per Unit FP Size (All Project Types)
Box-and-Whisker Plot Table 8-3-2 ● Test Cases and Identified Software Failures/Faults per Unit FP Size Basic Statistics Figure 8-3-3 ● Test Cases and Identified Software Failures/Faults per Unit FP Size (Development,
IFPUG_Group) Box-and-Whisker Plot Table 8-3-4 ● Test Cases and Identified Software Failures/Faults per Unit FP Size Basic Statistics
(Development, IFPUG_Group) Figure 8-3-5 ● Identified Software Faults per Unit FP Size (Enhancement, IFPUG_Group) Box-and-Whisker
Plot Table 8-3-6 ● Identified Software Faults per Unit FP Size Basic Statistics (Enhancement, IFPUG_Group)
302 IPA/SEC White Paper 2007 on Software Development Projects in Japan
Figure 8-3-7 ● Test Cases and Identified Software Failures/Faults per Unit SLOC Size (All Project Types) Box-and-Whisker Plot
Table 8-3-8 ● Test Cases and Identified Software Failures/Faults per Unit SLOC Size Basic Statistics (All Project Types)
Figure 8-3-9 ● Test Cases and Identified Software Failures/Faults per Unit SLOC Size (Development, Major_Primary_Programming_Language_Group) Box-and-Whisker Plot
Table 8-3-10 ● Primary-Programming-Language-Based Integration Test Cases per Unit SLOC Size Basic Statistics (Development)
Table 8-3-11 ● Primary-Programming-Language-Based System Test Cases per Unit SLOC Size Basic Statistics (Development)
Table 8-3-12 ● Primary-Programming-Language-Based Identified Software Failures per Unit SLOC Size Integration Test Basic Statistics (Development)
Table 8-3-13 ● Primary-Programming-Language-Based Identified Software Failures per Unit SLOC Size System Test Basic Statistics (Development)
Table 8-3-14 ● Primary-Programming-Language-Based Identified Software Faults per Unit SLOC Size Integration Test Basic Statistics (Development)
Table 8-3-15 ● Primary-Programming-Language-Based Identified Software Faults per Unit SLOC Size System Test Basic Statistics (Development)
Figure 8-3-16 ● Test Cases and Identified Software Defects/Faults per Unit SLOC Size (Enhancement, Major_Primary_Programming_Language_Group) Box-and-Whisker Plot
Table 8-3-17 ● Primary-Programming-Language-Based Integration Test Cases per Unit SLOC Size Basic Statistics (Enhancement)
Table 8-3-18 ● Primary-Programming-Language-Based System Test Cases per Unit SLOC Size Basic Statistics (Enhancement)
Table 8-3-19 ● Primary-Programming-Language-Based Identified Software Failures per Unit SLOC Size Integration Test Basic Statistics (Enhancement)
Table 8-3-20 ● Primary-Programming-Language-Based Identified Software Failures per Unit SLOC Size System Test Basic Statistics (Enhancement)
Table 8-3-21 ● Primary-Programming-Language-Based Identified Software Faults per Unit SLOC Size Integration Test Basic Statistics (Enhancement)
Table 8-3-22 ● Primary-Programming-Language-Based Identified Software Faults per Unit SLOC Size System Test Basic Statistics (Enhancement)
Figure 8-3-23 ● Test Cases and Identified Software Failures/Faults per Unit Effort (All Project Types) Box-and-Whisker Plot
Table 8-3-24 ● Test Cases and Identified Software Failures/Faults per Unit Effort Basic Statistics (All Project Types) (1)
Table 8-3-25 ● Test Cases and Identified Software Failures/Faults per Unit Effort Basic Statistics (All Project Types) (2)
Figure 8-3-26 ● Test Cases and Identified Software Failures/Faults per Unit Effort (Development) Box-and-Whisker Plot
Table 8-3-27 ● Test Cases and Identified Software Failures/Faults per Unit Effort Basic Statistics (Development) (1)
Table 8-3-28 ● Test Cases and Identified Software Failures/Faults per Unit Effort Basic Statistics (Development) (2)
Figure 8-3-29 ● Test Cases and Identified Software Defects/Faults per Unit Effort (Enhancement) Box-and-Whisker Plot
Table 8-3-30 ● Test Cases and Identified Software Defects/Faults per Unit Effort Basic Statistics (Enhancement) (1)
Table 8-3-31 ● Test Cases and Identified Software Defects/Faults per Unit Effort Basic Statistics (Enhancement) (2)
Chapter 9 Estimates-Results Analysis and Productivity
Cross-Analysis Figure 9-1-1 ● Overgrowth of Size, Effort, and Development Schedule (Concept of Estimation Error) Figure 9-1-2 ● Planned FP Size and Actual FP Size Table 9-1-3 ● FP Size Estimation Error Percentage Figure 9-1-4 ● FP Size Estimation Error Percentage Distribution Figure 9-1-5 ● Planned Effort and Actual Effort Table 9-1-6 ● Effort Estimation Error Percentage Figure 9-1-7 ● Effort Estimation Error Percentage Distribution Figure 9-1-8 ● Planned Development Schedule and Actual Development Schedule
List of Figures and Tables
IPA/SEC White Paper 2007 on Software Development Projects in Japan 303
Table 9-1-9 ● Development Schedule Estimation Error Percentage Figure 9-1-10 ● Development Schedule Estimation Error Percentage Distribution Table 9-2-1 ● Size-Based FP_productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) (FPs / Person Hours) Table 9-2-2 ● Size-Based FP_productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) (FPs / 160 Person Hours) Figure 9-2-3 ● Size-Based, Industry-Type-Based FP_productivity Basic Statistics (Development,
Mixed_FP_Measurement_Methods) Box-and-Whisker Plot Figure 9-2-4 ● Team-Size-Based FP_productivity (Development, Mixed_FP_Measurement_Methods)
Box-and-Whisker Plot Figure 9-2-5 ● Requirements-Specifications-Clearness-Based Distribution of FP Size and FP_productivity
(Development, Mixed_FP_Measurement_Methods) Table 9-2-6 ● Requirements-Specifications-Clearness-Based Effort Basic Statistics (Development,
Mixed_FP_Measurement_Methods) Table 9-2-7 ● Requirements-Specifications-Clearness-Based FP Size Basic Statistics (Development,
Mixed_FP_Measurement_Methods) Figure 9-2-8 ● Reliability-Requirements-Level-Based Distribution of FP Size and FP_productivity
(Development, Mixed_FP_Measurement_Methods) Figure 9-2-9 ● Reliability-Requirements-Level-Based Distribution of FP Size and FP_productivity
(Development, Mixed_FP_Measurement_Methods) X-Axis in Logarithmic Scale