6
Particle Swarm Optimization Based Effort Estimation Using Function Point Analysis Mandeep Kaur GNDEC, Ludhiana, India [email protected] Abstract- Software development effort estimation is one of the most important activities in software project management Various models have been proposed to construct a relationship between software size and effort, however there are many problems. This is because project data, available in the initial stages of project is oſten incomplete, inconsistent, uncertain and unclear. Accurate estimation of the software effort and schedule affects the budget computation. Inaccurate estimates lead to failure of obtaining a profit, increased probabi of project incompletion and delay of the project deve date. Function Points (FP) are one of the size metrics which are used for estimating the effort of the project and Particle Swarm Optimization (PSO), a swarm integence technique is used to tune the parameters of Value Adjustment Factor (VAF) which is used to obtain the function count. From this optimized funcon count, optimized Albrecht & Gaffney effort is estimated. The estimated effort is compared with the existing effort models and performance analysis is done on the basis of %MARE and RMSE. The research shows that the results of the proposed model are far better than the isting models. Keywords- Effort Estimation; Particle Swarm Optimization; Function Point Analysis. I. INTRODUCTION The modem day soſtware industry is all about efficiency. With increase in expanse of soſtware projects, it has become important to analyze the soſtware requirements early in the soſtware development phase. For a given set of requirements, it is desirable to evaluate the amount of time and the money required for delivering the project prolifically [7]. Accurate estimate means better planning and efficient use of project resources like cost, duration and effort requirements for soſtware projects especially space and military projects. Efficient soſtware project estimation is one of the most important tasks in soſtware development [ 1]. Accurate estimation of the problem size is very important to obtain a satisfactory estimation of effort, cost and time duration and of a soſtware project. The size of the problem is not the number of bytes that the source code occupies or the byte size of the executable code. But it is a measure of the problem complexity in terms of the effort and time required to develop the product. Two widely used metrics to estimate the size are: Lines of Code (LOC) and Function Points (FP) [9]. 978- 1-4799-2900-9/ 14/$3l.00 ©20 14 IEEE 140 Sumeet Kaur Sehra Assistant Professor, GNDEC, Ludhiana, India [email protected] A Line of code counts the number of source instructions of a program. While counting the number of source instructions, the lines used for commenting the code and the header lines are ignored. There are several cost, effort and schedule estimation models which use LOC as an input parameter, like the widely-used Constructive Cost Model (COCOMO), series of models invented by Dr. Barry Boehm [3]. However, LOC as a measure of problem size has several shortcomings: 1. Higher-level languages such as Delphi, VB, Java Script, or other visual languages require far fewer lines of code than Assembly, COBOL, or C to perform the same functionality. That is, the higher level of the language, the lower is the work output (output is LOC). 2. LOC gives a numerical value of problem size that can vary widely with individual coding style; different programmer lay out their code in different ways [9]. 3. LOC focuses on coding activity only, however, a good problem size measure should consider the total effort required to speci, design, code, test, etc. 4. The LOC count can be accurately computed only aſter the code has been fully developed. Therefore, LOC metric is of little use to the project managers during project planning, since project planning is carried out much before the development activity is started. With modeization and advancement in technology, new soſtwares are being developed to meet ever-expanding user requirements, so a method to understand and communicate size must also be used. Function point analysis is a structured technique which breaks systems into smaller parts, so they can be better understood and analyzed. Function Point Analysis (FPA) denotes a family of algorithmic methods for size estimation. This method separately evaluates two classes of the attributes of a soſtware system: size factors and influence factors. The first version of FPA, invented by Albrecht [lO] at IBM in 1979 proposed a new metric (i.e., nction point) for soſtware size rather than lines of code. Soſtware effort estimation is the process of predicting the amount of effort required to complete the project [4]. Effort estimation consists of predicting how many hours of work and how many workers are needed to develop a project.

[IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

Embed Size (px)

Citation preview

Page 1: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

Particle Swarm Optimization Based Effort

Estimation Using Function Point Analysis

Mandeep Kaur GNDEC, Ludhiana, India

[email protected]

Abstract- Software development effort estimation is one of

the most important activities in software project management.

Various models have been proposed to construct a relationship

between software size and effort, however there are many

problems. This is because project data, available in the initial

stages of project is often incomplete, inconsistent, uncertain and

unclear. Accurate estimation of the software effort and schedule

affects the budget computation. Inaccurate estimates lead to

failure of obtaining a profit, increased probability of project

incompletion and delay of the project delivery date.

Function Points (FP) are one of the size metrics which are

used for estimating the effort of the project and Particle Swarm

Optimization (PSO), a swarm intelligence technique is used to

tune the parameters of Value Adjustment Factor (VAF) which is

used to obtain the function count. From this optimized function

count, optimized Albrecht & Gaffney effort is estimated. The

estimated effort is compared with the existing effort models and

performance analysis is done on the basis of %MARE and

RMSE. The research shows that the results of the proposed

model are far better than the existing models.

Keywords- Effort Estimation; Particle Swarm Optimization;

Function Point Analysis.

I. INTRODUCTION

The modem day software industry is all about efficiency. With increase in expanse of software projects, it has become important to analyze the software requirements early in the software development phase. For a given set of requirements, it is desirable to evaluate the amount of time and the money required for delivering the project prolifically [7]. Accurate estimate means better planning and efficient use of project resources like cost, duration and effort requirements for software projects especially space and military projects. Efficient software project estimation is one of the most important tasks in software development [ 1]. Accurate estimation of the problem size is very important to obtain a satisfactory estimation of effort, cost and time duration and of a software project. The size of the problem is not the number of bytes that the source code occupies or the byte size of the executable code. But it is a measure of the problem complexity in terms of the effort and time required to develop the product. Two widely used metrics to estimate the size are: Lines of Code (LOC) and Function Points (FP) [9].

978-1-4799-2900-9/14/$3l.00 ©2014 IEEE 140

Sumeet Kaur Sehra Assistant Professor,

GNDEC, Ludhiana, India [email protected]

A Line of code counts the number of source instructions of a program. While counting the number of source instructions, the lines used for commenting the code and the header lines are ignored. There are several cost, effort and schedule estimation models which use LOC as an input parameter, like the widely-used Constructive Cost Model (COCOMO), series of models invented by Dr. Barry Boehm [3]. However, LOC as a measure of problem size has several shortcomings:

1. Higher-level languages such as Delphi, VB, Java Script, or other visual languages require far fewer lines of code than Assembly, COBOL, or C to perform the same functionality. That is, the higher level of the language, the lower is the work output (output is LOC).

2. LOC gives a numerical value of problem size that can vary widely with individual coding style; different programmer lay out their code in different ways [9].

3. LOC focuses on coding activity only, however, a good problem size measure should consider the total effort required to specify, design, code, test, etc.

4. The LOC count can be accurately computed only after the code has been fully developed. Therefore, LOC metric is of little use to the project managers during project planning, since project planning is carried out much before the development activity is started.

With modernization and advancement in technology, new softwares are being developed to meet ever-expanding user requirements, so a method to understand and communicate size must also be used. Function point analysis is a structured technique which breaks systems into smaller parts, so they can be better understood and analyzed. Function Point Analysis (FP A) denotes a family of algorithmic methods for size estimation. This method separately evaluates two classes of the attributes of a software system: size factors and influence factors. The first version of FP A, invented by Albrecht [lO] at IBM in 1979 proposed a new metric (i.e., function point) for software size rather than lines of code.

Software effort estimation is the process of predicting the amount of effort required to complete the project [4]. Effort estimation consists of predicting how many hours of work and how many workers are needed to develop a project.

Page 2: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

Particle swarm optImIzation (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality [ 11]. The algorithm and its concept were introduced by James Kennedy and Russel Eberhart [8] in 1995.

The paper focuses on using particle swarm optimization technique to fine tune the parameters of value adjustment factor in order to optimize the function points which is further used to estimate effort.

II. FUNCnONPOINT ANALYSIS

FP A was developed in an attempt to develop a mechanism to predict effort associated with software development and to overcome difficulties associated with lines of code as a measure of software size. Function point analysis measures software by quantifying the functionality the software provides to the user based primarily on logical design.

The objectives of function point analysis are to:

• Measure the functionality that the user requests and receIves.

• Measure software development and maintenance independently of technology used for

implementation.

FP (Function Points) is the most widespread functional type metrics which is suitable for quantifying a software application. It uses 5, user identifiable logical "functions" which are divided into 2 data function types and 3 transactional function types [2]. Data function types evaluate the functionality provided by the user for the data storage requirements of the application. Transactional function types evaluate the functionality provided by the user for the processing requirements of the application. The resulting numbers (Unadjusted FP) are combined with the Value Adjustment Factor (VAF) to obtain the final number of FP.

UFP specifies the functionality provided to the user by the project or application. This calculation begins with the counting of the five function types of a project or application:

Data function types-

• Internal Logical File (ILF): a user identifiable group of logically related data or control information maintained within the boundary of the application.

• External Interface File (ElF): a user identifiable group of logically related data or control information referenced by the application, maintained within the boundary of another application. This means that ElF counted for an application is an ILF in another application.

Transactional function types-

• External Input (EI): An EI processes data or control information that comes from outside the application's boundary.

• External Output (EO): An EO is an elementary process that generates data or control information to be sent outside the application's boundary

• External Inquiry (EQ): An EQ is an elementary process made up of an input-output combination that results in data retrieval.

TABLE I: UFP CALCULA nON

Function Type Weight by Functional Complexity Total

Low Average High

External Input -*3 -*4 -*6

External Output -*4 -*5 -*7

External Enquiry -*3 -*4 -*6

External Interface *5 *7 *10 - - -

File Internal Logical File -*7 -*10 -*15

Total Number of Unadjusted Function Points =

The above mentIOned functIOn types are then ranked according to their complexity as shown in Table I: Low, Average or High, using a set of prescriptive standards. Organizations that use FP methods develop criteria for determining whether a particular entry is Low, Average or High. After classifying each of the five function types, the UFP is computed using predefined weights for each function type.

The next step involves assessing the environment and processing complexity of the project or application as a whole. Here the 14 general system characteristics are rated on a scale from 0 to 5 in terms of their likely effect on the project or application.

These general system characteristics are:

1. Data communications

2. Distributed data processing

3. Performance

4. Heavily used configuration

5. Transaction rate

6. Online data entry

7. End-user efficiency

8. Online update

9. Complex processing

10. Reusability

11. Installation ease

12. Operational ease

13. Multiple sites

14. Facilitate change

The degree of influence of each characteristic is

determined as a rating on a scale of 0 to 5 as defined below.

20J4Internationai Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) 141

Page 3: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

0 = Not present or no Influence

1 = Incidental influence

2 = Moderate influence

3 = Average influence

4 = Significant influence

5 = Strong influence throughout

Once all the 14 GSC's have been answered, VAF should

be tabulated using the IFPUG Value Adjustment Factor

Equation-

VAF = 0.65 + [(Ll�lCi) * O.OlJ (1) Where, i = each GSC, from 1 to 14, Ci = degree of influence

for each General System Characteristic, I= is summation of

all 14 GSC's.

After determining the Unadjusted Function Point count

(UFP) out of transactional and data function types and rating

the general system characteristics thereby obtaining the total

degree of influence, the Value Adjustment Factor (V AF) is

calculated and thus the final Function Point count (FP) can

be calculated using the formula:

FP = UFP* VAF (2) A typical estimation model is derived using regression

analysis on data collected from past software projects. The

basic structure of such models is-

E = A + B * (ev)C (3) Where A, B, C are empirically derived constants. E is the

effort measured in person-months and ev is the estimation

variable (FP or LOC). The FP-oriented models are-

E = -13.39 + 0.0545 * FP (4) (Albrecht and Gaffney model)

E = 60.62 * 7.728 * 10-8 * Fp3 (5) (Kemerer model)

III. PARTICLE SWARM OPTIMIZATION

The main concept of particle swarm optimization is that

there are particles of a swarm moving in a problem space

and evaluating their positions through a fitness function. The

general purpose optimization method known as Particle

Swarm Optimization (PSO) was introduced by Kennedy and

Eberhart [6], and works by maintaining a swarm of particles

that move around in the search-space influenced by the

improvements discovered by the other particles. The

specification of the fitness function depends on the problem

being optimized (especially in its dimensions) and as such it

is simply referred to as f(Xi), short for f(Xi.O), ...... ,f(Xi. d)' This

function represents how well the ,�' particle's position in the

multidimensional space has achieved the desired goal. ,,0' represents the dimensions to be optimized. Since PSO is a

multi-dimensional algorithm, the positions and velocities of

the particles have ,,0' components, where positions are

XiCXi.O, ...... ,Xi. d) and velocities as ViCVi.O , ...... , Vi. d)'

In this algorithm there is a completely connected swarm,

i.e. all the particles share information, every particle knows

what is the best position ever visited by any particle in the

swarm. Every particle has a position (6) and a velocity (7)

which is calculated as follows:

Where,

Xi,d(it + 1) = Xi,d(it) + Vi,d(it + 1)

Vi,d(it + 1) = Vi,d(it) +

C1 * Rnd(O,l) * [pbi,d(it) - Xi,d(it)] +

C2 * Rnd(O,l) * [gbd(it) - Xi,d(it)]

,�' is used as a particle identifier;

(6)

(7)

,,0' is dimension being considered, each particle has a

position and a velocity for each dimension;

,�t' is iteration number as the algorithm is iterative;

,,�, d' is position of particle i in dimension d;

"Vi, d' is velocity of particle i in dimension d;

"Cl' is acceleration constant for the cognitive component;

,,Rnd' is a random value between 0 and 1;

,pbi,d' is the location in dimension d with the best fitness of

all the visited locations in that dimension of particle i;

"C2' is acceleration constant for the social component;

,.gbd' is the location in dimension d with the best fitness

among all the visited locations in that dimension of all the

particles.

IV. MODEL EXPERIMENTATION

The project data for twenty projects from a software

company provided the UFP values and the total degree of

influence values for each of the projects, which were

obtained based on the 14 general system characteristics rated

on the scale of 0 to 5. From these values the value

adjustment factor (V AF) were evaluated for each of the

projects using the formula-

VAF = 0.65 + (TDI * 0.01) (8) Where, TDI is total degree of influence.

Also FP values and measured effort values were provided. Table II represents the UFP, V AF, and FP values. The proposed model used particle swarm optllIDzation to optimize the value adjustment factor value, thereby obtaining optimized function point count. This optimized function point count was used to estimate the effort for Albrecht & Gaffney model. In total 20 projects were considered, number of iterations were 50, number of particles were 20.

142 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT)

Page 4: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

TABLE II: UFP, VAF & FP VALUES

Project UFP TDI VAF FP

No,

I. 173 44 1.09 188.57

2. 330 47 1.12 369.6

3. 294 41 1.06 311.64

4. 303 47 1.12 339.36

5. 268 43 1.08 289.44

6. 284 28 0.93 264.12

7. 280 32 0.97 271.6

8. 269 45 1.10 295.9

9. 289 33 0.98 283.22

10. 217 32 0.97 210.49

11. 228 26 0.91 207.48

12. 281 29 0.94 264.14

13. 301 41 1.06 319.06

14. 259 44 1.09 282.31

15. 242 36 1.0 I 244.42

16. 234 38 1.03 241.02

17. 268 37 1.02 273.36

18. 208 42 1.07 222.56

19. 251 46 1.11 278.61

20. 214 39 1.04 222.56

A. PSG Algorithm

Input: Size of software projects (FP Count), Measured Efforts.

Output: Optimized parameters for estimating effort.

Step 1: Initialize swarm coefficients and other parameters,

k= 1, c 1=c2=2, <I>=4.l, 1.,=0.73.

Step 2: Initialize particles velocity and position vectors randomly.

Step 3: For i=l to n do

Evaluate fitness function for each particle. %MARE was used as the fitness function.

Step 4: Find the local best position of each particle by

comparing the fitness value with the old local best

position of the particle. If the new value IS

promising then update the local best position.

Step 5: Find the global best particle among all local best

particles.

Step 6: In order to make faster convergence consider the

constriction factor which is evaluated as-

A = 2kjC2 - � - f' - 4�) (9)

Where, �= c1 + c2, � > 4 (10) The particles velocity and positions are updated using particle local best and global best position (pBestPos and gBestPos) with the following formulae-

Vi,d (t + 1) = Vi,d (t) +

C 1 * Rnd(O,l) * [pBesPosi,d (t) - Xi,d (t)j +

C2 * Rnd(O,l) * [gBesPosd (t) - Xi,d (t)j (11)

Xi,d (t + 1) = Xi,d (t) + Vi,d (t + 1) (12)

Step 7: Step 4 to 6 are repeated for 50 iterations.

Step 8: Global solution parameters are obtained as optimal solution.

Step 9: Exit.

V. RESULTS

The parameters 0.65 and 0.0 1 of V AF were optimized and the values obtained were -0.20379 and 1, by implementing the above PSO algorithm in MA TLAB. The new parameters obtained were used to estimate the effort of the proposed model. The estimated effort values are given in the table III. Function points have an advantage over lines of code. FP takes into consideration the user point of view and is technology independent. PSO, one of the swarm intelligence techniques used for optimization effectively tuned the parameters of function points.

A. Performance Analysis

The two criterion used for comparing the performance of the proposed model with existing models are-

1. Mean Absolute Relative Error (MARE):

MARE is one of the performance evaluation criteria which is evaluated as the mean of the relative difference in the absolute values of the measured effort and estimated effort.

%MARE = � In_ abs(MEi- EEi) * 100 ( 1 3) n 1-1 MEi

Where, ME; is the Measured Effort and EE; is the Estimated Effort, n is the number of projects.

2. Root Mean Square Error (RMSE):

Calculation of the RMSE involves a sequence of 3 simple steps.

"Total square error' is obtained first as the sum of the

individual squared errors, then it is divided by n, where ,,n'

is number of observations, which yields the mean-square

error (MSE). The third and final step is to take RMSE as the

square root of the MSE.

RM S E = J'-;;-l:-�-l (-M-E-i ---E-E-a-2 (14)

2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) 143

Page 5: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

Table II shows the calculation of Function points based upon the project data for twenty projects gathered from a software company.

TABLE Ill: ESTIMATED EFFORT OF VARIOUS MODELS

Project No. FP count Optimized FP Count Measured Effort Albrecht & Gaffney Kemerer Effort Optimized

1. 188.57 7567.7 3313.8

2. 369.6 15443 6373.2

3. 311.64 11994 5393.7

4. 339.36 14179 5862.2

5. 289.44 11469 5018.5

6. 264.12 7894.1 4590.6

7. 271.6 8902.9 4717

8. 295.9 12050 5127.7

9. 283.22 9478.1 4913.4

10. 210.49 6899.8 3684.3

I I. 207.48 5881.5 3633.4

12. 264.14 8091.7 4591

13. 319.06 12280 5519.1

14. 282.31 11343 4898

15. 244.42 8662.7 4257.7

16. 241.02 8844.3 4200.2

17. 273.36 9861.4 4746.8

18. 222.56 8693.6 3888.3

19. 278.61 11495 4835.5

20. 222.56 8302.4 3888.3

From given values of function points, the effort is calculated according to Albrecht & Gaffney and Kemerer models [5, 10] and also the optimized effort using PSO is evaluated as shown in Table III.

Table IV shows that the proposed model gives better results as compared to the Albrecht & Gaffney and Kemerer models in terms of %MARE and RMSE. The proposed model based upon PSO results in %MARE value of 88.77 and RMSE value of 4192.1 which are quite less than the values obtained from Albrecht & Gaffney and Kemerer models.

Effort Effort

-3.1129 31.412 399.54

6.7532 236.53 828.24

3.5944 141.79 640.29

5.1051 183.09 759.38

2.3845 113.59 611.69

1.0045 86.315 416.84

1.4122 93.858 471.82

2.7366 121.37 643.34

2.0455 106.43 503.17

-1.9183 43.69 362.65

-2.0823 41.842 307.15

1.0056 86.335 427.61

3.9988 152.60 655.85

1.9959 105.41 604.82

-0.06911 68.406 458.73

-0.25441 65.591 468.63

1.5081 95.695 524.06

-1.2605 51.645 460.41

1.7942 101.31 613.08

-1.2605 51.645 439.09

TABLE IV: PERFORMANCE ANALYSIS

MODEL %MARE RMSE

Albrecht & Gaffney 99.98 4732.4

Kemerer Model 97.99 4628.8

Proposed Model 88.77 4192.1

Figure 1 represents the absolute error obtained from Albrecht & Gaffney model, Kemerer model and the proposed model. The figure clearly specifies the absolute error for the proposed model is far less than the other two models.

144 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT)

Page 6: [IEEE 2014 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) - Ghaziabad, India (2014.02.7-2014.02.8)] 2014 International Conference on

7000 - Albrecht& Gaffney Model Kemerer Model -+-ProposedModel

6000

5000

'"' 0

Ji 4000 v

::; "2 3000

� 2000

1000

0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Proiect Numb er

FIG. 1: ABSOLUTE ERROR OF DIFFERENT MODELS

VI. CONCLUSION & FUTURE WORK

In this research, particle swarm optimization was used to fine tune the parameters of value adjustment factor in order to optimize function points which was used to estimate the effort. The values of estimated effort were then compared to the effort obtained from Albrecht & Gaffuey model and Kemerer model [5, 10]. The performance measures used for comparison were Mean Absolute Relative Error and Root Mean Square Error. The results obtained showed that the proposed model performed better than the other models as %MARE and RMSE values for the proposed model were lowest as compared to the other models.

This research concentrates on tuning the parameters of VAF. In future, PSO can be used to optimize other parameters like that of UFP and also other optimization techniques can also be used to optimize function points.

REFERENCES

[1] A. Sheta, D. Rine and A. Ayesh, "Development of Software Effort and Schedule Estimation Models Using Soft Computing Technique", IEEE Congress on Evolutionary Computation, Hong Kong, pp. 1283-1289,2008.

[2] B. Jeng, D. Yeh, D. Wang, S. Chu and C. Chen, "A Specific Effort Estimation Method Using Function Point", Journal of Information Science and Engineering Vol. 27, pp. 1363-1376,20 11.

[3] B.W. Boehm, Software engineering economics, Englewood Cliffs, NJ: Prentice- Hall, 1981.

[4] CH.V.M.K. Hari and P.V.G.D. Reddy, "A Fine Tuning Parameter for COCOMO 81 Software Effort Estimation using Particle Swarm Optimization", Journal of Software Engineering, Vol.5, No.1, pp.

38-48,20 11.

[5] C.F. Kemerer, "An Empirical Validation of Software Cost Estimation Models", Communications of the ACM, Vol. 30, No. 5, pp.

416-429, 1987.

[6] M.E.H. Pedersen and A. J. Chipperfield, "Simplitying Particle Swarm Optimization", Applied Soft Computing, Vol.l 0, No.2, pp. 618-628, 2009.

[7] P.V.G.D. Reddy and CH.V.M.K. Hari, "Software Effort Estimation Using Particle Swarm Optimization with Inertia Weight" International Journal of Software Engineering (IJSE), Vol.2, No.4, pp. 87-96,20 11.

[8] Q. Bai, "Analysis of Particle Swarm Optimization Algorithm", Computer and Information Science, Vol.3, No.1, pp. 180-184,2008.

[9] R. Mall, "Fundamentals of Software Engineering", PHI Learning Private Limited, New Delhi, 2008.

[ 10] R.S. Pressman, "Software Engineering: A Practitioner's Approach", McGraw-Hill series III

Computer Science, New York, 200 1.

[ 11] S.K. Sehra, Y.S. Brar, and N. Kaur, "Soft computing Techniques for Software Project Effort Estimation", International Journal of Advanced Computer and Mathematical Sciences, Vol.2, No.3, pp. 160-167,20 11.

20J4Internationai Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) 145