15
Evaluation of the Accuracy of Naslund’s Approximation for Managerial Decision Making under Uncertainty Hiiseyin Sarper University of Southern Colorado 2200 Bonforte Boulevard Pueblo, Colorado 81001-4901 Transmitted by Dr. Melvin R. Scott ABSTRACT This paper uses Chance-Constrained Mathematical Programming (CCP) as a tool in project selection when the coefficients involved are independent normal random variables. The main contribution of the paper centers around showing how well chance-constrained programs can be solved using Naslund’s approximation instead of other mathematical methods. If a given decision process involves a degree of uncer- tainty in its parameters and only binary decisions are required, then it is shown that approximation is the only available solution method. 1. INTRODUCTION Risk is a major element in decision process. Chance-Constrained Mathe- matical Programming (CCP) is one way to include risk into ordinary linear mathematical models. CCP describes constraints in the form of probability levels of attainment. Rao [9] provides a complete treatment on the origins and the theory of CCP. In CCP, coefficients have typically been assumed to follow a normal distribution with known means and variances. Consider the following model: Maximize F(X) = E,, (I) j=l APPLZED MATHEMATZCS AND COMPUTATION 55:73-87 (1993) 0 Elsetier Science Publishing Co., Inc., 1993 73 655 Avenue of the Americas, New York, NY 10010 0096-3003/93/$6.00

Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Embed Size (px)

Citation preview

Page 1: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Evaluation of the Accuracy of Naslund’s Approximation for

Managerial Decision Making under Uncertainty

Hiiseyin Sarper

University of Southern Colorado 2200 Bonforte Boulevard Pueblo, Colorado 81001-4901

Transmitted by Dr. Melvin R. Scott

ABSTRACT

This paper uses Chance-Constrained Mathematical Programming (CCP) as a tool

in project selection when the coefficients involved are independent normal random

variables. The main contribution of the paper centers around showing how well chance-constrained programs can be solved using Naslund’s approximation instead of other mathematical methods. If a given decision process involves a degree of uncer-

tainty in its parameters and only binary decisions are required, then it is shown that approximation is the only available solution method.

1. INTRODUCTION

Risk is a major element in decision process. Chance-Constrained Mathe- matical Programming (CCP) is one way to include risk into ordinary linear mathematical models. CCP describes constraints in the form of probability levels of attainment. Rao [9] provides a complete treatment on the origins and the theory of CCP. In CCP, coefficients have typically been assumed to follow a normal distribution with known means and variances. Consider the following model:

Maximize F(X) = E,, (I) j=l

APPLZED MATHEMATZCS AND COMPUTATION 55:73-87 (1993)

0 Elsetier Science Publishing Co., Inc., 1993

73

655 Avenue of the Americas, New York, NY 10010 0096-3003/93/$6.00

Page 2: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

74 HikEYiN SARPER

subject to

Pr[{ruijXj <bi] api for i = 1,2,...,m (2)

Xi > 0 forj = 1,2,...n (3)

where cj, aij, and bi are normal random variables and pi’s are prespecified probabilities. The above model would be the extreme case of stochasticity since all three coefficients are random variables. Typically, hi’s and/or aij’s and/or cj’s are considered as random. Considering only independently and normally distributed coefficients, three basic cases are as follows:

1) When only hi’s are random: Let ei (nonnegative) represent a standard normal variate at which O(ei> = I - pi. Then (1) remains the same and each constraint of type (2) takes the deterministic form of (4). No nonlinear terms result in this case:

~~ij~j-hi-ei[var(b,)]1’2$0, fori=1,2,...,m. (4) j=1

2) When only aij’s are random: (1) remains the same, but constraints (2) now become nonlinear as shown below.

~~~jXj+er[~rVar(ll;)r:]ll-b~<O fori=1,2,...,m. (5) j=l

3) When only cj’s are random: (2) remains the same, but the objective function (1) becomes nonlinear as shown below.

(6)

In (6) above, k, and k, are nonnegative coefficients that indicate the relative importance of minimizing mean and variance. For example, setting both k,

and k, equal to 1 indicates equal importance of minimizing both mean and

Page 3: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 75

variance. If all three coefficients are random, then the objective function remains as in (6) and each constraint (2) takes the form shown below.

Xi + e,[Var(hi)]i” < 0

n

for i = 1,2,..., rn whereXi = c Zijxj - zi for all i. (7) j=l

When some or all of the coefficients are correlated, equations similar to (4)-(7) above are given by Rao [9]. If the coefficients follow a distribution other than the normal distribution, there are no closed form equations reported in the literature as nonnormal coefficients have not really been studied in the CCP context. Then, one solution method would be to approxi- mate these distributions into a normal one.

The difficulty of a CCP solution starts after the deterministic equivalents are written. Except for only when hi’s are random, each constraint and/or the objective function consists of two segments: linear and nonlinear terms. The purpose of this paper is to evaluate the performance of a method that approximates the nonlinear segment into a linear one. The nonlinear segment has the form of (Cz= iv, Xi)l/’ where V, is the variance, if any, of the bi term. The problem has m - 1 decision variables with variances V, through V,,_ r. Variable X,, which arises, is simply ignored. Once approximated, variable coefficients of the linearized segment are added to linear segment coefficients to end up with a completely linear model. If a nonlinear solution is desired, the decision maker has some tools available. Wasil, et al. [12] review six major nonlinear optimization software packages and compare them by using a four variable example in which the objective function is mildly nonlinear and all the constraints are linear. The authors state that many real-world problems fit this type of format. Then, the NonLinear Problem (NLP) class presented in this paper has not been covered in [12]. The case of a linear or nonlinear objective function coupled with a nonlinear constraint set represents a different level of complexity. The problem expressed by (4)-(7) is certainly a very real type problem that describes the risk inherent in all decision making processes such as the case of the project selection problem under budget constraints. It is reasonable to expect that the available budget and the project requirements become more uncertain for future years. The problem becomes more realistic if each available budget is expressed as a random variable. This real-world problem can either be solved using one of the software packages reviewed in [I21 or it can be approximated into a Linear Problem (LP). [12] d oes not clearly claim that GINO [4] is the best NLP software package, but GINO appears to perform better than the

Page 4: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

76 HikEYiN SARPER

other five based upon various criteria. Thus, GINO is used in this paper to solve all nonlinear programs.

Another major problem with all NLP software is the lack of ability to solve binary or even general integer type problems. In fact, [12] does not mention any integer variables at all. The purpose of this paper is to compare the performance of Naslund’s method of approximating NLP types [5] presented earlier. GINO software is used in solving a small and a large problem for comparison with the approximate solution. GINO uses a generalized reduced gradient method that requires that the constraint set be convex. While manually solving another four variable deterministic equivalent of a CCP, Panne and Popp [7] p rove that the constraints of the form (5) provide a convex set even when covariances exist. Then, the use of GINO is possible.

2. NASLUND’S APPROXIMATION [5]

This approximation is based on fitting an affine function through n + 1 solution points. The nonlinear square root portion of each constraint of the deterministic equivalent is converted into an approximate linear form and then added to the rest of the constraint. If the objective function has any nonlinear (square root) terms due to stochastic cj coefficients, this approxi- mation, shown below, can also be used to linearize the objective function.

(~lvmx.i)l’z G (&L}“’

V,,, is the coefficient of each decision variable X, in the square root section. At the end of the approximation process, a constant is obtained and it is carried over to the right-hand side after changing its sign. This method requires no additional variables or constraints. The right-hand side of (8) can be easily programmed as a subroutine of any Mathematical Programming Software (MPS) formatted input generator code. Then, any general purpose mathematical optimizer can be used for the solution. The example below illustrates how this approximation can be used in linearization of the square

Page 5: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 77

root of a sum of binary variables. The expression (9) below is typically encountered in binary CCP problems,

( 103.7~; + 112.5~; + 68.5~; + 76x,2 + 40x,;

+102x,2 + 61x; + 75x,2 + 14x,2 + 36r;~)“’ (9)

where X,,..., Xi, E [O.l]. Application of Naslund’s approximation (8) to the above nonlinear expression results in the following linear form:

2.05X, + 2.24X, + 1.34X, + 1.49X, + 0.77X, + 2.02X,

+ 1.19X, + 1.47X, + 0.26X, + 0.69X,, + 12.72. (10)

The constant 12.72 has to be ignored if (9) is part of an objective function and carried over to the right-hand side if (9) is part of a constraint. If all X’s are 0, then (10) equals 12.72. This represents the largest error value between the original expression (9) and its approximation (10). If all X’s are l’s, then (9) equals to 26.2431, while (10) results in 26.2400 with an error rate of 0.012%. In project selection type decisions, the number of variables are often large. The approximation performs better as the number of variables in- creases. A code is used in comparing the actual values of (9) with those of

(10) for all 1,024 combinations of the ten O-l variables: The average amount of error is 8.9% over all combinations, but the error rate falls rapidly as the number of l’s in a given combination is increased. For example,

l The average error is 7.93% if there are at least two 1’s. l The average error is 3.30% if there are at least five 1’s. l The average error is 2.00% if there are at least seven 1’s.

3. LITERATURE REVIEW ON CCP SOLUTIONS

It is not the intent here to review the entire field of CCP, but it will be noted that CCP is often described as one of the major components under the topic of stochastic programming. Although there are many references in CCP itself, not all references provide examples with clear solution methods. Hillier [2] provides a mathematical method to solve problems similar to ones considered here. Olson and Swenseth [6] p rovide another approximation that can only be used with continuous decision variables. Rakes et al. [8] use a separable programming method in solving expressions like (5) in their pro- duction planning example that has three decision variables and three time

Page 6: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

78 HikEYiN SARPER

periods. Seppala and Orpana [ll] present a very different approximation method that converts (5) into linear form by creating many additional supporting constraints. Recently, Weintraub and Vera [I31 proposed a conver- gent cutting plane algorithm to solve the deterministic equivalent of CCP’s when the coefficients are normal random variables. This new method is applicable strictly for the non-integer variables case. The authors compare the accuracy of their method with nonlinear solutions found by using the MINOS package, which is one of the packages reviewed in 1123. Without referencing Panne and Popp [7], Weintraub and Vera [13] themselves show that con- straints of the form of (5) do indeed f orm a convex set, thus enabling the use of NLP codes such as MINOS and GINO. [13] does not refer to [5] either while reviewing other solution methods such as [6, 111.

It is clear that the value of Naslund’s approximation (8) has not been not been noticed much. This approximation appears to be the only direct means of solving binary CCP’s. There are only two references that use (8) in making decisions. De, et al. [l] and Keown and Taylor [3] are the ones who have used (8) in solving the nonlinear programs encountered once the deterministic equivalent of the CCP are found. Both references use relatively small examples. There is no reference in the literature that seeks to evaluate the accuracy of Naslund’s approximation other than Naslund [5] himself while geometrically explaining the accuracy of his method using small examples.

4. EXAMPLES

This section uses two sample problems to evaluate the performance of Naslund’s approximation. These problems are similar to project selection problems with the goal of maximization of net present value subject to various resource constraints.

4.1. Small Problem

Consider the following project selection problem in which available re- sources and the required resources for each project are both normal and independent variables. The objective function’s coefficient, initially, are de- terministic,

subject to:

Maximize [10X, + 15X, + 20X, + 14X,] (II)

(100;5)X, + (150;6)X, + (215;8)X, + (85;3)X, < (50095) (12)

(25;2)X, + (15;2)X, + (10;2)X, + (35;3)X, < (74;4) (13)

Page 7: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 79

(40;3)X, + (OS;O.l)X, + (20;2)X, + (&1)X, < (60;5) (14)

Xj E [0, l] (Case 1)

0 < X < 1 (Case 2)

Xj 2 0 (Case 3).

Case 1 is the most typical scenario encountered in decision making. The decision maker needs to know which projects to accept and which ones to reject. There is little distinction between the nonlinear and linear programs where binary choice is concerned. All binary problems can be written in linear form though sometimes at the cost of new extra variables. This case is still considered since it is very important in the decision process. Case 2 is also relevant in decision making by allowing the decision maker to round up or down the resulting Xi’s if there is no easy way solving the resulting nonlinear program in Case 1. In fact, this is exactly what current nonlinear programming software cannot accomplish. Case 3 could be applicable when Xj’s represent product-mix type decisions, rather than Yes/No type project selection decisions.

The numbers in each parenthesis (12)-(14) indicate the mean and the standard deviation of the respective coefficient. If management requires that each constraint should have at least 99% probability of not being violated, the constraints take the following deterministic equivalent, but nonlinear form. (5) is used in converting (E-(14) into (15)-(17).

100X, + 150X, + 215X, + 85X,

+ 2.33(25X; + 36X; + 64X; + 9X3,2 + 225)i” < 500 (15)

25X, + 15X, + 10X, + 34X,

+ 2.33(4X; + 4X; + 4X,2 + 9X3,2 + 16)i” < 74 (16)

40x, + 0.5x, + 20X, + 5x,

+ 233(9X,z + 0.01X; + 4X; + X4” + 25)“’ < 60 (17)

Figure A.1 shows the convex set formed when (15) is plotted after dropping variables X, and X,. The linearized form of the constraints

Page 8: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

80 HikEYiN SARPER

FIG. A.l. 2251”“.

Plot of the constraint Y =f(X,, X,) = 100X, + 150X, + 2.33[25X; + 36X; +

(15)~(17) using Naslund’s approximation (8) are shown below.

101,56X, + 152.27X, + 219.13X3, + 85.56X, < 464.37 (18)

25.79X, + 15.79X, + 10.79X, + 36.84X, < 64.03 (19)

41.79X3, + 0.50X, + 20.77X3, + 5.19X, < 48.19 (20)

Using LINDO [lo], Table 1 h s ows the solutions found under each of the three cases using both methods. It is clear that the approximation (8) performs very well and it actually allows one to solve binary CCP’s. Nonlinear general integers, not even the binary, are among the hard problems in Operations Research and there is no known software that can solve them when the problem size gets very large. Next, the objective function is made stochastic by letting each Xj have a standard deviation that is equal to 10%

Page 9: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 81

TABLE 1 COMPARISON USING THE SMALL EXAMPLE

Case 1 Case 2 Case 3

Naslund’s (*) Naslund’s GINO Naslund’s GINO

Xl 0 0 0.143851 0.087032 0 0 X, 1 1 1.0 1.0 0 0.542023 X3 1 1 1.0 1.0 1.626532 1.289455 X, 1 1 0.915855 0.984935 1.261664 1.148632 Z 49 49 49.26 49.66 50.19 50.00

* This column is found by complete enumeration of Equations 11, 15, 16, and 17. GINO, at present, does not handle integer models.

of the mean value shown in (11). The nonlinear objective function is shown in (21) by letting k, = k, = 1.

Maximize

[10X, + 15X, + 20X, + 14X, + (X; + 2.25X; + 4X; + 1.96X;)i”]

(21)

Figure A.2 shows another convex set formed by plotting (21) after dropping variables X, and X,. Both Figures A.1 and A.2 are convex although both appear almost as planes. This is not unexpected as each expression is the sum of linear and some nonlinear components. As noted before, convexity is proved in [7] and [13]. Using (81, th e 1 inearized form of the deterministic equivalent of the objective function (21) is shown in (22) below.

10.1695X, + 15.3966X, + 20.7523X, + 14.3422X, + 1.374 (22)

Table 2 compares the solutions for the same problem when the objective function is stochastic. Again, the results are essentially the same.

4.2. Large Problem

A 14 X 40 matrix is used. The mean of each aij is randomly chosen within the range of 1 to 99. Table A.1 shows this data. The standard deviation of each aij is set at 10% (not shown) of the mean value. The cj’s (net present values or project benefits) are deterministic and shown in Table A.2. The mean of the right-hand sides, b,‘s, are shown in Table A.3. For each row, the

Page 10: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

82 HiiSEYiN SARPER

FK;. A.2 Plot of the objective function Y =fCX,, X,1 = 10X, + 15X, + CX,” + 2.25X2)“2. 2

TABLE 2

COMPARISON USING THE SMALL EXAMPLE WITH RANDOM Cj’S

Case 1 Case 2 Case 3

Naslund’s (*) Naslund’s GINO Naslund’s GINO

Xl 0 0 0.143851 0.087029 0 0 x, 1 1 1.0 1.0 0 0

x3 1 1 1.0 1.0 1.626532 1.606930 & 1 1 0.915855 0.984939 1.261664 1.266324

z 51.87 51.87 52.12 52.52 53.22 53.54

* This column is found by complete enumeration of Equations 15, 16,

17, and 22.

Page 11: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

TABLE A.1

14 x 40 HANDOMLY

PIC:KF,I, MEAN

ai, COEFFICIENT

MATHIX

b

8.

Cohmns

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

2.

3"

Row1

16

37

08

99

12

34

23

64

36

35

68

90

35

60

97

09

34

33

50

50

%

Row2

66

31

85

63

73

24

38

22

50

13

36

91

58

39

98

91

49

45

23

68

R

_.

Row3

08

11

83

88

99

45

43

36

46

46

70

32

12

92

76

86

46

16

28

35

G-z

Row4

65

80

74

69

09

40

51

54

16

68

45

96

33

94

75

08

99

23

37

08

Row5

91

5

80

44

12

63

83

77

05

15

40

43

34

44

48

80

33

69

45

98

26

Row6

61

15

94

42

23

89

20

69

45

96

33

83

77

03

68

58

70

73

41

35

g

Row7

04

35

59

46

32

05

15

40

43

34

44

89

20

14

03

33

40

42

05

08

2

Row8

69

19

45

94

98

69

31

97

05

59

02

35

58

04

49

35

24

94

45

86

Row9

33

80

79

18

74

40

44

01

76

64

19

09

80

10

25

61

96

27

93

35

5

s.

Row10

54

11

48

69

07

34

45

02

05

03

14

39

06

71

24

72

23

28

72

95

&

Qz

Row11

49

41

38

87

63

86

87

17

17

77

66

14

68

46

05

88

52

36

01

39

Row12

79

19

76

35

58

26

85

26

95

67

97

73

75

86

77

28

14

40

34

88

Row13

40

44

01

10

51

64

26

45

01

87

20

01

19

39

09

47

34

07

35

44

Row14

82

16

15

01

84

36

45

41

96

71

98

77

80

77

55

73

22

70

97

79

Columns

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

Row1

07

34

24

23

38

64

36

35

26

45

74

77

74

51

92

43

37

37

29

65

Row2

47

68

90

35

22

50

13

36

39

45

95

93

42

58

26

65

27

31

53

39

Row3

54

91

58

45

43

36

46

46

15

47

44

52

66

95

27

07

99

21

36

78

Row4

09

32

12

40

51

59

54

16

38

48

82

39

61

01

18

33

21

60

15

94

Row5

94

68

45

96

33

83

77

05

66

46

16

28

35

54

94

75

08

70

99

23

Row6

53

56

15

40

43

34

44

89

37

08

92

48

42

58

26

04

27

22

05

52

Row7

37

20

69

31

97

05

59

02

28

25

62

90

10

33

93

33

78

39

56

01

Row 8

25

35

58

40

91

04

47

43

06

70

61

74

29

41

85

39

41

21

18

38

Row9

33

68

98

74

52

87

03

34

32

97

92

65

75

53

14

03

33

12

40

42

Row10

29

42

06

64

13

71

82

42

05

08

23

41

13

74

67

78

18

14

47

54

Row11

22

39

88

99

33

08

94

70

06

10

68

71

17

78

17

60

97

51

09

34

Row12

88

45

01

25

20

96

93

23

33

50

50

07

39

98

94

86

43

09

19

94

Row13

13

48

36

71

24

68

54

34

36

16

81

68

51

34

88

88

15

08

53

01

Row14

01

19

52

05

14

07

94

98

54

03

54

56

05

01

45

11

76

89

18

63

Page 12: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

TA

BL

E A

.2

OB

JEC

TIV

E

FU

NC

TIO

N

CO

EF

FIC

IEN

TS

FO

H T

IIE

1.

AH

C.E

EX

AM

PLE

x x

x x

x x

x x

x x

x x

x x

x x

x x

x x

1 2

3 4

5 6

7 8

9 10

11

12

13

14

15

16

17

18

19

20

450

789

345

123

567

156

150

0 18

9 12

3 26

7 35

6 12

0 28

9 14

5 15

3 16

7 25

6 22

0 48

9

x x

x x

x x

x x

x x

x x

x x

x x

x x

x x

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

345

463

367

426

320

429

385

261

157

506

422

439

435

233

357

236

232

528

284

294

TA

BL

E A

.3

ME

AN

R

IGH

T

IIA

ND

S

IDE

(H

HS

) V

ALU

ES

(H

ES

OU

HC

E

LIM

ITS

) FO

R

THE

LA

HC

E

EX

AM

PLE

Row

1

2 3

4 5

6 7

8 9

10

11

12

13

14

RII

S

1080

.6

1222

.2

1202

.4

1096

.2

1287

.0

1134

.0

887.

4 11

30.4

11

82.4

90

7.8

1168

.8

1314

.6

906.

6 11

88

Page 13: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 85

mean hi’s were set at 60% of the sum of the requirements shown in Figure A.l. The standard deviation of each bi was set at 15% of the mean. The above data corresponds to a project selection with 40 competing projects. The optimal project mix is to be selected under 14 different resource constraints when aij’s and b,‘s are independent normal random variables.

1. Version 1. This version of large problem contains the first five con- straints and the first 30 variables of the problem shown in Tables A.l, A.2, and A.3. Since Case 1 or binary solution is not possible for the nonlinear deterministic equivalents, it was decided to enumerate all 230 possible binary combinations of decision variables Xi, X,, . , X3,. A FORTRAN code was used in checking the feasibility of each combination in constraints similar to (l5)-(17) and selecting the combination that maximized the objective func- tion. This process took about 79 hours of CPU time on a PRIME 6150 mainframe computer operating at eight million operations per second. The deterministic equivalents were also approximated using (8) and solved on LINDO [lo] quickly. U d n er both methods, identical project mix

(X,, X,, X.5, X,, Xi,, X,,, X,,, X,,, X,,, X,,, X,,, X,,, X,,, X,,, x3,) with

objective function value of 6,286 was selected. 2. Version 2. This version involves the full data. Case 1 is not solvable

using GINO. Table 3 shows the optimal values found under both methods. Case 1 using the approximation (8) results in selection of projects No. 1, 2, 5, 12, 20, 22, 24, 25, 26, 27, 28, 30, 33, 37, and 38. Case 2, using GINO, resulted in exact selection of projects No. 2, 5, 12, 20, 21, 22, 26, 27, 28, 30, 32, 33, and 38. The corresponding decision variables for these projects were found as 1.0 by GINO although each was restricted as X3 < 1. X, = 0.748, X, = 0.692, and X3T = 0.501 were also found. If management decides to round up all X’s greater than 0.50 to 1.0, projects No. 1, 25, and 37 could also be selected. Then, projects No. 21, 24, 25, and 32 are the only ones not selected under both scenarios (Case 1 using approximation and Case 2 using

TABLE 3

COMPARISON OFO~IMALVALUES USINGTHE L,ARGEEXAMPLE(VERSI~N 2)

Optimal Value Found Via

Method Naslund’s GINO

Case 1

Case 2

Case 3

6,636.OO

6,851.OO

9,087.OO

not

solvable

6,847.95

8,901.27

Page 14: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

86 HiiSEYiN SARPER

GINO). Comparison of strictly Case 2 solutions under each method provide almost identical values. Finally, comparison of Case 3 solutions is as follows: X, = 6.52, (app roximation) vs 6.32 (GINO), X, = 0.38 vs 0.43, X,, = 4.96 vs 4.83, X,, = 0.60 vs 0.54, and X,, = 2.21 vs 2.25.

5. CONCLUSION

This note shows that little used Naslund’s approximation is a very good tool for quick solution of CCP’s. Project selection problems involving few decision variables can be solved even by complete enumeration. As the number of decision variables grows, enumeration becomes prohibitive. The set of projects selected via Naslund’s approximation can also be used as lower bound or a trial solution in a branch and bound based implicit enumeration scheme. Case 3 solutions obtained from the approximation may also be useful when the number of continuous decision variables is in excess of the current limit of the particular NLP software available.

REFERENCES

1

2

3

4

5

9

10

P. K. De, D. Achamyam, and K. C. Sahu, A chance-constrained goal program- ming model for capital budgeting, J. Oper. Res. Sot. ]upan 33:635-638 (1982). F. S. Hillier, Chance-constrained programming with zero-one or bounded contin-

uous decision variables, Management Sci. 14:34-57 (1967).

A. J. Keown and B. W. Taylor, A chance-constrained integer goal programming

model for capital budgeting in the production area, 1. Oper. Res. Sot. 31:579-589

(1980). J. Liebman, L. Lasdon, L. Schrage, and A. Waren, Modeling and Optimization

With GINO, The Scientific Press, Redwood City, California, 1986.

B. Naslund, A model of capital budgeting under risk, Journal of Business

39:257-271 (1966).

D. L. Olson and S. R. Swenseth, A linear approximation for chance-constrained programming, J. Oper. Res. Sot. 38261-267 (1987).

V. D. Panne and W. Popp, Minimum cost cattle feed under probabilistic protein constraints, Management Sci. 9:405-430 (1962).

R. T. Rakes, L. S. Franz, and J. A. Wynne, Aggregate production planning using chance-constrained goal programming, International Journal of Production Re-

search 22:673-784 (1984). S. Rao, Optimization Theory and Applications, 2nd ed., J. Wiley & Sons, New

York, 1984. L. Schrage, Linear, Interactive, and Discrete Optimizer (LINDO), 4th ed., The Scientific Press, Redwood City, California, 1989.

Page 15: Evaluation of the accuracy of Naslund's approximation for managerial decision making under uncertainty

Decision Making under Uncertainty 87

11 Y. Sepp&i and T. Orpana, Experimental study on the efficiency and accuracy of a

chance-constrained programming algorithm, European J. Oper. Res. 16:345-357 (1984).

12 E. Wasil, B. Golden, and L. Liu, State-of-the-art in nonlinear optimization software for the microcomputer, Comput. Oper. Res. 16:497-512 (1989).

13 A. Weintraub, and J. Vera, A cutting plane approach for chance constrained linear programs, Oper. Res. 39:776-785 (1991).