39
Instructions on Using the to ( Building a prediction Mod Step 1: Enter Your Data Usually one builds prediction model with 1 output only. If you have say, 2 output variables Y1 and Y2, both of which depend on the sa you may be better off, building 2 separate models - One with Y1 as Output, an Make sure that the number of Input (Cat & Cont) columns exactly match with t Application will replace it by the column mean Application will reaplce it by the most frequently occuring category If one of the category of a Cat column has only 1 observation, you Remove that observation OR Rename the category to any other categories of that Cat col Step 2: Fill up Model Inputs Step 3: Results of Modeling At the end of the run, the final set of weights are saved in the Calc sheet. as the training of the model progresses. Two charts showing training and Vali have been already provided in the Output sheet. will be created containing the model inputs, your data, and the fitted model You will be able to use this file as a calculator to do prediction, given any Step 4: Study Profiles Profile plot is the next best way to visualize this fitted surface. (A) Enter your data in The Data worksheet, starting from the cell AC105 (B) The observations should be in rows and the variables should be in columns (C) Above each column, choose appropriate Type (Omit, Output, Cont, Cat) To drop a column from model - set the type = Omit To treat a column as categorical Input, set type = Cat To treat a column as continuous Input, set type = Cont To treat a column as Output, set type = Output You can have atmost 10 output variables. Application will automatically treat You can have at most 50 input variables, out of which atmost 40 could be cate (D) Please make sure that your data does not have blank rows or blank columns (E) Continuous Inputs: Any non-number in Cont column will be treated as missing value. (E) Categorical Inputs: Any blank cell or cells containing Excel error in Cat column will be Category labels are case insensitive - lables good, Good, GoOd, GOOD There should be at least 2 observations in each category of a Cat co (A) Fill up the model inputs in the User Input Page. (B) Make sure that your inputs are within the range of values allowed by the (C) Click the 'Build Model' button to start modeling. (A) A Neural Network model is basically a set of weights between the layers o (B) The output page of this file will show you the values of MSE and ARE on t (C) In UserInput page if you have asked to save the model in a separate file, Fitted model is a surface in p-dimension where the number of your inputs is p Unless p is 2 or less, it is not possible to show the surface graphically.

NNpred

Embed Size (px)

Citation preview

Page 1: NNpred

Instructions on Using the tool ( Building a prediction Model)

Step 1: Enter Your Data

Usually one builds prediction model with 1 output only. If you have say, 2 output variables Y1 and Y2, both of which depend on the same set of Input variables, you may be better off, building 2 separate models - One with Y1 as Output, another one with Y2 as output.

Make sure that the number of Input (Cat & Cont) columns exactly match with the number entered in UserInput sheet.

Application will replace it by the column mean

Application will reaplce it by the most frequently occuring category.

If one of the category of a Cat column has only 1 observation, you should do one of the following - Remove that observation ORRename the category to any other categories of that Cat column.

Step 2: Fill up Model Inputs

Step 3: Results of Modeling

At the end of the run, the final set of weights are saved in the Calc sheet.

as the training of the model progresses. Two charts showing training and Validation MSE'shave been already provided in the Output sheet.

will be created containing the model inputs, your data, and the fitted model ( i.e. the weights)You will be able to use this file as a calculator to do prediction, given any new input.

Step 4: Study Profiles

Profile plot is the next best way to visualize this fitted surface.

(A) Enter your data in The Data worksheet, starting from the cell AC105(B) The observations should be in rows and the variables should be in columns.(C) Above each column, choose appropriate Type (Omit, Output, Cont, Cat)

To drop a column from model - set the type = OmitTo treat a column as categorical Input, set type = CatTo treat a column as continuous Input, set type = ContTo treat a column as Output, set type = Output

You can have atmost 10 output variables. Application will automatically treat them all as continuous variables.

You can have at most 50 input variables, out of which atmost 40 could be categorical.

(D) Please make sure that your data does not have blank rows or blank columns.(E) Continuous Inputs:

Any non-number in Cont column will be treated as missing value.

(E) Categorical Inputs: Any blank cell or cells containing Excel error in Cat column will be treated as missing value

Category labels are case insensitive - lables good, Good, GoOd, GOOD will all be treated as the same categoryThere should be at least 2 observations in each category of a Cat column.

(A) Fill up the model inputs in the User Input Page.(B) Make sure that your inputs are within the range of values allowed by the application.(C) Click the 'Build Model' button to start modeling.

(A) A Neural Network model is basically a set of weights between the layers of the net.

(B) The output page of this file will show you the values of MSE and ARE on the training and validation set

(C) In UserInput page if you have asked to save the model in a separate file, then a new file

Fitted model is a surface in p-dimension where the number of your inputs is p.Unless p is 2 or less, it is not possible to show the surface graphically.

Page 2: NNpred

we get the profile plot - which is really a one dimensional cross section of the high dimensional surface.In the Profile sheet you can specify which predictor to vary and the values at which the other predictors should be held fixed.

If the predictor you choose to vary is categorical then the other info ( #points to be generated, start and end values)will be ignored and the graph will show you the predicted response for each category of the predictor you have chosen to vary.

Profile plot lets you study the following things:

( E.g. Y increases as X increases OR Y decreases as X increases OR the relationship is non-linear - Y first increases and then decreases with X etc etc.

Suppose there are two predictors X and Z and we are studying the profile of Y as X variesSuppose we look at the profile by keeping Z fixed at 1 and varying X between -10 and 10.Now keep Z fixed at 2 instead of 1 and vary X between -10 and 10. If the shape of the profiles in these two scenarios are drastically different (e.g. one is increasing and the other is decreasing) then that says thay X and Z has interaction. In other words, the effect of X on the Response is not same at all levels of ZTo study the effect of X, it matters where Z is set.

A few more points …

Initial weightsFor the training of the model, we need to start with an initial set of values of the network weights.

Next time you want to train a model with same architecture and same data, the application will ask you whether to start with the weights already saved in Calc sheet.

Specifying your choice of starting weights is a bit non-trivial for this application. Here is how you do it.

This will just setup the Calc page without doing any training.Now go to Calc sheet and write down your choice of weights in the appropriate places of the weight matrices.

Now come back to UserInput sheet and specify the number of trining cycles you want and click on the Buil Model button.When the application asks whether to use the already saved weights, click on the YES button.Now your network will be trained with the starting weights specified by you.

By varying only one predictor between two values and keeping all the others fixed at some pre-specified values

Click Create Profile button to generate the profile.

(1) Nature of relationship bettween a particular predictor X and the response Y

(2) Profile plots also lets you study the interaction between predictors.

By default, the weights are initialized with random values between -w and w.where w is a number between 0 and 1, specified by you in the UserInput page. (A) Once you build a model, the final weights are stored in Calc page.

If you say YES, these wights are used. If you say NO, the weights are re-initialized with random values.(B) Instead of starting with ramdom weights, you may want to start with our own choice of weights.

Specify the inputs in the UserInput page and specify the number of training cycle as 0

Page 3: NNpred

# Missing ValueMinMaxAverage

If you have say, 2 output variables Y1 and Y2, both of which depend on the same set of Input variables, sdyou may be better off, building 2 separate models - One with Y1 as Output, another one with Y2 as output. Intercept

Slope

Make sure that the number of Input (Cat & Cont) columns exactly match with the number entered in UserInput sheet.

If one of the category of a Cat column has only 1 observation, you should do one of the following -

Application will automatically treat them all as continuous variables.

column will be treated as missing value

GOOD will all be treated as the same category

The output page of this file will show you the values of MSE and ARE on the training and validation set

Page 4: NNpred

we get the profile plot - which is really a one dimensional cross section of the high dimensional surface.In the Profile sheet you can specify which predictor to vary and the values at which the other predictors should be held fixed.

If the predictor you choose to vary is categorical then the other info ( #points to be generated, start and end values)will be ignored and the graph will show you the predicted response for each category of the predictor you have chosen to vary.

OR the relationship is non-linear - Y first increases and then decreases with X etc etc.

Suppose there are two predictors X and Z and we are studying the profile of Y as X variesSuppose we look at the profile by keeping Z fixed at 1 and varying X between -10 and 10.

(e.g. one is increasing and the other is decreasing) then that says thay X and Z has interaction.

For the training of the model, we need to start with an initial set of values of the network weights.

Specifying your choice of starting weights is a bit non-trivial for this application. Here is how you do it.

Now go to Calc sheet and write down your choice of weights in the appropriate places of the weight matrices.Now come back to UserInput sheet and specify the number of trining cycles you want and click on the Buil Model button.

predictor between two values and keeping all the others fixed at some pre-specified values

, the weights are re-initialized with random values.Instead of starting with ramdom weights, you may want to start with our own choice of weights.

specify the number of training cycle as 0.

Page 5: NNpred

Cont. Var. Cat. Var. Values Dummy# Missing Value #Levels

Lables

Page 6: NNpred

From very last cycle 1 2With least Training Error 2

With least Validation Error 3 1

Partition data into Training / Validation set 1Use whole data as training set 2

Page 7: NNpred

Network ArchitectureOptions

2

2

0.7 Initial Wt Range ( 0 +/- w): w =

0.5

Training Options13

Present Inputs in Random order while Training ? NO

Save Network weights With least Training Error

Training / Validation Set Partition data into Training / Validation set

If you want to partition, how do you want to select the Validation set ? Please choose one option 1 Option 1 : Randomly selectPlease fill up the input necessary for the selected option Option 2: Use last

Save model in a separate workbook? NO

Number of Inputs ( bewtween 2 and 50) Number of Outputs

Number of Hidden Layers ( 1 or 2 ) Hidden Layer sizes

Learning parameter (between 0 and 1)

Momentum (between 0 and 1)

Total #rows in your data ( Minimum 10 ) No. of Training cycles

Training Mode

Page 8: NNpred

1

Hidden 1 Hidden 26 3

Initial Wt Range ( 0 +/- w): w = 0.99

500

Sequential

Partition data into Training / Validation set

Option 1 : Randomly select 10%Option 2: Use last 6 rows of the data as validation set

Number of Outputs ( between 1 and 10 )

Hidden Layer sizes ( Maximum 20 )

No. of Training cycles ( Maximum 500 )

Training Mode (Batch or Sequential )

of data as Validation set (between 1% and 50%)

Page 9: NNpred

Enter your Data in this sheetInstructions:

Make sure that the row 104 is blank.Specify variable type in row 102.

For each continuous Input, there will be 1 neuron in Input Layer.

Var Type Cont Cont Output Omit

Var Name X1 X2 Y0 1 2.651 -2 14.12 3 30.853 5 76.75

-1 -5 62.75-2 6 88.4-3 0 -10.5-4 1 -11.355 2 28.13 10 275.51 2 14.1

-1 -10 261.55 5 83.75

Start Entering your data from cell AC105.

Cont - for continuous Input, Cat - for Categorical Input, Output -for Output var. Omit - if you don't want to usethe variable in the model

For Each categorical Input with K levels, there will be K neurons in Input LayerPlease make sure that there are no more than 50 neurons in Input Layer.There should be at most 10 Output variables - application will treat them all as Continuous.There should be no more than 40 Categorical Input Variables.

Page 10: NNpred

Specify variable name in row 103.

Omit Omit Omit Omit Omit

- application will treat them all as Continuous.

Page 11: NNpred

Omit Omit Omit Omit Omit

Page 12: NNpred

Omit Omit Omit Omit Omit

X18 X19

Page 13: NNpred

Omit Omit Omit Omit Omit

X20 X21 X22 X23 X24

Page 14: NNpred

Omit Omit Omit Omit Omit

X25 X26 X27 X28 X29

Page 15: NNpred

Omit Omit Omit Omit Omit

X30 X31 X32 X33 X34

Page 16: NNpred

Omit Omit Omit Omit Omit

X35 X36 X37 X38 X39

Page 17: NNpred

Omit Omit Omit Omit Omit

X40 X41 X42 X43 X44

Page 18: NNpred

Omit Omit Omit Omit Omit

X45 X46 X47 X48 X49

Page 19: NNpred

Omit Omit Omit Omit Omit

X50 X51 X52 X53 X54

Page 20: NNpred

Omit Omit Omit Omit Omit

X55 X56 X57 X58 X59

Page 21: NNpred

Omit

X60

Page 22: NNpred

Neural Network Model for Prediction Created On : 6-Nov-07

MSE(Training) 154.621 MSE(Validation) 267.5323

Number of Hidden Layers 2Layer Sizes 2 6 3 1

True Output (if available) RMSE 9.6373Model (Predicted) Output 9.6373ABS( (Tru - Predicted) / Tru ) #DIV/0!

Cont ContBias X1 X2

Raw Input 1Bias X1 X2

Transformed Input 1 0.4444 0.5000Hdn1_bias 0.0000 0.0000 0.0000 0.0000Hdn1_Nrn1 0.0524 2.5165 -0.4399 0.9509Hdn1_Nrn2 1.2854 0.1413 1.9945 2.3454Hdn1_Nrn3 -0.7819 -0.7233 -2.3057 -2.2563Hdn1_Nrn4 2.2004 -0.9966 -10.6639 -3.5744Hdn1_Nrn5 -0.1531 0.6052 -1.8043 -0.7863Hdn1_Nrn6 -7.2927 0.2966 8.4806 -2.9206

1.0000 0.7213 0.9126 0.0948Hdn2_bias 0.0000 0.0000 0.0000 0.0000Hdn2_Nrn1 -0.1402 -1.2248 -0.7537 0.7570Hdn2_Nrn2 -0.4941 0.5985 -1.1056 0.4482Hdn2_Nrn3 0.3762 0.2046 1.1232 -0.2514

1.0000 0.1376 0.2649 0.7978Op_bias 0.0000 0.0000 0.0000 0.0000Op_Nrn1 0.0699 -0.6589 4.2641 -4.5722

1.0000 0.0732

Output - Predicted by the model

Enter your Inputs in the range AG115:AH115 - the cells marked in green.

AF110
Output - Predicted by the model
AH115
Enter your Inputs in the range AG115:AH115 - the cells marked in green.
Page 23: NNpred

ARE #DIV/0!

0.0273 0.3130 0.05110.0000 0.0000 0.0000 0.0000

-0.8817 -0.5728 0.1543 -1.83513.0189 -0.7146 2.9264 -1.0206

-3.2625 0.3926 -3.6364 1.3728

0.0000-2.5391

Enter your Inputs in the range AG115:AH115 - the cells marked in green.

Page 24: NNpred

500

Page 25: NNpred

EpochMSE (Original Scale) ARE (%) MSE (Original Scale) ARE (%)

1 9138.942 395.89% 3049.480 347.69%2 9110.262 404.66% 3148.870 353.08%3 9100.245 408.04% 3188.305 355.15%4 9096.318 409.40% 3204.333 355.99%5 9094.499 410.02% 3211.658 356.37%6 9093.431 410.37% 3215.788 356.58%7 9092.640 410.63% 3218.738 356.73%8 9091.952 410.84% 3221.247 356.86%9 9091.309 411.04% 3223.580 356.98%10 9090.687 411.24% 3225.835 357.10%11 9090.077 411.43% 3228.051 357.21%12 9089.476 411.61% 3230.236 357.33%13 9088.883 411.80% 3232.399 357.44%14 9088.300 411.98% 3234.539 357.55%15 9087.724 412.17% 3236.658 357.66%16 9087.155 412.35% 3238.756 357.76%17 9086.592 412.53% 3240.832 357.87%18 9086.038 412.70% 3242.886 357.98%19 9085.491 412.88% 3244.918 358.08%20 9084.951 413.05% 3246.928 358.18%21 9084.418 413.22% 3248.917 358.28%22 9083.891 413.39% 3250.884 358.38%23 9083.374 413.56% 3252.828 358.48%24 9082.859 413.72% 3254.750 358.58%25 9082.354 413.89% 3256.649 358.68%26 9081.854 414.05% 3258.527 358.77%27 9081.360 414.21% 3260.381 358.87%28 9080.876 414.37% 3262.214 358.96%29 9080.395 414.52% 3264.023 359.05%30 9079.920 414.68% 3265.810 359.14%31 9079.450 414.83% 3267.574 359.23%32 9078.989 414.98% 3269.317 359.32%33 9078.532 415.13% 3271.036 359.41%34 9078.082 415.28% 3272.732 359.49%35 9077.638 415.43% 3274.406 359.58%36 9077.199 415.57% 3276.058 359.66%37 9076.765 415.71% 3277.687 359.74%38 9076.337 415.85% 3279.294 359.82%39 9075.913 415.99% 3280.877 359.90%40 9075.496 416.13% 3282.439 359.98%41 9075.084 416.26% 3283.979 360.06%42 9074.677 416.40% 3285.497 360.13%43 9074.273 416.53% 3286.992 360.21%44 9073.875 416.66% 3288.466 360.28%45 9073.483 416.79% 3289.917 360.35%46 9073.095 416.91% 3291.347 360.42%47 9072.713 417.04% 3292.755 360.49%

Avg. error per Input (Original Scale)

(Training Set)

Avg. error per Input (Original Scale)

(Validation Set)

0 100 200 300 400 500 6000.000

1000.000

2000.000

3000.000

4000.000

5000.000

6000.000

7000.000

8000.000

9000.000

10000.000

MSE (Training)

Epoch

0 100 200 300 400 500 6000.000

500.0001000.0001500.0002000.0002500.0003000.0003500.0004000.000

MSE (Validation)

Epoch

Page 26: NNpred

48 9072.334 417.16% 3294.141 360.56%49 9071.960 417.28% 3295.507 360.63%50 9071.590 417.40% 3296.851 360.70%51 9071.225 417.52% 3298.174 360.76%52 9070.862 417.64% 3299.475 360.83%53 9070.505 417.75% 3300.756 360.89%54 9070.151 417.86% 3302.017 360.95%55 9069.804 417.98% 3303.256 361.02%56 9069.457 418.09% 3304.475 361.08%57 9069.117 418.19% 3305.674 361.13%58 9068.777 418.30% 3306.852 361.19%59 9068.446 418.41% 3308.012 361.25%60 9068.115 418.51% 3309.151 361.31%61 9067.786 418.61% 3310.270 361.36%62 9067.464 418.72% 3311.370 361.41%63 9067.143 418.82% 3312.451 361.47%64 9066.826 418.91% 3313.512 361.52%65 9066.513 419.01% 3314.556 361.57%66 9066.201 419.11% 3315.580 361.62%67 9065.895 419.20% 3316.585 361.67%68 9065.588 419.29% 3317.572 361.72%69 9065.288 419.39% 3318.541 361.77%70 9064.988 419.48% 3319.492 361.81%71 9064.691 419.56% 3320.425 361.86%72 9064.398 419.65% 3321.341 361.90%73 9064.108 419.74% 3322.240 361.94%74 9063.819 419.82% 3323.120 361.99%75 9063.534 419.91% 3323.985 362.03%76 9063.250 419.99% 3324.831 362.07%77 9062.970 420.07% 3325.662 362.11%78 9062.691 420.15% 3326.475 362.15%79 9062.415 420.23% 3327.272 362.19%80 9062.140 420.31% 3328.054 362.22%81 9061.866 420.38% 3328.819 362.26%82 9061.597 420.46% 3329.568 362.30%83 9061.328 420.53% 3330.302 362.33%84 9061.062 420.61% 3331.020 362.37%85 9060.798 420.68% 3331.724 362.40%86 9060.535 420.75% 3332.412 362.43%87 9060.274 420.82% 3333.084 362.46%88 9060.015 420.89% 3333.742 362.50%89 9059.757 420.96% 3334.386 362.53%90 9059.501 421.02% 3335.016 362.56%91 9059.246 421.09% 3335.630 362.58%92 9058.993 421.15% 3336.231 362.61%93 9058.741 421.22% 3336.819 362.64%94 9058.491 421.28% 3337.392 362.67%95 9058.242 421.34% 3337.952 362.69%96 9057.994 421.40% 3338.498 362.72%97 9057.747 421.46% 3339.030 362.74%98 9057.501 421.52% 3339.550 362.77%99 9057.257 421.58% 3340.056 362.79%

100 9057.014 421.64% 3340.550 362.81%101 9056.772 421.69% 3341.032 362.84%

Page 27: NNpred

102 9056.530 421.75% 3341.501 362.86%103 9056.290 421.80% 3341.958 362.88%104 9056.050 421.86% 3342.401 362.90%105 9055.812 421.91% 3342.833 362.92%106 9055.573 421.96% 3343.253 362.94%107 9055.335 422.01% 3343.661 362.95%108 9055.098 422.06% 3344.057 362.97%109 9054.861 422.11% 3344.442 362.99%110 9054.626 422.16% 3344.815 363.01%111 9054.391 422.21% 3345.177 363.02%112 9054.157 422.26% 3345.527 363.04%113 9053.923 422.30% 3345.867 363.05%114 9053.688 422.35% 3346.196 363.07%115 9053.455 422.40% 3346.514 363.08%116 9053.222 422.44% 3346.821 363.09%117 9052.989 422.48% 3347.117 363.11%118 9052.755 422.53% 3347.403 363.12%119 9052.522 422.57% 3347.679 363.13%120 9052.290 422.61% 3347.945 363.14%121 9052.059 422.65% 3348.198 363.15%122 9051.824 422.69% 3348.444 363.16%123 9051.593 422.73% 3348.679 363.17%124 9051.358 422.77% 3348.904 363.18%125 9051.126 422.81% 3349.120 363.19%126 9050.894 422.85% 3349.325 363.20%127 9050.660 422.88% 3349.521 363.20%128 9050.428 422.92% 3349.707 363.21%129 9050.194 422.96% 3349.884 363.22%130 9049.959 422.99% 3350.051 363.22%131 9049.725 423.03% 3350.209 363.23%132 9049.491 423.06% 3350.358 363.24%133 9049.255 423.09% 3350.498 363.24%134 9049.019 423.13% 3350.629 363.24%135 9048.782 423.16% 3350.750 363.25%136 9048.545 423.19% 3350.863 363.25%137 9048.308 423.22% 3350.968 363.25%138 9048.067 423.25% 3351.063 363.26%139 9047.830 423.28% 3351.150 363.26%140 9047.589 423.31% 3351.228 363.26%141 9047.348 423.34% 3351.297 363.26%142 9047.108 423.37% 3351.358 363.26%143 9046.865 423.40% 3351.410 363.26%144 9046.621 423.43% 3351.453 363.26%145 9046.376 423.45% 3351.489 363.26%146 9046.132 423.48% 3351.516 363.26%147 9045.884 423.51% 3351.536 363.26%148 9045.637 423.53% 3351.546 363.26%149 9045.388 423.56% 3351.549 363.25%150 9045.138 423.58% 3351.543 363.25%151 9044.887 423.61% 3351.529 363.25%152 9044.632 423.63% 3351.507 363.25%153 9044.379 423.65% 3351.478 363.24%154 9044.122 423.68% 3351.439 363.24%155 9043.864 423.70% 3351.394 363.23%

Page 28: NNpred

156 9043.605 423.72% 3351.341 363.23%157 9043.345 423.74% 3351.279 363.22%158 9043.082 423.76% 3351.210 363.22%159 9042.819 423.78% 3351.133 363.21%160 9042.554 423.80% 3351.047 363.20%161 9042.285 423.82% 3350.955 363.19%162 9042.017 423.84% 3350.854 363.19%163 9041.745 423.86% 3350.746 363.18%164 9041.473 423.88% 3350.629 363.17%165 9041.196 423.90% 3350.505 363.16%166 9040.918 423.91% 3350.374 363.15%167 9040.640 423.93% 3350.235 363.14%168 9040.357 423.95% 3350.088 363.13%169 9040.071 423.96% 3349.933 363.12%170 9039.786 423.98% 3349.770 363.11%171 9039.497 424.00% 3349.600 363.10%172 9039.205 424.01% 3349.423 363.09%173 9038.910 424.03% 3349.237 363.08%174 9038.613 424.04% 3349.044 363.07%175 9038.313 424.05% 3348.843 363.05%176 9038.012 424.07% 3348.634 363.04%177 9037.706 424.08% 3348.418 363.03%178 9037.397 424.09% 3348.194 363.01%179 9037.087 424.10% 3347.963 363.00%180 9036.772 424.12% 3347.724 362.98%181 9036.454 424.13% 3347.475 362.97%182 9036.134 424.14% 3347.221 362.95%183 9035.810 424.15% 3346.958 362.94%184 9035.483 424.16% 3346.687 362.92%185 9035.152 424.17% 3346.408 362.90%186 9034.816 424.18% 3346.122 362.89%187 9034.478 424.19% 3345.827 362.87%188 9034.135 424.20% 3345.525 362.85%189 9033.790 424.21% 3345.214 362.83%190 9033.440 424.21% 3344.896 362.81%191 9033.086 424.22% 3344.569 362.79%192 9032.728 424.23% 3344.234 362.78%193 9032.365 424.23% 3343.891 362.76%194 9031.998 424.24% 3343.540 362.73%195 9031.628 424.25% 3343.180 362.71%196 9031.253 424.25% 3342.813 362.69%197 9030.872 424.26% 3342.436 362.67%198 9030.486 424.26% 3342.051 362.65%199 9030.097 424.27% 3341.658 362.63%200 9029.700 424.27% 3341.257 362.60%201 9029.301 424.27% 3340.846 362.58%202 9028.894 424.28% 3340.426 362.56%203 9028.483 424.28% 3339.998 362.53%204 9028.066 424.28% 3339.561 362.51%205 9027.645 424.28% 3339.115 362.48%206 9027.217 424.28% 3338.660 362.45%207 9026.782 424.29% 3338.197 362.43%208 9026.341 424.29% 3337.722 362.40%209 9025.895 424.29% 3337.240 362.37%

Page 29: NNpred

210 9025.442 424.29% 3336.748 362.35%211 9024.983 424.29% 3336.246 362.32%212 9024.516 424.28% 3335.735 362.29%213 9024.044 424.28% 3335.214 362.26%214 9023.563 424.28% 3334.683 362.23%215 9023.076 424.28% 3334.143 362.20%216 9022.580 424.28% 3333.592 362.17%217 9022.078 424.27% 3333.030 362.14%218 9021.568 424.27% 3332.459 362.11%219 9021.049 424.26% 3331.877 362.07%220 9020.522 424.26% 3331.286 362.04%221 9019.987 424.25% 3330.682 362.01%222 9019.443 424.25% 3330.068 361.97%223 9018.892 424.24% 3329.442 361.94%224 9018.331 424.24% 3328.806 361.90%225 9017.759 424.23% 3328.158 361.87%226 9017.179 424.22% 3327.500 361.83%227 9016.588 424.21% 3326.829 361.79%228 9015.988 424.20% 3326.146 361.76%229 9015.378 424.19% 3325.451 361.72%230 9014.757 424.18% 3324.743 361.68%231 9014.125 424.17% 3324.024 361.64%232 9013.481 424.16% 3323.293 361.60%233 9012.827 424.15% 3322.548 361.56%234 9012.160 424.14% 3321.790 361.52%235 9011.482 424.13% 3321.018 361.47%236 9010.791 424.11% 3320.234 361.43%237 9010.089 424.10% 3319.435 361.39%238 9009.372 424.08% 3318.622 361.34%239 9008.642 424.07% 3317.795 361.30%240 9007.898 424.05% 3316.954 361.25%241 9007.142 424.04% 3316.097 361.20%242 9006.370 424.02% 3315.226 361.16%243 9005.584 424.00% 3314.339 361.11%244 9004.780 423.98% 3313.437 361.06%245 9003.963 423.96% 3312.518 361.01%246 9003.129 423.94% 3311.584 360.96%247 9002.277 423.92% 3310.632 360.91%248 9001.411 423.90% 3309.664 360.85%249 9000.524 423.88% 3308.679 360.80%250 8999.621 423.86% 3307.676 360.74%251 8998.699 423.83% 3306.654 360.69%252 8997.758 423.81% 3305.615 360.63%253 8996.798 423.78% 3304.557 360.58%254 8995.816 423.76% 3303.479 360.52%255 8994.813 423.73% 3302.382 360.46%256 8993.791 423.70% 3301.264 360.40%257 8992.746 423.67% 3300.126 360.34%258 8991.679 423.64% 3298.968 360.27%259 8990.586 423.61% 3297.787 360.21%260 8989.471 423.58% 3296.585 360.14%261 8988.329 423.54% 3295.360 360.08%262 8987.163 423.51% 3294.112 360.01%263 8985.972 423.47% 3292.841 359.94%

Page 30: NNpred

264 8984.751 423.44% 3291.544 359.87%265 8983.503 423.40% 3290.224 359.80%266 8982.226 423.36% 3288.878 359.73%267 8980.919 423.32% 3287.506 359.65%268 8979.582 423.28% 3286.107 359.58%269 8978.212 423.23% 3284.680 359.50%270 8976.810 423.19% 3283.227 359.42%271 8975.374 423.14% 3281.743 359.34%272 8973.902 423.10% 3280.231 359.26%273 8972.395 423.05% 3278.688 359.18%274 8970.848 423.00% 3277.113 359.09%275 8969.267 422.95% 3275.507 359.01%276 8967.642 422.89% 3273.868 358.92%277 8965.978 422.84% 3272.194 358.83%278 8964.270 422.78% 3270.487 358.74%279 8962.519 422.72% 3268.743 358.64%280 8960.721 422.66% 3266.961 358.55%281 8958.875 422.60% 3265.143 358.45%282 8956.981 422.53% 3263.287 358.35%283 8955.036 422.47% 3261.389 358.25%284 8953.040 422.40% 3259.451 358.15%285 8950.987 422.33% 3257.470 358.04%286 8948.880 422.25% 3255.444 357.93%287 8946.713 422.18% 3253.374 357.82%288 8944.485 422.10% 3251.257 357.71%289 8942.196 422.02% 3249.092 357.59%290 8939.840 421.93% 3246.878 357.47%291 8937.417 421.85% 3244.614 357.35%292 8934.925 421.76% 3242.295 357.23%293 8932.355 421.67% 3239.922 357.10%294 8929.713 421.57% 3237.495 356.97%295 8926.992 421.47% 3235.008 356.84%296 8924.188 421.37% 3232.462 356.70%297 8921.297 421.26% 3229.854 356.56%298 8918.319 421.16% 3227.181 356.42%299 8915.249 421.04% 3224.442 356.27%300 8912.081 420.93% 3221.635 356.12%301 8908.814 420.80% 3218.758 355.97%302 8905.443 420.68% 3215.808 355.81%303 8901.963 420.55% 3212.780 355.65%304 8898.368 420.41% 3209.676 355.48%305 8894.655 420.27% 3206.488 355.31%306 8890.819 420.13% 3203.217 355.14%307 8886.854 419.98% 3199.859 354.96%308 8882.755 419.82% 3196.410 354.77%309 8878.515 419.66% 3192.867 354.58%310 8874.128 419.49% 3189.226 354.39%311 8869.587 419.32% 3185.485 354.19%312 8864.884 419.14% 3181.637 353.98%313 8860.016 418.95% 3177.680 353.77%314 8854.970 418.76% 3173.610 353.55%315 8849.740 418.55% 3169.422 353.33%316 8844.316 418.34% 3165.111 353.10%317 8838.694 418.12% 3160.674 352.86%

Page 31: NNpred

318 8832.856 417.89% 3156.103 352.61%319 8826.798 417.66% 3151.394 352.36%320 8820.507 417.41% 3146.540 352.10%321 8813.969 417.15% 3141.538 351.83%322 8807.176 416.88% 3136.377 351.56%323 8800.112 416.60% 3131.055 351.27%324 8792.766 416.31% 3125.563 350.97%325 8785.120 416.00% 3119.893 350.67%326 8777.162 415.68% 3114.038 350.35%327 8768.873 415.35% 3107.991 350.03%328 8760.238 415.01% 3101.741 349.69%329 8751.236 414.64% 3095.280 349.34%330 8741.850 414.26% 3088.599 348.98%331 8732.061 413.87% 3081.688 348.61%332 8721.842 413.45% 3074.535 348.22%333 8711.173 413.01% 3067.132 347.82%334 8700.029 412.56% 3059.465 347.40%335 8688.386 412.08% 3051.521 346.97%336 8676.212 411.58% 3043.288 346.52%337 8663.483 411.06% 3034.752 346.06%338 8650.165 410.51% 3025.900 345.58%339 8636.226 409.93% 3016.715 345.07%340 8621.629 409.32% 3007.181 344.55%341 8606.345 408.68% 2997.283 344.01%342 8590.327 408.01% 2987.002 343.45%343 8573.539 407.31% 2976.320 342.86%344 8555.937 406.56% 2965.219 342.25%345 8537.476 405.78% 2953.677 341.61%346 8518.106 404.96% 2941.675 340.94%347 8497.784 404.09% 2929.189 340.25%348 8476.454 403.18% 2916.199 339.53%349 8454.062 402.22% 2902.680 338.77%350 8430.551 401.21% 2888.609 337.98%351 8405.864 400.14% 2873.963 337.16%352 8379.942 399.01% 2858.714 336.29%353 8352.720 397.83% 2842.839 335.39%354 8324.135 396.57% 2826.313 334.45%355 8294.124 395.25% 2809.111 333.47%356 8262.619 393.86% 2791.208 332.44%357 8229.551 392.39% 2772.580 331.36%358 8194.858 390.84% 2753.207 330.24%359 8158.470 389.21% 2733.065 329.06%360 8120.320 387.49% 2712.137 327.83%361 8080.348 385.68% 2690.406 326.55%362 8038.491 383.78% 2667.860 325.20%363 7994.686 381.78% 2644.488 323.80%364 7948.880 379.68% 2620.285 322.34%365 7901.017 377.47% 2595.250 320.81%366 7851.053 375.17% 2569.385 319.22%367 7798.941 372.75% 2542.701 317.57%368 7744.646 370.23% 2515.214 315.84%369 7688.131 367.60% 2486.943 314.06%370 7629.371 364.87% 2457.914 312.21%371 7568.343 362.03% 2428.158 310.29%

Page 32: NNpred

372 7505.028 359.08% 2397.716 308.30%373 7439.411 356.03% 2366.628 306.26%374 7371.486 352.88% 2334.943 304.15%375 7301.242 349.64% 2302.709 301.98%376 7228.675 346.30% 2269.982 299.75%377 7153.780 342.88% 2236.821 297.47%378 7076.550 339.38% 2203.282 295.14%379 6996.981 335.81% 2169.428 292.76%380 6915.063 332.16% 2135.320 290.34%381 6830.785 328.46% 2101.019 287.88%382 6744.127 324.71% 2066.590 285.38%383 6655.067 320.90% 2032.094 282.86%384 6563.577 317.05% 1997.596 280.32%385 6469.623 313.17% 1963.160 277.76%386 6373.162 309.26% 1928.853 275.20%387 6274.145 305.32% 1894.741 272.63%388 6172.516 301.36% 1860.894 270.06%389 6068.212 297.38% 1827.385 267.51%390 5961.171 293.39% 1794.291 264.97%391 5851.318 289.38% 1761.692 262.46%392 5738.585 285.35% 1729.675 259.99%393 5622.903 281.31% 1698.331 257.55%394 5504.207 277.26% 1667.759 255.17%395 5382.442 273.18% 1638.059 252.85%396 5257.563 269.09% 1609.344 250.60%397 5129.549 264.97% 1581.727 248.42%398 4998.398 260.82% 1555.324 246.34%399 4864.137 256.63% 1530.258 244.35%400 4726.832 252.41% 1506.646 242.47%401 4586.588 248.15% 1484.609 240.71%402 4443.560 243.83% 1464.259 239.07%403 4297.953 239.47% 1445.701 237.57%404 4150.036 235.06% 1429.025 236.21%405 4000.136 230.60% 1414.307 235.01%406 3848.656 226.08% 1401.595 233.96%407 3696.067 221.52% 1390.916 233.07%408 3542.917 216.92% 1382.251 232.33%409 3389.837 212.28% 1375.548 231.75%410 3237.537 207.74% 1370.698 231.32%411 3086.802 203.24% 1367.535 231.03%412 2938.483 198.72% 1365.829 230.85%413 2793.481 194.18% 1365.276 230.76%414 2652.693 189.64% 1365.504 230.73%415 2516.977 185.09% 1366.075 230.71%416 2387.077 180.57% 1366.491 230.67%417 2263.563 176.64% 1366.217 230.55%418 2146.767 172.72% 1364.698 230.31%419 2036.760 168.93% 1361.378 229.91%420 1933.354 165.73% 1355.724 229.29%421 1836.130 162.53% 1347.230 228.44%422 1744.514 159.37% 1335.423 227.31%423 1657.847 156.25% 1319.866 225.88%424 1575.460 153.21% 1300.169 224.14%425 1496.738 150.26% 1276.002 222.07%

Page 33: NNpred

426 1421.169 147.41% 1247.133 219.64%427 1348.372 144.67% 1213.469 216.86%428 1278.096 142.03% 1175.109 213.69%429 1210.210 139.45% 1132.379 210.12%430 1144.672 136.89% 1085.863 206.16%431 1081.484 134.32% 1036.383 201.80%432 1020.657 131.66% 984.934 197.07%433 962.190 128.86% 932.598 192.03%434 906.086 125.88% 880.436 186.75%435 852.387 122.70% 829.413 181.34%436 801.216 119.34% 780.369 175.93%437 752.785 115.82% 734.009 170.62%438 707.379 112.20% 690.912 165.53%439 665.287 108.56% 651.521 160.75%440 626.731 104.96% 616.128 156.32%441 591.806 101.48% 584.840 152.29%442 560.442 98.16% 557.575 148.66%443 532.409 95.05% 534.056 145.40%444 507.353 92.29% 513.856 142.50%445 484.853 89.96% 496.452 139.90%446 464.486 87.84% 481.300 137.57%447 445.875 85.91% 467.906 135.46%448 428.723 84.16% 455.866 133.54%449 412.809 82.55% 444.884 131.77%450 397.984 81.18% 434.761 130.12%451 384.147 79.99% 425.379 128.59%452 371.225 78.87% 416.660 127.15%453 359.156 77.81% 408.556 125.79%454 347.879 76.81% 401.021 124.52%455 337.331 75.87% 394.006 123.32%456 327.449 74.97% 387.461 122.19%457 318.169 74.13% 381.333 121.12%458 309.435 73.32% 375.572 120.11%459 301.198 72.56% 370.138 119.14%460 293.417 71.84% 364.997 118.23%461 286.056 71.15% 360.123 117.36%462 279.088 70.49% 355.498 116.53%463 272.485 69.86% 351.104 115.73%464 266.223 69.26% 346.927 114.97%465 260.280 68.69% 342.953 114.25%466 254.633 68.14% 339.165 113.55%467 249.260 67.61% 335.550 112.88%468 244.140 67.11% 332.092 112.24%469 239.259 66.62% 328.781 111.62%470 234.601 66.16% 325.606 111.03%471 230.152 65.71% 322.563 110.46%472 225.898 65.28% 319.639 109.91%473 221.830 64.86% 316.830 109.38%474 217.935 64.47% 314.128 108.87%475 214.203 64.08% 311.529 108.37%476 210.625 63.71% 309.024 107.89%477 207.190 63.35% 306.606 107.43%478 203.892 63.01% 304.275 106.99%479 200.722 62.68% 302.023 106.55%

Page 34: NNpred

480 197.674 62.36% 299.845 106.13%481 194.741 62.04% 297.741 105.73%482 191.919 61.74% 295.706 105.33%483 189.198 61.45% 293.734 104.95%484 186.575 61.17% 291.821 104.58%485 184.044 60.90% 289.965 104.22%486 181.601 60.64% 288.166 103.87%487 179.242 60.38% 286.418 103.52%488 176.964 60.14% 284.723 103.19%489 174.761 59.90% 283.076 102.87%490 172.631 59.66% 281.473 102.55%491 170.569 59.44% 279.915 102.25%492 168.572 59.22% 278.396 101.95%493 166.637 59.01% 276.917 101.66%494 164.761 58.80% 275.474 101.37%495 162.942 58.60% 274.068 101.10%496 161.179 58.41% 272.698 100.83%497 159.467 58.22% 271.360 100.56%498 157.805 58.03% 270.055 100.30%499 156.190 57.86% 268.779 100.05%500 154.621 57.68% 267.532 99.80%

Page 35: NNpred

0 100 200 300 400 500 6000.000

1000.000

2000.000

3000.000

4000.000

5000.000

6000.000

7000.000

8000.000

9000.000

10000.000

MSE (Training)

Epoch

0 100 200 300 400 500 6000.000

500.0001000.0001500.0002000.0002500.0003000.0003500.0004000.000

MSE (Validation)

Epoch

Page 36: NNpred

Profile plot for the fitted model

Generate profile for Generateby varying keeping the other predictors fixed at the specified values

Outputs Predictors X1 Predicted YY X1 0 0.825191

X2 0.01 0.8389260.02 0.851391 Predictor X10.03 0.862673 Fixed Value 0.6920.04 0.872862 Min / Max in Original Data (for user's reference only)0.05 0.882046 Min -4.000.06 0.890312 Max 5.000.07 0.8977420.08 0.9044130.09 0.9103980.1 0.915763

0.11 0.9205680.12 0.9248690.13 0.9287150.14 0.9321520.15 0.9352190.16 0.9379520.17 0.9403830.18 0.942540.19 0.9444480.2 0.94613

0.21 0.9476050.22 0.948890.23 0.9500010.24 0.950950.25 0.9517490.26 0.9524090.27 0.9529380.28 0.9533450.29 0.9536340.3 0.953813

0.31 0.9538850.32 0.9538550.33 0.9537250.34 0.9534990.35 0.9531780.36 0.9527630.37 0.9522550.38 0.9516550.39 0.9509620.4 0.950176

0.41 0.9492940.42 0.948317

Page 37: NNpred

0.43 0.9472420.44 0.9460660.45 0.9447880.46 0.9434030.47 0.9419090.48 0.9403020.49 0.9385760.5 0.936729

0.51 0.9347550.52 0.9326480.53 0.9304030.54 0.9280130.55 0.9254730.56 0.9227760.57 0.9199140.58 0.916880.59 0.9136660.6 0.910264

0.61 0.9066660.62 0.9028610.63 0.8988430.64 0.8946010.65 0.8901250.66 0.8854070.67 0.8804350.68 0.8752010.69 0.8696950.7 0.863907

0.71 0.8578260.72 0.8514450.73 0.8447530.74 0.8377420.75 0.8304040.76 0.8227310.77 0.8147180.78 0.8063570.79 0.7976450.8 0.788578

0.81 0.7791530.82 0.7693710.83 0.7592310.84 0.7487360.85 0.7378910.86 0.7267020.87 0.7151760.88 0.7033250.89 0.691160.9 0.678695

0.91 0.6659470.92 0.6529350.93 0.639678

Page 38: NNpred

0.94 0.6261990.95 0.6125210.96 0.598670.97 0.5846730.98 0.5705560.99 0.556349

Page 39: NNpred

Profile plot for the fitted model

Generate profile for Y100 data pointsX1 between -4 and 5

keeping the other predictors fixed at the specified values

X21.385

Min / Max in Original Data (for user's reference only)-10.0010.00