12
Research Article Quality Prediction Model Based on Novel Elman Neural Network Ensemble Lan Xu 1,2 and Yuting Zhang 1 School of Economics and Management, Jiangsu University of Science and Technology, Zhenjiang, , China Department of Mechanical and Manufacturing Engineering, University of Calgary, Calgary, Canada TN N Correspondence should be addressed to Lan Xu; [email protected] Received 24 January 2019; Revised 17 April 2019; Accepted 7 May 2019; Published 21 May 2019 Academic Editor: Danilo Comminiello Copyright © 2019 Lan Xu and Yuting Zhang. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In this paper, we propose a novel prediction algorithm based on an improved Elman neural network (NN) ensemble for quality prediction, thus achieving the quality control of designed products at the product design stage. First, the Elman NN parameters are optimized using the grasshopper optimization (GRO) method, and then the weighted average method is improved to combine the outputs of the individual NNs, where the weights are determined by the training errors. Simulations were conducted to compare the proposed method with other NN methods and evaluate its performance. e results demonstrated that the proposed algorithm for quality prediction obtained better accuracy than other NN methods. In this paper, we propose a novel Elman NN ensemble model for quality prediction during product design. Elman NN is combined with GRO to yield an optimized Elman network ensemble model with high generalization ability and prediction accuracy. 1. Introduction During product design, once the scheme design is complete, designers expect to obtain a reliable estimation of the overall characteristic index of the product system for the purpose of further adjusting the designing parameters and improving overall performance. For large complex product systems, there are many designing parameters, and complex rela- tionships, which are nonlinear and strongly coupled, oſten exist between these parameters and the product’s overall quality characteristics. erefore, establishing an organic link between designing parameters and a product’s overall quality characteristics is an essential problem that needs to be solved during product design. Once the designing parameters are determined, designers start to conduct a detailed technical design according to corresponding technical codes, in addition to their expe- rience. During this stage, differences caused by different designers and technicians only appear in part of the product and do not have a substantial influence on the quality characteristics of the entire product system. erefore, there exists a certain correspondence between the product’s quality characteristic index and the various designing parameters determined in the scheme’s design phase. However, this type of correspondence is difficult to express using an exact explicit function. For some small and simple products, the overall characteristics are determined accurately with the aid of approaches such as the finite element and modal synthesis methods. e overall characteristics can even be obtained by creating models and conducting relative exper- iments. Many quality prediction techniques are available in the literature, such as the Bayesian model [1], support vector machine [2], neural network (NN) [3–6], state-dependent parameter model [7], and deep learning techniques [8]. Yu et al. [1] proposed a Bayesian model-based multi-Kernel Gaussian process regression approach to predict the qual- ity of nonlinear batch processes with multiple operating phases. Bidar et al. [7] proposed a data-driven soſt sensor approach for online quality prediction, which uses state- dependent parameter models. rough a comparison with other methods, the proposed model is much more robust and reliable with fewer model parameters, which makes it useful for industrial applications. In view of quality prediction in Hindawi Complexity Volume 2019, Article ID 9852134, 11 pages https://doi.org/10.1155/2019/9852134

Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Research ArticleQuality Prediction Model Based on Novel Elman NeuralNetwork Ensemble

Lan Xu 12 and Yuting Zhang1

1School of Economics and Management Jiangsu University of Science and Technology Zhenjiang 212003 China2Department of Mechanical and Manufacturing Engineering University of Calgary Calgary Canada T2N 1N4

Correspondence should be addressed to Lan Xu xulan1111justeducn

Received 24 January 2019 Revised 17 April 2019 Accepted 7 May 2019 Published 21 May 2019

Academic Editor Danilo Comminiello

Copyright copy 2019 Lan Xu and Yuting Zhang This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

In this paper we propose a novel prediction algorithm based on an improved Elman neural network (NN) ensemble for qualityprediction thus achieving the quality control of designed products at the product design stage First the Elman NNparameters areoptimized using the grasshopper optimization (GRO) method and then the weighted average method is improved to combine theoutputs of the individual NNs where the weights are determined by the training errors Simulations were conducted to compare theproposed method with other NNmethods and evaluate its performanceThe results demonstrated that the proposed algorithm forquality prediction obtained better accuracy than other NNmethods In this paper we propose a novel Elman NN ensemble modelfor quality prediction during product design Elman NN is combined with GRO to yield an optimized Elman network ensemblemodel with high generalization ability and prediction accuracy

1 Introduction

During product design once the scheme design is completedesigners expect to obtain a reliable estimation of the overallcharacteristic index of the product system for the purposeof further adjusting the designing parameters and improvingoverall performance For large complex product systemsthere are many designing parameters and complex rela-tionships which are nonlinear and strongly coupled oftenexist between these parameters and the productrsquos overallquality characteristics Therefore establishing an organic linkbetween designing parameters and a productrsquos overall qualitycharacteristics is an essential problem that needs to be solvedduring product design

Once the designing parameters are determined designersstart to conduct a detailed technical design according tocorresponding technical codes in addition to their expe-rience During this stage differences caused by differentdesigners and technicians only appear in part of the productand do not have a substantial influence on the qualitycharacteristics of the entire product system Therefore thereexists a certain correspondence between the productrsquos quality

characteristic index and the various designing parametersdetermined in the schemersquos design phase However thistype of correspondence is difficult to express using an exactexplicit function For some small and simple products theoverall characteristics are determined accurately with theaid of approaches such as the finite element and modalsynthesis methods The overall characteristics can even beobtained by creating models and conducting relative exper-iments

Many quality prediction techniques are available in theliterature such as the Bayesian model [1] support vectormachine [2] neural network (NN) [3ndash6] state-dependentparameter model [7] and deep learning techniques [8] Yuet al [1] proposed a Bayesian model-based multi-KernelGaussian process regression approach to predict the qual-ity of nonlinear batch processes with multiple operatingphases Bidar et al [7] proposed a data-driven soft sensorapproach for online quality prediction which uses state-dependent parameter models Through a comparison withother methods the proposedmodel is muchmore robust andreliable with fewer model parameters which makes it usefulfor industrial applications In view of quality prediction in

HindawiComplexityVolume 2019 Article ID 9852134 11 pageshttpsdoiorg10115520199852134

2 Complexity

manufacturing Bai et al [9] conducted a comparative studyof intelligent learning approaches Two categories of intelli-gent learning approaches that is shallow learning and deeplearning were investigated and compared for manufacturingquality prediction The authorsrsquo experiments showed thatthe deep framework overwhelmed the shallow architecturein terms of the root-mean-square error (RMSE) thresholdstatistics and mean absolute percentage error Zheng et al[10] proposed a new phase adaptive relevance vector machine(RVM) model for quality prediction in multiphase batchprocesses Based on the information transfer of relevancevectors in each RVM model different phases are connectedone after another which provides simultaneous informationfor the prediction of final product quality To predict thequality of a complex production process Zhang et al [11]proposed a multimodel modeling approach based on fuzzyC-means clustering and support vector regression to solvethe problems of a nonlinear wide operating condition rangeand difficult prediction Svalina et al [12] provided modelsbased on moving least-squares and moving least absolutedeviations methods to predict machined surface qualityComparisons with NNs were also conducted to show thatthe total mean square deviation in the proposed modelswas considerably higher than that in the application of theNN method To address issues of the real-time predictionof final product quality during batch operation Wang [13]presented a robust data-driven modeling approach usingavailable process information up to the current points tocapture their time-varying relationships with final productquality during the course of operation Chu et al [14]proposed an improved JYKPLS (Joint-Y Kernel partial leastsquares) process transfer model to solve the issue of finalquality prediction for new batch processes To cope withthe difficulty of online quality prediction for multigradeprocesses Liu et al [15] proposed a just-in-time latentvariable modeling method based on extracting the com-mon and special features of multigrade processes Garciaet al [16] applied regression models to estimate physicalquality indices in a tube extrusion process Park et al [17]developed a system for predicting the printed part qualityduring SLM (selective laser melting) process by simulationZhu et al [18] presented an improved analytical variationpropagation model for a multistage machining process basedon a geometric constraint equation Their study provided amethod to predict part deviation under the influence of thefixture error datum error andmachining error Additionallythere are many studies on water quality prediction Qualitycontrol is a process through which one seeks to ensurethat product quality is maintained or improved with eitherreduced or zero errors In this study we address ldquoqualitycontrolrdquo at the stage of product design The quality of thefinal product is a dependent variable that is determinedby some independent input variables We explore the rela-tionship between the independent input variables and thedependent variable (product quality) In this sense althoughthe cases of air and water quality control are not fields clearlyrelated to the case studied in this paper the logic of qualitycontrol is similar that is the control of final quality throughthe control of dependent input variables Final quality is

predicted through establishing the relationship betweeninput variables and quality characteristics Liu et al [2]proposed a water quality prediction model based on supportvector regression A hybrid approach known as real-valuegenetic algorithm support vector regression was presentedand applied to predict the aquaculture water quality collectedfrom the aquatic factories of Yixing in China Li et al [8]proposed a novel spatiotemporal deep learning-based airquality prediction method that inherently considers spatialand temporal correlations Statistical models are widely usedin the prediction of water quality Avila et al [19] comparedthe performance of a wide range of statistical modelsincluding a naıve model multiple linear regression dynamicregression regression tree Markov chain classification treerandom forest multinomial logistic regression and Bayesiannetwork in the prediction of water quality for the weekly datacollected over the summer months from 2006 to 2014 fromthe Oreti River inWallacetown in New Zealand Their resultsdemonstrated that the Bayesian network was superior to allother models

However for large complex products (systems) all thesemethods are difficult to apply Our purpose is to proposea novel prediction algorithm based on an improved ElmanNN ensemble for quality predictionThe contributions of thispaper include the following

(1) A novel quality prediction approach based on theElmanNNensemble is proposedThe predictionmodel helpsto establish an organic link between designing parametersand a productrsquos overall quality characteristics which isessential in product design

(2) A novel grasshopper optimization- (GRO-) basedElman NN ensemble model is designed to improve thegeneralization ability and effectiveness for quality predictionFirst the GRO algorithm is used to enhance the performanceof an individual Elman NN with respect to the requiredquality prediction Then while yielding an NN ensemblethe improved weighted average is used to combine theoutputs of different individual networks in which the weightsare determined by the training error of the individualNNs

The remainder of this paper is structured as follows InSection 2 we present a literature review which provides thebasis of our research In Section 3 we present the qualityprediction model based on novel Elman NN ensembles Weconduct simulations of our proposed approach in additionto comparisons with other algorithms in Section 4 InSection 5 we provide a conclusion

2 Literature Review

Artificial neural network (ANN) has been one of the maintechniques for quality prediction NNs have been applied tovarious areas for prediction Xu and Liu [6] combined thewavelet transform with the BPNN to establish the short-term wavelet NN water quality prediction model Najah etal [20] investigated different artificial intelligence techniquesin water quality prediction including multilayer perceptronNNs ensemble NNs and the support vector machine to

Complexity 3

develop a computationally efficient and robust approach forpredicting water quality Han et al [5] presented a flexiblestructure radial basis function NN (FS-RBFNN) and appliedit to water quality prediction To build an ANN with a self-organizing architecture and suitable learning algorithm fornonlinear system modeling Han et al [21] developed anautomatic axon-NN which can self-organize the architectureand weights thus improving network performance Sheoranet al [22] proposed a hybrid method for software qualityprediction which used an advanced NN that incorporateda hybrid cuckoo search optimization algorithm for bet-ter prediction accuracy Russo et al [4] applied optimalANNs to air quality forecasting Their proposed model usedmethods in stochastic data analysis to derive a set of afew stochastic variables that represent relevant informationabout a multivariate stochastic system and are used asinput for NNs Liu et al [3] presented a SNCCDBAGG-based NN ensemble approach for quality prediction in theinjection molding process in which bagging is used to createNNs for the ensemble and negative correlation learningvia correlation-corrected data (NCCD) is used to achievenegative correlation of each networkrsquos error against errors forthe remainder of the ensemble Ma et al [23] presented a newoutsourcing model for privacy-preserving neural networkprediction under two noncolluding servers framework Theyproposed a neural network prediction scheme for fullynoninteractive privacy-preserving Olawoyin [24] used BPANN as a prediction tool to study the potential toxicityof PAH carcinogens in soils Liang et al [25] proposed aneural network prediction control model based on improvedrolling optimization algorithm The control model realizesadvanced prediction to achieve intelligent control of unbal-anced drillingrsquos underpressure value and performs fast andstable self-feedback control of the output prediction resultsGu et al [26] used an improved BP neural network basedon GA algorithm to develop the yield-irrigation predictionmodel for subsurface drip irrigation system Bardak et al[27] presented an application of ANN to predict the woodbonding quality based on pressed conditions Heidari etal [28] optimized the multilayer perceptron NN usingthe GRO algorithm which was applied to many populardatasets

As seen from the above review of related work manystudies have considered the problem of quality predictionThere are many studies on water quality prediction airquality prediction and quality prediction in manufacturingHowever publications on quality prediction during the stageof product design are scarce Modeling the relationshipsbetween designing parameters and a productrsquos overall qualitycharacteristics has been ignored Considering theNNrsquos abilityto simulate the complex input and output relationship of areal system and its powerful ability in nonlinear modelinga quality prediction model based on a NN is built in thisstudy To improve the prediction accuracy a novel predictionalgorithm based on the Elman ensemble is proposed Firstthe ElmanNNparameters are optimized using theGRO algo-rithm and then the weighted average method is improved tocombine the outputs of the individual NNswhere theweightsare determined by the training errors

3 Quality Prediction Model Based on ElmanNeural Network Ensembles

31 ElmanNeural Network TheElmanNN is a type of locallyrecurrent network which is considered as a special type offeedforward NN with additional memory neurons and localfeedback Thus the BP algorithm which is typically used forfeedforward NN training can be used to train the Elmannetwork

As shown in Figure 1 the Elman NN consists of thecontext layer input layer hidden layer and output layer [29ndash31] 1198821 denotes the weight from the context layer to thehidden layer W2 denotes the weight from the input layerto the hidden layer and W3 denotes the weight from thehidden layer to the output layer119880(119905minus1) denotes the networkinput vector at the (119905minus1)th iteration 119881(119905) denotes the hiddenlayer output vector at the 119905th iteration and 119885(119905) denotesthe network output vector at the 119905th iteration The contextlayer retains the hidden layer output vector from the previousiteration that is119881119862(119905) denotes the context layer output vectorat the 119905th iteration and its value equals the hidden layeroutput vector at the (119905 minus 1)th iteration The thresholds of thehidden layer unit and output layer are 120579119895 and 120579119896 respectivelySuppose the transfer functions of the hidden layer and outputlayer units are 119892(sdot) and ℎ(sdot) respectively then the transferrelations of the Elman network are expressed as119881 (t) = 119892 (1198821119881119862 (119905) + 1198822119880 (119905 minus 1) minus 120579119895) (1)

where 119881119862(119905) = 119881(119905 minus 1)The output of the Elman NN is represented as119885 (119905) = ℎ (1198823119881 (119905) minus 120579119896) (2)

The Elman NN is one of the most widely used and mosteffective NN models in ANNs and has powerful processingability for nonlinear decisions [30] The Elman NN canbe considered as a special kind of feedforward NN withadditionalmemory neurons and local feedback Because of itsbetter learning efficiency approximation ability andmemoryability than other neural network the Elman NN can notonly be used in time series prediction but also in systemidentification and prediction [32 33] Therefore Elman NNis chosen for quality prediction in our paper

However the Elman NN also has some disadvantagessuch as a low convergence rate easily becoming trappedat the local minimum and lack of theory to determinethe initial weights and threshold of the network Generallyoptimization methods can overcome the deficiencies of NNsAdditionally the NN ensemble is a technique that can sig-nificantly improve the generalization ability of NNs throughtraining a number of NNs and then combining them ([31]He and Cao 2012) Generally the most commonmethods forthe NN ensemble model are simple averaging and weightedaveraging for regression problems [34] In this study theGROmethod is used to make up the deficiency of the ElmanNN and the input weights and output layer threshold areoptimized Then a novel method based on the training erroris established to construct an ensemble model

4 Complexity

Z-1

Vc (t)

W1

W2

U(t-1)

g()

g()

g()

j

V(t)

W3

k∙

ℎ() Z(t)

Figure 1 Elman neural network model

32 Grasshopper Optimization Algorithm The GRO algo-rithmwas proposed by Saremi et al [34]Themain character-istics of a swarm of grasshoppers in the larval phase are slowmovement and small steps The mathematical model used tosimulate the swarming behavior of grasshoppers is presentedas follows which can provide random behavior119883119894 = 1199031119878119894 + 1199032119866119894 + 1199033119860119894 (3)

where 119883119894 denotes the position of the ith grasshopper 119878119894denotes social interaction 119866119894 denotes the gravity force on theith grasshopper 119860 119894 denotes the wind advection and 1199031 1199032and 1199033 are random numbers in [0 1]

The mathematical model (3) cannot be used directly tosolve optimization problems mainly because grasshoppersquickly reach the comfort zone and the swarm does not con-verge to a specified point Amodified version of this equationis proposed as follows to solve optimization problems

119883119889119894 = 119888( 119873sum119895=1119895 =119894

119888119906119889 minus 1198971198892 119904 (119889119894119895) 1198891015840119894119895)+ 119879119889 (4)

where 119906119889 denotes the upper bound in the 119889th dimension 119897119889denotes the lower bound in the 119889th dimension 119879119889 denotesthe value of the 119889th dimension in the target and 119888 denotes adecreasing coefficient to shrink the comfort zone repulsionzone and attraction zone It shows that the next position of agrasshopper is defined based on its current position positionof the target and position of all other grasshoppers Note thatthe first component of this equation considers the location ofthe current grasshopper with respect to other grasshoppers

The pseudocode of the genetic optimization algorithm(GOA) algorithm is as follows [34]

Initialize the swarm Xi (i=12 N)Initialize cmax cmin and the maximum number ofiterationsCalculate fitness f of each search agentXbest=the best search agentWhile (lltL)Update cfor each search agentNormalize the distance between grasshoppersUpdate the position of the current search agentBring the current agent back if it goes outside theboundariesend forUpdate Xbest if there is a better solution119883119887119890119904119905 = 119883119889119894 if 119891(119883119887119890119904119905) gt 119891(119883119889119894 )119897 = 119897 + 1end whileReturn 119883119887119890119904119905

33 Improved Elman NN In this study the GRO algorithmis used to support NN training which is shown in Figure 2During the Elman-GRO combination the NN managesnonlinearity whereas GRO is used to optimize the NNparameters for an accurate fit

Complexity 5

Initialize Parameters

Feed Input Data

Elman NN

Training

Convergence

Use GRO-Elman-NN withoptimized parameters for testing

Stop

NOApply GRO to adjust

parameters

NO

YES

Figure 2 Elman NN optimization process using GRO

GRO optimizes a problem using individual grasshopperswhich move around in the exploration space accordingto a simple mathematical model A grasshopper positionrepresents one of the solutions for determining input weightsand output thresholds for an Elman NN The length of thegrasshopper position equals the dimensionality of the inputweights and output layer thresholdsTheobjective of theGROis to determine an input weight and output threshold vectorthat can result in the minimum RMSE from the NN

The RMSE is designed as the objective function thus theGRO is engaged to minimize the RMSE during the trainingphase It is calculated as the difference between the valuesanticipated by the predicted value and the true value TheRMSE of the predictionwith respect to the computed variableV119888119896 is determined as the square root of the mean-squarederror and is given by

119877119872119878119864 = radicsum119899119896=1 (V119889119896 minus V119888119896)2119899 (5)

where V119889119896 denotes the originally observed value of the 119896thdata instance and V119888119896 denotes the value predicted by the NN

Convergence is checked after each iteration of GROinstead of each epoch The objective of the optimization is todetermine an input weight and output threshold vector thatcan meet the requirement of RMSE which is written as119877119872119878119864 lt 119877119872119878119864119879 (6)

where 119877119872119878119864119879 is the desired thresholdAs shown in Figure 2 if the convergence condition ismet

the optimization stops

34 NN Ensemble Model for Quality Prediction An ANNensemble is a composite model of multiple NNs with bettergeneralization ability and stability than a single networkWhen using the improved Elman NN to construct the NNensemble first the individual NNs are produced and thenthe outputs of the individual NNs are combined

Bagging is a statistical resampling technique for generat-ing training data for each individual in an ensemble model[35 36] Each training dataset is drawn randomly with areplacement from the original training set Additionally thesize of the training set is typically the same as that of theoriginal training set The bagging technique increases the

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 2: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

2 Complexity

manufacturing Bai et al [9] conducted a comparative studyof intelligent learning approaches Two categories of intelli-gent learning approaches that is shallow learning and deeplearning were investigated and compared for manufacturingquality prediction The authorsrsquo experiments showed thatthe deep framework overwhelmed the shallow architecturein terms of the root-mean-square error (RMSE) thresholdstatistics and mean absolute percentage error Zheng et al[10] proposed a new phase adaptive relevance vector machine(RVM) model for quality prediction in multiphase batchprocesses Based on the information transfer of relevancevectors in each RVM model different phases are connectedone after another which provides simultaneous informationfor the prediction of final product quality To predict thequality of a complex production process Zhang et al [11]proposed a multimodel modeling approach based on fuzzyC-means clustering and support vector regression to solvethe problems of a nonlinear wide operating condition rangeand difficult prediction Svalina et al [12] provided modelsbased on moving least-squares and moving least absolutedeviations methods to predict machined surface qualityComparisons with NNs were also conducted to show thatthe total mean square deviation in the proposed modelswas considerably higher than that in the application of theNN method To address issues of the real-time predictionof final product quality during batch operation Wang [13]presented a robust data-driven modeling approach usingavailable process information up to the current points tocapture their time-varying relationships with final productquality during the course of operation Chu et al [14]proposed an improved JYKPLS (Joint-Y Kernel partial leastsquares) process transfer model to solve the issue of finalquality prediction for new batch processes To cope withthe difficulty of online quality prediction for multigradeprocesses Liu et al [15] proposed a just-in-time latentvariable modeling method based on extracting the com-mon and special features of multigrade processes Garciaet al [16] applied regression models to estimate physicalquality indices in a tube extrusion process Park et al [17]developed a system for predicting the printed part qualityduring SLM (selective laser melting) process by simulationZhu et al [18] presented an improved analytical variationpropagation model for a multistage machining process basedon a geometric constraint equation Their study provided amethod to predict part deviation under the influence of thefixture error datum error andmachining error Additionallythere are many studies on water quality prediction Qualitycontrol is a process through which one seeks to ensurethat product quality is maintained or improved with eitherreduced or zero errors In this study we address ldquoqualitycontrolrdquo at the stage of product design The quality of thefinal product is a dependent variable that is determinedby some independent input variables We explore the rela-tionship between the independent input variables and thedependent variable (product quality) In this sense althoughthe cases of air and water quality control are not fields clearlyrelated to the case studied in this paper the logic of qualitycontrol is similar that is the control of final quality throughthe control of dependent input variables Final quality is

predicted through establishing the relationship betweeninput variables and quality characteristics Liu et al [2]proposed a water quality prediction model based on supportvector regression A hybrid approach known as real-valuegenetic algorithm support vector regression was presentedand applied to predict the aquaculture water quality collectedfrom the aquatic factories of Yixing in China Li et al [8]proposed a novel spatiotemporal deep learning-based airquality prediction method that inherently considers spatialand temporal correlations Statistical models are widely usedin the prediction of water quality Avila et al [19] comparedthe performance of a wide range of statistical modelsincluding a naıve model multiple linear regression dynamicregression regression tree Markov chain classification treerandom forest multinomial logistic regression and Bayesiannetwork in the prediction of water quality for the weekly datacollected over the summer months from 2006 to 2014 fromthe Oreti River inWallacetown in New Zealand Their resultsdemonstrated that the Bayesian network was superior to allother models

However for large complex products (systems) all thesemethods are difficult to apply Our purpose is to proposea novel prediction algorithm based on an improved ElmanNN ensemble for quality predictionThe contributions of thispaper include the following

(1) A novel quality prediction approach based on theElmanNNensemble is proposedThe predictionmodel helpsto establish an organic link between designing parametersand a productrsquos overall quality characteristics which isessential in product design

(2) A novel grasshopper optimization- (GRO-) basedElman NN ensemble model is designed to improve thegeneralization ability and effectiveness for quality predictionFirst the GRO algorithm is used to enhance the performanceof an individual Elman NN with respect to the requiredquality prediction Then while yielding an NN ensemblethe improved weighted average is used to combine theoutputs of different individual networks in which the weightsare determined by the training error of the individualNNs

The remainder of this paper is structured as follows InSection 2 we present a literature review which provides thebasis of our research In Section 3 we present the qualityprediction model based on novel Elman NN ensembles Weconduct simulations of our proposed approach in additionto comparisons with other algorithms in Section 4 InSection 5 we provide a conclusion

2 Literature Review

Artificial neural network (ANN) has been one of the maintechniques for quality prediction NNs have been applied tovarious areas for prediction Xu and Liu [6] combined thewavelet transform with the BPNN to establish the short-term wavelet NN water quality prediction model Najah etal [20] investigated different artificial intelligence techniquesin water quality prediction including multilayer perceptronNNs ensemble NNs and the support vector machine to

Complexity 3

develop a computationally efficient and robust approach forpredicting water quality Han et al [5] presented a flexiblestructure radial basis function NN (FS-RBFNN) and appliedit to water quality prediction To build an ANN with a self-organizing architecture and suitable learning algorithm fornonlinear system modeling Han et al [21] developed anautomatic axon-NN which can self-organize the architectureand weights thus improving network performance Sheoranet al [22] proposed a hybrid method for software qualityprediction which used an advanced NN that incorporateda hybrid cuckoo search optimization algorithm for bet-ter prediction accuracy Russo et al [4] applied optimalANNs to air quality forecasting Their proposed model usedmethods in stochastic data analysis to derive a set of afew stochastic variables that represent relevant informationabout a multivariate stochastic system and are used asinput for NNs Liu et al [3] presented a SNCCDBAGG-based NN ensemble approach for quality prediction in theinjection molding process in which bagging is used to createNNs for the ensemble and negative correlation learningvia correlation-corrected data (NCCD) is used to achievenegative correlation of each networkrsquos error against errors forthe remainder of the ensemble Ma et al [23] presented a newoutsourcing model for privacy-preserving neural networkprediction under two noncolluding servers framework Theyproposed a neural network prediction scheme for fullynoninteractive privacy-preserving Olawoyin [24] used BPANN as a prediction tool to study the potential toxicityof PAH carcinogens in soils Liang et al [25] proposed aneural network prediction control model based on improvedrolling optimization algorithm The control model realizesadvanced prediction to achieve intelligent control of unbal-anced drillingrsquos underpressure value and performs fast andstable self-feedback control of the output prediction resultsGu et al [26] used an improved BP neural network basedon GA algorithm to develop the yield-irrigation predictionmodel for subsurface drip irrigation system Bardak et al[27] presented an application of ANN to predict the woodbonding quality based on pressed conditions Heidari etal [28] optimized the multilayer perceptron NN usingthe GRO algorithm which was applied to many populardatasets

As seen from the above review of related work manystudies have considered the problem of quality predictionThere are many studies on water quality prediction airquality prediction and quality prediction in manufacturingHowever publications on quality prediction during the stageof product design are scarce Modeling the relationshipsbetween designing parameters and a productrsquos overall qualitycharacteristics has been ignored Considering theNNrsquos abilityto simulate the complex input and output relationship of areal system and its powerful ability in nonlinear modelinga quality prediction model based on a NN is built in thisstudy To improve the prediction accuracy a novel predictionalgorithm based on the Elman ensemble is proposed Firstthe ElmanNNparameters are optimized using theGRO algo-rithm and then the weighted average method is improved tocombine the outputs of the individual NNswhere theweightsare determined by the training errors

3 Quality Prediction Model Based on ElmanNeural Network Ensembles

31 ElmanNeural Network TheElmanNN is a type of locallyrecurrent network which is considered as a special type offeedforward NN with additional memory neurons and localfeedback Thus the BP algorithm which is typically used forfeedforward NN training can be used to train the Elmannetwork

As shown in Figure 1 the Elman NN consists of thecontext layer input layer hidden layer and output layer [29ndash31] 1198821 denotes the weight from the context layer to thehidden layer W2 denotes the weight from the input layerto the hidden layer and W3 denotes the weight from thehidden layer to the output layer119880(119905minus1) denotes the networkinput vector at the (119905minus1)th iteration 119881(119905) denotes the hiddenlayer output vector at the 119905th iteration and 119885(119905) denotesthe network output vector at the 119905th iteration The contextlayer retains the hidden layer output vector from the previousiteration that is119881119862(119905) denotes the context layer output vectorat the 119905th iteration and its value equals the hidden layeroutput vector at the (119905 minus 1)th iteration The thresholds of thehidden layer unit and output layer are 120579119895 and 120579119896 respectivelySuppose the transfer functions of the hidden layer and outputlayer units are 119892(sdot) and ℎ(sdot) respectively then the transferrelations of the Elman network are expressed as119881 (t) = 119892 (1198821119881119862 (119905) + 1198822119880 (119905 minus 1) minus 120579119895) (1)

where 119881119862(119905) = 119881(119905 minus 1)The output of the Elman NN is represented as119885 (119905) = ℎ (1198823119881 (119905) minus 120579119896) (2)

The Elman NN is one of the most widely used and mosteffective NN models in ANNs and has powerful processingability for nonlinear decisions [30] The Elman NN canbe considered as a special kind of feedforward NN withadditionalmemory neurons and local feedback Because of itsbetter learning efficiency approximation ability andmemoryability than other neural network the Elman NN can notonly be used in time series prediction but also in systemidentification and prediction [32 33] Therefore Elman NNis chosen for quality prediction in our paper

However the Elman NN also has some disadvantagessuch as a low convergence rate easily becoming trappedat the local minimum and lack of theory to determinethe initial weights and threshold of the network Generallyoptimization methods can overcome the deficiencies of NNsAdditionally the NN ensemble is a technique that can sig-nificantly improve the generalization ability of NNs throughtraining a number of NNs and then combining them ([31]He and Cao 2012) Generally the most commonmethods forthe NN ensemble model are simple averaging and weightedaveraging for regression problems [34] In this study theGROmethod is used to make up the deficiency of the ElmanNN and the input weights and output layer threshold areoptimized Then a novel method based on the training erroris established to construct an ensemble model

4 Complexity

Z-1

Vc (t)

W1

W2

U(t-1)

g()

g()

g()

j

V(t)

W3

k∙

ℎ() Z(t)

Figure 1 Elman neural network model

32 Grasshopper Optimization Algorithm The GRO algo-rithmwas proposed by Saremi et al [34]Themain character-istics of a swarm of grasshoppers in the larval phase are slowmovement and small steps The mathematical model used tosimulate the swarming behavior of grasshoppers is presentedas follows which can provide random behavior119883119894 = 1199031119878119894 + 1199032119866119894 + 1199033119860119894 (3)

where 119883119894 denotes the position of the ith grasshopper 119878119894denotes social interaction 119866119894 denotes the gravity force on theith grasshopper 119860 119894 denotes the wind advection and 1199031 1199032and 1199033 are random numbers in [0 1]

The mathematical model (3) cannot be used directly tosolve optimization problems mainly because grasshoppersquickly reach the comfort zone and the swarm does not con-verge to a specified point Amodified version of this equationis proposed as follows to solve optimization problems

119883119889119894 = 119888( 119873sum119895=1119895 =119894

119888119906119889 minus 1198971198892 119904 (119889119894119895) 1198891015840119894119895)+ 119879119889 (4)

where 119906119889 denotes the upper bound in the 119889th dimension 119897119889denotes the lower bound in the 119889th dimension 119879119889 denotesthe value of the 119889th dimension in the target and 119888 denotes adecreasing coefficient to shrink the comfort zone repulsionzone and attraction zone It shows that the next position of agrasshopper is defined based on its current position positionof the target and position of all other grasshoppers Note thatthe first component of this equation considers the location ofthe current grasshopper with respect to other grasshoppers

The pseudocode of the genetic optimization algorithm(GOA) algorithm is as follows [34]

Initialize the swarm Xi (i=12 N)Initialize cmax cmin and the maximum number ofiterationsCalculate fitness f of each search agentXbest=the best search agentWhile (lltL)Update cfor each search agentNormalize the distance between grasshoppersUpdate the position of the current search agentBring the current agent back if it goes outside theboundariesend forUpdate Xbest if there is a better solution119883119887119890119904119905 = 119883119889119894 if 119891(119883119887119890119904119905) gt 119891(119883119889119894 )119897 = 119897 + 1end whileReturn 119883119887119890119904119905

33 Improved Elman NN In this study the GRO algorithmis used to support NN training which is shown in Figure 2During the Elman-GRO combination the NN managesnonlinearity whereas GRO is used to optimize the NNparameters for an accurate fit

Complexity 5

Initialize Parameters

Feed Input Data

Elman NN

Training

Convergence

Use GRO-Elman-NN withoptimized parameters for testing

Stop

NOApply GRO to adjust

parameters

NO

YES

Figure 2 Elman NN optimization process using GRO

GRO optimizes a problem using individual grasshopperswhich move around in the exploration space accordingto a simple mathematical model A grasshopper positionrepresents one of the solutions for determining input weightsand output thresholds for an Elman NN The length of thegrasshopper position equals the dimensionality of the inputweights and output layer thresholdsTheobjective of theGROis to determine an input weight and output threshold vectorthat can result in the minimum RMSE from the NN

The RMSE is designed as the objective function thus theGRO is engaged to minimize the RMSE during the trainingphase It is calculated as the difference between the valuesanticipated by the predicted value and the true value TheRMSE of the predictionwith respect to the computed variableV119888119896 is determined as the square root of the mean-squarederror and is given by

119877119872119878119864 = radicsum119899119896=1 (V119889119896 minus V119888119896)2119899 (5)

where V119889119896 denotes the originally observed value of the 119896thdata instance and V119888119896 denotes the value predicted by the NN

Convergence is checked after each iteration of GROinstead of each epoch The objective of the optimization is todetermine an input weight and output threshold vector thatcan meet the requirement of RMSE which is written as119877119872119878119864 lt 119877119872119878119864119879 (6)

where 119877119872119878119864119879 is the desired thresholdAs shown in Figure 2 if the convergence condition ismet

the optimization stops

34 NN Ensemble Model for Quality Prediction An ANNensemble is a composite model of multiple NNs with bettergeneralization ability and stability than a single networkWhen using the improved Elman NN to construct the NNensemble first the individual NNs are produced and thenthe outputs of the individual NNs are combined

Bagging is a statistical resampling technique for generat-ing training data for each individual in an ensemble model[35 36] Each training dataset is drawn randomly with areplacement from the original training set Additionally thesize of the training set is typically the same as that of theoriginal training set The bagging technique increases the

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 3: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Complexity 3

develop a computationally efficient and robust approach forpredicting water quality Han et al [5] presented a flexiblestructure radial basis function NN (FS-RBFNN) and appliedit to water quality prediction To build an ANN with a self-organizing architecture and suitable learning algorithm fornonlinear system modeling Han et al [21] developed anautomatic axon-NN which can self-organize the architectureand weights thus improving network performance Sheoranet al [22] proposed a hybrid method for software qualityprediction which used an advanced NN that incorporateda hybrid cuckoo search optimization algorithm for bet-ter prediction accuracy Russo et al [4] applied optimalANNs to air quality forecasting Their proposed model usedmethods in stochastic data analysis to derive a set of afew stochastic variables that represent relevant informationabout a multivariate stochastic system and are used asinput for NNs Liu et al [3] presented a SNCCDBAGG-based NN ensemble approach for quality prediction in theinjection molding process in which bagging is used to createNNs for the ensemble and negative correlation learningvia correlation-corrected data (NCCD) is used to achievenegative correlation of each networkrsquos error against errors forthe remainder of the ensemble Ma et al [23] presented a newoutsourcing model for privacy-preserving neural networkprediction under two noncolluding servers framework Theyproposed a neural network prediction scheme for fullynoninteractive privacy-preserving Olawoyin [24] used BPANN as a prediction tool to study the potential toxicityof PAH carcinogens in soils Liang et al [25] proposed aneural network prediction control model based on improvedrolling optimization algorithm The control model realizesadvanced prediction to achieve intelligent control of unbal-anced drillingrsquos underpressure value and performs fast andstable self-feedback control of the output prediction resultsGu et al [26] used an improved BP neural network basedon GA algorithm to develop the yield-irrigation predictionmodel for subsurface drip irrigation system Bardak et al[27] presented an application of ANN to predict the woodbonding quality based on pressed conditions Heidari etal [28] optimized the multilayer perceptron NN usingthe GRO algorithm which was applied to many populardatasets

As seen from the above review of related work manystudies have considered the problem of quality predictionThere are many studies on water quality prediction airquality prediction and quality prediction in manufacturingHowever publications on quality prediction during the stageof product design are scarce Modeling the relationshipsbetween designing parameters and a productrsquos overall qualitycharacteristics has been ignored Considering theNNrsquos abilityto simulate the complex input and output relationship of areal system and its powerful ability in nonlinear modelinga quality prediction model based on a NN is built in thisstudy To improve the prediction accuracy a novel predictionalgorithm based on the Elman ensemble is proposed Firstthe ElmanNNparameters are optimized using theGRO algo-rithm and then the weighted average method is improved tocombine the outputs of the individual NNswhere theweightsare determined by the training errors

3 Quality Prediction Model Based on ElmanNeural Network Ensembles

31 ElmanNeural Network TheElmanNN is a type of locallyrecurrent network which is considered as a special type offeedforward NN with additional memory neurons and localfeedback Thus the BP algorithm which is typically used forfeedforward NN training can be used to train the Elmannetwork

As shown in Figure 1 the Elman NN consists of thecontext layer input layer hidden layer and output layer [29ndash31] 1198821 denotes the weight from the context layer to thehidden layer W2 denotes the weight from the input layerto the hidden layer and W3 denotes the weight from thehidden layer to the output layer119880(119905minus1) denotes the networkinput vector at the (119905minus1)th iteration 119881(119905) denotes the hiddenlayer output vector at the 119905th iteration and 119885(119905) denotesthe network output vector at the 119905th iteration The contextlayer retains the hidden layer output vector from the previousiteration that is119881119862(119905) denotes the context layer output vectorat the 119905th iteration and its value equals the hidden layeroutput vector at the (119905 minus 1)th iteration The thresholds of thehidden layer unit and output layer are 120579119895 and 120579119896 respectivelySuppose the transfer functions of the hidden layer and outputlayer units are 119892(sdot) and ℎ(sdot) respectively then the transferrelations of the Elman network are expressed as119881 (t) = 119892 (1198821119881119862 (119905) + 1198822119880 (119905 minus 1) minus 120579119895) (1)

where 119881119862(119905) = 119881(119905 minus 1)The output of the Elman NN is represented as119885 (119905) = ℎ (1198823119881 (119905) minus 120579119896) (2)

The Elman NN is one of the most widely used and mosteffective NN models in ANNs and has powerful processingability for nonlinear decisions [30] The Elman NN canbe considered as a special kind of feedforward NN withadditionalmemory neurons and local feedback Because of itsbetter learning efficiency approximation ability andmemoryability than other neural network the Elman NN can notonly be used in time series prediction but also in systemidentification and prediction [32 33] Therefore Elman NNis chosen for quality prediction in our paper

However the Elman NN also has some disadvantagessuch as a low convergence rate easily becoming trappedat the local minimum and lack of theory to determinethe initial weights and threshold of the network Generallyoptimization methods can overcome the deficiencies of NNsAdditionally the NN ensemble is a technique that can sig-nificantly improve the generalization ability of NNs throughtraining a number of NNs and then combining them ([31]He and Cao 2012) Generally the most commonmethods forthe NN ensemble model are simple averaging and weightedaveraging for regression problems [34] In this study theGROmethod is used to make up the deficiency of the ElmanNN and the input weights and output layer threshold areoptimized Then a novel method based on the training erroris established to construct an ensemble model

4 Complexity

Z-1

Vc (t)

W1

W2

U(t-1)

g()

g()

g()

j

V(t)

W3

k∙

ℎ() Z(t)

Figure 1 Elman neural network model

32 Grasshopper Optimization Algorithm The GRO algo-rithmwas proposed by Saremi et al [34]Themain character-istics of a swarm of grasshoppers in the larval phase are slowmovement and small steps The mathematical model used tosimulate the swarming behavior of grasshoppers is presentedas follows which can provide random behavior119883119894 = 1199031119878119894 + 1199032119866119894 + 1199033119860119894 (3)

where 119883119894 denotes the position of the ith grasshopper 119878119894denotes social interaction 119866119894 denotes the gravity force on theith grasshopper 119860 119894 denotes the wind advection and 1199031 1199032and 1199033 are random numbers in [0 1]

The mathematical model (3) cannot be used directly tosolve optimization problems mainly because grasshoppersquickly reach the comfort zone and the swarm does not con-verge to a specified point Amodified version of this equationis proposed as follows to solve optimization problems

119883119889119894 = 119888( 119873sum119895=1119895 =119894

119888119906119889 minus 1198971198892 119904 (119889119894119895) 1198891015840119894119895)+ 119879119889 (4)

where 119906119889 denotes the upper bound in the 119889th dimension 119897119889denotes the lower bound in the 119889th dimension 119879119889 denotesthe value of the 119889th dimension in the target and 119888 denotes adecreasing coefficient to shrink the comfort zone repulsionzone and attraction zone It shows that the next position of agrasshopper is defined based on its current position positionof the target and position of all other grasshoppers Note thatthe first component of this equation considers the location ofthe current grasshopper with respect to other grasshoppers

The pseudocode of the genetic optimization algorithm(GOA) algorithm is as follows [34]

Initialize the swarm Xi (i=12 N)Initialize cmax cmin and the maximum number ofiterationsCalculate fitness f of each search agentXbest=the best search agentWhile (lltL)Update cfor each search agentNormalize the distance between grasshoppersUpdate the position of the current search agentBring the current agent back if it goes outside theboundariesend forUpdate Xbest if there is a better solution119883119887119890119904119905 = 119883119889119894 if 119891(119883119887119890119904119905) gt 119891(119883119889119894 )119897 = 119897 + 1end whileReturn 119883119887119890119904119905

33 Improved Elman NN In this study the GRO algorithmis used to support NN training which is shown in Figure 2During the Elman-GRO combination the NN managesnonlinearity whereas GRO is used to optimize the NNparameters for an accurate fit

Complexity 5

Initialize Parameters

Feed Input Data

Elman NN

Training

Convergence

Use GRO-Elman-NN withoptimized parameters for testing

Stop

NOApply GRO to adjust

parameters

NO

YES

Figure 2 Elman NN optimization process using GRO

GRO optimizes a problem using individual grasshopperswhich move around in the exploration space accordingto a simple mathematical model A grasshopper positionrepresents one of the solutions for determining input weightsand output thresholds for an Elman NN The length of thegrasshopper position equals the dimensionality of the inputweights and output layer thresholdsTheobjective of theGROis to determine an input weight and output threshold vectorthat can result in the minimum RMSE from the NN

The RMSE is designed as the objective function thus theGRO is engaged to minimize the RMSE during the trainingphase It is calculated as the difference between the valuesanticipated by the predicted value and the true value TheRMSE of the predictionwith respect to the computed variableV119888119896 is determined as the square root of the mean-squarederror and is given by

119877119872119878119864 = radicsum119899119896=1 (V119889119896 minus V119888119896)2119899 (5)

where V119889119896 denotes the originally observed value of the 119896thdata instance and V119888119896 denotes the value predicted by the NN

Convergence is checked after each iteration of GROinstead of each epoch The objective of the optimization is todetermine an input weight and output threshold vector thatcan meet the requirement of RMSE which is written as119877119872119878119864 lt 119877119872119878119864119879 (6)

where 119877119872119878119864119879 is the desired thresholdAs shown in Figure 2 if the convergence condition ismet

the optimization stops

34 NN Ensemble Model for Quality Prediction An ANNensemble is a composite model of multiple NNs with bettergeneralization ability and stability than a single networkWhen using the improved Elman NN to construct the NNensemble first the individual NNs are produced and thenthe outputs of the individual NNs are combined

Bagging is a statistical resampling technique for generat-ing training data for each individual in an ensemble model[35 36] Each training dataset is drawn randomly with areplacement from the original training set Additionally thesize of the training set is typically the same as that of theoriginal training set The bagging technique increases the

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 4: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

4 Complexity

Z-1

Vc (t)

W1

W2

U(t-1)

g()

g()

g()

j

V(t)

W3

k∙

ℎ() Z(t)

Figure 1 Elman neural network model

32 Grasshopper Optimization Algorithm The GRO algo-rithmwas proposed by Saremi et al [34]Themain character-istics of a swarm of grasshoppers in the larval phase are slowmovement and small steps The mathematical model used tosimulate the swarming behavior of grasshoppers is presentedas follows which can provide random behavior119883119894 = 1199031119878119894 + 1199032119866119894 + 1199033119860119894 (3)

where 119883119894 denotes the position of the ith grasshopper 119878119894denotes social interaction 119866119894 denotes the gravity force on theith grasshopper 119860 119894 denotes the wind advection and 1199031 1199032and 1199033 are random numbers in [0 1]

The mathematical model (3) cannot be used directly tosolve optimization problems mainly because grasshoppersquickly reach the comfort zone and the swarm does not con-verge to a specified point Amodified version of this equationis proposed as follows to solve optimization problems

119883119889119894 = 119888( 119873sum119895=1119895 =119894

119888119906119889 minus 1198971198892 119904 (119889119894119895) 1198891015840119894119895)+ 119879119889 (4)

where 119906119889 denotes the upper bound in the 119889th dimension 119897119889denotes the lower bound in the 119889th dimension 119879119889 denotesthe value of the 119889th dimension in the target and 119888 denotes adecreasing coefficient to shrink the comfort zone repulsionzone and attraction zone It shows that the next position of agrasshopper is defined based on its current position positionof the target and position of all other grasshoppers Note thatthe first component of this equation considers the location ofthe current grasshopper with respect to other grasshoppers

The pseudocode of the genetic optimization algorithm(GOA) algorithm is as follows [34]

Initialize the swarm Xi (i=12 N)Initialize cmax cmin and the maximum number ofiterationsCalculate fitness f of each search agentXbest=the best search agentWhile (lltL)Update cfor each search agentNormalize the distance between grasshoppersUpdate the position of the current search agentBring the current agent back if it goes outside theboundariesend forUpdate Xbest if there is a better solution119883119887119890119904119905 = 119883119889119894 if 119891(119883119887119890119904119905) gt 119891(119883119889119894 )119897 = 119897 + 1end whileReturn 119883119887119890119904119905

33 Improved Elman NN In this study the GRO algorithmis used to support NN training which is shown in Figure 2During the Elman-GRO combination the NN managesnonlinearity whereas GRO is used to optimize the NNparameters for an accurate fit

Complexity 5

Initialize Parameters

Feed Input Data

Elman NN

Training

Convergence

Use GRO-Elman-NN withoptimized parameters for testing

Stop

NOApply GRO to adjust

parameters

NO

YES

Figure 2 Elman NN optimization process using GRO

GRO optimizes a problem using individual grasshopperswhich move around in the exploration space accordingto a simple mathematical model A grasshopper positionrepresents one of the solutions for determining input weightsand output thresholds for an Elman NN The length of thegrasshopper position equals the dimensionality of the inputweights and output layer thresholdsTheobjective of theGROis to determine an input weight and output threshold vectorthat can result in the minimum RMSE from the NN

The RMSE is designed as the objective function thus theGRO is engaged to minimize the RMSE during the trainingphase It is calculated as the difference between the valuesanticipated by the predicted value and the true value TheRMSE of the predictionwith respect to the computed variableV119888119896 is determined as the square root of the mean-squarederror and is given by

119877119872119878119864 = radicsum119899119896=1 (V119889119896 minus V119888119896)2119899 (5)

where V119889119896 denotes the originally observed value of the 119896thdata instance and V119888119896 denotes the value predicted by the NN

Convergence is checked after each iteration of GROinstead of each epoch The objective of the optimization is todetermine an input weight and output threshold vector thatcan meet the requirement of RMSE which is written as119877119872119878119864 lt 119877119872119878119864119879 (6)

where 119877119872119878119864119879 is the desired thresholdAs shown in Figure 2 if the convergence condition ismet

the optimization stops

34 NN Ensemble Model for Quality Prediction An ANNensemble is a composite model of multiple NNs with bettergeneralization ability and stability than a single networkWhen using the improved Elman NN to construct the NNensemble first the individual NNs are produced and thenthe outputs of the individual NNs are combined

Bagging is a statistical resampling technique for generat-ing training data for each individual in an ensemble model[35 36] Each training dataset is drawn randomly with areplacement from the original training set Additionally thesize of the training set is typically the same as that of theoriginal training set The bagging technique increases the

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 5: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Complexity 5

Initialize Parameters

Feed Input Data

Elman NN

Training

Convergence

Use GRO-Elman-NN withoptimized parameters for testing

Stop

NOApply GRO to adjust

parameters

NO

YES

Figure 2 Elman NN optimization process using GRO

GRO optimizes a problem using individual grasshopperswhich move around in the exploration space accordingto a simple mathematical model A grasshopper positionrepresents one of the solutions for determining input weightsand output thresholds for an Elman NN The length of thegrasshopper position equals the dimensionality of the inputweights and output layer thresholdsTheobjective of theGROis to determine an input weight and output threshold vectorthat can result in the minimum RMSE from the NN

The RMSE is designed as the objective function thus theGRO is engaged to minimize the RMSE during the trainingphase It is calculated as the difference between the valuesanticipated by the predicted value and the true value TheRMSE of the predictionwith respect to the computed variableV119888119896 is determined as the square root of the mean-squarederror and is given by

119877119872119878119864 = radicsum119899119896=1 (V119889119896 minus V119888119896)2119899 (5)

where V119889119896 denotes the originally observed value of the 119896thdata instance and V119888119896 denotes the value predicted by the NN

Convergence is checked after each iteration of GROinstead of each epoch The objective of the optimization is todetermine an input weight and output threshold vector thatcan meet the requirement of RMSE which is written as119877119872119878119864 lt 119877119872119878119864119879 (6)

where 119877119872119878119864119879 is the desired thresholdAs shown in Figure 2 if the convergence condition ismet

the optimization stops

34 NN Ensemble Model for Quality Prediction An ANNensemble is a composite model of multiple NNs with bettergeneralization ability and stability than a single networkWhen using the improved Elman NN to construct the NNensemble first the individual NNs are produced and thenthe outputs of the individual NNs are combined

Bagging is a statistical resampling technique for generat-ing training data for each individual in an ensemble model[35 36] Each training dataset is drawn randomly with areplacement from the original training set Additionally thesize of the training set is typically the same as that of theoriginal training set The bagging technique increases the

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 6: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

6 Complexity

10000

2200

4920

7600

650

1

2

3

4

14

(3)

10

(1)

(2)

(4)

(7)

5

11 (5)

(6)

(8)

(10)7

12

8

(9)

9

13

6

Figure 3 Layout of the main beamrsquos web members

difference between individuals using bootstrap sampling andimproves the generalization ability of theNNensembleThenthe outputs of the individual networks are weighted to formthe overall output 119874 of the ensemble

119874 (119868 119886) = 119885sum119894=1

119886119894119874119894 (119868119894) (7)

where 119868119894 is the input vector 119886119894 is the weight applied to theoutput of the 119894th individual network and 119885 is the number ofNNs in the ensemble model

In this algorithm 119886119894 is designed according to trainingerror 119890119894 for each NN

119886119894 = (sum (119864) minus 119890119894)sum (119864) (8)

where 119864 = [1198901 1198902 119890119901] and 119904119906119898( ) is the summationfunction of a vector

4 Simulation of QualityPrediction and Analysis

41 Simulation To verify the effectiveness of the proposedapproach computer simulations were conducted A 10 mdiameter circular paraboloid antenna is considered as anexample in this section The quality prediction model wasestablished on the basis of relationships between the design-ing parameters and quality characteristics indices A circular

paraboloid antenna is a structure with high precision It iswidely used in areas such as aeronautics and astronauticsradar technique and satellite communication The require-ments of its quality characteristics include reflector accu-racy self-weight deformation requirement and homologousdesign requirement As is shown in Figure 3 the sectioncharacteristics of the beam (1) (2) (3) (6) and (7) and theheight of the beam (4) and (8) affect the overall quality ofthe antenna greatly ℎ119860 and ℎ119861 were introduced to denote theordinates of point 6 and point 8 respectively which are theheight of the beam (4) and (8) Accordingly the designingparameters for the quality design of the circular paraboloidantenna are determined which are the section characteristicsof the beam (1) (2) (3) (6) and (7) ℎ119860 and ℎ119861 denoted by1199091 1199092 1199093 1199094 1199095 1199096 1199097 respectively (1199091 1199092 1199093 1199094 1199095 1199096 1199097)is identified as the input vector of the quality predictionmodel to be established Simultaneously the indices of thequality characteristics of the circular paraboloid antenna areidentified as mean square error (MSE) dead weight andfundamental frequency denoted by 1199101 1199102 1199103 respectivelywhich is the output vector of the model

The fractional factorial design of 2119896minus119901 was selected toconduct experiments [37]

In the simulation 119896 = 7 119901 = 2 and 32 sets of datawere divided into two parts a training set and a testing setwhich consisted of 25 and 7 samples respectively There were7 input nodes 3 hidden nodes 5 context nodes and 3 outputnodes in the Elman NN Therefore 21 input weights and 3output layer thresholds were optimized During NN training

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 7: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Complexity 7

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

045

05

055

06

065

07

075

08

S 1

2 3 4 5 6 71Testing points

Figure 4 Prediction of y1

the epoch size was set to 1000The grasshopper number (sizeof swarm) N and iteration number Lwere designed to be 200and 200 respectively Four ElmanNNswere used for buildingthe ensemble model Due to the characteristics of artificialintelligence optimization and the bagging technique for theensemble the 4 Elman NNs are designed to be different witheach other Although only 25 sets of data are used in thesimulation they are employed to train the ElmanNN4 timeswhich can approximately suppos that there are 100 sets of dataused in our simulation Based on the excellent performance ofElmanNN the small-size data training problem is also solvedin Vairavan et al [38] which uses 60 sets of data to train theElman NN and obtains higher detection accuracy

Three levels were determined for each designing param-eter Detailed setting values are shown in Table 1 Outerdiameters of the steel pipe of beam (1) and (2) were set to50 mm and the thickness varied from 2 mm to 4 mm Thethickness of 3 mm was set to be level 0 whereas 2 mmand 4 mm were set to levels -1 and level 1 respectively Theouter diameters of the beam (3) and (7) were set to 65mm and the thickness varied from 3 mm to 5 mm Thethickness of 4 mm was set to be level 0 whereas 3 mmand 5 mm were set to levels -1 and level 1 respectively Theouter diameter of the beam (6) was set to 40 mm and thethickness varied from 1 mm to 3 mm The thickness of 2mm was set to be level 0 whereas 1 mm and 3 mm wereset to levels -1 and level 1 respectively Test data acquiredthrough experiments included combinations of designingparameters from the 32 tests and the corresponding valuesof the indices for the quality characteristics Part of the data

Table 1 Levels of designing parameters and the correspondingactual values (unit mm)

Designing parameters Levels-1 0 11199091 minus (1) 12060150 times 2 12060150 times 3 12060150 times 41199092 minus (2) 12060150 times 2 12060150 times 3 12060150 times 41199093 minus (3) 12060165 times 3 12060165 times 4 12060165 times 51199094 minus (6) 12060140 times 1 12060140 times 2 12060140 times 31199095 minus (7) 12060165 times 3 12060165 times 4 12060165 times 51199096 minus ℎ119860 1110 710 3101199097 minus ℎ119861 1990 1590 1190

(80 of the samples) was used to train the NN whereasthe remainder (20 of the samples) was used to verify theefficiency and generalization ability of the proposed modelComparisons were conducted with other methods includingthe Elman NN method GRO-based Elman method andweighted ensemble method [35]The prediction results basedon the NNmethods are shown in Figures 4ndash9 Table 2 showsthe average RMSE of all the methods which proves thesuperiority of the proposed ensembled Elman NN modelWith the same population size and iteration number wealso compared the convergence performance of GRO withthe popular GOA [39] in one optimization process whichis shown in Figure 10 It shows that GRO had better conver-gence performance which can provide more advantages inoptimizing the Elman NN

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 8: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

8 Complexity

Table 2 Average prediction RMSE

Methods y1 y2 y3Elman 01658 01223 04748Grasshopper-Elman 00712 00697 03472Ensemble1 00367 00365 02468Proposed Ensemble method 00214 00234 01729

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

2 3 4 5 6 71Testing points

0

005

01

015

02

025

03

035

RMSE

ofS

1

Figure 5 Prediction RMSE of y1

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

31

315

32

325

33

335

34

345

35

355

36

S 2

2 3 4 5 6 71Testing points

Figure 6 Prediction of y2

42 Analysis and Discussion We established that the singleElman method had the worst performance compared with

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

002

004

006

008

01

012

014

016

018

02

RMSE

ofS

2

2 3 4 5 6 71Testing points

Figure 7 Prediction RMSE of y2

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble methodTrue value

37

375

38

385

39

395

S 3

2 3 4 5 6 71Testing points

Figure 8 Prediction of y3

the other methods Meanwhile the proposed GRO-basedElman method achieved better performance than othersThe simulation results also established that the proposedensemble model outperformed all the other methods

An ANN ensemble has better generalization ability andstability than a single network In this study we combinedElman NN with GRO to yield an optimized Elman networkensemble model The Elman NN supported by GRO provedits efficiency for prediction Using GRO to enhance theperformance of Elman NN to achieve the required prediction

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 9: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Complexity 9

ElmanGrasshopper-ElmanEnsemble1Proposed Ensemble method

0

05

1

15

RMSE

ofS

3

2 3 4 5 6 71Testing points

Figure 9 Prediction RMSE of y3

Obj

ectiv

e fun

ctio

n

GROGOA

0

02

04

06

08

1

12

14

20 40 60 80 100 120 140 160 180 2000Iteration time

Figure 10 Convergence performance comparison

is the innovation of the current studyThe improved weightedaverage was used to combine the outputs of the individualnetworks and the weights were determined by the trainingerrors of the individual NNs which improved the general-ization ability and effectiveness of the model However itrequired a long time to optimize and ensemble the ElmanNN Therefore the proposed method can only be used inthe offline prediction field or fields that do not have a strictruntime requirement

5 Conclusion

In this paper a novel prediction algorithm based on theElman NN ensemble model was proposed for quality predic-tion Elman NN parameters were optimized using the GROmethod and then a novel NN ensemble model was designedfor quality prediction The quality prediction model for acircular paraboloid antenna was considered as an exampleto verify the proposed algorithm An Elman NN ensemblemodel that described the correlation between seven design-ing parameters and three indices of quality characteristicswas built The simulation results proved that the proposedalgorithm obtained better prediction accuracy

Data Availability

The simulation data used to support the findings of this studyare from the reference [37]

Conflicts of Interest

No potential conflict of interest was reported by the authors

Acknowledgments

This work was supported by the National Natural Sci-ence Foundation of China under Grant number 71403109the Humanities and Social Science Foundation of JiangsuProvince underGrant number 16JYC002 and theHumanitiesand Social Science Foundation of Ministry of Educationunder Grant number 19YJA880068

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 10: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

10 Complexity

References

[1] J Yu K Chen andMM Rashid ldquoA Bayesian model averagingbased multi-kernel Gaussian process regression framework fornonlinear state estimation and quality prediction of multiphasebatch processes with transient dynamics and uncertaintyrdquoChemical Engineering Science vol 93 no 4 pp 96ndash109 2013

[2] S Liu H Tai Q Ding D Li L Xu and Y Wei ldquoA hybridapproach of support vector regression with genetic algorithmoptimization for aquaculture water quality predictionrdquoMathe-matical and Computer Modelling vol 58 no 3-4 pp 458ndash4652013

[3] Y Liu F-L Wang Y-Q Chang and C Li ldquoA SNCCDBAGG-basedNNensemble approach for quality prediction in injectionmolding processrdquo IEEETransactions onAutomation Science andEngineering vol 8 no 2 pp 424ndash427 2011

[4] A Russo F Raischel and P G Lind ldquoAir quality predictionusing optimal neural networks with stochastic variablesrdquoAtmo-spheric Environment vol 79 pp 822ndash830 2013

[5] H G Han Q L Chen and J F Qiao ldquoAn efficient self-organizing RBF neural network for water quality predictionrdquoNeural Networks vol 24 no 7 pp 717ndash725 2011

[6] L Xu and S Liu ldquoStudy of short-term water quality predictionmodel based on wavelet neural networkrdquo Mathematical ampComputer Modelling vol 58 no 3-4 pp 807ndash813 2013

[7] B Bidar J Sadeghi F Shahraki and M M KhalilipourldquoData-driven soft sensor approach for online quality predictionusing state dependent parameter modelsrdquo Chemometrics andIntelligent Laboratory Systems vol 162 pp 130ndash141 2017

[8] X Li L Peng Y Hu J Shao and T Chi ldquoDeep learningarchitecture for air quality predictionsrdquo Environmental Scienceand Pollution Research vol 23 no 22 pp 22408ndash22417 2016

[9] Y Bai Z Sun J Deng L Li J Long and C Li ldquoManufacturingquality prediction using intelligent learning approaches acomparative studyrdquo Sustainability vol 10 no 1 article 85 2017

[10] J Zheng Z Ge Z Song and F Gao ldquoPhase adaptive RVMmodel for quality prediction of multiphase batch processeswith limited modeling batchesrdquo Chemometrics and IntelligentLaboratory Systems vol 156 pp 81ndash88 2016

[11] M Zhang Z Cai and W Cheng ldquoMulti-model quality pre-diction approach using fuzzy c-means clustering and supportvector regressionrdquo Advances in Mechanical Engineering vol 9no 8 2017

[12] I Svalina K Sabo and G Simunovic ldquoMachined surfacequality prediction models based on moving least squares andmoving least absolute deviations methodsrdquo e InternationalJournal of Advanced Manufacturing Technology vol 57 no 9-12 pp 1099ndash1106 2011

[13] DWang ldquoRobust data-drivenmodeling approach for real-timefinal product quality prediction in batch process operationrdquoIEEETransactions on Industrial Informatics vol 7 no 2 pp 371ndash377 2011

[14] F Chu X Cheng R Jia F Wang and M Lei ldquoFinal qualityprediction method for new batch processes based on improvedJYKPLS process transfer modelrdquo Chemometrics and IntelligentLaboratory Systems vol 183 pp 1ndash10 2018

[15] J Liu T Liu and J Chen ldquoQuality prediction for multi-gradeprocesses by just-in-time latent variable modeling with inte-gration of common and special featuresrdquo Chemical EngineeringScience vol 191 pp 31ndash41 2018

[16] V Garcıa J S Sanchez L A Rodrıguez-Pic et al ldquoUsingregressionmodels for predicting the product quality in a tubing

extrusion processrdquo Journal of Intelligent Manufacturing pp 1ndash10 2018

[17] H S Park N H Tran and D S Nguyen ldquoDevelopment of apredictive system for SLMproduct qualityrdquo in Proceedings of theIOP Conference Series Materials Science and Engineering vol227 pp 1747ndash8981 2017

[18] L ZhuG He and Z Song ldquoImproved quality predictionmodelformultistagemachiningprocess basedon geometric constraintequationrdquo Chinese Journal of Mechanical Engineering vol 29no 2 pp 430ndash438 2016

[19] R Avila B Horn E Moriarty R Hodson and E MoltchanovaldquoEvaluating statistical model performance in water qualitypredictionrdquo Journal of EnvironmentalManagement vol 206 pp910ndash919 2018

[20] A Najah A El-Shafie O A Karim O Jaafar and A HEl-Shafie ldquoAn application of different artificial intelligencestechniques for water quality predictionrdquo International Journalof Physical Sciences vol 6 no 22 pp 5298ndash5308 2011

[21] H-GHan L-DWang and J-FQiao ldquoEfficient self-organizingmultilayer neural network for nonlinear system modelingrdquoNeural Networks vol 43 no 7 pp 22ndash32 2013

[22] K Sheoran P Tomar and R Mishra ldquoSoftware quality predic-tionmodel with the aid of advanced neural network with HCSrdquoProcedia Computer Science vol 92 pp 418ndash424 2016

[23] X Ma X Chen and X Zhang ldquoNon-interactive privacy-preserving neural network predictionrdquo Information Sciencesvol 481 pp 507ndash519 2019

[24] R Olawoyin ldquoApplication of backpropagation artificial neuralnetwork prediction model for the PAH bioremediation ofpolluted soilrdquo Chemosphere vol 161 pp 145ndash150 2016

[25] H Liang Z Li and G Li ldquoNeural network prediction model toachieve intelligent control of unbalanced drillings underpres-sure valuerdquoMultimedia Tools and Applications pp 1ndash29 2018

[26] J Gu G Yin P Huang J Guo and L Chen ldquoAn improved backpropagation neural network prediction model for subsurfacedrip irrigation systemrdquo Computers and Electrical Engineeringvol 60 pp 58ndash65 2017

[27] S Bardak S Tiryaki G Nemli and A Aydin ldquoInvestigationand neural network prediction of wood bonding quality basedon pressing conditionsrdquo International Journal of Adhesion andAdhesives vol 68 pp 115ndash123 2016

[28] A A Heidari H Faris I Aljarah and S Mirjalili ldquoAn efficienthybrid multilayer perceptron neural network with grasshopperoptimizationrdquo So Computing pp 1ndash18 2018

[29] J L Elman ldquoFinding structure in timerdquo Cognitive Science vol14 no 2 pp 179ndash211 1990

[30] X Gao D You and S Katayama ldquoSeam tracking monitoringbased on adaptive Kalman filter embedded elman neural net-work during high-power fiber laser weldingrdquo IEEETransactionson Industrial Electronics vol 59 no 11 pp 4315ndash4325 2012

[31] D P Niu F LWang L L Zhang D K He andM X Jia ldquoNeu-ral network ensemble modeling for nosiheptide fermentationprocess based on partial least squares regressionrdquoChemometricsIntelligent Laboratory Systems vol 105 no 1 pp 125ndash130 2011

[32] S Ding Y Zhang J Chen and W Jia ldquoResearch on usinggenetic algorithms to optimize Elman neural networksrdquoNeuralComputing and Applications vol 23 no 2 pp 293ndash297 2013

[33] W Z Sun and J S Wang ldquoElman neural network soft-sensor model of conversion velocity in polymerization processoptimized by chaoswhale optimization algorithmrdquo IEEEAccessvol 5 pp 13062ndash13076 2017

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 11: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Complexity 11

[34] S Saremi S Mirjalili and A Lewis ldquoGrasshopper optimisationalgorithm theory and applicationrdquo Advances in EngineeringSoware vol 105 pp 30ndash47 2017

[35] F XieH Fan Y Li Z Jiang RMeng andA Bovik ldquoMelanomaclassification on dermoscopy images using a neural networkensemble modelrdquo IEEE Transactions on Medical Imaging vol36 no 3 pp 849ndash858 2017

[36] R Chen and J Yu ldquoAn improved bagging neural networkensemble algorithm and its applicationrdquo in Proceedings of the3rd International Conference on Natural Computation (ICNC)vol 5 pp 730ndash734 2007

[37] A Wang T Jiang and G Liu Intelligent Design HigherEducation Press 2008

[38] V Srinivasan C Eswaran and N Sriraam ldquoApproximateentropy-based epileptic EEG detection using artificial neuralnetworksrdquo IEEE Transactions on Information Technology inBiomedicine vol 11 no 3 pp 288ndash295 2007

[39] K Andras and A Janos ldquoRedesign of the supply of mobilemechanics based on a novel genetic optimization algorithmusing Google Maps APIrdquo Engineering Applications of ArtificialIntelligence vol 38 pp 122ndash130 2015

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 12: Quality Prediction Model Based on Novel Elman Neural Network …downloads.hindawi.com/journals/complexity/2019/9852134.pdf · 2019. 7. 30. · model for subsurface drip irrigation

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom