8
Research Article A Hybrid Intelligent Method of Predicting Stock Returns Akhter Mohiuddin Rather Woxsen School of Business, Sadasivpet, Kamkol, Hyderabad 502291, India Correspondence should be addressed to Akhter Mohiuddin Rather; [email protected] Received 16 May 2014; Revised 26 August 2014; Accepted 26 August 2014; Published 7 September 2014 Academic Editor: Ozgur Kisi Copyright © 2014 Akhter Mohiuddin Rather. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. is paper proposes a novel method for predicting stock returns by means of a hybrid intelligent model. Initially predictions are obtained by a linear model, and thereby prediction errors are collected and fed into a recurrent neural network which is actually an autoregressive moving reference neural network. Recurrent neural network results in minimized prediction errors because of nonlinear processing and also because of its configuration. ese prediction errors are used to obtain final predictions by summation method as well as by multiplication method. e proposed model is thus hybrid of both a linear and a nonlinear model. e model has been tested on stock data obtained from National Stock Exchange of India. e results indicate that the proposed model can be a promising approach in predicting future stock movements. 1. Introduction Prediction of stock returns has attracted many researchers in the past and at present it is still an emerging area both in academia and in industry. Mathematically, the techniques involved in obtaining prediction of stock returns can be broadly classified into two categories. e first category involves linear models such as autoregressive moving average models, exponential smoothing, linear trend prediction, random walk model, generalized autoregressive conditional heteroskedasticity, and stochastic volatility model [1]. e second category involves those models which are based on artificial intelligence such as artificial neural networks (ANNs) [2], support vector machines [3], genetic algorithms (GA), and particle swarm optimization (PSO) [4]. Linear models have a common limitation associated with them, that is, their linear feature which prevents them from detecting nonlinear patterns of data. Due to instability in stock market, the stock data is volatile in nature; thus, linear models are unable to detect nonlinear patterns of such data. Nonlinear models overcome the limitations of linear models, as ANNs embody useful nonlinear functions which are able to detect nonlinear patterns of data [5]. As a consequence, prediction performance improves by using nonlinear models [6, 7]. A lot of work has been done in this field; for instance, radial basis neural network was used for stock prediction of Shanghai Stock Exchange, wherein artificial fish swarm optimization was introduced so as to optimize radial basis function [8]. In time series prediction, ANNs have received overwhelming attention from researchers. For instance, Fre- itas et al. [9], Wang et al. [10], Khashei and Bijari [11], Chen et al. [12], and Jain and Kumar [13] report the use of ANN in time series stock prediction. A new approach called the wavelet denoising based backpropagation neural network was proposed for predicting stock prices [14]. In another work, researchers explored the use of activation functions in ANN, and to improve the performance of ANNs, they suggest the use of three new simple functions; financial time series data was used in the experiments [15]. For seasonal time series prediction, an ANN was proposed, which considers the seasonal time periods in time series. e purpose of such consideration is to determine the number of input and output neurons [16]. Multilayer perceptron and generalized regression neural networks were used to predict the Kuwait Stock Exchange [17]. e results showed that the models were useful in predicting stock exchange movements in emerging markets. Different ANN models were proposed that are able to capture temporal aspects of inputs [5]. For time series predictions, two types of ANN models are proved to be successful: time-lagged feedforward networks and dynamically driven recurrent (feedback) networks [18]. Hindawi Publishing Corporation Advances in Artificial Neural Systems Volume 2014, Article ID 246487, 7 pages http://dx.doi.org/10.1155/2014/246487

Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

Research ArticleA Hybrid Intelligent Method of Predicting Stock Returns

Akhter Mohiuddin Rather

Woxsen School of Business Sadasivpet Kamkol Hyderabad 502291 India

Correspondence should be addressed to Akhter Mohiuddin Rather akhterakigmailcom

Received 16 May 2014 Revised 26 August 2014 Accepted 26 August 2014 Published 7 September 2014

Academic Editor Ozgur Kisi

Copyright copy 2014 Akhter Mohiuddin Rather This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

This paper proposes a novel method for predicting stock returns by means of a hybrid intelligent model Initially predictions areobtained by a linear model and thereby prediction errors are collected and fed into a recurrent neural network which is actuallyan autoregressive moving reference neural network Recurrent neural network results in minimized prediction errors because ofnonlinear processing and also because of its configurationThese prediction errors are used to obtain final predictions by summationmethod as well as by multiplication methodThe proposed model is thus hybrid of both a linear and a nonlinear modelThemodelhas been tested on stock data obtained fromNational Stock Exchange of IndiaThe results indicate that the proposed model can bea promising approach in predicting future stock movements

1 Introduction

Prediction of stock returns has attracted many researchersin the past and at present it is still an emerging area bothin academia and in industry Mathematically the techniquesinvolved in obtaining prediction of stock returns can bebroadly classified into two categories The first categoryinvolves linear models such as autoregressive moving averagemodels exponential smoothing linear trend predictionrandom walk model generalized autoregressive conditionalheteroskedasticity and stochastic volatility model [1] Thesecond category involves those models which are basedon artificial intelligence such as artificial neural networks(ANNs) [2] support vector machines [3] genetic algorithms(GA) and particle swarm optimization (PSO) [4] Linearmodels have a common limitation associated with them thatis their linear feature which prevents them from detectingnonlinear patterns of data Due to instability in stock marketthe stock data is volatile in nature thus linear models areunable to detect nonlinear patterns of such data Nonlinearmodels overcome the limitations of linear models as ANNsembody useful nonlinear functions which are able to detectnonlinear patterns of data [5] As a consequence predictionperformance improves by using nonlinear models [6 7]

A lot of work has been done in this field for instanceradial basis neural network was used for stock prediction

of Shanghai Stock Exchange wherein artificial fish swarmoptimization was introduced so as to optimize radial basisfunction [8] In time series prediction ANNs have receivedoverwhelming attention from researchers For instance Fre-itas et al [9] Wang et al [10] Khashei and Bijari [11] Chenet al [12] and Jain and Kumar [13] report the use of ANNin time series stock prediction A new approach called thewavelet denoising based backpropagation neural networkwas proposed for predicting stock prices [14] In anotherwork researchers explored the use of activation functionsin ANN and to improve the performance of ANNs theysuggest the use of three new simple functions financial timeseries datawas used in the experiments [15] For seasonal timeseries prediction an ANN was proposed which considersthe seasonal time periods in time series The purpose ofsuch consideration is to determine the number of input andoutput neurons [16] Multilayer perceptron and generalizedregression neural networks were used to predict the KuwaitStock Exchange [17] The results showed that the modelswere useful in predicting stock exchange movements inemerging markets Different ANN models were proposedthat are able to capture temporal aspects of inputs [5]For time series predictions two types of ANN models areproved to be successful time-lagged feedforward networksand dynamically driven recurrent (feedback) networks [18]

Hindawi Publishing CorporationAdvances in Artificial Neural SystemsVolume 2014 Article ID 246487 7 pageshttpdxdoiorg1011552014246487

2 Advances in Artificial Neural Systems

ANNs are not always guaranteed to yield desired resultsIn order to solve such problems the researchers attemptedto find global optimization approach of ANN to predict thestock price index [19] With the goal to further improve theperformance of predictors researchers have also attemptedto develop hybrid models for prediction of stock returnsThe hybridization may include integration of linear andnonlinearmodels [10 20 21] Researchers developed a hybridforecasting model by integrating recurrent neural networkbased on artificial bee colony and wavelet transforms [22]In another work an adaptive network-based fuzzy inferencesystem was used so as to develop a hybrid prediction model[23] A hybrid system was proposed by integrating linearautoregressive integrated moving average model and ANN[11] To overcome the prediction problem with the use oftechnical indicators researchers proposed a hybrid modelby merging ANNs and genetic programming [24] Featureselection was also used to improve the performance of thesystem In another work random walk prediction model wasmerged with ANN and a hybrid model was thus developed[25] Researchers also introduced a hybrid ANN architectureof particle swarm optimization and adaptive radial basisfunction [26] Various regressive ANN models such as selforganising maps and support vector regressions were usedto form a hybrid model so as to predict foreign exchangecurrency rate [27] For business failure prediction a hybridmodel was proposed in which particle swarm optimizationand network-based fuzzy inference system were used [28]In a recent work researchers created a hybrid predictionmodel by integrating autoregressive moving average modeland differential evolution based training of its feedforwardand feedback parameters [29] This hybrid model has beencompared with other similar hybrid models and the resultsconfirm that it outperforms other models

The rest of the paper is arranged as follows Section 2discusses various prediction based models including thosemodels which are used in this work The proposed hybridmodel is described in detail in Section 3 Section 4 discussestwo commonly used error metrics In Section 5 experimentsand results are presented Finally Section 6 presents conclu-sions

2 Prediction Based Models

Stock return or simply return is used to refer to a profit on aninvestment Return 119877

119905at time is calculated using (1) where

119901119905and 119901

119905minus1are prices of stock at times 119905 and 119905minus1 respectively

119877119905=119901119905minus 119901119905minus1

119901119905minus1

(1)

Few prediction-based models available in literatureincluding those used in this work are described below

21 Exponential Smoothing Model Exponential smoothingmodel is used to obtain predictions on time series data Brown[30] It computes one-step-ahead prediction by means of

computing geometric sum of past observations as shown inthe following equation

119903119905+1= 119903119905+ 120572 (119903

119905minus 119903119905) (2)

where 119903119905is prediction of 119903

119905 119903119905+1

is prediction for futurevalue 120572 is a smoothing parameter in the range of (0 1) and119903119905minus 119903119905is prediction error Exponential smoothing assigns

exponentially decreasing weights over timeThe observationsare weighted with more weight given to the most recentobservations

22 Autoregressive Moving Average Model Autoregressivemoving average model was introduced by Box and Jenkins[31] for times series prediction which was actually inspiredby the early work of Yule [32] and Wold [33] The modelconsists of an autoregressive part AR(119901) and moving averagepart MA(119902) This is the reason why the model is referred toas ARMA(119901 119902) model The autoregressive moving averagemodel is thus defined asAR(119901) process and it can be generallyexpressed as shown in the following equation

119903119905= 1206011119903119905minus1+ 1206012119903119905minus2+ sdot sdot sdot + 120601

119901119903119905minus119901+ 120576119905 (3)

Similarly MA(119902) process can be generally expressed asshown in the following equation

119903119905= 120576119905minus 1205791120576119905minus1minus 1205792120576119905minus2minus sdot sdot sdot minus 120579

119902120576119905minus119902 (4)

Hence the ARMA(119901 119902)model can be expressed as

119903119905= 1206011119903119905minus1+ 1206012119903119905minus2+ sdot sdot sdot + 120601

119901119903119905minus119901+ 120576119905minus 1205791120576119905minus1

minus 1205792120576119905minus2minus sdot sdot sdot minus 120579

119902120576119905minus119902

(5)

where119901 and 119902 represent the order of the autoregressivemodeland of the moving average model respectively 120601 and 120579 arecoefficients that satisfy stationarity of series 120576

119905are random

errors or white noise at time 119905 with zero mean and variance1205902

120576

23 Autoregressive Moving Reference Regression Model Thisis a new regression model proposed by Freitas et al [9] and isadopted in this work also The same model has been used by[34 35] along with different ANN models

Let us consider time series data with 119879 past returns asshown in the following equation

1199031015840= (119903119905minus(119879minus1) 119903

119905minus1 119903119905) (6)

Based on the available historical data the future return ofan action can be defined as the process in which the elementsof the past returns are used to obtain an estimate of futurereturn 119903

119905+119897 where 119897 ge 1 The value of 119897 directly affects the

choice of the adopted prediction method usedInput-output pairs are generated by this model which

are given to ANN in a supervised manner thus it canbe called autoregressive moving reference neural networkAR-MRNN(119901 119896) where 119901 is the order of regression and 119896 isdelay from the point of reference

Advances in Artificial Neural Systems 3

Using an autoregressive predictor say R AR-MRNN(119901 119896) implements the prediction system as shown in thefollowing equation

119903119905+1minus 119911 = R (119903

119905minus(119901minus1)minus 119911 119903

119905minus1minus 119911 119903119905minus 119911) (7)

where 119903119905+1minus 119911 is prediction for 119903

119905+1minus 119911 obtained at time

119905 from the information available from the historical series1199031015840= (119903119905minus(119879minus1) 119903

119905minus1 119903119905) 119901 is the order of regression and 119911 is

moving reference given in the following equation

119911 = 119903119905minus(119901minus1)minus119896 (8)

After training the neural network prediction 119903119905+1

isobtained as shown in the following equation

119903119905+1= 119903119905+1minus 119911 + 119911 (9)

According to (7) inputs to the neural network are given asdifferences rather than original values and the network thusrequires smaller values of weights which improves its abilityto generalizeThe output obtained fromneural network is notthe final predictions rather than final predictions which arecalculated by adding 119911 value to the output

3 The Proposed Hybrid IntelligentPrediction Model

This section discusses the proposed hybrid intelligent pre-diction model (HIPM) in detail Consider the actual returnsof time series given by 119903

119905(119905 = 1 119879) Let the predictions

obtained by any linear model be denoted by 119903119905(119905 = 1 119879)

The difference between actual time series (119903119905) and predicted

series (119903119905) is known as prediction error or simply error which

is calculated here in a similar fashion that is 120576119905= 119903119905minus 119903119905

In HIPM the predictions are obtained via two methodsthat is summation method andmultiplication method thesemethods are defined below

31 Summation Method According to summation methodactual data is equal to predicted linear data summed up witherror terms as shown in the following equation

119903119905= 119903119905+ 120576119905 (10)

Error terms are thus calculated as shown in the followingequation

120576119905= 119903119905minus 119903119905 (11)

32 MultiplicativeMethod Formultiplicativemethod actualdata is equal to predicted linear data multiplied with errorterms as shown in the following equation

119903119905= 119903119905times 120576119905 (12)

The error terms are multiplied back to linear predictionsbecause these error terms were calculated as shown in thefollowing equation

120576119905=119903119905

119903119905

(13)

Linear model

Minimized errors

Final predictions

Summation method Multiplicative method

ANNrt rt 120576t t

rt = rt + t rt = rt times t

120576t = rt minus rt

Figure 1 Work flow of HIPM

The series of errors obtained by the above two methodsare given to ANNbymeans of AR-MRNN(119901 119896) where 119896 = 1Thus in terms of AR-MRNN (7) to (9) are modified as givenbelow

120576119905+1minus 119911 = R (120576

119905minus(119901minus1)minus 119911 120576

119905minus1minus 119911 120576119905minus 119911) (14)

where 120576119905+1minus 119911 is estimate for 120576

119905+1minus 119911 obtained at time 119905 from

the information available from the previously obtained errorseries 1205761015840 = (120576

119905minus(119879minus1) 120576

119905minus1 120576119905) 119901 is the order of regression

and 119911 is moving reference given in the following equation

119911 = 120576119905minus(119901minus1)minus119896 (15)

After training the neural network 120576119905+1

is obtained asshown in (16) thus minimized errors are obtained

120576119905+1= 120576119905+1minus 119911 + 119911 (16)

These minimized errors are added to (10) and (12)after replacing original errors Hence final predictions fromHIPMare obtained by summationmethod andmultiplicativemethod as shown in the following two equations respec-tively

119910119905= 119903119905+ 120576119905 (17)

119910119905= 119903119905times 120576119905 (18)

Figure 1 shows the complete work flow of HIPM Initiallyactual returns 119903

119905 are given as an input to linear prediction

through which predictions 119903119905are obtained In the next step

errors 120576119905are calculated These errors are fed into ANN

which does nonlinear processing The minimized errors 120576119905

obtained fromANN are used to calculate final predictions viatwo methods that is summation method and multiplicativemethod

33 Recurrent Neural Network with Autoregressive MovingReference The importance of a recurrent neural network(RNN) is that it responds to the same input pattern differ-ently at different times Figure 2 shows the recurrent neuralnetwork with one hidden layer which is used in this workIn this type of network model input layer is not only fedto hidden layer but also fed back into input layer The

4 Advances in Artificial Neural Systems

120576t minus z

120576tminus1 minus z

120576tminus(pminus1) minus z

h11

h12

h1n

h21

Long termmemoryNeurons

p

120576t+1 minus z

Figure 2 Recurrent neural network used in HIPM

network is a supervised AR-MRNNwith inputs (120576119905minus 119911 120576119905minus1minus

119911 120576119905minus(119901minus1)minus 119911) The desired output is 120576

119905+1minus 119911 As shown

the receiving end for input layer is a long term memory forthe network The function of this memory is to hold the dataand pass it to hidden layer immediately after each patternis sent from input layer In this way the network is able tosee previous knowledge it had about previous inputs Thuslong termmemory remembers the new input data and uses itwhen the next pattern is processed The disadvantage of thisnetwork is that it takes longer to train

The input layer has input neurons equal to the regressionorder chosen that is 119901 possessing linear activation functionThe 119899 are number of neurons ℎ

11 ℎ

1119899in hidden layer

possessing some activation functionThe output neuron thatis ℎ21 also possesses some activation function Thus the

network is able to learn how to predict complex nonlinearpatterns of the data This network is also known as JordanElman neural network and for training itself the networkuses backpropagation algorithm [36 37]

4 Error Metrics

Theperformance of HIPM is checked here by using two errormetrics mean square error and mean absolute error A briefdiscussion about these error metrics is given in the followingsubsections

41 Mean Square Error (MSE) MSE is the arithmetic meanof the sum of the squares of the forecasted errors MSE is astandard metric for comparing the differences between twotime series and is defined as shown in the following equation

MSE = 1119879

119879

sum

119905=1

(119903119905minus 119903119905)2

(19)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

42 Mean Absolute Error (MAE) MAE is a quantity usedto measure how close forecasts are to the eventual outcomes

Table 1 Company list

1 2 3Tech Mahindra HCL Mindtree

MAE measures the average magnitude of the errors in a setof forecasts without considering their direction The meanabsolute error is given in the following equation

MAE = 1119879

119879

sum

119894=1

1003816100381610038161003816119903119905 minus 1199031199051003816100381610038161003816

(20)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

5 Application of HIPM on Stock Market Data

In order to verify the performance of HIPM predictor realworld stock data has been used for experiments The stockdata of three different information technology companieshave been used here These companies are given in Table 1

Stock data of above three companies has been obtainedfrom Bombay Stock Exchange India (httpwwwbseindiacom) Daily adjusted closing prices of three stocks havebeen taken since 14-05-2013 to 30-12-2013 and returns of164 days (15-05-2013 to 30-12-2013) are calculated using (1)Predictions were obtained using ESM and RNN

51 Predictions Using ESM Initially predictions are obtainedusing linear predictionmodel here ESMhas been chosen forthe purpose Value for smoothing factor 120572 in ESM has beenobtained using the following optimization model

Minsum119879

119905=1(119903119905minus 119903ESM(119905))

2

119879st 0 le 120572 le 1

(21)

where 119903119905is actual return 119903ESM(119905) is prediction obtained from

ESM 120572 is smoothing factor and 119879 is length of historical

Advances in Artificial Neural Systems 5

00000

00200

00400

00600

1 9 17 25 33 41 49 57 65 73 81minus00400

minus00200

Actual returnsPredicted returns

rt+1

Figure 3 Multiplicative method output of HIPM for stock 2

series The smoothing factor 120572 is associated with the term119903ESM(119905) as shown in (2) Thus (21) is an objective functionwhich minimizes MSE of the predictions obtained fromexponential smoothing technique Its constraint guaranteesthat the value of smoothing factor 120572 ranges between 0 and1 Since ESM is a linear prediction model it obviously didnot produce satisfactory predictions thus resulting in highprediction error

52 Predictions Using RNN After obtaining predictionsusing ESM and series of errors calculated these errors weregiven to RNN by means of AR-MRNN(119901 119896) ((14) to (16))AR-MRNN(6 1) (RNN)was used to obtain stock predictionsRegression order 119901 = 6 was chosen after trial and erroras it was observed that by using this particular regressionorder RNN produced less prediction error In each stockdata was divided into two equal parts (5050) Out of 164returns 50 data or 82 returns between15-05-2013 and 05-09-2013 were kept for training RNN and the remaining 50data that is 82 returns between 06-09-2013 and 30-12-2013for testing Sliding windows each of 82 returns were createdFor each window input-output pairs were calculated usingAR-MRNN(6 1) method which result in 76 input-outputpairs in each window 83 sliding windows were formed eachgiving prediction for future period thus 83 windows give 83future predictions while initial window gives prediction forthe period 119903

119905+1 By combing 83 sliding windows 6308 input-

output pairs were obtainedThese slidingwindowswere givento RNN and trained in a supervised manner the procedurewas repeated for all stocks

In the chosen RNNmodel there are 16 neurons in hiddenlayer possessing sigmoid activation function and an outputneuron in output layer also possessing sigmoid activationfunction An error threshold of MSE = 00002 was presetfor RNN it means RNN converged only after average errorreached below the threshold For RNN it took over 10000epochs for each stock to reach below preset error

Figures 3 and 4 show the prediction output of HIPM(between 06-09-2013 and 31-12-2013) via multiplicativemethod and summation method for stock 2 and stock 3

00100

00300

00500

minus00700

minus00500

minus00300

minus00100

1 9 17 25 33 41 49 57 65 73 81

Actual returnsPredicted returns

rt+1

Figure 4 Summation method output of HIPM for stock 3

Table 2 Values of error metrics for HIPM

Stock 1 2 3Summation method

MSE 00005 00008 00025MAE 00176 00205 00362

Multiplicative methodMSE 00004 00005 00007MAE 00172 00175 00200

respectively Actual returns are shown by blue solid linewhereas predictions are shown by orange dotted line HIPMpredictor is able to capture nonlinear patterns of data verywell As shown actual and predicted returns are very closeto each other which imply that predictions are satisfactoryThere are total six similar figures obtained out of which onlytwo are displayed above

The performance of HIPM predictor can be better judgedby Table 2 which displays values of error metrics obtainedAs seen Multiplicative method outperforms summationmethod in terms of less prediction error

6 Conclusions

A new and promising approach for prediction of stockreturns is presented in this paper A hybrid predictionintelligent model is developed by combining predictionsobtained from a linear prediction model and a nonlinearmodel The linear model chosen is exponential smoothingmodel while autoregressive moving reference neural networkis chosen as nonlinear model This is a new approachwherein errors are fed into neural network so as to obtainminimized errors Initially prediction of stock returns isobtained using exponential smoothing model and predictionerrors calculated Autoregressive moving reference methodis used to calculate input-output pairs for the errors justobtained These errors are fed into recurrent neural networkand the network learns using backpropagation algorithm insupervised manner Finally the prediction of stocks is calcu-lated via twomethods summationmethod andmultiplicative

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

2 Advances in Artificial Neural Systems

ANNs are not always guaranteed to yield desired resultsIn order to solve such problems the researchers attemptedto find global optimization approach of ANN to predict thestock price index [19] With the goal to further improve theperformance of predictors researchers have also attemptedto develop hybrid models for prediction of stock returnsThe hybridization may include integration of linear andnonlinearmodels [10 20 21] Researchers developed a hybridforecasting model by integrating recurrent neural networkbased on artificial bee colony and wavelet transforms [22]In another work an adaptive network-based fuzzy inferencesystem was used so as to develop a hybrid prediction model[23] A hybrid system was proposed by integrating linearautoregressive integrated moving average model and ANN[11] To overcome the prediction problem with the use oftechnical indicators researchers proposed a hybrid modelby merging ANNs and genetic programming [24] Featureselection was also used to improve the performance of thesystem In another work random walk prediction model wasmerged with ANN and a hybrid model was thus developed[25] Researchers also introduced a hybrid ANN architectureof particle swarm optimization and adaptive radial basisfunction [26] Various regressive ANN models such as selforganising maps and support vector regressions were usedto form a hybrid model so as to predict foreign exchangecurrency rate [27] For business failure prediction a hybridmodel was proposed in which particle swarm optimizationand network-based fuzzy inference system were used [28]In a recent work researchers created a hybrid predictionmodel by integrating autoregressive moving average modeland differential evolution based training of its feedforwardand feedback parameters [29] This hybrid model has beencompared with other similar hybrid models and the resultsconfirm that it outperforms other models

The rest of the paper is arranged as follows Section 2discusses various prediction based models including thosemodels which are used in this work The proposed hybridmodel is described in detail in Section 3 Section 4 discussestwo commonly used error metrics In Section 5 experimentsand results are presented Finally Section 6 presents conclu-sions

2 Prediction Based Models

Stock return or simply return is used to refer to a profit on aninvestment Return 119877

119905at time is calculated using (1) where

119901119905and 119901

119905minus1are prices of stock at times 119905 and 119905minus1 respectively

119877119905=119901119905minus 119901119905minus1

119901119905minus1

(1)

Few prediction-based models available in literatureincluding those used in this work are described below

21 Exponential Smoothing Model Exponential smoothingmodel is used to obtain predictions on time series data Brown[30] It computes one-step-ahead prediction by means of

computing geometric sum of past observations as shown inthe following equation

119903119905+1= 119903119905+ 120572 (119903

119905minus 119903119905) (2)

where 119903119905is prediction of 119903

119905 119903119905+1

is prediction for futurevalue 120572 is a smoothing parameter in the range of (0 1) and119903119905minus 119903119905is prediction error Exponential smoothing assigns

exponentially decreasing weights over timeThe observationsare weighted with more weight given to the most recentobservations

22 Autoregressive Moving Average Model Autoregressivemoving average model was introduced by Box and Jenkins[31] for times series prediction which was actually inspiredby the early work of Yule [32] and Wold [33] The modelconsists of an autoregressive part AR(119901) and moving averagepart MA(119902) This is the reason why the model is referred toas ARMA(119901 119902) model The autoregressive moving averagemodel is thus defined asAR(119901) process and it can be generallyexpressed as shown in the following equation

119903119905= 1206011119903119905minus1+ 1206012119903119905minus2+ sdot sdot sdot + 120601

119901119903119905minus119901+ 120576119905 (3)

Similarly MA(119902) process can be generally expressed asshown in the following equation

119903119905= 120576119905minus 1205791120576119905minus1minus 1205792120576119905minus2minus sdot sdot sdot minus 120579

119902120576119905minus119902 (4)

Hence the ARMA(119901 119902)model can be expressed as

119903119905= 1206011119903119905minus1+ 1206012119903119905minus2+ sdot sdot sdot + 120601

119901119903119905minus119901+ 120576119905minus 1205791120576119905minus1

minus 1205792120576119905minus2minus sdot sdot sdot minus 120579

119902120576119905minus119902

(5)

where119901 and 119902 represent the order of the autoregressivemodeland of the moving average model respectively 120601 and 120579 arecoefficients that satisfy stationarity of series 120576

119905are random

errors or white noise at time 119905 with zero mean and variance1205902

120576

23 Autoregressive Moving Reference Regression Model Thisis a new regression model proposed by Freitas et al [9] and isadopted in this work also The same model has been used by[34 35] along with different ANN models

Let us consider time series data with 119879 past returns asshown in the following equation

1199031015840= (119903119905minus(119879minus1) 119903

119905minus1 119903119905) (6)

Based on the available historical data the future return ofan action can be defined as the process in which the elementsof the past returns are used to obtain an estimate of futurereturn 119903

119905+119897 where 119897 ge 1 The value of 119897 directly affects the

choice of the adopted prediction method usedInput-output pairs are generated by this model which

are given to ANN in a supervised manner thus it canbe called autoregressive moving reference neural networkAR-MRNN(119901 119896) where 119901 is the order of regression and 119896 isdelay from the point of reference

Advances in Artificial Neural Systems 3

Using an autoregressive predictor say R AR-MRNN(119901 119896) implements the prediction system as shown in thefollowing equation

119903119905+1minus 119911 = R (119903

119905minus(119901minus1)minus 119911 119903

119905minus1minus 119911 119903119905minus 119911) (7)

where 119903119905+1minus 119911 is prediction for 119903

119905+1minus 119911 obtained at time

119905 from the information available from the historical series1199031015840= (119903119905minus(119879minus1) 119903

119905minus1 119903119905) 119901 is the order of regression and 119911 is

moving reference given in the following equation

119911 = 119903119905minus(119901minus1)minus119896 (8)

After training the neural network prediction 119903119905+1

isobtained as shown in the following equation

119903119905+1= 119903119905+1minus 119911 + 119911 (9)

According to (7) inputs to the neural network are given asdifferences rather than original values and the network thusrequires smaller values of weights which improves its abilityto generalizeThe output obtained fromneural network is notthe final predictions rather than final predictions which arecalculated by adding 119911 value to the output

3 The Proposed Hybrid IntelligentPrediction Model

This section discusses the proposed hybrid intelligent pre-diction model (HIPM) in detail Consider the actual returnsof time series given by 119903

119905(119905 = 1 119879) Let the predictions

obtained by any linear model be denoted by 119903119905(119905 = 1 119879)

The difference between actual time series (119903119905) and predicted

series (119903119905) is known as prediction error or simply error which

is calculated here in a similar fashion that is 120576119905= 119903119905minus 119903119905

In HIPM the predictions are obtained via two methodsthat is summation method andmultiplication method thesemethods are defined below

31 Summation Method According to summation methodactual data is equal to predicted linear data summed up witherror terms as shown in the following equation

119903119905= 119903119905+ 120576119905 (10)

Error terms are thus calculated as shown in the followingequation

120576119905= 119903119905minus 119903119905 (11)

32 MultiplicativeMethod Formultiplicativemethod actualdata is equal to predicted linear data multiplied with errorterms as shown in the following equation

119903119905= 119903119905times 120576119905 (12)

The error terms are multiplied back to linear predictionsbecause these error terms were calculated as shown in thefollowing equation

120576119905=119903119905

119903119905

(13)

Linear model

Minimized errors

Final predictions

Summation method Multiplicative method

ANNrt rt 120576t t

rt = rt + t rt = rt times t

120576t = rt minus rt

Figure 1 Work flow of HIPM

The series of errors obtained by the above two methodsare given to ANNbymeans of AR-MRNN(119901 119896) where 119896 = 1Thus in terms of AR-MRNN (7) to (9) are modified as givenbelow

120576119905+1minus 119911 = R (120576

119905minus(119901minus1)minus 119911 120576

119905minus1minus 119911 120576119905minus 119911) (14)

where 120576119905+1minus 119911 is estimate for 120576

119905+1minus 119911 obtained at time 119905 from

the information available from the previously obtained errorseries 1205761015840 = (120576

119905minus(119879minus1) 120576

119905minus1 120576119905) 119901 is the order of regression

and 119911 is moving reference given in the following equation

119911 = 120576119905minus(119901minus1)minus119896 (15)

After training the neural network 120576119905+1

is obtained asshown in (16) thus minimized errors are obtained

120576119905+1= 120576119905+1minus 119911 + 119911 (16)

These minimized errors are added to (10) and (12)after replacing original errors Hence final predictions fromHIPMare obtained by summationmethod andmultiplicativemethod as shown in the following two equations respec-tively

119910119905= 119903119905+ 120576119905 (17)

119910119905= 119903119905times 120576119905 (18)

Figure 1 shows the complete work flow of HIPM Initiallyactual returns 119903

119905 are given as an input to linear prediction

through which predictions 119903119905are obtained In the next step

errors 120576119905are calculated These errors are fed into ANN

which does nonlinear processing The minimized errors 120576119905

obtained fromANN are used to calculate final predictions viatwo methods that is summation method and multiplicativemethod

33 Recurrent Neural Network with Autoregressive MovingReference The importance of a recurrent neural network(RNN) is that it responds to the same input pattern differ-ently at different times Figure 2 shows the recurrent neuralnetwork with one hidden layer which is used in this workIn this type of network model input layer is not only fedto hidden layer but also fed back into input layer The

4 Advances in Artificial Neural Systems

120576t minus z

120576tminus1 minus z

120576tminus(pminus1) minus z

h11

h12

h1n

h21

Long termmemoryNeurons

p

120576t+1 minus z

Figure 2 Recurrent neural network used in HIPM

network is a supervised AR-MRNNwith inputs (120576119905minus 119911 120576119905minus1minus

119911 120576119905minus(119901minus1)minus 119911) The desired output is 120576

119905+1minus 119911 As shown

the receiving end for input layer is a long term memory forthe network The function of this memory is to hold the dataand pass it to hidden layer immediately after each patternis sent from input layer In this way the network is able tosee previous knowledge it had about previous inputs Thuslong termmemory remembers the new input data and uses itwhen the next pattern is processed The disadvantage of thisnetwork is that it takes longer to train

The input layer has input neurons equal to the regressionorder chosen that is 119901 possessing linear activation functionThe 119899 are number of neurons ℎ

11 ℎ

1119899in hidden layer

possessing some activation functionThe output neuron thatis ℎ21 also possesses some activation function Thus the

network is able to learn how to predict complex nonlinearpatterns of the data This network is also known as JordanElman neural network and for training itself the networkuses backpropagation algorithm [36 37]

4 Error Metrics

Theperformance of HIPM is checked here by using two errormetrics mean square error and mean absolute error A briefdiscussion about these error metrics is given in the followingsubsections

41 Mean Square Error (MSE) MSE is the arithmetic meanof the sum of the squares of the forecasted errors MSE is astandard metric for comparing the differences between twotime series and is defined as shown in the following equation

MSE = 1119879

119879

sum

119905=1

(119903119905minus 119903119905)2

(19)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

42 Mean Absolute Error (MAE) MAE is a quantity usedto measure how close forecasts are to the eventual outcomes

Table 1 Company list

1 2 3Tech Mahindra HCL Mindtree

MAE measures the average magnitude of the errors in a setof forecasts without considering their direction The meanabsolute error is given in the following equation

MAE = 1119879

119879

sum

119894=1

1003816100381610038161003816119903119905 minus 1199031199051003816100381610038161003816

(20)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

5 Application of HIPM on Stock Market Data

In order to verify the performance of HIPM predictor realworld stock data has been used for experiments The stockdata of three different information technology companieshave been used here These companies are given in Table 1

Stock data of above three companies has been obtainedfrom Bombay Stock Exchange India (httpwwwbseindiacom) Daily adjusted closing prices of three stocks havebeen taken since 14-05-2013 to 30-12-2013 and returns of164 days (15-05-2013 to 30-12-2013) are calculated using (1)Predictions were obtained using ESM and RNN

51 Predictions Using ESM Initially predictions are obtainedusing linear predictionmodel here ESMhas been chosen forthe purpose Value for smoothing factor 120572 in ESM has beenobtained using the following optimization model

Minsum119879

119905=1(119903119905minus 119903ESM(119905))

2

119879st 0 le 120572 le 1

(21)

where 119903119905is actual return 119903ESM(119905) is prediction obtained from

ESM 120572 is smoothing factor and 119879 is length of historical

Advances in Artificial Neural Systems 5

00000

00200

00400

00600

1 9 17 25 33 41 49 57 65 73 81minus00400

minus00200

Actual returnsPredicted returns

rt+1

Figure 3 Multiplicative method output of HIPM for stock 2

series The smoothing factor 120572 is associated with the term119903ESM(119905) as shown in (2) Thus (21) is an objective functionwhich minimizes MSE of the predictions obtained fromexponential smoothing technique Its constraint guaranteesthat the value of smoothing factor 120572 ranges between 0 and1 Since ESM is a linear prediction model it obviously didnot produce satisfactory predictions thus resulting in highprediction error

52 Predictions Using RNN After obtaining predictionsusing ESM and series of errors calculated these errors weregiven to RNN by means of AR-MRNN(119901 119896) ((14) to (16))AR-MRNN(6 1) (RNN)was used to obtain stock predictionsRegression order 119901 = 6 was chosen after trial and erroras it was observed that by using this particular regressionorder RNN produced less prediction error In each stockdata was divided into two equal parts (5050) Out of 164returns 50 data or 82 returns between15-05-2013 and 05-09-2013 were kept for training RNN and the remaining 50data that is 82 returns between 06-09-2013 and 30-12-2013for testing Sliding windows each of 82 returns were createdFor each window input-output pairs were calculated usingAR-MRNN(6 1) method which result in 76 input-outputpairs in each window 83 sliding windows were formed eachgiving prediction for future period thus 83 windows give 83future predictions while initial window gives prediction forthe period 119903

119905+1 By combing 83 sliding windows 6308 input-

output pairs were obtainedThese slidingwindowswere givento RNN and trained in a supervised manner the procedurewas repeated for all stocks

In the chosen RNNmodel there are 16 neurons in hiddenlayer possessing sigmoid activation function and an outputneuron in output layer also possessing sigmoid activationfunction An error threshold of MSE = 00002 was presetfor RNN it means RNN converged only after average errorreached below the threshold For RNN it took over 10000epochs for each stock to reach below preset error

Figures 3 and 4 show the prediction output of HIPM(between 06-09-2013 and 31-12-2013) via multiplicativemethod and summation method for stock 2 and stock 3

00100

00300

00500

minus00700

minus00500

minus00300

minus00100

1 9 17 25 33 41 49 57 65 73 81

Actual returnsPredicted returns

rt+1

Figure 4 Summation method output of HIPM for stock 3

Table 2 Values of error metrics for HIPM

Stock 1 2 3Summation method

MSE 00005 00008 00025MAE 00176 00205 00362

Multiplicative methodMSE 00004 00005 00007MAE 00172 00175 00200

respectively Actual returns are shown by blue solid linewhereas predictions are shown by orange dotted line HIPMpredictor is able to capture nonlinear patterns of data verywell As shown actual and predicted returns are very closeto each other which imply that predictions are satisfactoryThere are total six similar figures obtained out of which onlytwo are displayed above

The performance of HIPM predictor can be better judgedby Table 2 which displays values of error metrics obtainedAs seen Multiplicative method outperforms summationmethod in terms of less prediction error

6 Conclusions

A new and promising approach for prediction of stockreturns is presented in this paper A hybrid predictionintelligent model is developed by combining predictionsobtained from a linear prediction model and a nonlinearmodel The linear model chosen is exponential smoothingmodel while autoregressive moving reference neural networkis chosen as nonlinear model This is a new approachwherein errors are fed into neural network so as to obtainminimized errors Initially prediction of stock returns isobtained using exponential smoothing model and predictionerrors calculated Autoregressive moving reference methodis used to calculate input-output pairs for the errors justobtained These errors are fed into recurrent neural networkand the network learns using backpropagation algorithm insupervised manner Finally the prediction of stocks is calcu-lated via twomethods summationmethod andmultiplicative

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

Advances in Artificial Neural Systems 3

Using an autoregressive predictor say R AR-MRNN(119901 119896) implements the prediction system as shown in thefollowing equation

119903119905+1minus 119911 = R (119903

119905minus(119901minus1)minus 119911 119903

119905minus1minus 119911 119903119905minus 119911) (7)

where 119903119905+1minus 119911 is prediction for 119903

119905+1minus 119911 obtained at time

119905 from the information available from the historical series1199031015840= (119903119905minus(119879minus1) 119903

119905minus1 119903119905) 119901 is the order of regression and 119911 is

moving reference given in the following equation

119911 = 119903119905minus(119901minus1)minus119896 (8)

After training the neural network prediction 119903119905+1

isobtained as shown in the following equation

119903119905+1= 119903119905+1minus 119911 + 119911 (9)

According to (7) inputs to the neural network are given asdifferences rather than original values and the network thusrequires smaller values of weights which improves its abilityto generalizeThe output obtained fromneural network is notthe final predictions rather than final predictions which arecalculated by adding 119911 value to the output

3 The Proposed Hybrid IntelligentPrediction Model

This section discusses the proposed hybrid intelligent pre-diction model (HIPM) in detail Consider the actual returnsof time series given by 119903

119905(119905 = 1 119879) Let the predictions

obtained by any linear model be denoted by 119903119905(119905 = 1 119879)

The difference between actual time series (119903119905) and predicted

series (119903119905) is known as prediction error or simply error which

is calculated here in a similar fashion that is 120576119905= 119903119905minus 119903119905

In HIPM the predictions are obtained via two methodsthat is summation method andmultiplication method thesemethods are defined below

31 Summation Method According to summation methodactual data is equal to predicted linear data summed up witherror terms as shown in the following equation

119903119905= 119903119905+ 120576119905 (10)

Error terms are thus calculated as shown in the followingequation

120576119905= 119903119905minus 119903119905 (11)

32 MultiplicativeMethod Formultiplicativemethod actualdata is equal to predicted linear data multiplied with errorterms as shown in the following equation

119903119905= 119903119905times 120576119905 (12)

The error terms are multiplied back to linear predictionsbecause these error terms were calculated as shown in thefollowing equation

120576119905=119903119905

119903119905

(13)

Linear model

Minimized errors

Final predictions

Summation method Multiplicative method

ANNrt rt 120576t t

rt = rt + t rt = rt times t

120576t = rt minus rt

Figure 1 Work flow of HIPM

The series of errors obtained by the above two methodsare given to ANNbymeans of AR-MRNN(119901 119896) where 119896 = 1Thus in terms of AR-MRNN (7) to (9) are modified as givenbelow

120576119905+1minus 119911 = R (120576

119905minus(119901minus1)minus 119911 120576

119905minus1minus 119911 120576119905minus 119911) (14)

where 120576119905+1minus 119911 is estimate for 120576

119905+1minus 119911 obtained at time 119905 from

the information available from the previously obtained errorseries 1205761015840 = (120576

119905minus(119879minus1) 120576

119905minus1 120576119905) 119901 is the order of regression

and 119911 is moving reference given in the following equation

119911 = 120576119905minus(119901minus1)minus119896 (15)

After training the neural network 120576119905+1

is obtained asshown in (16) thus minimized errors are obtained

120576119905+1= 120576119905+1minus 119911 + 119911 (16)

These minimized errors are added to (10) and (12)after replacing original errors Hence final predictions fromHIPMare obtained by summationmethod andmultiplicativemethod as shown in the following two equations respec-tively

119910119905= 119903119905+ 120576119905 (17)

119910119905= 119903119905times 120576119905 (18)

Figure 1 shows the complete work flow of HIPM Initiallyactual returns 119903

119905 are given as an input to linear prediction

through which predictions 119903119905are obtained In the next step

errors 120576119905are calculated These errors are fed into ANN

which does nonlinear processing The minimized errors 120576119905

obtained fromANN are used to calculate final predictions viatwo methods that is summation method and multiplicativemethod

33 Recurrent Neural Network with Autoregressive MovingReference The importance of a recurrent neural network(RNN) is that it responds to the same input pattern differ-ently at different times Figure 2 shows the recurrent neuralnetwork with one hidden layer which is used in this workIn this type of network model input layer is not only fedto hidden layer but also fed back into input layer The

4 Advances in Artificial Neural Systems

120576t minus z

120576tminus1 minus z

120576tminus(pminus1) minus z

h11

h12

h1n

h21

Long termmemoryNeurons

p

120576t+1 minus z

Figure 2 Recurrent neural network used in HIPM

network is a supervised AR-MRNNwith inputs (120576119905minus 119911 120576119905minus1minus

119911 120576119905minus(119901minus1)minus 119911) The desired output is 120576

119905+1minus 119911 As shown

the receiving end for input layer is a long term memory forthe network The function of this memory is to hold the dataand pass it to hidden layer immediately after each patternis sent from input layer In this way the network is able tosee previous knowledge it had about previous inputs Thuslong termmemory remembers the new input data and uses itwhen the next pattern is processed The disadvantage of thisnetwork is that it takes longer to train

The input layer has input neurons equal to the regressionorder chosen that is 119901 possessing linear activation functionThe 119899 are number of neurons ℎ

11 ℎ

1119899in hidden layer

possessing some activation functionThe output neuron thatis ℎ21 also possesses some activation function Thus the

network is able to learn how to predict complex nonlinearpatterns of the data This network is also known as JordanElman neural network and for training itself the networkuses backpropagation algorithm [36 37]

4 Error Metrics

Theperformance of HIPM is checked here by using two errormetrics mean square error and mean absolute error A briefdiscussion about these error metrics is given in the followingsubsections

41 Mean Square Error (MSE) MSE is the arithmetic meanof the sum of the squares of the forecasted errors MSE is astandard metric for comparing the differences between twotime series and is defined as shown in the following equation

MSE = 1119879

119879

sum

119905=1

(119903119905minus 119903119905)2

(19)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

42 Mean Absolute Error (MAE) MAE is a quantity usedto measure how close forecasts are to the eventual outcomes

Table 1 Company list

1 2 3Tech Mahindra HCL Mindtree

MAE measures the average magnitude of the errors in a setof forecasts without considering their direction The meanabsolute error is given in the following equation

MAE = 1119879

119879

sum

119894=1

1003816100381610038161003816119903119905 minus 1199031199051003816100381610038161003816

(20)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

5 Application of HIPM on Stock Market Data

In order to verify the performance of HIPM predictor realworld stock data has been used for experiments The stockdata of three different information technology companieshave been used here These companies are given in Table 1

Stock data of above three companies has been obtainedfrom Bombay Stock Exchange India (httpwwwbseindiacom) Daily adjusted closing prices of three stocks havebeen taken since 14-05-2013 to 30-12-2013 and returns of164 days (15-05-2013 to 30-12-2013) are calculated using (1)Predictions were obtained using ESM and RNN

51 Predictions Using ESM Initially predictions are obtainedusing linear predictionmodel here ESMhas been chosen forthe purpose Value for smoothing factor 120572 in ESM has beenobtained using the following optimization model

Minsum119879

119905=1(119903119905minus 119903ESM(119905))

2

119879st 0 le 120572 le 1

(21)

where 119903119905is actual return 119903ESM(119905) is prediction obtained from

ESM 120572 is smoothing factor and 119879 is length of historical

Advances in Artificial Neural Systems 5

00000

00200

00400

00600

1 9 17 25 33 41 49 57 65 73 81minus00400

minus00200

Actual returnsPredicted returns

rt+1

Figure 3 Multiplicative method output of HIPM for stock 2

series The smoothing factor 120572 is associated with the term119903ESM(119905) as shown in (2) Thus (21) is an objective functionwhich minimizes MSE of the predictions obtained fromexponential smoothing technique Its constraint guaranteesthat the value of smoothing factor 120572 ranges between 0 and1 Since ESM is a linear prediction model it obviously didnot produce satisfactory predictions thus resulting in highprediction error

52 Predictions Using RNN After obtaining predictionsusing ESM and series of errors calculated these errors weregiven to RNN by means of AR-MRNN(119901 119896) ((14) to (16))AR-MRNN(6 1) (RNN)was used to obtain stock predictionsRegression order 119901 = 6 was chosen after trial and erroras it was observed that by using this particular regressionorder RNN produced less prediction error In each stockdata was divided into two equal parts (5050) Out of 164returns 50 data or 82 returns between15-05-2013 and 05-09-2013 were kept for training RNN and the remaining 50data that is 82 returns between 06-09-2013 and 30-12-2013for testing Sliding windows each of 82 returns were createdFor each window input-output pairs were calculated usingAR-MRNN(6 1) method which result in 76 input-outputpairs in each window 83 sliding windows were formed eachgiving prediction for future period thus 83 windows give 83future predictions while initial window gives prediction forthe period 119903

119905+1 By combing 83 sliding windows 6308 input-

output pairs were obtainedThese slidingwindowswere givento RNN and trained in a supervised manner the procedurewas repeated for all stocks

In the chosen RNNmodel there are 16 neurons in hiddenlayer possessing sigmoid activation function and an outputneuron in output layer also possessing sigmoid activationfunction An error threshold of MSE = 00002 was presetfor RNN it means RNN converged only after average errorreached below the threshold For RNN it took over 10000epochs for each stock to reach below preset error

Figures 3 and 4 show the prediction output of HIPM(between 06-09-2013 and 31-12-2013) via multiplicativemethod and summation method for stock 2 and stock 3

00100

00300

00500

minus00700

minus00500

minus00300

minus00100

1 9 17 25 33 41 49 57 65 73 81

Actual returnsPredicted returns

rt+1

Figure 4 Summation method output of HIPM for stock 3

Table 2 Values of error metrics for HIPM

Stock 1 2 3Summation method

MSE 00005 00008 00025MAE 00176 00205 00362

Multiplicative methodMSE 00004 00005 00007MAE 00172 00175 00200

respectively Actual returns are shown by blue solid linewhereas predictions are shown by orange dotted line HIPMpredictor is able to capture nonlinear patterns of data verywell As shown actual and predicted returns are very closeto each other which imply that predictions are satisfactoryThere are total six similar figures obtained out of which onlytwo are displayed above

The performance of HIPM predictor can be better judgedby Table 2 which displays values of error metrics obtainedAs seen Multiplicative method outperforms summationmethod in terms of less prediction error

6 Conclusions

A new and promising approach for prediction of stockreturns is presented in this paper A hybrid predictionintelligent model is developed by combining predictionsobtained from a linear prediction model and a nonlinearmodel The linear model chosen is exponential smoothingmodel while autoregressive moving reference neural networkis chosen as nonlinear model This is a new approachwherein errors are fed into neural network so as to obtainminimized errors Initially prediction of stock returns isobtained using exponential smoothing model and predictionerrors calculated Autoregressive moving reference methodis used to calculate input-output pairs for the errors justobtained These errors are fed into recurrent neural networkand the network learns using backpropagation algorithm insupervised manner Finally the prediction of stocks is calcu-lated via twomethods summationmethod andmultiplicative

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

4 Advances in Artificial Neural Systems

120576t minus z

120576tminus1 minus z

120576tminus(pminus1) minus z

h11

h12

h1n

h21

Long termmemoryNeurons

p

120576t+1 minus z

Figure 2 Recurrent neural network used in HIPM

network is a supervised AR-MRNNwith inputs (120576119905minus 119911 120576119905minus1minus

119911 120576119905minus(119901minus1)minus 119911) The desired output is 120576

119905+1minus 119911 As shown

the receiving end for input layer is a long term memory forthe network The function of this memory is to hold the dataand pass it to hidden layer immediately after each patternis sent from input layer In this way the network is able tosee previous knowledge it had about previous inputs Thuslong termmemory remembers the new input data and uses itwhen the next pattern is processed The disadvantage of thisnetwork is that it takes longer to train

The input layer has input neurons equal to the regressionorder chosen that is 119901 possessing linear activation functionThe 119899 are number of neurons ℎ

11 ℎ

1119899in hidden layer

possessing some activation functionThe output neuron thatis ℎ21 also possesses some activation function Thus the

network is able to learn how to predict complex nonlinearpatterns of the data This network is also known as JordanElman neural network and for training itself the networkuses backpropagation algorithm [36 37]

4 Error Metrics

Theperformance of HIPM is checked here by using two errormetrics mean square error and mean absolute error A briefdiscussion about these error metrics is given in the followingsubsections

41 Mean Square Error (MSE) MSE is the arithmetic meanof the sum of the squares of the forecasted errors MSE is astandard metric for comparing the differences between twotime series and is defined as shown in the following equation

MSE = 1119879

119879

sum

119905=1

(119903119905minus 119903119905)2

(19)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

42 Mean Absolute Error (MAE) MAE is a quantity usedto measure how close forecasts are to the eventual outcomes

Table 1 Company list

1 2 3Tech Mahindra HCL Mindtree

MAE measures the average magnitude of the errors in a setof forecasts without considering their direction The meanabsolute error is given in the following equation

MAE = 1119879

119879

sum

119894=1

1003816100381610038161003816119903119905 minus 1199031199051003816100381610038161003816

(20)

where 119903119905and 119903

119905are actual returns and predicted returns

respectively and 119879 is the length of the series

5 Application of HIPM on Stock Market Data

In order to verify the performance of HIPM predictor realworld stock data has been used for experiments The stockdata of three different information technology companieshave been used here These companies are given in Table 1

Stock data of above three companies has been obtainedfrom Bombay Stock Exchange India (httpwwwbseindiacom) Daily adjusted closing prices of three stocks havebeen taken since 14-05-2013 to 30-12-2013 and returns of164 days (15-05-2013 to 30-12-2013) are calculated using (1)Predictions were obtained using ESM and RNN

51 Predictions Using ESM Initially predictions are obtainedusing linear predictionmodel here ESMhas been chosen forthe purpose Value for smoothing factor 120572 in ESM has beenobtained using the following optimization model

Minsum119879

119905=1(119903119905minus 119903ESM(119905))

2

119879st 0 le 120572 le 1

(21)

where 119903119905is actual return 119903ESM(119905) is prediction obtained from

ESM 120572 is smoothing factor and 119879 is length of historical

Advances in Artificial Neural Systems 5

00000

00200

00400

00600

1 9 17 25 33 41 49 57 65 73 81minus00400

minus00200

Actual returnsPredicted returns

rt+1

Figure 3 Multiplicative method output of HIPM for stock 2

series The smoothing factor 120572 is associated with the term119903ESM(119905) as shown in (2) Thus (21) is an objective functionwhich minimizes MSE of the predictions obtained fromexponential smoothing technique Its constraint guaranteesthat the value of smoothing factor 120572 ranges between 0 and1 Since ESM is a linear prediction model it obviously didnot produce satisfactory predictions thus resulting in highprediction error

52 Predictions Using RNN After obtaining predictionsusing ESM and series of errors calculated these errors weregiven to RNN by means of AR-MRNN(119901 119896) ((14) to (16))AR-MRNN(6 1) (RNN)was used to obtain stock predictionsRegression order 119901 = 6 was chosen after trial and erroras it was observed that by using this particular regressionorder RNN produced less prediction error In each stockdata was divided into two equal parts (5050) Out of 164returns 50 data or 82 returns between15-05-2013 and 05-09-2013 were kept for training RNN and the remaining 50data that is 82 returns between 06-09-2013 and 30-12-2013for testing Sliding windows each of 82 returns were createdFor each window input-output pairs were calculated usingAR-MRNN(6 1) method which result in 76 input-outputpairs in each window 83 sliding windows were formed eachgiving prediction for future period thus 83 windows give 83future predictions while initial window gives prediction forthe period 119903

119905+1 By combing 83 sliding windows 6308 input-

output pairs were obtainedThese slidingwindowswere givento RNN and trained in a supervised manner the procedurewas repeated for all stocks

In the chosen RNNmodel there are 16 neurons in hiddenlayer possessing sigmoid activation function and an outputneuron in output layer also possessing sigmoid activationfunction An error threshold of MSE = 00002 was presetfor RNN it means RNN converged only after average errorreached below the threshold For RNN it took over 10000epochs for each stock to reach below preset error

Figures 3 and 4 show the prediction output of HIPM(between 06-09-2013 and 31-12-2013) via multiplicativemethod and summation method for stock 2 and stock 3

00100

00300

00500

minus00700

minus00500

minus00300

minus00100

1 9 17 25 33 41 49 57 65 73 81

Actual returnsPredicted returns

rt+1

Figure 4 Summation method output of HIPM for stock 3

Table 2 Values of error metrics for HIPM

Stock 1 2 3Summation method

MSE 00005 00008 00025MAE 00176 00205 00362

Multiplicative methodMSE 00004 00005 00007MAE 00172 00175 00200

respectively Actual returns are shown by blue solid linewhereas predictions are shown by orange dotted line HIPMpredictor is able to capture nonlinear patterns of data verywell As shown actual and predicted returns are very closeto each other which imply that predictions are satisfactoryThere are total six similar figures obtained out of which onlytwo are displayed above

The performance of HIPM predictor can be better judgedby Table 2 which displays values of error metrics obtainedAs seen Multiplicative method outperforms summationmethod in terms of less prediction error

6 Conclusions

A new and promising approach for prediction of stockreturns is presented in this paper A hybrid predictionintelligent model is developed by combining predictionsobtained from a linear prediction model and a nonlinearmodel The linear model chosen is exponential smoothingmodel while autoregressive moving reference neural networkis chosen as nonlinear model This is a new approachwherein errors are fed into neural network so as to obtainminimized errors Initially prediction of stock returns isobtained using exponential smoothing model and predictionerrors calculated Autoregressive moving reference methodis used to calculate input-output pairs for the errors justobtained These errors are fed into recurrent neural networkand the network learns using backpropagation algorithm insupervised manner Finally the prediction of stocks is calcu-lated via twomethods summationmethod andmultiplicative

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

Advances in Artificial Neural Systems 5

00000

00200

00400

00600

1 9 17 25 33 41 49 57 65 73 81minus00400

minus00200

Actual returnsPredicted returns

rt+1

Figure 3 Multiplicative method output of HIPM for stock 2

series The smoothing factor 120572 is associated with the term119903ESM(119905) as shown in (2) Thus (21) is an objective functionwhich minimizes MSE of the predictions obtained fromexponential smoothing technique Its constraint guaranteesthat the value of smoothing factor 120572 ranges between 0 and1 Since ESM is a linear prediction model it obviously didnot produce satisfactory predictions thus resulting in highprediction error

52 Predictions Using RNN After obtaining predictionsusing ESM and series of errors calculated these errors weregiven to RNN by means of AR-MRNN(119901 119896) ((14) to (16))AR-MRNN(6 1) (RNN)was used to obtain stock predictionsRegression order 119901 = 6 was chosen after trial and erroras it was observed that by using this particular regressionorder RNN produced less prediction error In each stockdata was divided into two equal parts (5050) Out of 164returns 50 data or 82 returns between15-05-2013 and 05-09-2013 were kept for training RNN and the remaining 50data that is 82 returns between 06-09-2013 and 30-12-2013for testing Sliding windows each of 82 returns were createdFor each window input-output pairs were calculated usingAR-MRNN(6 1) method which result in 76 input-outputpairs in each window 83 sliding windows were formed eachgiving prediction for future period thus 83 windows give 83future predictions while initial window gives prediction forthe period 119903

119905+1 By combing 83 sliding windows 6308 input-

output pairs were obtainedThese slidingwindowswere givento RNN and trained in a supervised manner the procedurewas repeated for all stocks

In the chosen RNNmodel there are 16 neurons in hiddenlayer possessing sigmoid activation function and an outputneuron in output layer also possessing sigmoid activationfunction An error threshold of MSE = 00002 was presetfor RNN it means RNN converged only after average errorreached below the threshold For RNN it took over 10000epochs for each stock to reach below preset error

Figures 3 and 4 show the prediction output of HIPM(between 06-09-2013 and 31-12-2013) via multiplicativemethod and summation method for stock 2 and stock 3

00100

00300

00500

minus00700

minus00500

minus00300

minus00100

1 9 17 25 33 41 49 57 65 73 81

Actual returnsPredicted returns

rt+1

Figure 4 Summation method output of HIPM for stock 3

Table 2 Values of error metrics for HIPM

Stock 1 2 3Summation method

MSE 00005 00008 00025MAE 00176 00205 00362

Multiplicative methodMSE 00004 00005 00007MAE 00172 00175 00200

respectively Actual returns are shown by blue solid linewhereas predictions are shown by orange dotted line HIPMpredictor is able to capture nonlinear patterns of data verywell As shown actual and predicted returns are very closeto each other which imply that predictions are satisfactoryThere are total six similar figures obtained out of which onlytwo are displayed above

The performance of HIPM predictor can be better judgedby Table 2 which displays values of error metrics obtainedAs seen Multiplicative method outperforms summationmethod in terms of less prediction error

6 Conclusions

A new and promising approach for prediction of stockreturns is presented in this paper A hybrid predictionintelligent model is developed by combining predictionsobtained from a linear prediction model and a nonlinearmodel The linear model chosen is exponential smoothingmodel while autoregressive moving reference neural networkis chosen as nonlinear model This is a new approachwherein errors are fed into neural network so as to obtainminimized errors Initially prediction of stock returns isobtained using exponential smoothing model and predictionerrors calculated Autoregressive moving reference methodis used to calculate input-output pairs for the errors justobtained These errors are fed into recurrent neural networkand the network learns using backpropagation algorithm insupervised manner Finally the prediction of stocks is calcu-lated via twomethods summationmethod andmultiplicative

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

6 Advances in Artificial Neural Systems

method Based on results it is observed that the proposedmodel is able to detect the nonlinear patterns of data very welland results are satisfactory Input to neural network is givenas differences rather than original values The network thusneeds to find smaller weights thus increasing its predictionperformance The performance of proposed hybrid modelcan be further improved and applied in other areas too thisis certainly an important avenue for future research

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper

References

[1] G B Durham ldquoSvmixturemodels with application to s amp p 500index returnsrdquo Journal of Financial Economics vol 85 no 3 pp822ndash856 2007

[2] C H Chen ldquoNeural networks for financial market predictionrdquoin Proceedings of the 1994 IEEE International Conference onNeural Networks pp 1199ndash1202 Orlando Fla USA June 1994

[3] C Burges ldquoA tutorial on support vector machines for patternrecognitionrdquo Data Mining and Knowledge Discovery vol 2 no2 pp 121ndash167 1998

[4] R Majhi G Panda G Sahoo A Panda and A Choubey ldquoPre-diction of SampP 500 andDJIA stock indices using particle swarmoptimization techniquerdquo in Proceeding of the IEEE Congresson Evolutionary Computation (CEC 08) pp 1276ndash1282 HongKong China June 2008

[5] S Haykin Neural Networks A Comprehensive FoundationMacMillan College New York NY USA 1994

[6] G Armano M Marchesi and A Murru ldquoA hybrid genetic-neural architecture for stock indexes forecastingrdquo InformationSciences vol 170 no 1 pp 3ndash33 2005

[7] K-J Kim and I Han ldquoGenetic algorithms approach to featurediscretization in artificial neural networks for the prediction ofstock price indexrdquo Expert Systems with Applications vol 19 no2 pp 125ndash132 2000

[8] W Shen X Guo C Wu and D Wu ldquoForecasting stockindices using radial basis function neural networks optimizedby artificial fish swarm algorithmrdquo Knowledge-Based Systemsvol 24 no 3 pp 378ndash385 2011

[9] F D Freitas A F de Souza and A R de Almeida ldquoPrediction-based portfolio optimization model using neural networksrdquoNeurocomputing vol 72 no 10ndash12 pp 2155ndash2170 2009

[10] J-JWang J-ZWang Z-G Zhang and S-P Guo ldquoStock indexforecasting based on a hybrid modelrdquoOmega vol 40 no 6 pp758ndash766 2012

[11] M Khashei and M Bijari ldquoAn artificial neural network (pd q) model for timeseries forecastingrdquo Expert Systems withApplications vol 37 no 1 pp 479ndash489 2010

[12] A-S Chen M T Leung and H Daouk ldquoApplication of neuralnetworks to an emerging financial market forecasting andtrading the Taiwan Stock Indexrdquo Computers amp OperationsResearch vol 30 no 6 pp 901ndash923 2003

[13] A Jain and A M Kumar ldquoHybrid neural network modelsfor hydrologic time series forecastingrdquo Applied Soft ComputingJournal vol 7 no 2 pp 585ndash592 2007

[14] J-ZWang J-JWang Z-G Zhang and S-P Guo ldquoForecastingstock indices with back propagation neural networkrdquo ExpertSystems with Applications vol 38 no 11 pp 14346ndash14355 2011

[15] G S da SGomes T B Ludermir and L M M R LimaldquoComparison of new activation functions in neural networkfor forecasting financial time seriesrdquo Neural Computing andApplications vol 20 no 3 pp 417ndash439 2011

[16] C Hamzacebi ldquoImproving artificial neural networksrsquo perfor-mance in seasonal time series forecastingrdquo Information Sciencesvol 178 no 23 pp 4550ndash4559 2008

[17] M M Mostafa ldquoForecasting stock exchange movements usingneural networks empirical evidence from Kuwaitrdquo ExpertSystems with Applications vol 37 no 9 pp 6302ndash6309 2010

[18] S Samarasinghe Neural Networks for Applied Sciences andEngineering Auerbach Publications Taylor amp Francis NewYork NY USA 2007

[19] K-J Kim and H Ahn ldquoSimultaneous optimization of artificialneural networks for financial forecastingrdquo Applied Intelligencevol 36 no 4 pp 887ndash898 2012

[20] Y-K Kwon and B-R Moon ldquoA hybrid neurogenetic approachfor stock forecastingrdquo IEEE Transactions on Neural Networksvol 18 no 3 pp 851ndash864 2007

[21] P G Zhang ldquoTime series forecasting using a hybrid ARIMAand neural network modelrdquo Neurocomputing vol 50 pp 159ndash175 2003

[22] T-J Hsieh H-F Hsiao andW-C Yeh ldquoForecasting stock mar-kets using wavelet transforms and recurrent neural networksan integrated system based on artificial bee colony algorithmrdquoApplied Soft Computing Journal vol 11 no 2 pp 2510ndash25252011

[23] M-Y Chen D-R Chen M-H Fan and T-Y Huang ldquoInter-national transmission of stock market movements an adaptiveneuro-fuzzy inference system for analysis of TAIEX forecast-ingrdquo Neural Computing and Applications vol 23 pp 369ndash3782013

[24] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash671 2013

[25] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applications vol 24no 6 pp 1441ndash1449 2014

[26] G Sermpinis KTheofilatos A Karathanasopoulos E F Geor-gopoulos and C Dunis ldquoForecasting foreign exchange rateswith adaptive neural networks using radial-basis functions andparticle swarm optimizationrdquo European Journal of OperationalResearch vol 225 no 3 pp 528ndash540 2013

[27] HNi andH Yin ldquoExchange rate prediction using hybrid neuralnetworks and trading indicatorsrdquo Neurocomputing vol 72 no13ndash15 pp 2815ndash2823 2009

[28] M-Y Chen ldquoA hybrid ANFIS model for business failure pre-diction utilizing particle swarm optimization and subtractiveclusteringrdquo Information Sciences vol 220 pp 180ndash195 2013

[29] M Rout B Majhi R Majhi and G Panda ldquoForecas ting ofcurrency exchange rates using an adaptive arma model withdifferential evolution based trainingrdquo Journal of King SaudUniversitymdashComputer and Information Sciences vol 26 pp 7ndash18 2014

[30] R Brown Smoothing Forecasting and Prediction of DiscreteTime Series Courier Dover Publications 2004

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

Advances in Artificial Neural Systems 7

[31] G E P Box and G M Jenkins Time Series Analysis Forecastingand Control Holden-day San Francisco Calif USA 1970

[32] G U Yule ldquoWhy do we sometimes get nonsense correlationsbetween time series A study in sampling and the nature of timeseriesrdquo Journal of the Royal Statistical Society vol 89 pp 30ndash411926

[33] H O Wold A Study in the Analysis of Stationary Time SeriesAlmgrist amp Wiksell Stockholm Sweden 1938

[34] A M Rather ldquoA prediction based approach for stock returnsusing autoregressive neural networksrdquo in Proceedings of theWorld Congress on Information and Communication Technolo-gies (WICT rsquo11) pp 1271ndash1275 Mumbai India December 2011

[35] A M Rather ldquoOptimization of predicted portfolio usingvarious autoregressive neural networksrdquo in Proceedings of theInternational Conference on Communication Systems and Net-work Technologies (CSNT rsquo12) pp 265ndash269 IEEE May 2012

[36] D Rumelhart and J McClelland Parallel Distributed ProcessingMIT Press Cambridge Mass USA 1986

[37] S Seker E Ayaz and E Turkcan ldquoElmanrsquos recurrent neuralnetwork applications to condition monitoring in nuclear powerplant and rotating machineryrdquo Engineering Applications ofArtificial Intelligence vol 16 no 7-8 pp 647ndash656 2003

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article A Hybrid Intelligent Method of Predicting ...downloads.hindawi.com/journals/aans/2014/246487.pdf · Stock Exchange [ ]. e results showed that the models were useful

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014