Upload
christopher
View
231
Download
3
Embed Size (px)
DESCRIPTION
tyjth
Citation preview
Electric Power Systems Research 74 (2005) 417425
Forecasting regional electricity loadge
iangy, 1, Und, Da-T
nuary 22005
Abstract
Accompa of therole in regio ad forof influence lly emproblems. H a recur(RSVMG) is proposed to forecast electricity load. In addition, genetic algorithms (GAs) are used to determine free parameters of support vectormachines. Subsequently, examples of electricity load data from Taiwan are used to illustrate the performance of proposed RSVMG model.The empirical results reveal that the proposed model outperforms the SVM model, artificial neural network (ANN) model and regressionmodel. Consequently, the RSVMG model provides a promising alternative for forecasting electricity load in power industry. 2005 Elsevier B.V. All rights reserved.
Keywords: RElectricity loa
1. Introdu
With intaccurate lobeen the mtion systemagement strload forecaFarmer [1]ror impliedfore, overespinning reto internatitimation ofreserve andsary for int
CorresponE-mail ad
0378-7796/$doi:10.1016/jecurrent neural networks (RNNs); Support vector machines (SVMs); Recurrent support vector machines (RSVM); Genetic algorithms (GAs);d forecasting
ction
roduction of deregulation into electricity industry,ad forecasting of the future electricity demand hasost important role regarding the areas of distribu-
investments, electricity load planning and man-ategies in regional or national systems. Inaccuratesting may increase operating costs [1,2]. Bunn andpointed out that a 1% increase in forecasting er-a 10 million increase in operating costs. There-
stimation of future load results in an unnecessaryserve, and the excess supply is also unwelcomeonal energy networks. On the contrary, underes-future load causes failure in providing sufficientimplies high costs per peaking unit. It is neces-
ernational electricity production cooperation that
ding author. Tel.: +886 4 85 11 890; fax: +886 4 92 91 5205.dress: [email protected] (P.-F. Pai).
every member is able to forecast its demands accurately. Loadforecasting approaches are generally classified into time se-ries [35], state space and Kalman filtering technology [5],regression models [1,5,6], artificial intelligence techniques[7,8] and fuzzy logic methods [9]. Time series model, knownas BoxJenkins ARIMA model, uses historical load data toinfer the future electricity load. Time series approaches areconvenient for modeling especially when only the electricityload data are available. On the other hand, the disadvantageof time series model is the ignorance of other factors thatinfluence electricity loads. State space and Kalman filteringtechnology treats the periodic component of load as a randomprocess and uses 310 historical data to establish the periodicload variation for estimating the dependent variables (load ortemp erature) of the power system. The regression modelestablishes the causeeffect relationships between electric-ity load and independent variables such as climate factors,social activities and seasonal factors. Knowledge-based ex-pert system (KBES) and artificial neural networks (ANNs)
see front matter 2005 Elsevier B.V. All rights reserved..epsr.2005.01.006support vector machines withPing-Feng Pai a,, Wei-Ch
a Department of Information Management, National Chi Nan Universitb School of Management, Da-Yeh University, 112 Shan-Jiau Roa
Received 10 October 2004; received in revised form 14 JaAvailable online 8 April
nying deregulation of electricity industry, accurate load forecastingnal or national power system strategy management. Electricity lod factors. Support vector machines (SVMs) have been successfuowever, the application for load forecasting is rare. In this study,based on recurrentnetic algorithmsHong b
iversity Rd. Puli, Nantou, Taiwan 545, ROCusen, Changhua, Taiwan 51505, ROC
005; accepted 15 January 2005
future electricity demand has been the most importantecasting is complex to conduct due to its nonlinearityployed to solve nonlinear regression and time seriesrent support vector machines with genetic algorithms
418 P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425
are the popular representatives of artificial intelligence tech-niques for load forecasting in the recent decade. The KBESmodel forms new rules based on received information, in-cluding daious day. Arare superiotraining proconsumingaccelerateis useful inhistorical d
The supprinciple othe principducted by mtroductionhave beention probleprediction,engine reli(RNNs) aris considerof adjustedare extensicasting [11works [12]networks [1past informand Zipserhidden layeand Zipsermented (Tsare suited tthis investigproposed Rback-propanetic algornodes. Finforecast eleture [7] is eof the prop
2. Recurralgorithms
2.1. Suppo
The baslinearly thespace. Henxi is the innumber of
f = g(x) =
where i(x) is the feature of inputs, and both wi and b arecoefficients. The coefficients (wi and b) are estimated by min-imizing the following regularized risk function:
= C 1N
e
, f ) =
andd the recasecondTheren the eser-de*, w
orrespducedtrained
mize
, )
the co
(xi) +wi( 0,is con
wing p
, b, ,
12||w|
Ni=1
Ni=1
Ni=1
(5) is, a
LagrshKuly temperature, day types and load from the previ-tificial intelligence techniques for load forecastingr to traditional forecasting approach. However, thecedure of an artificial intelligence model is time
. Therefore, some approaches were proposed tothe speed of converge [8]. The fuzzy logic modelforecasting electricity load particularly while theata are represented by linguistic terms.port vector machines (SVMs) are based on thef structural risk minimization (SRM) rather thanle of empirical risk minimization, which con-ost of traditional neural network models. With in-
of Vapniks -insensitive loss function [10], SVMsextended to solve nonlinear regression estima-ms in financial time series forecasting, air qualityproduction value forecast of machinery industry,
ability prediction, etc. Recurrent neural networkse based on the main concept in which every united as an output of the network and the provisioninformation as input in a training process. RNNs
vely applied in long term load time series fore-] and can be classified in three types, Jordan net-, Elman networks [13], and Williams and Zipser4]. Both Jordan and Elman networks use mainlyation to capture detailed information. Williamsnetworks take much more information from ther and back into themselves. Therefore, Williamsnetworks are sensitive when models are imple-oi and Back [15]). Jordan and Elman networkso time series forecasting (Jhee and Lee [16]). Ination, the Jordan network is used as a basis for theSVMG model. Traditionally, RNNs are trained bygation algorithms. In this work, SVMs with ge-ithms are used to determine the weights betweenally, the proposed RSVMG model is applied toctricity load. A numerical example in the litera-mployed to demonstrate the forecasting accuracyosed model.
ent support vector machines with genetic
rt vector machines with genetic algorithms
ic concept of the SVM regression is to map non-original data x into a higher dimensional feature
ce, given a set of data G = {(xi, ai)}Ni=1 (whereput vector, ai the actual value and N is the totaldata patterns), the SVM regression function is:
wii(x) + b (1)
r(C)
wher
(a
and Ccallethe foThe stion.tweeare u
andthe cintrocons
Mini
r(w,
with
wi
ai i,
i
Thfollo
L(wi
=
Eq.wi, bativeKaruNi=1
(ai, fi) + 12 ||w||2 (2)
{0, if |a f | |a f | , otherwise (3)
are prescribed parameters. In Eq. (2), (a, f) is-insensitive loss function. The loss equals zero ifted value is within the -tube (Eq. (3) and Fig. 1).term, 12 ||w||2, measures the flatness of the func-
fore, C is considered to specify the trade-off be-mpirical risk and the model flatness. Both C and termined parameters. Two positive slack variableshich represent the distance from actual values toonding boundary values of -tube (Fig. 1), are
. Then, Eq. (2) is transformed into the followingform:
= 12||w||2 + C
(Ni=1
(i + i ))
(4)
nstraints,
bi ai + i , i = 1, 2, . . . , Nxi) bi + i, i = 1, 2, . . . , N
i = 1, 2, . . . , Nstrained optimization problem is solved using therimal Lagrangian form;
, i, i , i, i )
|2 + C(
Ni=1
(i + i ))
i[wi(xi) + b ai + + i]
i [ai wi(xi) b+ + i ]
(ii + i i ) (5)
minimized with respect to primal variablesnd *, and maximized with respect to nonneg-angian multipliers i, i , and i . Finally,hnTucker conditions are applied to the regres-
P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425 419
Fig. 1
sion, and E
(i, i ) =
subject to tNi=1
(i
0 i 0 i
The Lagi
i = 0. T
and an optperplane is
w =Ni=1
(
Hence, the
g(x, , )
Here, K(xi,Kernel equfeature spaAny functias the Kernexp
( 12
(The sele
model is imstructural mparametersin the proption. Hollaalgorithmsmember in
Fig. 2. The architecture of a SVMG model.
ssing it from generation to generation. Fig. 2 presentsamework of the proposed SVMG model. GAs are usedld a smaller MAPE by searching for better combinations
ree parameters in SVMs. Fig. 3 depicts the operation ofs, which is described below.. Parameters used in support vector regression [17].
q. (4) thus yields the dual Lagrangian,Ni=1
ai(i i ) Ni=1
(i + i )
12
Ni=1
Nj=1
(i i )(j j )K(xi, xj) (6)
he constraints,
i ) = 0
C, i = 1, 2, . . . , NC, i = 1, 2, . . . , Nrange multipliers in Eq. (6) satisfy the equalityhe Lagrange multipliers and i , are calculated
imal desired weight vector of the regression hy-,
i i )K(x, xi) (7)
regression function is Eq. (8).
=Ni=1
(i i )K(x, xi) + b (8)
xj) is called the Kernel function. The value of theals the inner product of two vectors, xi and xj, in thece (xi) and (xj); that is K(xi, xj) =(xi)(xj).on that meets Mercers condition [10] can be used
by pathe frto yieof tha GAel function. In this work, the Gaussian function,||xixj ||
)2)is used in the SVMs.
ction of three parameters, , and C, of a SVMportant to the accuracy of forecasting. However,ethods for confirming efficiently the selection ofefficiently are lacking. Therefore, GAs are usedosed SVM model to optimize parameter selec-nd first proposed genetic algorithms [18]. Suchare based on the survival principle of the fittesta population, which retains genetic information Fig. 3. The procedure of genetic algorithms.
420 P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425
Step 1 (Initialization). Generate randomly an initial popu-lation of chromosomes. The three free parameters, , C and, are encoded in a binary format; and represented by a chro-mosome.
Step 2 (Evabsolute pefunction. T
MAPE =N
where ai anN is the nu
Step3 (Selwith highein the next(Holland [1duction.
Step 4 (Crrandomly bin to a 1ployed. Setermined band mutatithe probabi0.1, respec
Step 5 (Negeneration.
Step6 (Stoa given scasolution; o
2.2. Recuralgorithms
In this wneural netwin the contelayer. A coonly occurthe contextnetwork. Foutput neu
fn(t) =q
i=where Wi ii(t) is the
i(t) = g
F
e vij aare w
k delas in pis ont neu
ack-preurald as fis rew
= h(xe h()= [x1(]T is tthe in
t)) =
e d(t)e ins
evised weight vector in the next moment are given by(14) and (15), respectively.d(t) fn(t) = d(t) h(xT(t)(t)) (14)
1) = (t) J((t)) (15)e is the learning rate.ird, the gradient J((t)) can be calculated as:
(t)) = J((t))(t) = e(t)
e(t)(t)
= e(t)h(xT(t)(t))x(t) (16)e h() is the first derivative of the nonlinearity h(). Fi-, the weight is revised as:
1) = (t) + e(t)h(xT(t)(t))x(t) (17)aluating fitness). In this study, a negative meanrcentage error (MAPE) is used as the fitnesshe MAPE is as follows:
1 Ni=1
ai fiai 100% (9)
d fi represent the actual and forecast values andmber of forecasting periods.
ection). Based on fitness functions, chromosomesr fitness values are more likely to yield offspringgeneration. The roulette wheel selection principle8]) is applied to choose chromosomes for repro-
ossover and mutation). Mutations are performedy converting a 1 bit into a 0 bit or a 0 bitbit. The single-point-crossover principle is em-
gments of paired chromosomes between two de-reak-points are swapped. The rates of crossoveron are probabilistically determined. In this study,lities of crossover and mutation are set to 0.5 andtively.
xt generation). Form a population for the next
p conditions). If the number of generations equalsle, then the best chromosomes are presented as atherwise go back to Step 2.
rent support vector machines with genetic
ork, the Jordan network is specified as a recurrentork framework. All neurons in a layer except thosext layer are connected with all neurons in the nextntext layer is a special hidden layer. Interactionsbetween neurons in the hidden layer and those inlayer. Fig. 4 shows the architecture of a Jordan
or a Jordan network with p inputs, q hidden and rrons, the output of the nth neuron, fn(t), is [19]:
1Wii(t) + bi(t) (10)
s weight between the hidden and output layer, andoutput function of the hidden neurons, which is:
Pj=1
vijxj(t) +s
k=1
rv=1
wikvfv(t k) + bi(t)(11)
wherwikvwithlayerthereoutpu
Bof a nsente(11)fn(t)wherxT(t)P(t)to be
J((
wherTh
the rEqs.
e(t) =(t +wher
Th
J(
whernally
(t +ig. 4. The architecture of Jordan networks [12].
re weights between the input and the hidden layer;eights between the context and the hidden layery periods and s is the total number of context
ast output data, in the proposed RSVMG model,ly one context layer (i.e., s= 1) due to only oneron (i.e., r= 1).opagation yields gradients for adapting weightsnetwork. The back-propagation algorithm is pre-ollows. First, the output of the nth neuron in Eq.ritten as:
T(t)(t)) (12)is the nonlinearity function of xT(t) and fn(t);t), . . ., xP(t)]T is the input vector; (t) = [1(t), . . .,he weight vector; a cost function is then presentedstantaneous performance index,12[d(t)fn(t)
]2 = 12[d(t) h(xT(t)(t))]2 (13)
= [dl(t), . . ., dP(t)]T is the desired output.tantaneous output error at the output neuron and
P.-F.P
ai,W.-C.H
ong/Electric
PowerSystem
sResearch74(2005)417425
421
Table 1Taiwan regional electricity load (from 1981 to 2000) and forecasting results of RSVMG, SVMG, ANN and regression models (unit: 106 Wh)
Year Northern regional Central regional Southern regional Eastern regional
Actual RSVMG SVMG ANN Regression Actual RSVMG SVMG ANN Regression Actual RSVMG SVMG ANN Regression Actual RSVMG SVMG ANN Regression
1981 3,388 3,288 2,988 3,424 3,430 1,663 1,615 1,713 1,833 1,867 2,272 2,172 2,192 2,235 2,227 122 109 110 124 1241982 3,523 3,623 3,392 3,491 3,494 1,829 1,839 1,872 1,864 1,893 2,346 2,383 2,399 2,269 2,263 127 125 126 127 1261983 3,752 3,852 3,645 3,926 3,933 2,157 2,066 2,034 2,079 2,098 2,494 2,542 2,565 2,494 2,488 148 141 142 143 1411984 4,296 4,079 3,896 4,263 4,277 2,219 2,295 2,207 2,257 2,256 2,686 2,685 2,718 2,697 2,697 142 157 158 153 1531985 4,250 4,427 4,258 4,398 4,395 2,190 2,525 2,398 2,323 2,289 2,829 2,853 2,886 2,786 2,796 143 173 174 157 1561986 5,013 4,962 4,754 4,993 4,986 2,638 2,755 2,613 2,602 2,564 3,172 3,072 3,092 3,113 3,126 176 189 191 175 1751987 5,745 5,645 5,345 5,607 5,594 2,812 2,986 2,858 2,868 2,858 3,351 3,341 3,340 3,405 3,409 206 206 207 194 1951988 6,320 6,348 5,993 6,287 6,238 3,265 3,214 3,130 3,143 3,145 3,655 3,636 3,617 3,705 3,701 227 222 224 216 2161989 6,844 6,944 6,648 6,769 6,753 3,376 3,441 3,426 3,369 3,424 3,823 3,923 3,903 3,989 3,979 236 238 240 232 2341990 7,613 7,397 7,213 7,311 7,292 3,655 3,665 3,734 3,593 3,685 4,256 4,187 4,185 4,279 4,267 243 255 257 248 2511991 7,551 7,788 7,610 7,788 7,736 4,043 3,885 4,040 3,864 3,804 4,548 4,448 4,468 4,550 4,551 264 271 274 259 2651992 8,352 8,252 7,952 8,318 8,345 4,425 4,101 4,324 4,134 4,150 4,803 4,747 4,772 4,894 4,887 292 287 291 284 2881993 8,781 8,853 8,531 8,958 8,917 4,594 4,311 4,568 4,364 4,355 5,192 5,100 5,112 5,132 5,120 307 303 307 307 3051994 9,400 9,500 9,467 9,470 9,419 4,771 4,515 4,752 4,614 4,532 5,352 5,452 5,467 5,419 5,418 325 319 324 325 3211995 10,254 9,956 10,334 10,091 10,073 4,483 4,712 4,862 4,894 4,831 5,797 5,670 5,769 5,794 5,805 343 335 341 346 3431996 10,719 10,956 10,319 10,838 10,921 4,935 4,700 4,885 5,197 5,307 6,369 6,279 5,916 6,206 6,208 363 336 357 371 3731997 11,222 11,252 11,213 10,991 11,262 5,061 5,065 5,060 5,112 5,361 6,336 6,200 6,265 6,305 6,493 358 367 358 378 3801998 11,642 11,644 11,747 11,643 12,162 5,246 5,231 5,203 5,301 5,711 6,318 6,156 6,389 6,476 6,868 397 381 373 403 4071999 11,981 12,219 12,173 11,804 12,395 5,233 5,385 5,230 5,350 5,780 6,259 6,261 6,346 6,537 7,013 401 401 397 410 4132000 12,924 12,826 12,543 12,834 13,122 5,633 5,522 5,297 5,572 6,131 6,804 6,661 6,513 6,672 7,481 420 416 408 435 440
MAPE 0.7498 1.3981 1.0600 2.4500 1.3026 1.8146 1.7300 8.52 1.7530 2.0243 2.4800 8.2900 1.8955 2.6475 3.6200 4.1000
422 P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425
Table 2Training and
Data sets
Training dataValidation daTesting data
Fig. 5 showThe output
f (t) =Pi=1
Then, Eto run theparametersusing Eq. (
Table 3Forecasting re
Regions
NorthernCentralSouthernEastern
Regions
NorthernCentralSouthernEastern
Table 4Wilcoxon signed-rank test
Region Wilcoxon signed-rank test
= 0.025, W= 0 = 0.05, W= 0
Northern regionRSVMG vs. SVMG 1 1RSVMG vs. ANN 1 1RSVMG vs. regression 0 0
Southern regionRSVMG vs. SVMG 1 1RSVMG vs. ANN 0 0RSVMG vs. regression 0 0
Central regionRSVMG vs. SVMG 0 0RSVMG vs. ANN 0 0RSVMG vs. regression 0 0
Eastern regionRSVMG vs. SVMG 0 0
MG vs. ANN 0 0MG vs. regression 0 0
nume
is stuow tharing
osed bto 200aiwanFig. 5. Architecture of RSVMG model.
testing data sets of the proposed model
RSVMG model ANN model
1981199219811996ta 19931996
19972000 19972000
RSVRSV
3. A
Thto shcompprop1981for Ts the architecture of the proposed RSVMG model.of RSVMG ( f (t)) is
WT(xT(t)) + b(t) (18)
q. (18) replaces Eq. (1) in the SVMG algorithms,loop of SVMG in the search for values of three. Finally, the forecast values f (t) are calculated18). Eq. (18) yields the forecast value f (t).
sults and parameters of SVMG model and RSVMG model
SVMG parameters
C
0.30 2.10 10100.90 1.85 10100.50 1.00 10107.00 0.600 1010
RSVMG parameters
C
0.50 2.50 10104.10 1.95 10100.47 1.35 10108.00 0.60 1010
Table 1. Tobasis, it issets. Theretraining dadata set (4(4 years, froin Table 2.percentage
In the tgion (inclumodel, andrical example and experimental results
dy employed Taiwan regional electricity load datae forecasting performances of RSVMG modelswith those of ANN model and regression modely Hsu and Chen [7]. The total load values from0 serve as experimental data. Totally, 20 load dataregional electricity load are available, as listed inconduct the forecast performance on the same
necessary to divide total data into the same sub-fore, the data are divided into three data sets: theta set (12 years, from 1981 to 1992), the validationyears, from 1993 to 1996) and the testing data setm 1997 to 2000) [7]. The three data sets are listed
The forecasting accuracy is measured by absoluteerror (MAPE), as given by Eq. (9).raining stage, the training data set of each re-MAPE of testing (%)
400 1.398150 1.814680 2.0243
1 2.6475
MAPE of testing (%)
100 0.749810 1.3026
100 1.75305 1.8955
ding total 12 load data) are fed into the RSVMGthe structural risk minimization principle is em-
P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425 423
Fig. 6. Forecasting values for different models (northern region).
ployed to minimize the training error. While training errorsimprovement occurs, the three kernel parameters, , C and of RSVMG model adjusted by GAs are employed to calcu-late the validation error. Then, the adjusted parameters withminimum vate parameforecast eledata sets aaccuracy oeters, , Ctesting MAthis examp
eters for the different regional SVMG models and RSVMGmodels are illustrated in Table 3.
Table 1 also lists the MAPE values of various forecastingmodels. In each region electricity load forecasting, based on
ame fmallerls (th[7]), p
n. ThefromNN m
rom 1alidation error are selected as the most appropri-ters. Finally, a four-steps-ahead policy is used toctricity load in each region. Note that the testingre not used for modeling but for examining thef the forecasting model. Then, the kernel param-and , in the RSVMG model with the smallestPE value is used as the most suitable model for
le. The forecasting results and the suitable param-
the shas smodeChenregiotrendthe Arate fFig. 7. Forecasting values for different models (corecasting period, the proposed RSVMG modelMAPE values than SVMG, ANN and regression
e latter two models were proposed by Hsu andarticularly for the southern region and the easternANN model failed to capture the load decreasing1997 to 1998 in the southern region, similarly,odel also failed to capture the load increasing
998 to 2000 in the eastern region. Figs. 69 illus-entral region).
424 P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425
trate real vregarding e
To verifRSVMG, ttest, was cand 0.05 ssults (Tableimproved fother three(in northernregion) moFig. 8. Forecasting values for different models (so
Fig. 9. Forecasting values for different models (e
alues and forecasting values of different modelsach region.y the significance of accuracy improvement ofhe statistical test, namely Wilcoxon signed-rankonducted. The test was performed at the 0.025ignificance levels n one-tail-tests. The test re-4) showed that almost the RSVMG model yields
orecast results and significantly outperforms theforecasting models, only except versus SVMGregion and southern region) and ANN (northern
dels.
4. Conclu
Accurateconomy sdata of eacparticularlyphenomenounderprodudevelopmeforecastingin forecastuthern region).
astern region).
sions
e load forecasting is crucial for an energy-limitedystem, like Taiwan. The historical electricity loadh region in Taiwan shows a strong growth trend,
in northern region. Although this is a commonn in developing countries, overproduction orction electricity load influence the sustainable
nt of economy a lot. This study introduced a noveltechnique, RSVMG, to investigate its feasibility
ing annual regional electricity loads in Taiwan.
P.-F. Pai, W.-C. Hong / Electric Power Systems Research 74 (2005) 417425 425
The experimental results indicate that the RSVMG modeloutperformed the ANN and regression models in termsof forecasting accuracy. The superior performance of theRSVMG model has several causes. First, the RSVMG modelhas nonlinear mapping capabilities and thus can more easilycapture electricity load data patterns than can the ANN andregression models. Second, improper determining of thesethree parameters will cause either over-fitting or under-fittingof a SVM model. In this work, the GAs can determine suitableparameters to forecast electricity load. Third, the RSVMGmodel performs structural risk minimization rather than mini-mizing the training errors. Minimizing the upper bound on thegeneralization error improves the generalization performancecompared to the ANN and regression models. Finally, Jordanrecurrent networks can continually capture data patternsfrom the output layer with past values into the hidden layer.
This investigation is the first to apply the recurrentneural network and SVM model with GAs to electricityload forecasting. The empirical results obtained in thisstudy demalternativethe future,seasonal faforecastingsearching tbe combine
Acknowled
This restional Scie93-2745-Hdata analys
Reference
[1] D.W. BuForecast
[2] D.W. Bunn, Forecasting loads and prices in competitive power mar-kets, Proc. IEEE 88 (2000) 163169.
[3] G.E.P. Box, G.M. Jenkins, Time Series Analysis, Forecasting andControl, Holden-Day, San Francisco, 1970.
[4] S. Saab, E. Badr, G. Nasr, Univariate modeling and forecasting ofenergy consumption: the case of electricity in Lebanon, Energy 26(2001) 114.
[5] J.H. Park, Y.M. Park, K.Y. Lee, Composite modeling for adap-tive short-term load forecasting, IEEE Trans. Power Syst. 6 (1991)450457.
[6] J.W. Taylor, R. Buizza, Using weather ensemble predictions in elec-tricity demand forecasting, Int. J. Forecasting 19 (2003) 5770.
[7] C.C. Hsu, C.Y. Chen, Regional load forecasting in Taiwan: appli-cations of artificial neural networks, Energy Convers. Manage. 44(2003) 19411949.
[8] B. Novak, Superfast autoconfiguring artificial neural networks andtheir application to power systems, Electr. Power Syst. Res. 35(1995) 1116.
[9] A.M. Al-Kandari, S.A. Soliman, M.E. El-Hawary, Fuzzy short-termelectric load forecasting, Electr. Power Energy Syst. 26 (2004)111122.
[10] V. Vapnik, S. Golowich, A. Smola, Support vector machine forunctiong, Adv. Kermears lo25133.I. Jor
equentiognitiv
.L. Elm. Willia
ully rec.C. Tsorks: A(1994).C. Jhe
orecasti571.. Vojis
hines, Nassach
. Hollanf Mich. Ayaze vario
peration2003) 3onstrate that the proposed model offers a validfor application in the electricity industry. Inregional climate factors, social activities, and
ctors can be included in the RSVMG model forelectricity load. In addition, some other advancedechniques for suitable parameters selection cand with RSVM to forecast electricity load.
gements
earch was conducted with the support of Na-nce Council (NSC 93-2213-E-212-001 & NSC-212-001-URD). Mr. Chih-Shen Lin helped withis.
s
nn, E.D. Farmer, Comparative Models for Electrical Loading, John Wiley & Sons, New York, 1985.
fin
[11] By1
[12] Ms
C[13] J[14] R
f[15] A
w
5[16] W
f5
[17] Kc
M[18] J
o
[19] Etho
(approximation, regression estimation, and signal process-. Neural Inf. Process. Syst. 9 (1996) 281287.anshahi, Recurrent neural network for forecasting next 10ads of nine Japanese utilities, Neurocomputing 23 (1998).dan, Attractor dynamics and parallelism in a connectionistal machine, in: Proceeding of 8th Annual Conference of thee Science Society, Hillsdale, 1987, pp. 531546.an, Finding structure in time, Cogn. Sci. 14 (1990) 179211.ms, D. Zipser, A learning algorithm for continually runningurrent neural networks, Neural Comput. 1 (1989) 270280.oi, A.D. Back, Locally recurrent globally feedforward net-
critical review of architectures, IEEE Trans. Neural Netw.229239.e, J.K. Lee, Performance of neural networks in managerial
ng, Int. J. Intell. Syst. Accounting Finance Manage. 2 (1993)
lav, Learning and Soft ComputingSupport Vector Ma-eural Networks and Fuzzy Logic Models, The MIT Press,
usetts, 2001.d, Adaptation in Natural and Artificial System, University
igan Press, AnnArbor, 1975., S. Seker, B. Barutcu, E. Turkcan, Comparisons betweenus types of neural networks with the data of wide rangeal conditions of the Borssele NPP, Prog. Nucl. Energy 4381387.
Forecasting regional electricity load based on recurrent support vector machines with genetic algorithmsIntroductionRecurrent support vector machines with genetic algorithmsSupport vector machines with genetic algorithmsRecurrent support vector machines with genetic algorithms
A numerical example and experimental resultsConclusionsAcknowledgementsReferences