7/25/2019 Part 8 - Recursive Identification Methods
1/35
System IdentificationControl Engineering B.Sc., 3rd yearTechnical University of Cluj-Napoca
Romania
Lecturer: Lucian Busoniu
7/25/2019 Part 8 - Recursive Identification Methods
2/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Part VIII
Recursive identification methods
7/25/2019 Part 8 - Recursive Identification Methods
3/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Recursive identification: Idea
Recursive methods can be seen as working online, while the systemis evolving.
At stepk, computenew parameter estimate
(k), based on the
previous estimate(k 1)and newly available datau(k),y(k).Remark:To contrast them with recursive identification, the previousmethods, which used the whole data set, will be called batchidentification.
7/25/2019 Part 8 - Recursive Identification Methods
4/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Features
Recursive methods:
Require less memory, and less computation for each update,than the whole batch algorithm.
Easier to apply in real-time.
When properly modified, can deal withtime-varyingsystems.
If model is used to tune a controller, we obtainadaptive control:
7/25/2019 Part 8 - Recursive Identification Methods
5/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Disadvantage
Recursive methods are usually approximations of batch techniques guarantees more difficult to obtain.
M ti ti l R i l t d ARX R i i t t l i bl
7/25/2019 Part 8 - Recursive Identification Methods
6/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Table of contents
1 Motivating example: Estimating a scalar
2 Recursive least-squares and ARX
3 Recursive instrumental variables
Motivating example Recursive least squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
7/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Recall: Estimating a scalar
Recall scalar estimation. Model:
y(k) =b+ e(k) =1 b+ e(k) =(k)+ e(k)
where(k) =1k,=b.
For the data points up to and including k:
y(1) =(1)=1 b
y(k) =(k)=1 b
After calculation, the solution of this system is:
(k) = 1k
[y(1) +. . .+ y(k)]
Intuition:Estimate is the average of all measurements, filtering out
the noise.
Motivating example Recursive least squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
8/35
Motivating example Recursive least-squares and ARX Recursive instrumental variables
Estimating a scalar: Recursive formulation
Rewrite the formula:
(k) =
1
k[y(1) +. . .+ y(k)]
= 1
k[(k 1) 1
k 1 (y(1) +. . .+ y(k 1))+ y(k)]
= 1
k[(k 1)(k 1)+ y(k)]
= 1
k[k(k 1) + y(k) (k 1)]
=(k 1) + 1k
[y(k) (k 1)]
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
9/35
Motivating example Recursive least squares and ARX Recursive instrumental variables
Estimating a scalar: Discussion
(k) =(k 1) + 1k
[y(k) (k 1)]Recursive formula: new estimate(k)computed based onprevious estimate(k 1)and new datay(k).[y(k) (k 1)]is a prediction error(k), since
(k 1) =
b=
y(k), a one-step-ahead prediction of the output.
Update applies a correction proportional to(k), weighted by 1k
:
When the error (k)is large (or small), a large (or small)adjustment is made.
The weight 1k
decreases with timek, so adjustments get smaller asbbecomes better.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
10/35
Motivating example Recursive least squares and ARX Recursive instrumental variables
Table of contents
1 Motivating example: Estimating a scalar
2 Recursive least-squares and ARX
General recursive least-squares
Recursive ARX
Matlab example
3 Recursive instrumental variables
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
11/35
g p q
Recall: Least-squares regression
Model:y(k) =(k)+ e(k)
Dataset up tokgives a linear system of equations:
y(1) =(1)
y(2) =(2)
y(k) =(k)
After some linear algebra and calculus, the least-squares solution is:
(k) =
kj=1
(j)(j)
1
kj=1
(j)y(j)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
12/35
g p q
Least-squares: Recursive formula
With notationP(k) =kj=1(j)(j):(k) =P1(k)
kj=1
(j)y(j)
=P1(k)k1
j=1
(j)y(j) +(k)y(k)=P1(k)
P(k 1)
(k 1) +(k)y(k)
=P
1
(k) [P(k) (k)(k)](k 1) +(k)y(k)=(k 1) + P1(k) (k)(k)(k 1) +(k)y(k)=
(k 1) + P1(k)(k)
y(k) (k)
(k 1)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
13/35
Least-squares: Discussion
(k) =(k 1) + P1(k)(k) y(k) (k)(k 1)
(k) =
(k 1) + W(k)(k)
Recursive formula: new estimate(k)computed based onprevious estimate(k 1)and new datay(k).
y(k) (k)
(k 1)
is a prediction error(k), since
(k)(k 1) =yk1(k) is the one-step-ahead prediction usingthe previous parameter vector.W(k) =P1(k)(k)is a weighting vector: elements of Pgrowlarge for largek, thereforeW(k)decreases.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
14/35
Recursive matrix inversion
The previous formula requires matrix inverseP1(k), which iscomputationally costly.
ForP, an easy recursion exists: P(k) =P(k 1) +(k)(k), but
this does not help; the matrix must still be inverted.TheSherman-Morrisonformula gives a recursion for the inverse P1:
P1(k) =P1(k 1) P1(k 1)(k)(k)P1(k 1)
1 +(k)P1(k 1)(k)
Exercise: Prove the Sherman-Morrison formula! (linear algebra)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
15/35
Overall algorithm
Recursive least-squares (RLS)
initialize
(0),P1(0)
loopat every stepk=1, 2, . . .
measurey(k), form regressor vector(k)find prediction error(k) =y(k) (k)(k 1)update inverseP1(k) =P1(k 1) P
1(k1)(k)(k)P1(k1)1+(k)P1(k1)(k)
compute weightsW(k) =P1(k)(k)
update parameters(k) =(k 1) + W(k)(k)end loop
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
16/35
Initialization
The RLS algorithm requires initial parameter(0), and initial inverseP1(0).
Typical choices without prior knowledge:
(0) = [0, . . . , 0]
- a vector ofnzeros.
P1(0) = 1
I, witha small number such as 103.An equivalent initial value ofPwould beP(0) =I.
Intuition: Psmall corresponds to a low confidence in the initialparameters. This leads to large P1(0), so initially the weightsW(k)
are large and large updates are applied to.If a prior valueis available,(0)can be initialized to it, andcorrespondingly increased. This means better confidence in the initialestimate, and smaller initial updates.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
17/35
Back to the scalar case
y(k) =b+ e(k) =(k)+ e(k), where(k) =1, =b
Take(0) =0, P1(0) . We have with Sherman-Morrison:P1(k) =P1(k 1)
(P1(k 1))2
1 + P1(k 1)=
P1(k 1)
1 + P1(k 1)
P
1(1) =1
P1(2) = 1
2
P1
(k) = 1
k
Also,(k) =y(k) (k 1)and W(k) =P1(k) = 1k
, leading to:
(k) =(k 1) + W(k)(k) =(k 1) +1
k[y(k) (k 1)] formula for the scalar case was recovered.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
18/35
Table of contents
1 Motivating example: Estimating a scalar
2 Recursive least-squares and ARX
General recursive least-squares
Recursive ARX
Matlab example
3 Recursive instrumental variables
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
19/35
Recall ARX model
A(q1)y(k) =B(q1)u(k) + e(k)
(1+a1q1 + + anaq
na)y(k) =
(b1q1 + + bnbq
nb)u(k) + e(k)
In explicit form:
y(k) + a1y(k 1) + a2y(k 2) +. . .+ anay(k na)
=b1u(k 1) + b2u(k 2) +. . .+ bnbu(k nb) + e(k)
where the model parameters are: a1, a2, . . . , anaandb1, b2, . . . , bnb.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
20/35
Linear regression representation
y(k) = a1y(k 1) a2y(k 2) . . . anay(k na)
b1u(k 1) + b2u(k 2) +. . .+ bnbu(k nb) + e(k)
= y(k 1) y(k na) u(k 1) u(k nb)
a1 ana b1 bnb
+ e(k)
=:(k)+ e(k)
Regressor vector: Rna+nb
, previous output and input values.Parameter vector: Rna+nb, polynomial coefficients.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
21/35
Recursive ARX
With this representation, just an instantiation of RLS template.
Recursive ARX
initialize(0),P1(0)loopat every stepk=1, 2, . . .
measureu(k),y(k)
form regressor vector(k) = [y(k 1), , y(k na), u(k 1), , u(k nb)]
find prediction error(k) =y(k) (k)(k 1)update inverseP1(k) =P1(k 1) P
1(k1)(k)(k)P1(k1)1+(k)P1(k1)(k)
compute weightsW(k) =P1(k)(k)
update parameters(k) =(k 1) + W(k)(k)end loop
Remark:Outputs more thannasteps ago, and inputs more than nbsteps ago, can be forgotten.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
22/35
Guarantees idea
With(0) =0, P1(0) = 1
Iand 0, recursive ARX at step k isequivalent to running batch ARX on the datasetu(1), y(1), . . . , u(k), y(k).
the same guarantees as batch ARX:
Theorem
Under appropriate assumption (including the existence of trueparameters0), ARX identification isconsistent: the estimatedparameterstend to the true parameters 0 ask .The condition onPis just to ensure the inverses are the same as inthe offline case.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
23/35
Table of contents
1 Motivating example: Estimating a scalar
2 Recursive least-squares and ARX
General recursive least-squares
Recursive ARX
Matlab example
3 Recursive instrumental variables
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
24/35
System
To illustrate recursive ARX, we take a known system:
y(k) + ay(k 1) =bu(k 1) + e(k), a= 0.9, b=1(Soderstrom & Stoica)
system = idpoly([1 -0.9], [0 1]);
Identification data obtained in simulation:sim(system, u, noise);
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
25/35
Recursive ARX
model = rarx(id, [na, nb, nk], ff, 1, th0, Pinv0);
Arguments:
1
Identification data.2 Array containing the orders ofA and Band the delaynk.
3 ff, 1selects the algorithm variant presented in this lecture.
4 th0is the initial parameter value.
5 Pinv0is the initial inverseP1(0).
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
26/35
Results
na= nb=nk=1P1(0) = 1
I, left: =103, right: =100.
Conclusions:
Convergence to true parameter values...
...slower whenis larger (good confidence in the initial solution)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
27/35
System outside model class
Take now anoutput-errorsystem (so it is not ARX with white noise):
y(k) = bq1
1 + fq1u(k) + e(k), f = 0.9, b=1
(Soderstrom & Stoica)
system = idpoly([], [0 1], [], [], [1 -.9]);
Identification data:
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
28/35
Results
na= nb=nk=1, so we attempt the ARX model:
y(k) + fy(k 1) =bu(k 1) + e(k)
Also,P1(0) = 1
I,=103.
Conclusion:
Convergence to the true values is not attained! as expected,
due to colored noise
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
29/35
Other recursive algorithms
Other recursive identification methods available in Matlab, e.g.:
ARMAX, rarmax
output-error, roe
generic prediction error method, rpem
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
30/35
Table of contents
1 Motivating example: Estimating a scalar
2 Recursive least-squares and ARX
3 Recursive instrumental variables
Motivating example Recursive least-squares and ARX Recursive instrumental variables
7/25/2019 Part 8 - Recursive Identification Methods
31/35
Recall: Instrumental variable method
IV methods find the parameter vector using:
= 1N
Nk=1
Z(k)(k)
1 1
N
Nk=1
Z(k)y(k)
(8.1)
which is the solution to the system of equations:1
N
Nk=1
Z(k)[(k) y(k)]
= 0 (8.2)
The regressor vector:(k) = [y(t 1), , y(t na), u(t 1), , u(t nb)]
Motivation: instrument vectorZ(k)uncorrelated with the noise can handle colored noise.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
R i f l i
7/25/2019 Part 8 - Recursive Identification Methods
32/35
Recursive formulation
Rewrite the equation for the recursive case:
=P1
k
j=1Z(j)y(j)
(8.3)
whereP=k
j=1 Z(j)(j).
Note averaging by the number of data points has been removed; itwas needed in batch IV since summations over many data pointscould grow very large, leading to numerical problems.
Recursive IV will add terms one by one, and will therefore benumerically stable.
Motivating example Recursive least-squares and ARX Recursive instrumental variables
R i f l ti (2)
7/25/2019 Part 8 - Recursive Identification Methods
33/35
Recursive formulation (2)
With this definition ofP, the recursive update is found similarly to
RLS:
(k) =P1(k) k
j=1
Z(j)y(j)
=P1(k)k1
j=1
Z(j)y(j) + Z(k)y(k)=P1(k)
P(k 1)
(k 1) + Z(k)y(k)
=P
1
(k) [P(k) Z(k)(k)](k 1) + Z(k)y(k)=(k 1) + P1(k) Z(k)(k)(k 1) + Z(k)y(k)=
(k 1) + P1(k)Z(k)
y(k) (k)
(k 1)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
R i f l ti (3)
7/25/2019 Part 8 - Recursive Identification Methods
34/35
Recursive formulation (3)
Final formula:
(k) =
(k 1) + P1(k)Z(k)
y(k) (k)
(k 1)
(k) =(k 1) + W(k)(k)with the Sherman-Morrison update of the inverse:
P1(k) =P1(k 1) P1(k 1)Z(k)(k)P1(k 1)
1 +
(k)P1
(k 1)Z(k)
Motivating example Recursive least-squares and ARX Recursive instrumental variables
R i IV O ll l ith
7/25/2019 Part 8 - Recursive Identification Methods
35/35
Recursive IV: Overall algorithm
Recursive IV
initialize
(0),P1(0)
loopat every stepk=1, 2, . . .
measurey(k), form regressor vector(k)find prediction error(k) =y(k) (k)(k 1)update inverseP1(k) =P1(k 1) P
1(k1)Z(k)(k)P1(k1)1+(k)P1(k1)Z(k)
compute weightsW(k) =P1(k)Z(k)
update parameters(k) =(k 1) + W(k)(k)end loop