62
SYSTEMS SYSTEMS Identification Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart Ljung

SYSTEMS Identification

  • Upload
    tiger

  • View
    32

  • Download
    5

Embed Size (px)

DESCRIPTION

SYSTEMS Identification. Ali Karimpour Assistant Professor Ferdowsi University of Mashhad. Reference: “System Identification Theory For The User” Lennart Ljung. Lecture 11. Recursive estimation methods. Topics to be covered include : Introduction. The Recursive Least-Squares Algorithm. - PowerPoint PPT Presentation

Citation preview

Page 1: SYSTEMS Identification

SYSTEMSSYSTEMSIdentificationIdentification

Ali Karimpour

Assistant Professor

Ferdowsi University of Mashhad

Reference: “System Identification Theory For The User”

Lennart Ljung

Page 2: SYSTEMS Identification

2

lecture 11

Ali Karimpour Nov 2009

Lecture 11

Recursive estimation methodsRecursive estimation methods

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

Page 3: SYSTEMS Identification

3

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

Introduction

Page 4: SYSTEMS Identification

4

lecture 11

Ali Karimpour Nov 2009

Introduction

In many cases it is necessary, or useful, to have a model of the system available on-line.

The need for such an on-line model is required in order to:

• Which input should be applied at the next sampling instant?

• How should the parameters of a matched filter be tuned?

• What are the best predictions of the next few output?

• Has a failure occurred and, if so, of what type?

Adaptive

Adaptive control, adaptive filtering, adaptive signal processing, adaptive prediction.

Page 5: SYSTEMS Identification

5

lecture 11

Ali Karimpour Nov 2009

Introduction

The on-line computation of the model must completed during one sampling interval.

Identification techniques that comply with this requirement will be called:

• Recursive identification methods.

• On-line identification.

• Real-time identification.

• Adaptive parameter estimation.

• Sequential parameter estimation.

• Recursive identification methods. Used in this Reference.

Page 6: SYSTEMS Identification

6

lecture 11

Ali Karimpour Nov 2009

Introduction

Page 7: SYSTEMS Identification

7

lecture 11

Ali Karimpour Nov 2009

Introduction

Algorithm format

General identification method: ),(ˆ tt ZtF

))(),(),1((ˆˆ

))(),(),1(()1()(

1 tutytXQ

tutytXQtXtX

ttt

Xt

This form cannot be used in a recursive algorithm, since it cannot be completed in one sampling instant.

Instead following recursive algorithm must comply:

Minimizing argument of some function or…

Information state

Since the information in the latest pair of measurement { y(t) , u(t) } normally is small compared to the pervious information so there is a more suitable form

Small numbers reflecting the relative information value in the latest measurement.

))1((ˆ

))(),(),1(,()(

tXh

tutytXtHtX

t

Page 8: SYSTEMS Identification

8

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

The Recursive Least-Squares Algorithm

Page 9: SYSTEMS Identification

9

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Weighted LS Criterion

The estimate for the weighted least squares is:

Where

Page 10: SYSTEMS Identification

10

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Recursive algorithm

Suppose the weighting sequence has the following property:

Now

Page 11: SYSTEMS Identification

11

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Recursive algorithm

Suppose the weighting sequence has the following property:

Page 12: SYSTEMS Identification

12

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Recursive algorithm

Version with Efficient Matrix Inversion

Remember matrix inversion lemma

To avoid inverting at each step, let introduce

Page 13: SYSTEMS Identification

13

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Version with Efficient Matrix Inversion

Moreover we have

We can summarize this version of algorithm as:

Page 14: SYSTEMS Identification

14

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Normalized Gain Version

The size of the matrix R(t) will depend on the λ(t)

Page 15: SYSTEMS Identification

15

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Initial Condition

A possibility could be to initialize only at a time instant t0

By LS method

Clearly if P0 is large or t is large, then above estimate is the same as:

Page 16: SYSTEMS Identification

16

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Asymptotic Properties of the Estimate

Page 17: SYSTEMS Identification

17

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Multivariable case

Remember SISO

Now for MIMO

Page 18: SYSTEMS Identification

18

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Kalman Filter Interpretation

The Kalman Filter for estimating the state of system

Kalman Filter

The linear regression model can be cast to above form as:

Now, letExercise: Derive the Kalman filter for above mention system, and show that it

is exactly same as the Recursive Least-Squares Algorithm for multivariable

case.

Page 19: SYSTEMS Identification

19

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Kalman Filter Interpretation

Kalman filter interpretation gives important information, as well as some practical hints:

Page 20: SYSTEMS Identification

20

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Coping with Time-varying Systems

An important reason for using adaptive methods and recursive identification in practice is:

• The properties of the system may be time varying.

• We want the identification algorithm to track the variation.

This is handled by weighted criterion, by assigning less weight to older measurements

Page 21: SYSTEMS Identification

21

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Coping with Time-varying Systems

These choices have the natural effect that in the recursive algorithms the step size will not decrease to zero.

Page 22: SYSTEMS Identification

22

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Coping with Time-varying Systems

Another and more formal alternative to deal with time-varying parameters is that the true parameters varies like a random walk so

Exercise: Derive the Kalman filter for above mention system, and show that it is exactly same as the Recursive Least-Squares Algorithm for multivariable case.

Note: The additive term R1(t) in P(t) prevents the gain L(t) from tending to zero.

Page 23: SYSTEMS Identification

23

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

The Recursive IV Method

Page 24: SYSTEMS Identification

24

lecture 11

Ali Karimpour Nov 2009

The Recursive IV Method

Remember Weighted LS Criterion:

Where

The IV estimate for instrumental variable method is:

Page 25: SYSTEMS Identification

25

lecture 11

Ali Karimpour Nov 2009

The Recursive IV Method

The IV estimate for instrumental variable method is:

Page 26: SYSTEMS Identification

26

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

Recursive Prediction-Error Methods

Page 27: SYSTEMS Identification

27

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Analogous to the weighted LS case, let us consider a weighted quadratic prediction-error criterion

Where

so we have

the gradient with respect to θ is

Page 28: SYSTEMS Identification

28

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Analogous to the weighted LS case, let us consider a weighted quadratic prediction-error criterion

Remember the general search algorithm developed for PEM as:

For each iteration i, we collect one more data point, so

now define

As an approximation let:

Page 29: SYSTEMS Identification

29

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Analogous to the weighted LS case, let us consider a weighted quadratic prediction-error criterion

As an approximation let:

With above approximation and taking μ(t)=1, we thus arrive at the algorithm:

This terms must be recursive too.

Page 30: SYSTEMS Identification

30

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

This terms must be recursive too.

Page 31: SYSTEMS Identification

31

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

This terms must be recursive too.

Page 32: SYSTEMS Identification

32

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

This terms must be recursive too.

Page 33: SYSTEMS Identification

33

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Family of recursive prediction error methods

• According to the model structure

• According to the choice of R

Wide family of methods

We shall call “RPEM”

For example, the linear regression

If we consider R(t)=I

Where the gain could be normalized so

This is recursive least square method

This scheme has been widely used, under the name least mean squares (LMS)

Page 34: SYSTEMS Identification

34

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Example 11.1 Recursive Maximum Likelihood

)()()()()()( teqCtuqBtyqA

Tcba nttntutuntytyt ),(...),1()(...)1()(...)1(),(

Tnnn cba

cccbbbaaa ......... 212121

Consider ARMAX model

where

and

Remember chapter 10

By rule 11.41

This scheme is known as recursive maximum likelihood (RML)

Page 35: SYSTEMS Identification

35

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods Projection into DM

In off-line minimization this must be kept in mind as a constraint.

The model structure is well defined only for giving stable predictors. MD

The same is true for the recursive minimization.

Page 36: SYSTEMS Identification

36

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Asymptotic Properties

The recursive prediction-error method is designed to make updates of θ in a direction that “on the average” is modified negative gradient of

i.e.

Page 37: SYSTEMS Identification

37

lecture 11

Ali Karimpour Nov 2009

Recursive Prediction-Error Methods

Asymptotic Properties

Moreover (see appendix 11a), for Gauss-Newton RPEM, with

tt /1)( It can be shown that has an asymptotic normal distribution, which coincides with that of the corresponding off-line estimate. We thus have

)(ˆ t

Page 38: SYSTEMS Identification

38

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

Recursive Pseudolinear Regressions

Page 39: SYSTEMS Identification

39

lecture 11

Ali Karimpour Nov 2009

Recursive Pseudolinear Regressions

Consider the pseudo linear representation of the prediction

And recall that this model structure contains, among other models, the general linear SISO model:

A bootstrap method for estimating θ was given by (Chapter 10, 10.64)

By Newton - Raphson method

Page 40: SYSTEMS Identification

40

lecture 11

Ali Karimpour Nov 2009

Recursive Pseudolinear Regressions

By Newton - Raphson method

Page 41: SYSTEMS Identification

41

lecture 11

Ali Karimpour Nov 2009

Recursive Pseudolinear Regressions

Page 42: SYSTEMS Identification

42

lecture 11

Ali Karimpour Nov 2009

Recursive Pseudolinear Regressions

Family of RPLRs

The RPLR scheme represents a family of well-known algorithms when applied to different special cases of

The RPLR scheme represents a family of well-known algorithms when applied to different special cases of

The ARMAX case is perhaps the best known of this. If we choose

This scheme is known as extended least squares (ELS).

Other special cases are displayed in following table:

Page 43: SYSTEMS Identification

43

lecture 11

Ali Karimpour Nov 2009

Recursive Pseudolinear Regressions

Other special cases are displayed in following table:

Page 44: SYSTEMS Identification

44

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

The Choice of Updating Step

Page 45: SYSTEMS Identification

45

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Recursive Prediction-Error Methods is based prediction error approach:

Recursive Pseudolinear Regressions is based on correlation approach:

Page 46: SYSTEMS Identification

46

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Recursive Prediction-Error Methods (RPEM) Recursive Pseudolinear Regressions (RPLR)

The difference between prediction error approach and correlation approach is:

Now we are going to speak about that modifies the update direction and determines the length of the update step.

)()( 1 tRt

We just speak about RPEM, RPLR is the same just one must change

Page 47: SYSTEMS Identification

47

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Update direction

There are two basic choices of update directions:

Bet

ter

conv

erge

nce

rate

Eas

ier

com

puta

tion

Page 48: SYSTEMS Identification

48

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Update Step: Adaptation gain

An important aspect of recursive algorithm is, their ability to cope with time varing systems. There are two different ways of achieving this

Page 49: SYSTEMS Identification

49

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Update Step: Adaptation gain

In either case, the choice of update step is a trade-off between

• Tracking ability

• Noise sensitivity

A high gain means that the algorithm is alert in tracking parameter changes but at the same time sensitive to disturbance in data.

Page 50: SYSTEMS Identification

50

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Choice of forgetting factor

The choice of forgetting profile β(t,k) is conceptually simple.For a system that changes gradually and in a stationary manner the most common choice is:

The constant λ is always chosen slightly less than 1 so

This means that measurements that are older than T0 samples are included in the criterion with a weight e-1≈0.36% of the most recent measurement.So T0 is the memory time constant.

So we could select λ such that 1/(1-λ) reflects the ratio between the time constant of variations in the dynamics and those of the dynamics itself.

Typical choices of λ are in the range between 0.98 and 0.995. For a system that undergoes sudden changes, rather than steady and slow ones, it is suitable to decrease λ(t) to a small value and then increase it to a value close to 1 again.

Page 51: SYSTEMS Identification

51

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Choice of Gain γ(t)

Page 52: SYSTEMS Identification

52

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Including a model of parameter changes

Kalman Filter Interpretation

Remember

Now let

Page 53: SYSTEMS Identification

53

lecture 11

Ali Karimpour Nov 2009

The Recursive Least-Squares Algorithm

Kalman Filter Interpretation

In the case of a linear regression model, this algorithm does give the optimal trade-off between tracking ability and noise sensitivity, in terms of parameter error covariance matrix.

The case where the parameters are subject to variations that themselves are of a nonstationary nature, [i.e. R1(t) varies with t] needs a parallel algorithm. (see Anderson 1985)

Page 54: SYSTEMS Identification

54

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Constant systems

Page 55: SYSTEMS Identification

55

lecture 11

Ali Karimpour Nov 2009

The Choice of Updating Step

Asymptotic behavior in the time-varying case

Page 56: SYSTEMS Identification

56

lecture 11

Ali Karimpour Nov 2009

Topics to be covered include: Introduction.

The Recursive Least-Squares Algorithm.

The Recursive IV Method.

Recursive Prediction-Error Methods.

Recursive Pseudolinear Regressions.

The Choice of Updating Step.

Implementation.

Implementation

Page 57: SYSTEMS Identification

57

lecture 11

Ali Karimpour Nov 2009

Implementation

Recursive Prediction-Error Methods (RPEM) Recursive Pseudolinear Regressions (RPLR)

Implementation

The basic, general Gauss-Newton algorithm was given in RPEM and RPLR.

Inverse manipulation is not suited for direct implementation, (d*d matrix R must be inverted)

We shall discuss some aspects on how to best implement recursive algorithm. By using matrix inversion lemma

Here η (a d*p matrix) represents either φ or ψ depending on the

approcach.

But this is p*p

Page 58: SYSTEMS Identification

58

lecture 11

Ali Karimpour Nov 2009

Implementation

Implementation

Unfortunately, the P-recursion which in fact is a Riccati equation is not numerically sound: the equation is sensitive to round-off errors that can accumulate and make P(t) indefinite

Using factorization

It is useful to represent the data matrices in factorized form so as to work with better conditioned matrices.

• Cholesky decomposition, which Q(t) is triangular

• UD-decomposition, which U(t) is triangular and D is diagonal

Here we shall give some details of a related algorithm, which is directed based on Householder transformation (problem 10T.1). (by Morf and Kailath)

Page 59: SYSTEMS Identification

59

lecture 11

Ali Karimpour Nov 2009

Implementation

Using factorization

• Step 1: Let

Form (p+d)*(p+d) matrix

• Step 2: Apply an orthogonal (p+d)*(p+d) transformation T (TTT=I) to L(t-1) so that TL(t-1) becomes an upper triangular matrix. (Use QR-factorization)

Let to partition TL(t-1) as:

Page 60: SYSTEMS Identification

60

lecture 11

Ali Karimpour Nov 2009

Implementation

Using factorization

Page 61: SYSTEMS Identification

61

lecture 11

Ali Karimpour Nov 2009

Implementation

Using factorization

• Step 3: Now L(t) and P(t) are:

Page 62: SYSTEMS Identification

62

lecture 11

Ali Karimpour Nov 2009

Implementation

Using factorization in summary

Now L(t) and P(t) are:

There are several advantages with this particular way of performing.

• The only essential computation to perform is the triangularization step.

• This step gives new Q and the gain L after simple additional calculations.

• Note that Π(t) is triangular p*p matrix, so it is easy to invert it.

• Condition number of the matrix L(t-1) is much better than that of P.