Upload
matthew-hunt
View
219
Download
3
Embed Size (px)
Citation preview
Introduction to adaptive filtering & its applications
An Introduction to adaptive filtering & its applicationsByAsst.Prof.Dr.Thamer M.JamelDepartment of Electrical EngineeringUniversity of Technology Baghdad Iraq
IntroductionLinear filters : the filter output is a linear function of the filter input Design methods: The classical approach frequency-selective filters such as low pass / band pass / notch filters etc
Optimal filter design Mostly based on minimizing the mean-square value of the error signal
2Wiener filterwork of Wiener in 1942 and Kolmogorov in 1939it is based on a priori statistical informationwhen such a priori information is not available, which is usually the case, it is not possible to design a Wiener filter in the first place.
Adaptive filterthe signal and/or noise characteristics are often nonstationary and the statistical parameters vary with time
An adaptive filter has an adaptation algorithm, that is meant to monitor the environment and vary the filter transfer function accordingly
based in the actual signals received, attempts to find the optimum filter design
Adaptive filterThe basic operation now involves two processes :
1. a filtering process, which produces an output signal in response to a given input signal.
2. an adaptation process, which aims to adjust the filter parameters (filter transfer function) to the (possibly time-varying) environment Often, the (average) square value of the error signal is used as the optimization criterion
Adaptive filterBecause of complexity of the optimizing algorithms most adaptive filters are digital filters that perform digital signal processing
When processing analog signals, the adaptive filter is then preceded by A/D and D/A convertors.
Adaptive filterThe generalization to adaptive IIR filters leads to stability problems
Its common to use a FIR digital filter with adjustable coefficients
Applications of Adaptive Filters: IdentificationUsed to provide a linear model of an unknown plantApplications: System identification
Applications of Adaptive Filters: Inverse ModelingUsed to provide an inverse model of an unknown plantApplications: Equalization (communications channels)
Applications of Adaptive Filters: PredictionUsed to provide a prediction of the present value of a random signalApplications: Linear predictive coding
Applications of Adaptive Filters: Interference CancellationUsed to cancel unknown interference from a primary signalApplications: Echo / Noise cancellationhands-free carphone, aircraft headphones etc
Example:Acoustic Echo Cancellation
LMS AlgorithmMost popular adaptation algorithm is LMS Define cost function as mean-squared error
Based on the method of steepest descent Move towards the minimum on the error surface to get to minimum gradient of the error surface estimated at every iteration
LMS Adaptive Algorithm Introduced by Widrow & Hoff in 1959 Simple, no matrices calculation involved in the adaptation In the family of stochastic gradient algorithms Approximation of the steepest descent method Based on the MMSE criterion.(Minimum Mean square Error) Adaptive process containing two input signals:1.) Filtering process, producing output signal.2.) Desired signal (Training sequence) Adaptive process: recursive adjustment of filter tap weights
LMS Algorithm StepsFilter output
Estimation error
Tap-weight adaptation
17
Stability of LMSThe LMS algorithm is convergent in the mean square if and only if the step-size parameter satisfy
Here max is the largest eigenvalue of the correlation matrix of the input dataMore practical test for stability is
Larger values for step sizeIncreases adaptation rate (faster adaptation)Increases residual mean-squared error
Given the following function we need to obtain the vector that would give us the absolute minimum.
It is obvious that give us the minimum.(This figure is quadratic error function (quadratic bowl) )
STEEPEST DESCENT EXAMPLE
Now lets find the solution by the steepest descend method
We start by assuming (C1 = 5, C2 = 7)We select the constant . If it is too big, we miss the minimum. If it is too small, it would take us a lot of time to het the minimum. I would select = 0.1.The gradient vector is:
STEEPEST DESCENT EXAMPLE
So our iterative equation is:STEEPEST DESCENT EXAMPLE
As we can see, the vector [c1,c2] converges to the value which would yield the function minimum and the speed of this convergence depends on .
Initial guessMinimum
LMS CONVERGENCE GRAPH
This graph illustrates the LMS algorithm. First we start from guessing the TAP weights. Then we start going in opposite the gradient vector, to calculate the next taps, and so on, until we get the MMSE, meaning the MSE is 0 or a very close value to it.(In practice we can not get exactly error of 0 because the noise is a random process, we could only decrease the error below a desired minimum)Example for the Unknown Channel of 2nd order: Desired Combination of tapsDesired Combination of taps
Adaptive Array AntennaAdaptive ArraysLinear CombinerInterference
SMART ANTENNAS34
Adaptive Array Antenna
Applications are manyDigital Communications (OFDM , MIMO , CDMA, and RFID)Channel EqualisationAdaptive noise cancellationAdaptive echo cancellationSystem identificationSmart antenna systemsBlind system equalisationAnd many, many othersAdaptive Equalization
Introduction Wireless communication is the most interesting field of communication these days, because it supports mobility (mobile users). However, many applications of wireless comm. now require high-speed communications (high-data-rates).
What is the ISI Inter-symbol-interference, takes place when a given transmitted symbol is distorted by other transmitted symbols. Cause of ISI ISI is imposed due to band-limiting effect of practical channel, or also due to the multi-path effects (delay spread).
Definition of the Equalizer: the equalizer is a digital filter that provides an approximate inverse of channel frequency response.Need of equalization: is to mitigate the effects of ISI to decrease the probability of error that occurs without suppression of ISI, but this reduction of ISI effects has to be balanced with prevention of noise power enhancement.
Types of Equalization techniques Linear Equalization techniques which are simple to implement, but greatly enhance noise power because they work by inverting channel frequency response.
Non-Linear Equalization techniques which are more complex to implement, but have much less noise enhancement than linear equalizers.Equalization Techniques
Fig.3 Classification of equalizersLinear equalizer with N-taps, and (N-1) delay elements. Go
Table of various algorithms and their trade-offs:algorithmMultiplying-operationscomplexityconvergencetrackingLMSLowslowpoorMMSEVery highfastgoodRLSHighfastgoodFastkalmanFairlyLowfastgoodRLS-DFEHighfastgood
Adaptive noise cancellationAdaptive Filter Block Diagram
54The LMS EquationThe Least Mean Squares Algorithm (LMS) updates each coefficient on a sample-by-sample basis based on the error e(n).
This equation minimises the power in the error e(n).
55The Least Mean Squares AlgorithmThe value of (mu) is critical.If is too small, the filter reacts slowly.If is too large, the filter resolution is poor. The selected value of is a compromise.56LMS Convergence Vs u
Audio Noise ReductionA popular application of acoustic noise reduction is for headsets for pilots. This uses two microphones.
58The Simulink Model
59Setting the Step size (mu)The rate of convergence of the LMS Algorithm is controlled by the Step size (mu).This is the critical variable.
60Trace of Input to ModelInput = Signal + Noise.
61Trace of LMS Filter Output Output starts at zero and grows.
62Trace of LMS Filter ErrorError contains the noise.
63Typical C6713 DSK Setup
USB to PCto +5VHeadphonesMicrophoneAdaptive Echo Cancellation
Acoustic Echo Canceller
New Trends in Adaptive FilteringPartial Updating Weights.Sub-band adaptive filtering.Adaptive Kalman filtering.Affine Projection Method.Time-Space adaptive processing.Non-Linear adaptive filtering:-Neural Networks.The Volterra Series Algorithm .Genetic & Fuzzy.Blind Adaptive Filtering.
Thank You