App.A_Detection and estimation in additive Gaussian noise.pdf

Embed Size (px)

Citation preview

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    1/55

    Appendix A. Detection andEstimation in Additive

    Gaussian Noise

    Kyungchun Lee

    Dept. of EIE, SeoulTech

    2013 Fall

    Information and Communication Engineering

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    2/55

    Outline

    Gaussian random variables

    Detection in Gaussian noise

    Estimation in Gaussian noise (Brief introduction)

    2

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    3/55

    [Review] Probability and Random Variables

    Random experiment

    On any trial of the experiment, the outcome is unpredictable. For a large number of trials of the experiment, the outcomes exhibit

    statistical regularity. That is, a definite average pattern of outcomes

    is observed if the experiment is repeated a large number of times.

    E.g., tossing of a coin: possible outcomes are heads and tails

    3Introduction to Analog & Digital Communications, S. Haykin and M. Moher

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    4/55

    [Review] Relative-Frequency Approach

    Suppose that in n trials, of the experiment, eventA occurs nA

    times. Then, we say that the relative frequency of the eventAis nA /n.

    The relative frequency is a nonnegative real number less than or equal

    to one.

    The experiment exhibits statistical regularity if, for any sequence ofn

    trials, the relative frequency nA /n converges to a limit as n becomes

    large. We define this limit

    as the probability of event A.

    4

    10 n

    nA

    nnA A

    nlim][P

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    5/55

    [Review] Sample Space

    With each possible outcome of the experiment, we associate

    a point called the sample point. The totality of all sample point

    Sample space S

    The entire sample space S is called the sure event.

    The null set is called the null or impossible event.

    A single sample point is called an elementary event.

    5

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    6/55

    [Review] Sample Space

    Example) Throw of a die

    Six possible outcomes: 1, 2, 3, 4, 5, 6 Sample space: {1, 2, 3, 4, 5 , 6}

    The elementary event of a six shows corresponds to the samplepoint {6}.

    The event of an even number shows corresponds to the subset {2, 4,

    6} of the sample space.

    6

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    7/55

    [Review] A formal definition of probability

    A probability system consists of the triple:

    1. A sample space S of elementary events (outcomes).2. A class of events that are subsets of S.

    3. A probability measure P[A] assigned to each eventA in the class ,which has the following properties (axioms of probability):

    7

    [ ] 1P S

    P[ ] P[ ] P[ ]A B A B

    0 [ ] 1P A -

    -- IfAUB is the union of

    two mutually exclusive

    events in the class ,

    then

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    8/55

    [Review] Random Variables

    Random Variables

    A function whose domain is a sample space and whose range is a setof real numbers is called a random variable of the experiment.

    E.g., mapping a head to 1 and mapping a tail to 0

    8

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    9/55

    [Review] Random Variables

    If the outcome of the experiment is s, we denote the random

    variable asX(s) or justX. We denote a particular outcome of a random experiment by

    x; that is,X(sk)=x

    The random variables may be discrete and take only a finitenumber of values.

    Alternatively, random variables may be continuous.

    E.g., The amplitude of a noise voltage at a particular instant in

    time

    9

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    10/55

    [Review] Probability Distribution Function

    Probability Distribution Function

    The probability that the random variableXtakes any value less than orequal tox

    More often called Cumulative Distribution Function (CDF)

    Properties of Probability Distribution Function

    The distribution function is bounded between zero and one.

    The distribution function is a monotone nondecreasing

    function of x; that is,

    10

    )7.8(][P)( xXxFX

    )(xFX

    )(xFX

    2121if)()( xxxFxF

    XX

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    11/55

    [Review] Probability density function

    Ifxis a continuous-valued random variable and is

    differentiable with respect tox, we can define the probabilitydensity functionas

    Properties of Probability Density Function Since the distribution function is monotone nondecreasing, it follows

    that the density function is nonnegative for all values ofx.

    The distribution function may be recovered from the density function

    by integration, as shown by

    The above property implies that total area under the curve of the

    density function is unity

    11

    )()( xFx

    xf XX

    )(xFX

    x

    XX dssfxF )()(

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    12/55

    [Review] Probability density function

    12

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    13/55

    [Review] Probability density function

    13

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    14/55

    [Review] Independent

    The two random variables ,Xand Y, are statistically

    independent if the outcome ofXdoes not affect the outcomeY.

    By setting ,

    Simple notation:

    14

    ][P][P],[P BYAXBYAX

    )()(),(, yFxFyxF YXYX

    ( , ], ( , ]A x B x

    P[ , ] P[ ]P[ ]X Y X Y

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    15/55

    [Review] Multiple Random Variables

    Joint Distribution Function

    The probability that the random variableXis less than or equal to aspecified valuexand that the random variable Yis less than or equal

    to a specified value y

    More often called Joint Cumulative Distribution Function (Joint CDF)

    Suppose is continuous everywhere, we obtain the

    jointprobability density function

    Nonnegatve everywhere

    The total volume is unity.

    15

    ],[P),(, yYxXyxF YX

    yx

    yxFyxf

    YX

    YX

    ),(

    ),(,

    2

    ,

    , ( , )X YF x y

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    16/55

    16

    [Review] Mean

    Mean

    Statistical averages or expectations

    For a discrete random variableX, the mean, , is the

    weighted sum of the possible outcomes.

    For a continuous random variable, the analogous

    definition of the expected value is

    X

    ][P

    ][E

    xXx

    X

    X

    X

    dxxxfX X

    )(][E

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    17/55

    [Review] Covariance

    Covariance

    The covariance of two random variables,Xand Y

    For the two continuous random variables,

    17

    )])([(E),(Cov YX YXYX

    dxdyyxxyfXY YX ),(][E ,

    Cov( , ) E[ ] E[ ] E[ ]

    E[ ]E[ ]

    Y X X Y

    X Y X Y X Y

    X Y

    X Y XY X Y

    XYXY

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    18/55

    [Review] Covariance

    If the two random variables happen to be independent

    The covariance of independent random variables is zero.

    In this case, we say that the random variables are uncorrelated.

    However, zero covariance does not, in general, imply independence.

    18

    E[ ] ( ) ( )

    ( ) ( )

    E[ ]E[ ]

    X Y

    X X

    X Y

    XY xyf x f y dxdy

    xf x dx yf y dy

    X Y

    A 1 G i d i bl

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    19/55

    A.1 Gaussian random variablesA.1.1 Scalar real Gaussian random variables

    A standard Gaussian random variable w

    A (general) Gaussian random variablex

    19

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    20/55

    Gaussian random variables

    Q-function

    The tail of the Gaussian random variable

    20

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    21/55

    A.1.2 Gaussian random variables

    Linear combinations of independent Gaussian random

    variables are still Gaussian. If are independent and

    then

    21

    (A.6)

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    22/55

    Real Gaussian random vectors

    A standard Gaussian random vector :

    where

    22

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    23/55

    Real Gaussian random vectors

    Property) An orthogonal transformation

    preserves the magnitude of a vector. Ifw is standard Gaussian, then Ow is also standard Gaussian.

    (Isotropic property)

    23

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    24/55

    Real Gaussian random vectors

    Gaussian random vector

    Linear transformation of a standard Gaussian random vector plus aconstant vector

    Property 1) A standard Gaussian random vector is alsoGaussian (with and ).

    Property 2) Any linear combination of the elements of a

    Gaussian random vector is a Gaussian random variable.

    This directly follows from (A.6).

    24

    ncwhere

    (A.10)

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    25/55

    Real Gaussian random vectors

    Property 3) IfA is invertible, then the probability density

    function ofx is expressed as

    Proof omitted.

    Covariance matrix ofx

    25

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    26/55

    Real Gaussian random vectors

    A few inferences form property 3

    Consider two matrices A and AO used to define two Gaussian randomvectors as in (A.10). When O is orthogonal, the covariance matrices of

    both these random vectors are the same, equal to AAt; so the two

    random vectors must be distributed identically.

    A Gaussian random vector is composed ofindependent Gaussian

    random variables exactly when the covariance matrix K is diagonal. The component random variables are uncorrelated (zero

    covariance). Such a random vector is also called a white Gaussian

    random vector.

    26

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    27/55

    A.1.3 Complex Gaussian random vectors

    Complex random vector

    where are real random vectors

    Complex Gaussian random vector

    is a real Gaussian random vector

    The mean and covariance of the complex random vector

    ()*: Hermitian transpose (transpose of a matrix with each element

    replaced by its complex conjugate)

    27

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    28/55

    Complex Gaussian random vectors

    In wireless communication we are almost exclusively

    interested in complex random vectors that have the circularsymmetry property:

    For a circular symmetric random vector, the covariance matrix K fullyspecifies the first- and second-order statistics

    A circular symmetric Gaussian random vector with covariance

    matrix K is denoted as

    28

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    29/55

    Complex Gaussian random vectors

    Property 1) A complex Gaussian random variable

    with independent and identically distributed (i.i.d.) zero-meanGaussian real and imaginary components is circular

    symmetric.

    The phase ofwis uniform over the range and independent of

    the magnitude , which has a Rayleigh distribution.

    29

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    30/55

    Complex Gaussian random vectors

    A standard circular symmetric Gaussian random vector w

    denoted by has the density function

    Ifw is and A is a complex matrix, then x = Aw is alsocircular symmetric Gaussian, with covariance matrix K = AA,

    i.e.,

    IfA is invertible, the density function ofx can be expressed as

    30

    l d

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    31/55

    Complex Gaussian random vectors

    Property 2) For a standard circular symmetric Gaussian

    random vector w, we have

    when U is a complex orthogonal matrix (called a unitary

    matrix and characterized by the property )

    31

    (Isotropic property)

    A 2 Detection in Gaussian noise

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    32/55

    A.2 Detection in Gaussian noiseA.2.1 Scalar detection

    The received signal

    u{uA, uB}

    The detection problem Making a decision on whether uA or uB was transmitted based on the

    observationy

    What is the optimal detector?

    Provides the least probability of making an erroneous decision

    Chooses the symbol that is most likely to have been transmitted given

    the received signal y, i.e., uA is chosen if

    32

    [ i ] C di i l b bili

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    33/55

    33

    [Review] Conditional Probability

    Conditional Probability

    Example) Tossing two dice. X: The number showing on the first die

    Y: The sum of the two dice

    Conditional probability ofYgivenX

    The probability mass function ofYgiven thatXhas occurred:

    where is the joint probability of the two random variables.

    Bayes rule

    ][P

    ],[P]|[P

    X

    YXXY

    P[ , ] P[ | ]P[ ]= P[ | ]P[ ]

    X Y Y X XX Y Y

    ][P

    ][P]|[P]|[P

    X

    YYXXY

    P[ , ]X Y

    [R i ] C di i l P b bili

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    34/55

    [Review] Conditional Probability

    Example)

    A blood test is 95% effective in detecting the viral infection when it is,in fact, present.

    However, the test also yields false positive result for 1% of the healthy

    persons tested.

    0.5% of the population has the infection.

    If a person is tested to be positive, would you decide that he has theinfection?

    34

    [positive|I] 0.95P

    [positive |no I] 0.01P I: Infection

    [R i ] C di i l P b bili

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    35/55

    [Review] Conditional Probability

    The probability that a person has the infection, given that his

    test result is positive:

    35

    [I, positive][I|positive]

    [positive]

    [positive|I] [I]

    [positive|I] [I] [positive|no I] no I

    (0.95) (0.005)

    (0.95) (0.005) (0.01) (0.995)

    0.323 0.5

    PP

    P

    P P

    P P P P

    S l d i

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    36/55

    Scalar detection

    Since the two symbols uA, uB are equally likely to have been

    transmitted, Bayes rule lets us simplify this to the maximumlikelihood (ML) receiver.

    Choose the transmit symbol that makes the observation ymost likely.

    ML decision rule We choose uA if

    and uB otherwise.

    36

    (A.30)

    S l d t ti

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    37/55

    Scalar detection

    The ML rule in (A.30) further simplifies:

    Choosing the nearest neighboring transmit symbol

    37

    S l d t ti

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    38/55

    Scalar detection

    The error probability

    Only depends on the distance between the two transmit symbols uA

    and uB

    38

    D t ti i t

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    39/55

    Detection in a vector space

    Consider detecting the transmit vector u equally likely to be

    uA or uB (both elements of ) Received vector

    where

    ML decision rule

    We choose uA if

    which simplifies to

    39

    n

    (A.35)

    D t ti i t

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    40/55

    Detection in a vector space

    40

    D t ti i t

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    41/55

    Detection in a vector space

    Suppose uA is transmitted, so

    Then an error occurs when the event in (A.35) does not occur,

    i.e.,

    Therefore, the error probability is equal to

    Since

    the error probability can be rewritten as

    41

    D i i A l i i

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    42/55

    Detection in a vector space: An alternative view

    We can write the transmit vector as

    where the information is in the scalar

    We observe that the transmit symbol (a scalarx) is only in a

    specific direction:

    The components of the received vector y in the directions orthogonal

    to v contain purely noise irrelevant for detection.

    42

    1 2x

    D t ti i t A lt ti i

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    43/55

    Detection in a vector space: An alternative view

    Projecting the received vector along the signal direction v

    provides all the necessary information for detection:

    43

    sufficient statistic obtained

    by a matched filter

    A 2 3 D t ti i l t

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    44/55

    A.2.3 Detection in a complex vector space

    The transmit symbol u is equally likely to be one of two

    complex vectors uA, uB.

    The received signal model

    where

    As in the real case, we write

    The signal direction

    44

    Detection in a complex vector space

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    45/55

    Detection in a complex vector space

    Decision statistic

    where

    Sincexis real (

    1/2), we can further extract a sufficientstatistic by looking only at the real component:

    where

    The error probability

    45

    A.3 Estimation in Gaussian noise

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    46/55

    A.3 Estimation in Gaussian noiseA.3.1 Scalar estimation

    Consider a zero-mean realsignal x embedded in independent

    additive real Gaussian noise

    We wish to come up with an estimate of .

    Mean Squared Error (MSE)

    46

    xx

    Scalar estimation

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    47/55

    Scalar estimation

    The estimate that yields the smallest MSE is the classical

    conditional mean

    Orthogonality principle: the error is independent of the observation

    47

    Linear estimator

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    48/55

    Linear estimator

    To simplify the analysis, one studies the restricted class of

    linear estimates that minimize the MSE. Whenxis a Gaussian random variable with zero mean, the

    conditional mean operator is actually linear.

    Assuming thatxis zero mean, linear estimates are of the form

    By the orthogonality principle,

    The corresponding minimum MSE (MMSE)

    48

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    49/55

    Summary

    49

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    50/55

    Summary

    50

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    51/55

    Summary

    51

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    52/55

    Summary

    52

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    53/55

    Summary

    53

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    54/55

    Summary

    54

    Summary

  • 7/27/2019 App.A_Detection and estimation in additive Gaussian noise.pdf

    55/55

    Summary