Upload
farouqrazzaz2574
View
234
Download
0
Embed Size (px)
Citation preview
8/19/2019 Chapter 3- Multiple Random Variables-Updated
1/25
8/19/2019 Chapter 3- Multiple Random Variables-Updated
2/25
CEN 343 Chapter 3: Multiple random variables
2
1. Vector Random Variables
•
Let two random variables X with value x and Y with value y are defined on a sample space S,then the random point (x, y) is a random vector in the XY plane.
• In the general case where N r.v's. X1, X2, . . . XN are defined on a sample space S, they
become N-dimensional random vector or N-dimensional r.v.
2. Joint distribution
•
Let two events = { ≤ } = { ≤ }, ( X and Y two random variables ) withprobability distribution functions () and (), respectively:
() = ( ≤ ) (1) () = ( ≤ ) (2)
• The probability of the joint event { ≤ , ≤ } , which is a function of the members and is called the joint probability distribution function { ≤ , ≤ } = ( ∩ ). It isgiven as follows:
,(,) = ( ≤ , ≤ ) (3)
• If X and Y are two discrete random variables, where X have N possible values and Y haveM possible values , then:
,(,) = ∑ ∑ (,)( )( )=1=1 (4)
• If X and Y are two continues random variables, then:
Probability of the joint
event { = , = } Unit step function
8/19/2019 Chapter 3- Multiple Random Variables-Updated
3/25
CEN 343 Chapter 3: Multiple random variables
3
,(,) = ,(, )
−
− (5)
Example 1-a:
Sol: a-
8/19/2019 Chapter 3- Multiple Random Variables-Updated
4/25
CEN 343 Chapter 3: Multiple random variables
4
b-
Example 1-b:
Let X and Y be two discrete random variables. Let us assume that the joint space has only three
possible elements (1, 1), (2, 1) and (3, 3). The probabilities of these events are:
(1,1) = 0.2, (2, 1) = 0.3, (3, 3) = 0.5. Find and plot ,(, ). Solution:
Proprieties of the joint distribution ,(,):1. ,(∞,∞) = ,(∞,) = ,(,∞) = 2.
,(∞,∞) = 3. ≤ ,(,) ≤ 4. {
8/19/2019 Chapter 3- Multiple Random Variables-Updated
5/25
CEN 343 Chapter 3: Multiple random variables
5
5. ,(,) is a non decreasing function of both and Marginal distributions:
•
The marginal distribution functions of one random variable is expressed as:
,(,∞) = () ,(∞,) = ()
Example 1-c
Suppose that we have two random variables X and Y with joint density function:
Sol:
=13/160
Example 2:
= {(1,1), (2,1), (3,3)}
8/19/2019 Chapter 3- Multiple Random Variables-Updated
6/25
CEN 343 Chapter 3: Multiple random variables
6
(1,1) = 0.2, (2,1) = 0.3, (3,3) = 0.5 Find ,(,) and the marginal distributions X() and Y() of example 1 Solution:
3. Joint density
•
For two continuous random variables X and Y, the joint probability density function is given
by:
,(,) = 2,(,) (6)
8/19/2019 Chapter 3- Multiple Random Variables-Updated
7/25
CEN 343 Chapter 3: Multiple random variables
7
• If X and Y, are tow discrete random variables then the joint probability density function is
given by:
,(,) = (,)( )( )=1
=1
(7)
Proprieties of the joint density function ,(,):1. ,(,) ≥ 0 2.
∫ ∫ ,(,) = 1−− 3. ,(, ) = ∫ ∫ ,(,)−− 4.
() = ∫ ∫ ,(,)−− , () = ∫ ∫ ,(,)−− 5. {1
8/19/2019 Chapter 3- Multiple Random Variables-Updated
8/25
CEN 343 Chapter 3: Multiple random variables
8
In tabular form, the marginal distributions of X and Y are:
Example:
Note that g(x) stands for f X(x) and h(y) for f Y(y). By definition, we have:
Note that, we have:
Example:
8/19/2019 Chapter 3- Multiple Random Variables-Updated
9/25
8/19/2019 Chapter 3- Multiple Random Variables-Updated
10/25
CEN 343 Chapter 3: Multiple random variables
10
• For discrete random variables:
Suppose we have:
(y) =
=1 (12)
,(,) = ∑ ∑ ,( )=1=1 (13) Assume = is the specific value of y, then:
(| = ) = ( ,)
()
=1 ( ) (14) and (| = ) = (,)()
=1
( ) (15) Example 4:
Let us consider the joint pdf: ,(,) = ()() −(+1)
8/19/2019 Chapter 3- Multiple Random Variables-Updated
11/25
CEN 343 Chapter 3: Multiple random variables
11
Find (|) if the marginal pdf of X is given by: () = ()− Solution:
,(,) and () are nonzero only for > 0 > 0, (|) is nonzero only for > 0 > 0, therefore, we keep u(x).
Calculate the probability: Pr( ≤ | = )? ( ≤ | = ) = ∫ (| = ) =∫ − = − = .
Example 5:
What represents the corresponding figure ?
Find (| = 3)
Solution:
Example:
8/19/2019 Chapter 3- Multiple Random Variables-Updated
12/25
CEN 343 Chapter 3: Multiple random variables
12
Example:
Sol:
8/19/2019 Chapter 3- Multiple Random Variables-Updated
13/25
CEN 343 Chapter 3: Multiple random variables
13
4.2 Interval conditioning: The Radom variable in the interval {
8/19/2019 Chapter 3- Multiple Random Variables-Updated
14/25
CEN 343 Chapter 3: Multiple random variables
14
Show that: ∫ ∫ ,(, )− = ,(, ) ,(, )
Example 6:
Let ,(, ) = ()() −(+1) (18) () = ()−, () = ()(+1)
Find (| ≤ )?Solution 6:
Numerical application: Find (
8/19/2019 Chapter 3- Multiple Random Variables-Updated
15/25
CEN 343 Chapter 3: Multiple random variables
15
• Note that if X and Y are independent, then
(
|
) =
(
)
(
|
) =
(
) (21)
Example:
Example 7:
In example 6, are X and Y independent?
Solution
⟹ . Example 8:
Let ,(, ) = 112 ()()−4− are X and Y independent?
Solution
8/19/2019 Chapter 3- Multiple Random Variables-Updated
16/25
CEN 343 Chapter 3: Multiple random variables
16
Example:
Sol:
6. Sum of two random variables
• Here the problem is to determine the probability density function of the sum of two
independent random variables X and Y :
= + (22) • The resulting probability density function of can be shown to be the convolution of the
density functions of X and Y:
() = () ∗ ()
() = () ( )
−= () ( )
− (23)
Convolution
8/19/2019 Chapter 3- Multiple Random Variables-Updated
17/25
CEN 343 Chapter 3: Multiple random variables
17
Proof:
Example 9:
Let us consider two independent random variables X and Y with the following pdfs:
() = 1 [() ( )]
() = 1 [() ( )] Where 0
8/19/2019 Chapter 3- Multiple Random Variables-Updated
18/25
CEN 343 Chapter 3: Multiple random variables
18
7. Central Limit Theorem
The probability distribution function of the sum of a large number of random variables
approaches a Gaussian distribution. More details will be given in Chapter 5.
a+b
()
w
1
a b
8/19/2019 Chapter 3- Multiple Random Variables-Updated
19/25
CEN 343 Chapter 3: Multiple random variables
19
8. Expectations and Correlations
• If (,) is a function of two random variables X and Y ( their joint probability densityfunction
,(
,
)). Then the expected value of
(
,
), a function of the two random
variables X and Y is given by:
[( ,)] = (,) ,(,)∞
−∞
∞
−∞ ()
Example 10
Let ( ,) = + , find [( ,)]? Solution:
• If we have n functions of random variables (,),(,), … , (,) :
[(,) + (,) + … + (,)] = [(,)] + [(,)] + ⋯+ [(,)]
8/19/2019 Chapter 3- Multiple Random Variables-Updated
20/25
CEN 343 Chapter 3: Multiple random variables
20
Which means that the expected value of the sum of the functions is equal to the sum of the
expected values of the functions.
8.1 Joint Moments about the origin
= [ ] =
−
− ,(,) (25)
We have the following proprieties:
0 = [ ] are the moments of X
0 = [] are the moments of Y The sum n+k is called the order of the moments. ( 20,02 and 11 are called second
order moments).
10=
[
] and
01=
[
] are called first order moments.
8.2 Correlation
• The second-order joint moment 11 is called the correlation between X and Y .• The correlation is a very important statistic and its denoted by :
8/19/2019 Chapter 3- Multiple Random Variables-Updated
21/25
CEN 343 Chapter 3: Multiple random variables
21
= [ ] = 11 =
−
− ,(,) (26)
• If
=
[
]
[
] then X and Y are said to be uncorrelated.
• If X and Y are independent then ,(x, y) = () () and
= ()∞
− ()
∞
−= [ ][] (27)
• Therefore, if X and Y are independent ℎ they are uncorrelated. • However, if X and Y are uncorrelated, it is not necessary that they are independent.
•
If = 0 ℎ Y are called orthogonal.
Example 11
Let [ ] = 3 ,2 = 2, let also = 6 + 22Find ? Are X and Y Orthogonal?
Are X and Y uncorrelated?
Solution:
Or X and Y are correlated.
Example:
8/19/2019 Chapter 3- Multiple Random Variables-Updated
22/25
CEN 343 Chapter 3: Multiple random variables
22
Find the marginal PDF: (), |(|), and E(XY)Solution:
1. () = √ −
2. |(|) = −−
3. ( ) = ∫
√
−− =
8.3 Joint central moments
• The joint central moment is defined as:
= [( �)( �)]= ( �)( �)
−
− ,(,) (28)
We note that:20 = [( �)2] = 2 is the variance of X.2 = [( �)2] = 2 is the variance of Y .
8.4 Covariance
• The second order joint central moment is called the covariance of X and Y and denoted by .
= 11 = [( �)( �)] = ∫ ∫ ( �)( �)−− ,(,) (29) We have
= [ ] �� = [ ][] (30) • If X and Y are uncorrelated (or independent), then = 0 • If X and Y are orthogonal, then = ��
8/19/2019 Chapter 3- Multiple Random Variables-Updated
23/25
CEN 343 Chapter 3: Multiple random variables
23
• If X and Y are correlated, then the correlation coefficient measures the degree of correlation between X and Y:
= =
= [
−�
−� ] (31)
It is important to note that: 1 ≤ ≤ 1 Example 12
Let = + find 2 when X and Y are uncorrelatedSolution:
9 Joint Characteristic functions
• The joint characteristic function of two random variables is defined by:
∅,(1,2)=+ = ∫ ∫ − ,(,)− (32) 1 and 2 are real numbers.•
By setting 1 = 0 2 = 0, we obtain the marginal characteristic function:∅(1) = ∅,(1, 0)
∅(2) = ∅,(0,2) (33)
• The joint moments are obtained as :
8/19/2019 Chapter 3- Multiple Random Variables-Updated
24/25
CEN 343 Chapter 3: Multiple random variables
24
= ()+ ∅,(,) | =0,=0 (34)
Example 1Let us consider the joint characteristic function: ∅ ,(,) = −− Find E [ X ] and E [Y ]
Are X and Y uncorrelated?
Solution:
For = = , the joint characteristic function of: Z=X+Y is:
8/19/2019 Chapter 3- Multiple Random Variables-Updated
25/25
CEN 343 Chapter 3: Multiple random variables
25
Problem