Upload
others
View
13
Download
0
Embed Size (px)
Citation preview
Lecture 9
Properties of Point Estimators and Methods of
Estimation
Relative efficiency: If we have two unbiased estimators of a
parameter, and , we say that is relatively more efficient
than if ( ) .
Definition: Given two unbiased estimators and of , the
efficiency of relative to , denoted eff( , ), is given by
( )
Example: Let be a random sample of size n from a
population with mean µ and variance . Consider
,
,
.
Find eff( , ) and eff( , ).
Solution:
Consistency: We toss a coin n times. The probability of having
heads is p. Tosses are independent. Let Y = # of heads.
Definition: An estimator is a consistent estimator of θ, if
→ , i.e., if converges in probability to θ.
Theorem: An unbiased estimator for is consistent, if
→ ( ) .
Proof: omitted.
Example: Let be a random sample of size n from a
population with mean µ and variance . Show that
∑
is a consistent estimator of µ.
Solution:
Sufficiency:
Example: Consider the outcomes of n trials of a binomial
experiment, .
Definition: Let denote a random sample from a
probability distribution with unknown parameter . Then the
statistic is sufficient for if the conditional
distribution of , given U, does not depend on .
How to find it?
Definition: Let be sample observations taken on
corresponding random variables whose distribution
depends on . Then if are discrete (continuous)
random variables, the likelihood of the sample,
, is defined to be the joint probability (density)
function of .
Theorem (Factorization Criterion): Let U be a statistic based on the
random sample . Then U is a sufficient statistic for the
estimation of if and only if
.
Proof: omitted.
Example: Let be a random sample, and
{
.
Show that is a sufficient statistic for .
Solution:
Example: (#9.49) Let be a random sample from
U . Show that is sufficient for
.
Solution:
How to find estimators?
There are two main methods for finding estimators:
1) Method of moments.
2) The method of Maximum likelihood.
Method of Moments (MoM)
The method of moments is a very simple procedure for finding an
estimator for one or more parameters of a statistical model. It is
one of the oldest methods for deriving point estimators.
Recall: the moment of a random variable is
The corresponding sample moment is
The estimator based on the method of moments will be the solution
to the equation .
Example: Let . Use MoM to estimate .
Solution:
Example: Let . Find Mom estimators of
and .
Solution:
Maximum Likelihood Estimators (MLEs)
Suppose the likelihood function depends on k parameters
. Choose as estimates those values of the parameters that
maximize the likelihood .
l(θ) = ln(L(θ)) is the log likelihood function. Both the likelihood
function and the log likelihood function have their maximums at
the same value of . It is often easier to maximize l(θ).
Example: A binomial experiment consisting of n trials resulted in
observations , where if the trial is a success
and otherwise. Find the MLE of p, the probability of a
success.
Solution:
Example: Let . Find the MLEs of and .
Solution:
More Examples...
Example 1: Suppose that X is a discrete random variable with the
following probability mass function:
X 0 1 2 3
P(X) 2 /3 /3 2( )/3 /3
Where is a parameter. The following 10 independent
observations
3, 0, 2, 1, 3, 2, 1, 0, 2, 1
were taken from such a distribution. What is the maximum
likelihood estimate of .
Solution:
Example 2: The Pareto distribution has a probability density
function
, for x ≥α , θ > 1
where α and θ are positive parameters of the distribution. Assume
that α is known and that is a random sample of size n.
a) Find the method of moments estimator for θ.
b) Find the maximum likelihood estimator for θ. Does this
estimator differ from that found in part (a)?
c) Estimate θ based on these data:
3, 5, 2, 3, 4, 1, 4, 3, 3, 3.
Solution:
Example 3: Suppose that form a random sample from a
uniform distribution on the interval (0, ), where parameter > 0
is unknown. Find MLE of .
Solution:
Example 4: Suppose that form a random sample from a
uniform distribution on the interval , where the value of
the parameter is unknown . What is the MLE for
?
Solution:
Example 5: Let be an i.i.d. collection of Poisson( )
random variables, > 0. Find the MLE for .
Solution:
Example 6: Let be a random sample from geometric
distribution with
Find the estimator for p.
Solution: