If you can't read please download the document
Upload
aysel
View
43
Download
0
Embed Size (px)
DESCRIPTION
The Likelihood Function - Introduction. Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The distribution f θ can be either a probability density function or a - PowerPoint PPT Presentation
Citation preview
The Likelihood Function - IntroductionRecall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data.
The distribution f can be either a probability density function or a probability mass function.
The joint probability density function or probability mass function of iid random variables X1, , Xn is
week 4
The Likelihood FunctionLet x1, , xn be sample observations taken on corresponding random variables X1, , Xn whose distribution depends on a parameter . The likelihood function defined on the parameter space is given by
Note that for the likelihood function we are fixing the data, x1,, xn, and varying the value of the parameter.
The value L( | x1, , xn) is called the likelihood of . It is the probability of observing the data values we observed given that is the true value of the parameter. It is not the probability of given that we observed x1, , xn.
week 4
Examples Suppose we toss a coin n = 10 times and observed 4 heads. With no knowledge whatsoever about the probability of getting a head on a single toss, the appropriate statistical model for the data is the Binomial(10, ) model. The likelihood function is given by
Suppose X1, , Xn is a random sample from an Exponential() distribution. The likelihood function is
week 4
Maximum Likelihood Estimators In the likelihood function, different values of will attach different probabilities to a particular observed sample.
The likelihood function, L( | x1, , xn), can be maximized over , to give the parameter value that attaches the highest possible probability to a particular observed sample.
We can maximize the likelihood function to find an estimator of . This estimator is a statistics it is a function of the sample data. It is denoted by
week 4
The log likelihood functionl() = ln(L()) is the log likelihood function.
Both the likelihood function and the log likelihood function have their maximums at the same value of
It is often easier to maximize l().
week 4
Examples
week 4
Properties of MLEThe MLE is invariant, i.e., the MLE of g() equal to the function g evaluated at the MLE.
Proof:
Examples:
week 4
Important CommentSome MLEs cannot be determined using calculus. This occurs whenever the support is a function of the parameter .
These are best solved by graphing the likelihood function.
Example:
week 4