P1
- Let (X0,X1,…,XN−1) be a random sample of a Bernoulli random variable X with a probability mass function
f(x;p)=px(1−p)1−x where x∈{0,1} and 0≤p≤1 is unknown
- Find the maximum-likelihood estimator (MLE) of p
Solution
xˉ=[X0,X1,…,XN−1]likelihood(xˉ;p)=n=0∏N−1pXn(1−p)1−Xn→likelihood(xˉ;p)=n=0∑N−1(Xnlnp+(1−Xn)ln(1−p))
- Let ∑Xn=Y
=Ylnp+(N−Y)ln(1−p)∂p∂lnp(x;p)=pY+1−pY−N→p^Y+1−p^Y−N=0→p^(Y−N−Y)+Y=0→p^=NY=N1n=0∑N−1Xn
P2
- Let (X0,X1,…,XN−1) be a random sample of a binomial random variable X with parameters (n,p), where n is assumed to be known and p is unknown
- Determine the maximum-likelihood estimator (MLE) of p
- Show that the MLE of p is unbiased
Solution 1
liklihood→i=0∏N−1(xin)pxi(1−p)n−xi=xi!(n−xi)!n!pxi(1−p)n−xilog liklihood→i=0∑N−1(lnX!(n−Xi)!n!+Xilnp+(n−Xi)ln(1−p))∂p∂lnp(x;p)=i=0∑N−1(pXi+1−pXi−n)=pY+1−pY−NnY=∑Xi→p^Y+(1−p^)Y−Nn=0→p^=NnY=Nn1⋅i=0∑N−1Xi
Solution 2
E[p^]=E[Nn1i=0∑N−1Xi]=Nn1i=0∑N−1E[xi]=Nn1⋅N⋅np=p∴Unbiased
P3
- We observe N IID samples from the PDFs :
- Gaussian
p(x;μ)=2π1exp[−21(x−μ)2]
- Exponential
p(x;λ)={λexp(−λx)0x>0x<0
- In eaxh case find the MLE of the unknown parameter and be sure to verify that it indeed maximizes the likelihood function
- Do the estimators make sense?
Solution
- p(x;μ)=2π1exp[−1/2(x−μ)2]
- Log-Likelihood Function
2Nlog(2π1)−21i=0∑N−1(Xi−μ)2
- 1st Derivative of Log-Likelihood Function
∂μ∂lnp(x;μ)=i=0∑N−1(Xi−μ)=Y−NμY=i=0∑N−1Xi→μ^=NY
- 2nd Derivative of Log-Likelihood Function
∂μ2∂2lnp(x;μ)=−N
- Conclusion : Second derivative negative ▷ it's maximum
- p(x;λ)={λexp(−λx)0x>0x<0
- Log Likelihood Function
likelihood=i=0∏N−1λexp(−λXi)Log-likelihood=Nlogλ−λi=0∑N−1Xi
- 1st Derivative of Log-likelihood Function
∂λ∂lnp(x;λ)=λN−i=0∑N−1Xi→λ^=∑i=0N−1XiN
- 2nd Derivative of Log-Likelihood Function
∂λ2∂2lnp(x;λ)=−λ2N It's maximum because negative
- Exponential distribution
- Mean =λ1
- Conclusion
λ^=∑i=0N−1XiNmean=λ1=N∑Xi=λ^1