[DetnEst] Assignment 4

KBC·2024년 10월 25일
0

Detection and Estimation

목록 보기
9/23

P1

  • Suppose that X1X_1 and X2X_2 are independent Poisson random variables each with parameter λ\lambda
  • Let the parameter θ\theta by
    θ=eλ\theta=e^{-\lambda}
  1. Show that X1+X2X_1+X_2 is a sufficient statistic for θ\theta
    (Assume λ\lambda ranges of (0,))(0,\infty))
  2. Show that X1+X2X_1+X_2 is also complete
    (Note that sum of independent Poisson random variables is a Poisson random variable
    In addition, if the sum is a power series and is zero, then all the coefficients are zero.)
  3. Define an estimate θ^\hat \theta by
    θ^(x)=12(f(x1)+f(x2))\hat\theta(\text{x})=\frac{1}{2}(f(x_1)+f(x_2))
    where ff is defined by
    f(x)={1,if x=00,if x0f(x) = \begin{cases} 1, & \text{if } x = 0 \\ 0, & \text{if } x \neq 0 \end{cases}
    Show that θ^\hat\theta is an unbiased estimate of θ\theta
  4. Find an MVUE of θ\theta

Solution 1

  • Let Y=X1+X2Y=X_1+X_2
    p(X1=x1,X2=x2Y=y;θ)=p(X1=x1,X2=x2,Y=y;θ)p(Y=y;θ)Y is Poisson Distribution paramter 2λ=p(X1=x1,X2=x2;θ)δ(y(x1+x2))p(Y=y;θ)=(lnθ)x1θx1!(lnθ)x2θx2!(2lnθ)yθ2y!,  (x1+x2=y)=(1)x1+x2(2)x1+x2(x1+x2)!x1!x2! not depend on θY=X1+X2 sufficient statisticsp(X_1=x_1,X_2=x_2|Y=y;\theta)\\[0.3cm] =\frac{p(X_1=x_1,X_2=x_2,Y=y;\theta)}{p(Y=y;\theta)}\\[0.3cm] Y \text{ is Poisson Distribution paramter }2\lambda\\[0.3cm] =\frac{p(X_1=x_1,X_2=x_2;\theta)\delta(y-(x_1+x_2))}{p(Y=y;\theta)}\\[0.3cm] =\frac{\frac{(-\ln\theta)^{x_1}\theta}{x_1!}\cdot\frac{(-\ln\theta)^{x_2}\theta}{x_2!}}{\frac{(-2\ln\theta)^y\theta^2}{y!}},\;(x_1+x_2=y)\\[0.3cm] =\frac{(-1)^{x_1+x_2}}{(-2)^{x_1+x_2}}\cdot\frac{(x_1+x_2)!}{x_1!\cdot x_2!} \text{ not depend on }\theta\\[0.3cm] \rightarrow Y=X_1+X_2 \text{ sufficient statistics}

Solution 2

y=0v(Y)p(Y;θ)=y=0v(Y)(2lnθ)Yθ2Y!=0v(Y) should be zeroY is complete\sum_{y=0}^\infty v(Y)p(Y;\theta)=\sum_{y=0}^\infty v(Y)\frac{(-2\ln \theta)^Y \theta^2}{Y!}=0\\[0.3cm] \rightarrow v(Y) \text{ should be zero} \rightarrow Y \text{ is complete}

Solution 3

E[Θ^(x)]=E[12p(x1)+12f(x2)]=2×12E[f(x)](p(x;θ)=(lnθ)xθx!)=2×12×(1×p(x=0;θ))=θ(unbiased estimate)E[\hat{\Theta}(x)] = E\left[ \frac{1}{2} p(x_1) + \frac{1}{2} f(x_2) \right]\\[0.3cm] = 2 \times \frac{1}{2} E[f(x)] \quad \left( p(x; \theta) = \frac{(-\ln \theta)^x \theta}{x!} \right)\\[0.3cm] = 2 \times \frac{1}{2} \times (1 \times p(x = 0; \theta))\\[0.3cm] = \theta \quad \text{(unbiased estimate)}

Solution 4

E[12(f(x1)+f(x2))Y] is the MVUE of θ=E[12f(x1)Y]+E[12f(x2)Y]=12P(x1=0,x2=Y)P(Y=Y)+12P(x2=0x1=Y)P(Y=Y)=12θ(lnθ)YY!+12θ(2lnθ)YY!=1212Y+1212Y=12YMVUE estimator of θE\left[\frac{1}{2}(f(x_1) + f(x_2)) \mid Y \right] \text{ is the MVUE of }\theta\\[0.3cm] = E\left[\frac{1}{2}f(x_1) \mid Y \right] + E\left[\frac{1}{2}f(x_2) \mid Y \right]\\[0.3cm] = \frac{1}{2} \frac{P(x_1 = 0, x_2 = Y)}{P(Y = Y)} + \frac{1}{2} \frac{P(x_2 = 0 \mid x_1 = Y)}{P(Y = Y)}\\[0.3cm] = \frac{1}{2} \cdot \frac{\theta}{(-\ln \theta)^Y Y!} + \frac{1}{2} \cdot \frac{\theta}{(-2\ln \theta)^Y Y!}\\[0.3cm] = \frac{1}{2} \cdot \frac{1}{2Y} + \frac{1}{2} \cdot \frac{1}{2Y} = \frac{1}{2Y} \Rightarrow \text{MVUE estimator of } \theta


P2

  • Assume that x[n]x[n] is the result of a Bernoulli trial (a coin toss) with
    Pr{x[n]=1}=θPr{x[n]=0}=1θ\Pr\{x[n]=1\}=\theta\\[0.3cm] \Pr\{x[n]=0\}=1-\theta
  • And that NN IID observations have been made
  • Assuming that Neyman-Fisher factorization theorem holds for discrete random variables
  • Find a sufficient statistic for θ\theta
  • Then, assuming completeness, find the MVUE for θ\theta

Solution

H=n=0N1x(n)p(xθ)=θH(1θ)NH=θH(1θ)H(1θ)N=(θH(1θ)H)(1θ)N1=g(H,θ),H=H(x)(sufficient statistic)E[H(x)N]=1NNθ=θ(unbiased)θ^=1Nn=0N1x(n)(MVUE)H = \sum_{n=0}^{N-1} x(n) \quad \Rightarrow \quad p(\vec{x} \mid \theta) = \theta^H (1 - \theta)^{N-H}\\[0.3cm] = \frac{\theta^H}{(1-\theta)^H} (1-\theta)^N = \left( \frac{\theta^H}{(1-\theta)^H} \right) (1-\theta)^N \cdot 1\\[0.3cm] = g(H, \theta), \quad H = H(\vec{x}) \quad \text{(sufficient statistic)}\\[0.3cm] E\left[\frac{H(\vec{x})}{N}\right] = \frac{1}{N} \cdot N \theta = \theta \quad \text{(unbiased)}\\[0.3cm] \hat{\theta} = \frac{1}{N} \sum_{n=0}^{N-1} x(n) \quad \text{(MVUE)}
profile
AI, Security

0개의 댓글