[DetnEst] Assignment 2

KBC·2024년 10월 25일
0

Detection and Estimation

목록 보기
5/23

P1

  • Suppose that X0,X1,X2,,XN1X_0, X_1, X_2,\dots,X_{N-1} be a random sample of an exponential random variable XX with an unknown parameter α\alpha, corresponding the mean of XX
    fX(x;α)=1αex/αf_X(x;\alpha)=\frac{1}{\alpha}e^{-x/\alpha}
    When α\alpha is to be estimated, find the Cramer-Rao lower bound (CRLB)

Solution

CRLB=1I(α)\text{CRLB}=\frac{1}{I(\alpha)}\\[0.3cm]
  • Log Likelihood
    lnp(x;α)=i=0N1log(fX(Xi;α))=i=0N1log(1αeXi/α)=i=0N1(log(α)Xiα)=Nlog(α)1αi=0N1Xi\ln p(x;\alpha)=\sum_{i=0}^{N-1}\log(f_X(X_i;\alpha))\\[0.3cm] =\sum_{i=0}^{N-1}\log\left(\frac{1}{\alpha}e^{-X_i/\alpha}\right)\\[0.3cm] =\sum_{i=0}^{N-1}\left(-\log(\alpha)-\frac{X_i}{\alpha}\right)\\[0.3cm] =-N\log(\alpha)-\frac{1}{\alpha}\sum_{i=0}^{N-1} X_i
  • 1st Derivative of Log Likelihood
    lnp(x;α)α=Nα+1α2i=0N1Xi\frac{\partial\ln p(x;\alpha)}{\partial\alpha}=-\frac{N}{\alpha}+\frac{1}{\alpha^2}\sum_{i=0}^{N-1}X_i\\[0.3cm]
  • Fisher Information Function
    I(α)=E[2lnp(x;α)α2]2lnp(x;α)α2=Nα22α3i=0N1XiI(\alpha)=-E\left[\frac{\partial^2 \ln p(x;\alpha)}{\partial\alpha^2}\right]\\[0.3cm] \frac{\partial^2\ln p(x;\alpha)}{\partial\alpha^2}=\frac{N}{\alpha^2}-\frac{2}{\alpha^3}\sum_{i=0}^{N-1} X_i\\[0.3cm]
  • E[Xi]=αE[X_i]=\alpha
    I(α)=Nα2CRLB=α2NI(\alpha)=\frac{N}{\alpha^2}\\[0.3cm] \therefore \text{CRLB}=\frac{\alpha^2}{N}


P2

  • Let Y0,Y1,Y2,,YN1Y_0,Y_1,Y_2,\dots,Y_{N-1} be a random sample of a Gaussian random variable of mean α+βxi\alpha+\beta x_i and variance 11, where constants x0,x1,,xN1x_0,x_1,\dots,x_{N-1} are known, whereas α\alpha and β\beta are unknown parameters. Please derive the Fisher information matrix for CRLB of θ=[α  β]T\theta=[\alpha\;\beta]^T

Solution

  • PDF

    • mean of YiY_i : α+βxi\alpha + \beta x_i
    • variance of YiY_i : 11
      fY(yi;α,β)=12πexp(12(yiαβxi)2)f_Y(y_i;\alpha,\beta)=\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{1}{2}(y_i-\alpha-\beta x_i)^2\right)\\[0.3cm]
  • Log Liklihood Function

    lnp(x;α,β)=i=0N1log(fY(yi;α,β))=N2log(2π)12i=0N1(yiαβxi)2\ln p(x;\alpha,\beta)=\sum_{i=0}^{N-1}\log(f_Y(y_i;\alpha,\beta))\\[0.3cm] =-\frac{N}{2}\log(2\pi)-\frac{1}{2}\sum_{i=0}^{N-1}(y_i-\alpha-\beta x_i)^2
  • 1st Derivative of α\alpha

    lnp(x;α,β)α=i=0N1(yiαβxi)\frac{\partial \ln p(x;\alpha,\beta)}{\partial\alpha}=\sum_{i=0}^{N-1}(y_i-\alpha-\beta x_i)
  • 1st Derivative of β\beta

    lnp(x;α,β)β=i=0N1(yiαβxi)xi\frac{\partial \ln p(x;\alpha, \beta)}{\partial\beta}=\sum_{i=0}^{N-1}(y_i-\alpha-\beta x_i)x_i
  • 2nd Derivative of α\alpha

    2lnp(x;α,β)α2=N\frac{\partial^2 \ln p(x;\alpha,\beta)}{\partial\alpha^2}=-N
  • 2nd Derivative of β\beta

    2lnp(x;α,β)β2=i=0N1xi2\frac{\partial^2 \ln p(x;\alpha,\beta)}{\partial\beta^2}=-\sum_{i=0}^{N-1}x_i^2
  • α\alpha, β\beta 2nd Derivative

    lnp(x;α,β)αβ=i=0N1xi\frac{\partial \ln p(x;\alpha, \beta)}{\partial\alpha\partial\beta}=-\sum_{i=0}^{N-1}x_i
  • Fisher Information Function

    I(θ)=[Ni=0N1xii=0N1xii=0N1xi2]I(\theta) = \begin{bmatrix} -N & -\sum_{i=0}^{N-1} x_i \\ -\sum_{i=0}^{N-1} x_i & -\sum_{i=0}^{N-1} x_i^2 \end{bmatrix}


P3

The data x[n]=Arn+w[n]x[n]=Ar^n +w[n] for n=0,1,,N1n=0,1,\dots,N-1 are observed, where w[n]w[n] is WGN with variance σ2\sigma^2 and r>0r>0 is known. Find the CRLB for AA. Show that an efficient estimator exists and find its variance. What happens to the variance as NN\rightarrow\infty for various values of rr?


  • PDF

    f(x[n];A)=12πσ2exp((x[n]Arn)22σ2)f(x[n];A)=\frac{1}{\sqrt{2\pi\sigma^2}}\exp\left(-\frac{(x[n]-Ar^n)^2}{2\sigma^2}\right)
  • Log Likelihood Function

    lnp(x;A)=n=0N1logf(x[n];A)=N2log(2πσ2)12σ2n=0N1(x[n]Arn)2\ln p(x;A)=\sum_{n=0}^{N-1}\log f(x[n];A)\\[0.3cm] =-\frac{N}{2}\log(2\pi\sigma^2)-\frac{1}{2\sigma^2}\sum_{n=0}^{N-1}(x[n]-Ar^n)^2
  • 1st Derivative of Log Likelihood Function

    lnp(x;A)A=1σ2n=0N1(x[n]Arn)rn\frac{\partial\ln p(x;A)}{\partial A}=\frac{1}{\sigma^2}\sum_{n=0}^{N-1}(x[n]-Ar^n)r^n
  • 2nd Derivative of Log Likelihood Function

    2lnp(x;A)A2=1σ2n=0N1(rn)2\frac{\partial^2 \ln p(x;A)}{\partial A^2}=-\frac{1}{\sigma^2}\sum_{n=0}^{N-1}(r^n)^2
  • Fisher Information Function

    I(A)=1σ2n=0N1(rn)2I(A)=\frac{1}{\sigma^2}\sum_{n=0}^{N-1}(r^n)^2
  • CRLB

    var(A^)1I(A)=σ2n=0N1(rn)2\text{var}(\hat A) \geq \frac{1}{I(A)}=\frac{\sigma^2}{\sum_{n=0}^{N-1}(r^n)^2}
  • Efficient Estimator and Variance

    If 1st derivative of log liklihood function equal to 0 then MLE

    A^=n=0N1x[n]rnn=0N1(rn)2var(A^)=σ2n=0N1(rn)2=CRLB\hat A = \frac{\sum_{n=0}^{N-1}x[n]r^n}{\sum_{n=0}^{N-1}(r^n)^2}\\[0.3cm] \text{var}(\hat A)=\frac{\sigma^2}{\sum_{n=0}^{N-1}(r^n)^2}=\text{CRLB}
  • What happens if NN\rightarrow\infty

    n=0N1(rn)2=n=0N1r2n=1r2N1r2,  for r<1When N,  r<1var(A^)σ2(1rn)1r2Nσ21r2When r=1n=0N1(rn)2=NWhen N,  r=1Var(A^)σ2N0\sum_{n=0}^{N-1}(r^n)^2=\sum_{n=0}^{N-1}r^{2n}=\frac{1-r^{2N}}{1-r^2},\;\text{for }r<1\\[0.3cm] \text{When }N\rightarrow\infty,\;r<1\\[0.3cm] \text{var}(\hat A)\rightarrow\frac{\sigma^2(1-r^n)}{1-r^{2N}}\approx\frac{\sigma^2}{1-r^2}\\[0.3cm] \text{When }r=1\\[0.3cm] \sum_{n=0}^{N-1}(r^n)^2=N\\[0.3cm] \text{When }N\rightarrow\infty,\;r=1\\[0.3cm] \text{Var}(\hat A)\rightarrow\frac{\sigma^2}{N}\rightarrow0


P4

Prove that

1Nn=0N1cos(4πf0n+2ϕ)0\frac{1}{N}\sum_{n=0}^{N-1}\cos(4\pi f_0 n+2\phi)\approx0

What conditions on f0f_0 are required for this to hold? Note that

n=0N1cos(αn+β)=Re(n=0N1exp[j(αn+β)])\sum_{n=0}^{N-1}\cos(\alpha n+\beta)=\text{Re}\left(\sum_{n=0}^{N-1}\exp[j(\alpha n+\beta)]\right)

and use the geometrix progression sum formula


Solution

  1. 코사인의 복소수 표현
    cos(θ)=Re(ejθ)n=0N1cos(4πf0n+2ϕ)=Re(n=0N1exp(j(4πf0n+2ϕ)))=Re(ej2ϕn=0N1ej4πf0n)\cos(\theta)=\text{Re}(e^{j\theta})\\[0.3cm] \sum_{n=0}^{N-1}\cos(4\pi f_0 n+2\phi)=\text{Re}\left(\sum_{n=0}^{N-1}\exp(j(4\pi f_0 n+2\phi))\right)\\[0.3cm] =\text{Re}\left(e^{j2\phi}\sum_{n=0}^{N-1}e^{j4\pi f_0 n}\right)
    문제는 n=0N1ej4πf0n\sum_{n=0}^{N-1}e^{j4\pi f_0 n}의 합을 계산해야함
  2. 기하급수 수열의 합 공식 적용
    n=0N1rn=1rN1r,  r1r=ej4πf0n=0N1ej4πf0n=1ej4πf0N1ej4πf0\sum_{n=0}^{N-1}r^n=\frac{1-r^N}{1-r},\;r\neq1\\[0.3cm] r=e^{j4\pi f_0}\\[0.3cm] \sum_{n=0}^{N-1}e^{j4\pi f_0n}=\frac{1-e^{j4\pi f_0N}}{1-e^{j4\pi f_0}}
  3. 조건 도출
    ej4πf0N=1e^{j4\pi f_0 N} = 1의 경우 f0f_0가 정수여야 함 그럼 기하급수의 합은 0
    n=0N1ej4πf0n=0  if f0Integer1Nn=0N1cos(4πf0n+2ϕ)0\sum_{n=0}^{N-1}e^{j4\pi f_0 n}=0\;\text{if }f_0\in\text{Integer}\\[0.3cm] \frac{1}{N}\sum_{n=0}^{N-1}\cos(4\pi f_0 n+2\phi)\approx0


P5

  • We observe two samples of a DC level in correlated Gaussian noise
    x[0]=A+w[0]x[1]=A+w[1]x[0]=A+w[0]\\ x[1]=A+w[1]
    where w=[w[0]  w[1]]T\text{w}=[w[0]\;w[1]]^T is zero mean with covariance matrix
    C=σ2[1ρρ1]\mathbf{C} = \sigma^2 \begin{bmatrix} 1 & \rho \\ \rho & 1 \end{bmatrix}
  • The parameter ρ\rho is the correlation coefficient between w[0]w[0] and w[1]w[1]
  • Compute the CRLB for AA and compare it to the case when w[n]w[n] is WGN or ρ=0\rho=0
  • Also, explain what happens when ρ±1\rho \rightarrow \pm1
  • Finally, comment on the additivity property of the Fisher Information for nonindependent observations

x[0]=A+w[0],  x[1]=A+w[1]x[0]=A+w[0],\;x[1]=A+w[1]
  • CRLB
    I(A)=μTAC1μAμ=[A  A]TμA=[1  1]TC1=1σ2(1ρ2)[1ρρ1]I(A)=[1  1]1σ2(1ρ2)[1ρρ1][1  1]T=1σ2(1ρ2)((1)(1)+(ρ)(1)+(ρ)(1)+(1)(1))=1σ2(1ρ2)(22ρ)=2(1ρ)σ2(1ρ2)I(A)=2(1ρ)σ2(1ρ2)Var(A^)σ2(1ρ2)2(1ρ)=CRLBI(A)=\frac{\partial\mu^T}{\partial A}C^{-1}\frac{\partial\mu}{\partial A}\\[0.3cm] \mu=[A\;A]^T\rightarrow\frac {\partial\mu}{\partial A}=[1\;1]^T\\[0.3cm] C^{-1}=\frac{1}{\sigma^2(1-\rho^2)}\begin{bmatrix} 1 & -\rho \\ -\rho & 1 \end{bmatrix}\\[0.3cm] I(A)=[1\;1]\frac{1}{\sigma^2(1-\rho^2)}\begin{bmatrix} 1 & -\rho \\ -\rho & 1 \end{bmatrix}[1\;1]^T\\[0.3cm] =\frac{1}{\sigma^2(1-\rho^2)}((1)(1)+(-\rho)(1)+(-\rho)(1)+(1)(1))\\[0.3cm] =\frac{1}{\sigma^2(1-\rho^2)}(2-2\rho)\\[0.3cm] =\frac{2(1-\rho)}{\sigma^2(1-\rho^2)}\\[0.3cm] I(A)=\frac{2(1-\rho)}{\sigma^2(1-\rho^2)}\\[0.3cm] \text{Var}(\hat A)\geq\frac{\sigma^2(1-\rho^2)}{2(1-\rho)}=\text{CRLB}
  • ρ=0\rho=0 (WGN)
    I(A)=2σ2Var(A^)σ22=CRLBI(A)=\frac{2}{\sigma^2}\\[0.3cm] \text{Var}(\hat A) \geq \frac{\sigma^2}{2}=\text{CRLB}
    We can't get CRLB
  • ρ±1\rho \rightarrow \pm 1
    • If ρ1\rho \rightarrow 1
      I(A)0I(A) \rightarrow 0
    • If ρ1\rho \rightarrow -1
      I(A)=2(1(1))σ2(1(1)2)=2(2)σ2(4)=1σ2Var(A^)σ2=CRLBI(A)=\frac{2(1-(-1))}{\sigma^2(1-(-1)^2)}=\frac{2(2)}{\sigma^2(4)}=\frac{1}{\sigma^2}\\[0.3cm] \text{Var}(\hat A)\geq\sigma^2=\text{CRLB}
      The Estimator getting worse
  • Conclude
    • When ρ=0\rho=0, lowest CRLB
    • When ρ±1\rho \rightarrow \pm1 decrease Fisher information, increase CRLB


P6

  • Consider a generalization of the line fitting problem as described in Problem 4 termed polynomial or curve fitting
  • The data model is
    x[n]=k=0p1Aknk+w[n]x[n]=\sum_{k=0}^{p-1}A_kn^k+w[n]
    for n=0,1,,N1n=0,1,\dots,N-1
  • As before, w[n]w[n] is WGN with variance σ2\sigma^2
  • It is desired to estimate {A0,A1,,Ap1}\{A_0,A_1,\dots,A_{p-1}\}
  • Find the Fisher Information matrix for this problem

Solution

μ[n]=k=0p1Aknkμ[n]Ak=nkI(A)=1σ2n=0N1μ[n]Aμ[n]ATI(A)=1σ2n=0N1[1nn2n3np1nn2n3n4npn2n3n4n5np+1np1npnp+1np+2n2p2]Ii,j=1σ2n=0N1ni+j\mu[n]=\sum_{k=0}^{p-1}A_kn^k\\[0.3cm] \frac{\partial\mu[n]}{\partial A_k}=n^k\\[0.3cm] I(A)=\frac{1}{\sigma^2}\sum_{n=0}^{N-1}\frac{\partial\mu[n]}{\partial A} \frac{\partial\mu[n]}{\partial A}^T\\[0.3cm] I(A) = \frac{1}{\sigma^2} \sum_{n=0}^{N-1} \begin{bmatrix} 1 & n & n^2 & n^3 & \cdots & n^{p-1} \\ n & n^2 & n^3 & n^4 & \cdots & n^p \\ n^2 & n^3 & n^4 & n^5 & \cdots & n^{p+1} \\ \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\ n^{p-1} & n^p & n^{p+1} & n^{p+2} & \cdots & n^{2p-2} \end{bmatrix}\\[0.5cm] I_{i,j}=\frac{1}{\sigma^2}\sum_{n=0}^{N-1}n^{i+j}


P7

  • It is desired to estimate the total power P0P_0 of a WSS random process, whose PSD is given as
    Pxx(f)=P0Q(f)P_{xx}(f)=P_0Q(f)
    where
    1212Q(f)df=1\int_{-\frac{1}{2}}^{\frac{1}{2}}Q(f)df=1
    and Q(f)Q(f) is known
  • If NN observations are available, find the CRLB for the total power using the exact from as well as the asymptotic approximation and compare

Solution

P0=1212Pxx(f)dfP0=P01/21/2Q(f)df=P01P_0=\int_{-\frac{1}{2}}^{\frac{1}{2}}P_{xx}(f)df\\[0.3cm] P_0=P_0\int_{-1/2}^{1/2}Q(f)df=P_0\cdot 1
  • Fisher Information Function
    I(P0)=N1/21/21Pxx(f)(Pxx(f)P0)2dfPxx(f)=P0Q(f)I(P0)=N1/21/21P02Q(f)(Q(f))2df=N1P021/21/2Q(f)df=N1P021=NP02I(P_0)=N\cdot \int_{-1/2}^{1/2}\frac{1}{P_{xx}(f)}\left(\frac{\partial P_{xx}(f)}{\partial P_0}\right)^2 df\\[0.3cm] P_{xx}(f)=P_0Q(f)\\[0.3cm] I(P_0)=N\cdot \int_{-1/2}^{1/2}\frac{1}{P^2_0Q(f)}(Q(f))^2df\\[0.3cm] =N\cdot\frac{1}{P^2_0}\int_{-1/2}^{1/2}Q(f)df\\[0.3cm] =N\cdot\frac{1}{P^2_0}\cdot 1=\frac{N}{P^2_0}
  • CRLB
    Var(P^0)1I(P0)=P02N=CRLB\text{Var}(\hat P_0)\geq\frac{1}{I(P_0)}=\frac{P^2_0}{N}=\text{CRLB}
  • Asymptotic approximation
    Var(P^0)P02N\text{Var}(\hat P_0)\approx\frac{P^2_0}{N}

aa

profile
AI, Security

0개의 댓글