Expectation

deejayosamu·2025년 1월 17일

통계 기본 개념

목록 보기
2/20

Expectation of a R.V.

Def)
E(X)={xfX(x)dx :continuousxxPX(x) :discreteE(X)=\left\{\begin{matrix} \int_{-\infty}^{\infty}xf_X(x)dx \space :continuous \\ \sum_{x}xP_X(x) \space : discrete \end{matrix}\right.

provided xfX(x)dx<(or xxPX(x)<)\int_{-\infty}^{\infty}|x|f_X(x)dx<\infty(or \space \sum_{x}|x|P_X(x)<\infty)

Theorem)
If E(g1(x)),E(g2(x))E(g_1(x)),E(g_2(x)) exist,
E[k1g1(x)+k2g2(x)]=k1E(g1(x))+k2E(g2(x))E[k_1g_1(x)+k_2g_2(x)]=k_1E(g_1(x))+k_2E(g_2(x))

Some special Expectations

Mean(First Moment)

The mean of XX: μ=E(X)\mu=E(X)

Variance(Second Central Moment)

The variance of XX: σ2=Var(X)=E[(XE(X))2]=E(X2)[E(X)]2\sigma^2=Var(X)=E[(X-E(X))^2]=E(X^2)-[E(X)]^2
pf)
var

Theorem)
Var(k1g(x)+k2)=k12Var(g(x))Var(k_1g(x)+k_2)=k_1^2Var(g(x))
pf)
var_thrm

MGF(Moment Generating Function)

Def)
확률변수 XX에 대해, h<t<h-h<t<h 에서 E(etx)E(e^{tx})가 존재한다면,
MX(t)=E(etx)M_X(t)=E(e^{tx})

활용법)
① mgf는 t=0 에서 n번 미분을 함으로써, n th moment of XX를 생성한다.
mgf
② For a certain distri., mgf is unique.
=> 두 개의 확률변수가 같은 분포를 가지는지 확인하기 위해 사용

✔︎ 주의사항
E(Xn)(n=1,2,...):E(X^n)(n=1,2,...): The n th moment of XX
E((Xμ)n)(n=1,2,...):E((X- \mu)^n)(n=1,2,...): The n th central moment of XX

Multivariate case

Suppose bivariate case, X=(X1,X2)\underline{X}=(X_1,X_2)

E(g(X1,X2))={x1x2g(x1,x2)PX1,X2(x1,x2):discreteg(x1,x2)fX1,X2(x1,x2):continuousif E(g(X1,X2))<E(g(X_1,X_2)) = \left\{\begin{matrix} \sum_{x_1} \sum_{x_2} g(x_1,x_2) P_{X_1,X_2} (x_1,x_2):discrete\\ \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} g(x_1,x_2) f_{X_1,X_2} (x_1,x_2):continuous \end{matrix}\right. \\if \space E(g(X_1,X_2))<\infty

Therem)
If E(g(x1,x2))E(g(x_1,x_2)) and E(g(x1,x2))E(g(x_1,x_2)) exist,
E[k1g1(x1,x2)+k2g2(x1,x2)]=k1E(g1(x1,x2))+k2E(g2(x1,x2))E[k_1g_1(x_1,x_2) + k_2g_2(x_1,x_2)] = k_1E(g_1(x_1,x_2) ) + k_2E(g_2(x_1,x_2)) for any constants k1k_1 and k2k_2

MX(t)=MX1,...,Xn(t1,...,tn)=E(etTX)=E(ei=1ntixi)M_{\underline{X}} (\underline{t}) = M_{X_1,...,X_n} (t_1,...,t_n)=E(e^{\underline{t}^T \underline{X}}) = E(e^{\sum_{i=1}^{n} t_i x_i})

E(X1m1...Xnmn)=m1+...+mnt1m1...tnmnMX(t)t=0E(X_1^{m_1}...X_n^{m_n})=\frac{ \partial^{m_1+...+m_n} }{\partial t_1^{m_1} ... \partial t_n^{m_n}} M_{\underline{X}} (\underline{t}) \left.\begin{matrix} \end{matrix}\right|_{ \underline{t}=0 }

bivariate case)
MX1,X2(t1,t2)=E(et1x1+t2x2)M_{X_1,X_2} (t_1,t_2) = E(e^{t_1 x_1 + t_2 x_2})
E(X1m1X2m2)=m1+m2t1m1t2m2MX1,X2(t1,t2)t1=t2=0E(X_1^{m_1} X_2^{m_2}) = \frac{ \partial^{m_1 + m_2} }{ \partial t_1^{m_1} \partial t_2^{m_2}} M_{X_1,X_2}(t_1,t_2) \left.\begin{matrix} \end{matrix}\right|_{t_1=t_2=0}
MX1,X2(t1,0)=E(et1x1+0x2)=E(et1x1)=MX1(t1)M_{X_1,X_2} (t_1,0) = E(e^{t_1 x_1 + 0 x_2}) = E(e^{t_1 x_1}) = M_{X_1} (t_1)

0개의 댓글