[Linear Algebra] Linear Independence

Jason Lee·2022년 8월 8일
0

Linear Algebra

목록 보기
11/26
post-custom-banner

Uniqueness of Solution for Ax=bA\textrm{x} = \textrm{b}

e.g.

Ax=bA\textbf{x} = \textbf{b}

[221310131][x1x2x3]=[61418]\begin{bmatrix} 2 & 2 & 1 \\ 3 & 1 & 0 \\ 1 & 3 & 1 \\ \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 6 \\ 14 \\ 18 \end{bmatrix}

a1x1+a2x2+a3x3=b\textbf{a}_1 x_1 + \textbf{a}_2 x_2 + \textbf{a}_3 x_3 = \textbf{b}

[231]x1+[213]x2+[101]x3=[61418]\begin{bmatrix} 2 \\ 3 \\ 1 \\ \end{bmatrix} x_1 + \begin{bmatrix} 2 \\ 1 \\ 3 \\ \end{bmatrix} x_2 + \begin{bmatrix} 1 \\ 0 \\ 1 \\ \end{bmatrix} x_3 = \begin{bmatrix} 6 \\ 14 \\ 18 \end{bmatrix}

  • Solution exists only when bSpan{a1,a2,a3}\textbf{b} \in \textrm{Span} \begin{Bmatrix} \textbf{a}_1, \textbf{a}_2, \textbf{a}_3 \end{Bmatrix}
  • If a1,a2,\textbf{a}_1, \textbf{a}_2, and a3\textbf{a}_3 are linearly independent, then unique solution exists
  • If a1,a2,\textbf{a}_1, \textbf{a}_2, and a3\textbf{a}_3 are linearly dependent, then infinitely many solutions exist

Linear Independence

Formal Definition

Consider x1v1+x2v2++xpvp=0x_1 \textbf{v}_1 + x_2 \textbf{v}_2 + \cdots + x_p \textbf{v}_p = \textbf{0}

Obviously, one solution is x=[x1x2xp]=[000]\textbf{x} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_p \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix}, which we call a trivial solution

  • If trivial solution is the only solution, v1,,vp\textbf{v}_1, \cdots, \textbf{v}_p are linearly independent
  • If other non-trivial solution exists, v1,,vp\textbf{v}_1, \cdots, \textbf{v}_p are linearly dependent

Practical Definition

Given a set of vectors v1,,vpRn\textbf{v}_1, \cdots, \textbf{v}_p \in \mathbb{R}^{n}, check if vj\textbf{v}_j can be represented as a linear combination of the previous vectors {v1,v2,,vj1}\begin{Bmatrix} \textbf{v}_1, \textbf{v}_2, \cdots, \textbf{v}_{j-1} \end{Bmatrix} for j=1,,pj = 1, \cdots, p

vjSpan{v1,v2,,vj1}\textbf{v}_j \in \textrm{Span} \begin{Bmatrix} \textbf{v}_1, \textbf{v}_2, \cdots, \textbf{v}_{j-1} \end{Bmatrix} for some j=1,,pj = 1, \cdots, p

  • If at least one such vj\textbf{v}_j is found, then {v1,,vp}\begin{Bmatrix} \textbf{v}_1, \cdots, \textbf{v}_{p} \end{Bmatrix} is linearly dependent
  • If no such vj\textbf{v}_j is found, then {v1,,vp}\begin{Bmatrix} \textbf{v}_1, \cdots, \textbf{v}_{p} \end{Bmatrix} is linearly independent

Two Definitions are Equivalent

If v1,,vp\textbf{v}_1, \cdots, \textbf{v}_p are linearly dependent, consider a non-trivial solution

Let's just denote j as the last index then

xjvj=x1v1xj1vj1x_j \textbf{v}_j = - x_1 \textbf{v}_1 - \cdots - x_{j-1} \textbf{v}_{j-1}

If xj0x_j \ne 0, then vj\textbf{v}_j can be represented as a linear combination of the previous vectors

vj=x1xjv1xj1xjvj1Span{v1,v2,,vj1}\textbf{v}_j = - \frac{x_1}{x_j} \textbf{v}_1 - \cdots - \frac{x_{j-1}}{x_j} \textbf{v}_{j-1} \in \textrm{Span} \begin{Bmatrix} \textbf{v}_1, \textbf{v}_2, \cdots, \textbf{v}_{j-1} \end{Bmatrix}

If xj=0x_j = 0, then just iterate

x1v1++xj1vj1=0x_1 \textbf{v}_1 + \cdots + x_{j-1} \textbf{v}_{j-1} = \textbf{0}

profile
AI Researcher
post-custom-banner

0개의 댓글