[Linear Algebra] Inner Product

Jason Lee·2022년 9월 16일
0

Linear Algebra

목록 보기
19/26
post-custom-banner

Over-determined Linear Systems

  • Over-determined linear systems
    • number of equations >> number of variables
    • data amount >> feature dimension
    • usually no solution exists

      e.g.

      Feature AFeature BFeature Clabel
      A1A_1B1B_1C1C_1L1L_1
      A2A_2B2B_2C2C_2L2L_2
      \vdots\vdots\vdots\vdots
      A100A_{100}B100B_{100}C100C_{100}L100L_{100}

      Total data num : 100
      Feature dim : 3
      Over-determined linear system

Vector Equation Perspective

  • Vector equation form
    • [A1A2A100]x1+[B1B2B100]x2+[C1C2C100]x3=[L1L2L100]\begin{bmatrix}A_1 \\A_2 \\ \vdots \\A_{100} \end{bmatrix} x_1 + \begin{bmatrix}B_1 \\B_2 \\ \vdots \\B_{100} \end{bmatrix} x_2 + \begin{bmatrix}C_1 \\C_2 \\ \vdots \\C_{100} \end{bmatrix} x_3 = \begin{bmatrix}L_1 \\L_2 \\ \vdots \\L_{100} \end{bmatrix}
    • a1x1+a2x2+a3x3=b\textbf{a}_1 x_1 + \textbf{a}_2 x_2 + \textbf{a}_3 x_3 = \textbf{b}
    • Compared to original space R100\mathbb{R}^{100} (a1,a2,a3,bR100\textbf{a}_1, \textbf{a}_2, \textbf{a}_3, \textbf{b} \in \mathbb{R}^{100}), Span{a1,a2,a3}\textrm{Span}\begin{Bmatrix} \textbf{a}_1, \textbf{a}_2, \textbf{a}_3 \end{Bmatrix} will be a thin hyperplane
    • So it is likely that bSpan{a1,a2,a3}\textbf{b} \notin \textrm{Span}\begin{Bmatrix} \textbf{a}_1, \textbf{a}_2, \textbf{a}_3 \end{Bmatrix}, which mean no solution exists

Motivation for Least Squares

  • Even if no solution exists, we want to approximately obtain the solution for an over-determined system
  • Then, how can we define the best approximate solution?

Inner Product

  • Given u,vRn\textbf{u}, \textbf{v} \in \mathbb{R}^n, we can consider u\textbf{u} and v\textbf{v} as n×1n \times 1 matrices
  • The transpose uT\textbf{u}^T is a 1×n1 \times n matrix, and the matrix product uTv\textbf{u}^T \textbf{v} is a 1×11 \times 1 matrix, which we write as a scalar without brackets
  • The number uTv\textbf{u}^T \textbf{v} is called the inner product or dot product of u\textbf{u} and v\textbf{v}, and it is written as uv\textbf{u} \cdot \textbf{v}

Properties of Inner Product

  • Theorem : let u\textbf{u}, v\textbf{v} and w\textbf{w} be vectors in Rn\mathbb{R}^n, and let cc be a scalar

    • a ) uv=vu\textbf{u} \cdot \textbf{v} = \textbf{v} \cdot \textbf{u}
    • b ) (u+v)w=uw+vw(\textbf{u} + \textbf{v}) \cdot \textbf{w} = \textbf{u} \cdot \textbf{w} + \textbf{v} \cdot \textbf{w}
    • c ) (cu)v=c(uv)=u(cv)(c \textbf{u}) \cdot \textbf{v} = c (\textbf{u} \cdot \textbf{v}) = \textbf{u} \cdot (c \textbf{v})
    • d ) uu0\textbf{u} \cdot \textbf{u} \geq 0, and uu=0\textbf{u} \cdot \textbf{u} = 0 if and only if u=0\textbf{u} = \textbf{0}
  • Properties (b) and (c) can be combined to produce the following rule

    • (c1u1++cpup)w=c1(u1w)++cp(upw)(c_1 \textbf{u}_1 + \cdots + c_p \textbf{u}_p) \cdot \textbf{w} = c_1 (\textbf{u}_1 \cdot \textbf{w}) + \cdots + c_p (\textbf{u}_p \cdot \textbf{w})
profile
AI Researcher
post-custom-banner

0개의 댓글