y=Ax\textbf{y} = A\textbf{x}y=Ax A∈Rm×nA \in \mathbb{R}^{m \times n}A∈Rm×n : weight matrix of fully-connected layer x∈Rn\textbf{x} \in \mathbb{R}^{n}x∈Rn : input vector y∈Rm\textbf{y} \in \mathbb{R}^{m}y∈Rm : output vector
y=Ax\textbf{y} = A\textbf{x}y=Ax
A∈Rm×nA \in \mathbb{R}^{m \times n}A∈Rm×n : weight matrix of fully-connected layer x∈Rn\textbf{x} \in \mathbb{R}^{n}x∈Rn : input vector y∈Rm\textbf{y} \in \mathbb{R}^{m}y∈Rm : output vector
y=Ax+b\textbf{y} = A\textbf{x} + \textbf{b}y=Ax+b A∈Rm×nA \in \mathbb{R}^{m \times n}A∈Rm×n : weight matrix of fully-connected layer b∈Rm\textbf{b} \in \mathbb{R}^{m}b∈Rm : bias vector of fully-connected layer x∈Rn\textbf{x} \in \mathbb{R}^{n}x∈Rn : input vector y∈Rm\textbf{y} \in \mathbb{R}^{m}y∈Rm : output vector
y=Ax+b\textbf{y} = A\textbf{x} + \textbf{b}y=Ax+b
A∈Rm×nA \in \mathbb{R}^{m \times n}A∈Rm×n : weight matrix of fully-connected layer b∈Rm\textbf{b} \in \mathbb{R}^{m}b∈Rm : bias vector of fully-connected layer x∈Rn\textbf{x} \in \mathbb{R}^{n}x∈Rn : input vector y∈Rm\textbf{y} \in \mathbb{R}^{m}y∈Rm : output vector