Gradient Descent at a glance

Screenshot 2023-08-03 at 6.52.16 PM.png

$$ \begin{equation} \mathbf{w}^{t+1}=\mathbf{w}^t-\alpha \nabla L\left(\mathbf{w}^t\right) \end{equation} $$

Screenshot 2023-08-04 at 2.46.26 PM.png

$$ \begin{equation} \mathbf{w}^{t+1}=\mathbf{w}^t-\frac{\alpha}{n} \sum_{i=1}^n \nabla L_i\left(\mathbf{w}^t, \mathbf{x}_i, y_i\right) \end{equation} $$

$$ \begin{equation} \mathbf{w}^{t+1}=\mathbf{w}^t+\frac{\alpha}{n} \mathbf{X}^T\left(\mathbf{y}-\mathbf{X} \mathbf{w}^t\right) \end{equation} $$

Error Decomposition and Bias Variance

Screenshot 2023-08-05 at 4.38.47 PM.png

Screenshot 2023-08-05 at 4.44.06 PM.png