Table of Contents

IV. Linear Regression with Multiple Variables

4.1 - Multiple Features

\[h_\theta(x) = \theta_0 + \theta_1x_1 + \theta_2x_2 + ... + \theta_nx_n\]

4.2 - Gradient descent for Multiple Variables

\[\theta_j := \theta_j - \alpha \frac{\partial}{\partial \theta_j} J(\theta_0,\theta_1,\dotsc,\theta_n)\]

4.3 - Gradient descent in Practive 1 - Feature Scaling

4.4 - Gradient descent in Practive 2 - Learning rate

4.5 - Features and Polynomial Regression

4.6 - Normal Equation

4.7 - Normal Equation Noninvertibility