Supervised Learning - Regression & Regularization (Weeks 3)
Supervised learning, particularly regression, is explored through linear and polynomial relationships. Key concepts include the mathematical frameworks of simple and multiple linear regression, gradient descent for optimization, and the importance of evaluation metrics like MSE and R². A significant focus is placed on understanding the Bias-Variance Trade-off, which is critical for model generalization.
Sections
Navigate through the learning materials and practice exercises.
What we have learnt
- Linear regression models the relationship between a target variable and predictor variables by fitting a line to the observed data.
- Gradient Descent is an optimization algorithm used to minimize the cost function by iteratively adjusting model parameters.
- Evaluation metrics such as MSE, RMSE, and R-squared are essential for assessing regression model performance.
Key Concepts
- -- Linear Regression
- A statistical method for modeling the relationship between a dependent variable and one or more independent variables by fitting a linear equation.
- -- Gradient Descent
- An iterative optimization algorithm used to minimize a function by taking steps proportional to the negative of the gradient.
- -- BiasVariance Tradeoff
- The balance between the error due to bias, which represents error from overly simplistic models, and variance, which represents error from overly complex models.
- -- Mean Squared Error (MSE)
- A measure of the average of the squares of the errors, which calculates the average squared difference between predicted and actual values.
- -- Polynomial Regression
- An extension of linear regression that allows modeling of non-linear relationships by incorporating polynomial terms.
Additional Learning Materials
Supplementary resources to enhance your learning experience.