📄️ Multivariable Functions and Partial Derivatives
Overview
📄️ Gradient and Multidimensional Optimization
Definition of Gradient
📄️ Optimization using Gradient Descent in one variable
Gradient Descent is an iterative method extensively utilized in finding the minimum or maximum of functions, particularly beneficial in multi-variable scenarios. This approach is foundational in optimization problems where exact solutions are challenging to derive analytically, especially due to complexities in higher dimensions.
📄️ Optimization using Gradient Descent in two variables
Gradient descent extends to functions of multiple variables by utilizing gradients instead of derivatives. The process involves iteratively moving in the direction opposite to the gradient of the function at the current point, to find a local minimum.
📄️ Optimization using Gradient Descent - Least squares
Linear regression is a statistical method used to model the relationship between a dependent variable and one or more independent variables by fitting a linear equation to observed data. The simplest form of the linear equation with one dependent and one independent variable is represented as $y = mx + b$, where $m$ is the slope of the line and $b$ is the y-intercept. This method is widely used in predictive modeling and quantitative forecasting.