This repository contains implementations of numerical optimization algorithms used in machine learning (ML) and deep learning (DL). Each notebook includes explanations, mathematical formulas, Python code, and visualizations.
File / Notebook | Description |
---|---|
1.1_Single_Variable_Gradient_Descent.ipynb |
Gradient descent for single-variable linear regression. |
1.2_Multivariables_Gradient_Descent.ipynb |
Gradient descent for multi-variable linear regression. |
1.3_Batch_Mini-Batch_and_Stochastic_Gradient_Descent.ipynb |
Batch, mini-batch, and stochastic gradient descent implementations. |
2_Momentum_and_NAG_Gradient_Descent.ipynb |
Momentum and Nesterov Accelerated Gradient (NAG) implementations. |
3_Adagrad_and_RMSProp.ipynb |
Adagrad and RMSProp optimizers implementations. |
4_Adam_Optimizer_Implementation.ipynb |
Adam optimizer implementation. |
- Single-variable gradient descent
- Multi-variable gradient descent
- Batch gradient descent
- Mini-batch gradient descent
- Stochastic gradient descent
- Momentum
- Nesterov Accelerated Gradient (NAG)
- Adagrad
- RMSProp
- Adam
To run the notebooks, you need:
- Python 3.x
- NumPy
- Pandas
- Matplotlib
- Jupyter Notebook or JupyterLab
Install the dependencies using:
pip install numpy pandas matplotlib jupyter