Skip to content

This repository provides implementations of numerical optimization algorithms for machine learning and deep learning. It includes clear explanations, mathematical formulas, Python code, and visualizations to help understand the behavior of each optimizer.

Notifications You must be signed in to change notification settings

Raafat-Nagy/Implementations_of_ML_and_DL_Optimizers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Implementations of Machine Learning (ML) and Deep Learning (DL) Optimizers

This repository contains implementations of numerical optimization algorithms used in machine learning (ML) and deep learning (DL). Each notebook includes explanations, mathematical formulas, Python code, and visualizations.

Repository Structure

File / Notebook Description
1.1_Single_Variable_Gradient_Descent.ipynb Gradient descent for single-variable linear regression.
1.2_Multivariables_Gradient_Descent.ipynb Gradient descent for multi-variable linear regression.
1.3_Batch_Mini-Batch_and_Stochastic_Gradient_Descent.ipynb Batch, mini-batch, and stochastic gradient descent implementations.
2_Momentum_and_NAG_Gradient_Descent.ipynb Momentum and Nesterov Accelerated Gradient (NAG) implementations.
3_Adagrad_and_RMSProp.ipynb Adagrad and RMSProp optimizers implementations.
4_Adam_Optimizer_Implementation.ipynb Adam optimizer implementation.

Covered Topics

  • Single-variable gradient descent
  • Multi-variable gradient descent
  • Batch gradient descent
  • Mini-batch gradient descent
  • Stochastic gradient descent
  • Momentum
  • Nesterov Accelerated Gradient (NAG)
  • Adagrad
  • RMSProp
  • Adam

Requirements

To run the notebooks, you need:

  • Python 3.x
  • NumPy
  • Pandas
  • Matplotlib
  • Jupyter Notebook or JupyterLab

Install the dependencies using:

pip install numpy pandas matplotlib jupyter

Related Resources


About

This repository provides implementations of numerical optimization algorithms for machine learning and deep learning. It includes clear explanations, mathematical formulas, Python code, and visualizations to help understand the behavior of each optimizer.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published