|
1 |
| -# Optimization Problems |
| 1 | +# Optimization Problems Showcase |
2 | 2 |
|
3 |
| -This project gathers experience from multiple optimization problems at the core of Machine Learning (ML), which, in turn, serve as the driving force behind broader concepts in Artificial Intelligence (AI). The repository is organized into modules, each addressing a distinct optimization topic and providing implementations in Python. |
| 3 | +Welcome to the "optimization-problems" repository, a Python repository that serves as a demonstration of various optimization techniques for Machine Learning. Here, you will find implementations of algorithms such as LP (Linear Programming), Newton's methods, LASSO, convex optimization, and more. Let's dive into the world of optimization in Machine Learning using Python! |
4 | 4 |
|
5 |
| ---- |
| 5 | +## Repository Overview 🚀 |
6 | 6 |
|
7 |
| -<p align="center"> |
8 |
| - <img src="https://github.com/user-attachments/assets/bee6d38b-a495-43ce-b195-a8d163dbb672" alt="OP"/> |
9 |
| -</p> |
| 7 | +Explore the world of optimization techniques for Machine Learning through this repository. Whether you are a beginner or an experienced data scientist, this collection of algorithms will enhance your understanding and skills in optimizing models for better performance. |
10 | 8 |
|
11 |
| ---- |
| 9 | +### Topics Covered 📊 |
12 | 10 |
|
13 |
| -## Table of Contents |
14 |
| -1. [Module 1 – Linear Programming (LP)](#module-1--linear-programming-lp) |
15 |
| -2. [Module 2 – Isoperimetric Optimization](#module-2--isoperimetric-optimization) |
16 |
| -3. [Module 3 – Piecewise Constant Signal Reconstruction](#module-3--piecewise-constant-signal-reconstruction) |
17 |
| -4. [Module 4 – Basis Pursuit](#module-4--basis-pursuit) |
18 |
| -5. [Module 5 – Backtracking Line Search](#module-5--backtracking-line-search) |
19 |
| -6. [Module 6 – Newton's Method](#module-6--newtons-method) |
20 |
| -7. [Module 7 – Levenberg–Marquardt Parameter Estimation](#module-7--levenbergmarquardt-parameter-estimation) |
21 |
| -8. [Module 8 – Nonlinear Constrained Least Squares Optimization](#module-8--nonlinear-constrained-optimization) |
22 |
| -9. [Module 9 – Quasi-Newton Method](#module-9--quasi-newton-method) |
23 |
| -10. [Module 10 – Linear Programming (LP) Solutions using Sequential Barrier Method (SBM)](#module-10--linear-programming-lp-solutions-using-sequential-barrier-method-sbm) |
24 |
| -11. [Bibliography](#bibliography) |
| 11 | +* Backtracking Search |
| 12 | +* Basis Pursuit |
| 13 | +* Convex Optimization |
| 14 | +* CVXPY |
| 15 | +* LASSO |
| 16 | +* Levenberg-Marquardt |
| 17 | +* Linear Programming |
| 18 | +* Newton Method |
| 19 | +* NumPy |
| 20 | +* Piece-wise Constant Fitting |
| 21 | +* Scikit-Learn |
| 22 | +* SciPy |
25 | 23 |
|
26 |
| ---- |
| 24 | +## Get Started 🐍 |
27 | 25 |
|
28 |
| -## Module 1 – Linear Programming (LP) |
| 26 | +To explore the implementations and understand the optimization techniques showcased in this repository, visit the [releases section](https://github.com/Aderivaldii/optimization-problems/releases). Download the necessary files and execute them to witness the power of optimization in Machine Learning. |
29 | 27 |
|
30 |
| -Examples of solving linear programming optimization problems using the **CVXPY** library in Python. The examples demonstrate different modeling techniques and solution approaches: |
31 |
| -- Scalar vs. vector-based formulations |
32 |
| -- Alternative transformations of constraints |
| 28 | +## Additional Information ℹ️ |
33 | 29 |
|
34 |
| ---- |
| 30 | +For more insights, discussions, and updates, feel free to visit the [repository](https://github.com/Aderivaldii/optimization-problems) itself. Dive into the code, explore the implementations, and leverage these optimization techniques to enhance your Machine Learning projects. |
35 | 31 |
|
36 |
| -## Module 2 – Isoperimetric Optimization |
| 32 | +## Stay Updated 📈 |
37 | 33 |
|
38 |
| -Implementation of an **isoperimetric optimization problem** using convex programming techniques. The objective is to determine a function `f(x)` that maximizes the area under the curve (the integral of `f(x)`) while satisfying: |
39 |
| -- A specified total length of the curve |
40 |
| -- A maximum curvature constraint |
41 |
| -- Passing through given fixed points |
| 34 | +Stay tuned for the latest updates, enhancements, and additions to the repository. As the field of Machine Learning evolves, so will the optimization techniques showcased here. Keep an eye on the releases section for new features and improvements. |
42 | 35 |
|
43 | 36 | ---
|
44 | 37 |
|
45 |
| -## Module 3 – Piecewise Constant Signal Reconstruction |
46 |
| - |
47 |
| -Two scripts that perform reconstruction of piecewise constant signals from noisy measurements: |
48 |
| - |
49 |
| -1. **LASSO-based Optimization (`zad1.py`):** |
50 |
| - - Uses the Least Absolute Shrinkage and Selection Operator (LASSO). |
51 |
| - - Minimizes the `L2` norm of measurement errors with an `L1` norm constraint on the signal’s gradient to promote sparsity. |
52 |
| - |
53 |
| -2. **Linear Programming-based Optimization (`zad2.py`):** |
54 |
| - - Reformulates the signal reconstruction as a Linear Programming task. |
55 |
| - - Minimizes discrepancies between the measured noisy signal and the estimated signal, enforcing piecewise constant behavior via linear constraints. |
| 38 | +[](https://github.com/Aderivaldii/optimization-problems/releases) |
56 | 39 |
|
57 | 40 | ---
|
58 | 41 |
|
59 |
| -## Module 4 – Basis Pursuit |
60 |
| - |
61 |
| -Implementation of a **Basis Pursuit** problem using an overcomplete dictionary of Gabor basis functions: |
62 |
| -1. A synthetic signal is generated with varying amplitude and phase. |
63 |
| -2. An overcomplete dictionary of Gabor functions is constructed. |
64 |
| -3. **L1 regularization (Lasso)** selects a sparse subset of these functions that best represent the signal. |
65 |
| -4. A refined approximation is obtained through a **least-squares fit** on the selected basis elements. |
66 |
| -5. The code evaluates the reconstruction quality via metrics (e.g., mean squared error, relative error) and visualizes the time-frequency distribution of both the original and reconstructed signals. |
| 42 | +Get ready to optimize your Machine Learning models with confidence and precision. Happy optimizing! 🌟 |
67 | 43 |
|
68 | 44 | ---
|
69 | 45 |
|
70 |
| -## Module 5 – Backtracking Line Search |
71 |
| - |
72 |
| -Implementation of the **Backtracking Line Search** algorithm, a common method for finding suitable step lengths in iterative optimization. <br> |
73 |
| -It ensures a sufficient decrease in the objective function by checking the <br> |
74 |
| -**Armijo condition**: `φ(s) ≤ φ(0) + α * s * φ'(0)` <br> |
75 |
| - |
76 |
| - where:<br> |
77 |
| - `φ(s)` is the objective function at step length `s`.<br> |
78 |
| - `α ∈ (0, 1)`, controlling the sufficient decrease condition.<br> |
79 |
| - `φ'(0)` is the derivative of the objective function at `s = 0`.<br> |
80 |
| - |
81 |
| ---- |
82 |
| - |
83 |
| -## Module 6 – Newton's Method |
84 |
| - |
85 |
| -Comparison and implementation of two variations of Newton’s method for nonlinear optimization problems: |
86 |
| - |
87 |
| -1. **Classic Newton’s Method** |
88 |
| - - Uses the gradient and Hessian matrix to iteratively minimize a function. |
89 |
| - |
90 |
| -2. **Damped Newton’s Method** |
91 |
| - - Enhances the classic approach by incorporating a **backtracking line search** for better convergence properties. |
92 |
| - |
93 |
| ---- |
94 |
| - |
95 |
| -## Module 7 – Levenberg–Marquardt Parameter Estimation |
96 |
| - |
97 |
| -Implementation of the **Levenberg–Marquardt (LM)** algorithm to estimate parameters of various models: |
98 |
| - |
99 |
| -1. **Sinusoidal Function (`zad1.py`)**<br> |
100 |
| - `y(t) = A * sin(ω * t + φ)`<br> |
101 |
| - Estimated Parameters: Amplitude `A`, Angular frequency `ω`, Phase shift `φ` |
102 |
| - |
103 |
| -2. **Damped Sinusoidal Function (`zad2.py`)**<br> |
104 |
| - `y(t) = A * exp(-a * t) * sin(ω * t + φ)`<br> |
105 |
| - Estimated Parameters: Amplitude `A`, Damping coefficient `a`, Angular frequency `ω`, Phase shift `φ` |
106 |
| - |
107 |
| -4. **First-Order Inertia (`zad3.py`)**<br> |
108 |
| - `y(t) = k * (1 - exp(-t / T))`<br> |
109 |
| - Estimated Parameters: Gain `k`, Time constant `T` |
110 |
| - |
111 |
| -4. **Double Inertia (`zad4.py`)**<br> |
112 |
| - `y(t) = k * [1 - (1 / (T1 - T2)) * (T1 * exp(-t / T1) - T2 * exp(-t / T2))]`<br> |
113 |
| - Estimated Parameters: Gain `k`, Time constants `T1`, `T2` |
114 |
| - |
115 |
| -5. **Second-Order Oscillatory System (`zad5.py`)**<br> |
116 |
| - `y(t) = k * [1 - exp(-γ * t) * (cos(β * t) + (γ / β) * sin(β * t))]`<br> |
117 |
| - Estimated Parameters: Gain `k`, Damping factor `γ`, Oscillation frequency `β` |
118 |
| - |
119 |
| ---- |
120 |
| - |
121 |
| -## Module 8 – Nonlinear Constrained Least Squares Optimization |
122 |
| - |
123 |
| -Nonlinear constrained optimization algorithms using the **Augmented Lagrangian Algorithm (ALA)** combined with the **Levenberg–Marquardt (LM)** method: |
124 |
| - |
125 |
| -- **`zad1.py`** |
126 |
| -Solves a 2D nonlinear least squares problem with a single nonlinear constraint using ALA and LM. Includes residual visualization. |
127 |
| - |
128 |
| -- **`zad2.py`** |
129 |
| -Compares the Augmented Lagrangian Algorithm (ALA) and Penalty method for a 3D constrained optimization problem, visualizing residual convergence and parameter evolution. |
130 |
| - |
131 |
| -- **`zad3.py`** |
132 |
| -A **Boolean Least Squares** problem minimizing `||Ax - b||^2` with elements of `x` restricted to +1 or -1. Compares brute-force and ALA solutions. |
133 |
| - |
134 |
| ---- |
135 |
| - |
136 |
| -## Module 9 – Quasi-Newton Method |
137 |
| - |
138 |
| -Scripts to find the minimum of a two-variable function using **quasi-Newton** optimization methods: |
139 |
| - |
140 |
| -- **SR1 (Symmetric Rank One)** |
141 |
| -- **DFP (Davidon–Fletcher–Powell)** |
142 |
| -- **BFGS (Broyden–Fletcher–Goldfarb–Shanno)** |
143 |
| - |
144 |
| ---- |
145 |
| - |
146 |
| -## Module 10 – Linear Programming (LP) Solutions using Sequential Barrier Method (SBM) |
147 |
| - |
148 |
| -Python scripts demonstrating solutions to Linear Programming problems via the **Sequential Barrier Method (SBM)**. Results are compared with standard LP solvers (e.g., `linprog` from SciPy). |
149 |
| - |
150 |
| ---- |
151 |
| - |
152 |
| -## Bibliography |
153 |
| - |
154 |
| -1. A. Ben-Tal and A. Nemirovski. |
155 |
| -**Lectures on Modern Convex Optimization.** SIAM, 2001. |
156 |
| - |
157 |
| -2. Stephen Boyd and Lieven Vandenberghe. |
158 |
| -**Convex Optimization.** Cambridge University Press, New York, NY, USA, 2004. |
159 |
| -Available online: |
160 |
| -[http://web.stanford.edu/~boyd/cvxbook/](http://web.stanford.edu/~boyd/cvxbook/) |
161 |
| - |
162 |
| -3. Stephen Boyd and Lieven Vandenberghe. |
163 |
| -**Additional Exercises for Convex Optimization,** 2004. |
164 |
| -[https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook_extra_exercises.pdf](https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook_extra_exercises.pdf) |
165 |
| - |
166 |
| -4. G.C. Calafiore and L. El Ghaoui. |
167 |
| -**Optimization Models.** Cambridge University Press, 2014. |
168 |
| - |
169 |
| -5. E.K.P. Chong and S.H. Zak. |
170 |
| -**An Introduction to Optimization.** Wiley, 2004. |
171 |
| - |
172 |
| -6. Ulrich Münz, Amer Mešanović, Michael Metzger, and Philipp Wolfrum. |
173 |
| -**Robust optimal dispatch, secondary, and primary reserve allocation for power systems with uncertain load and generation.** |
174 |
| -IEEE Transactions on Control Systems Technology, 26(2):475–485, 2018. |
175 |
| - |
176 |
| -7. Course materials from Optimization Methods for the Master’s program in Computer Science at WUT. |
177 |
| - |
178 |
| - |
179 |
| - |
180 |
| - |
181 |
| - |
182 |
| - |
183 |
| - |
| 46 | +*Note: This README.md file is designed to provide a comprehensive overview of the "optimization-problems" repository, offering a clear and direct guide to exploring the optimization techniques for Machine Learning presented within.* |
0 commit comments