Skip to content

Commit 4ebc3ca

Browse files
committed
test
1 parent e20fd3b commit 4ebc3ca

File tree

2 files changed

+62
-14
lines changed

2 files changed

+62
-14
lines changed

content/blog/basics-of-supervised-learning-linear-regression.md

Lines changed: 61 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
author: Kishore Kumar
3-
date: 2025-08-18 06:04:35+0530
3+
date: 2025-08-18 06:20:02+0530
44
doc: 2025-08-18 06:03:53+0530
55
title: Basics Of Supervised Learning - Linear Regression
66
topics: []
@@ -106,23 +106,71 @@ Finding the best set of parameters $\theta$ now just means finding the best valu
106106

107107
Consider the function $y = (x-3)^2 + 2$. We can find *a* minimum by differentiating it and setting $\frac{dy}{dx} = 0$. This gives us $\frac{dy}{dx} = \frac{d(x^2 - 6x + 9 + 2)}{dx} = 2x - 6 = 0 \implies x = 3$. At $x = 3$, $y = 2$. By double differentiating it, we get $\frac{d^2y}{dx^2} = 2 \gt 0$ which means it's a minimum. Since this curve is concave-up shaped, it has just one minimum and hence it is the *global* minimum.
108108

109-
```desmos-graph
110-
y = (x-3)^2 + 2
111-
A = (3,2)
112-
```
109+
<div id="desmos-8a6985bc" style="width: 100%; height: 400px; margin: 20px 0; border: 1px solid #ccc;"></div>
110+
<script>
111+
(function() {
112+
if (typeof Desmos === 'undefined') {
113+
console.error('Desmos API not loaded');
114+
return;
115+
}
116+
var calculator = Desmos.GraphingCalculator(
117+
document.getElementById('desmos-8a6985bc'),
118+
{
119+
expressions: false,
120+
settingsMenu: false,
121+
zoomFit: true,
122+
showGrid: true,
123+
showXAxis: true,
124+
showYAxis: true
125+
}
126+
);
127+
128+
// Set expressions from parsed content
129+
calculator.setExpression({"id": "expr_0", "latex": "y = (x-3)^2 + 2", "color": "#c74440"});
130+
calculator.setExpression({"id": "expr_1", "latex": "A = (3,2)", "color": "#2d70b3"});
131+
132+
calculator.setMathBounds({
133+
left: -10, right: 10,
134+
bottom: -10, top: 10
135+
});
136+
})();
137+
</script>
113138

114139
However, this same approach isn't very feasible for more complicated functions. Sometimes solving for all possible values of $\frac{dy}{dx} = 0$ is difficult (or impossible). Checking the double derivative for complex functions might often be inconclusive and we may need to check higher order functions or use numerical methods. When we're dealing with multiple variables and higher dimensional functions, the computations can get extremely complex and difficult to compute. So instead, people rely on iterative numerical optimization algorithms.
115140

116141
**Gradient Descent** is one such iterative optimization algorithm used to find the minimum of a function. We start with some random initial values for $\theta$ and repeatedly update them by taking small steps in the direction of the steepest descent of the cost function. Consider this more complex function below:
117142

118-
```desmos-graph
119-
C(w) = 0.1w^5 - 0.3w^4 - 1.2w^3 + 0.8w^2 + 1.5w
120-
121-
w_0 = 0.5
122-
(w_0, C(w_0))
123-
124-
T(w) = C(w_0) + (0.5w_0^4 - 1.2w_0^3 - 3.6w_0^2 + 1.6w_0 + 1.5)(w - w_0)
125-
```
143+
<div id="desmos-48dbb6d3" style="width: 100%; height: 400px; margin: 20px 0; border: 1px solid #ccc;"></div>
144+
<script>
145+
(function() {
146+
if (typeof Desmos === 'undefined') {
147+
console.error('Desmos API not loaded');
148+
return;
149+
}
150+
var calculator = Desmos.GraphingCalculator(
151+
document.getElementById('desmos-48dbb6d3'),
152+
{
153+
expressions: false,
154+
settingsMenu: false,
155+
zoomFit: true,
156+
showGrid: true,
157+
showXAxis: true,
158+
showYAxis: true
159+
}
160+
);
161+
162+
// Set expressions from parsed content
163+
calculator.setExpression({"id": "expr_0", "latex": "C(w) = 0.1w^5 - 0.3w^4 - 1.2w^3 + 0.8w^2 + 1.5w", "color": "#c74440"});
164+
calculator.setExpression({"id": "expr_1", "latex": "w_0 = 0.5", "color": "#2d70b3"});
165+
calculator.setExpression({"id": "point_2", "latex": "(w_0, C(w_0))"});
166+
calculator.setExpression({"id": "expr_3", "latex": "T(w) = C(w_0) + (0.5w_0^4 - 1.2w_0^3 - 3.6w_0^2 + 1.6w_0 + 1.5)(w - w_0)", "color": "#fa7e19"});
167+
168+
calculator.setMathBounds({
169+
left: -10, right: 10,
170+
bottom: -10, top: 10
171+
});
172+
})();
173+
</script>
126174

127175
Given any point $w_0$, we can find out the answer to *"Which direction should I move in to reduce the value of the function?"* by computing the derivative (slope) of the function at that point $w_0$. If the slope is positive, we should move left to reduce the value of the function. If it's negative, we should move right. If we do this repeatedly, we'll eventually approach & reach some **local minimum** of the function. The visualization that really helps sell this idea is that of a ball rolling down the 2D hills (curves generated by the function). If we generate *enough* random initial points (or balls) and perform this procedure, we should eventually hit a very good local minimum.
128176

hugo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ defaultMarkdownHandler = "goldmark"
3232
description = "If we keep holding onto yesterday, what are we going to be tomorrow?"
3333
keywords = ["kishore", "kumar", "akcube", "iiit", "iiith", "iiit-h", "international institute of information technology", "ICPC", "competitive programming", "optimization", "performance", "algorithms", "finance", "math", "c++", "cpp", "research", "quant", "finance", "ml", "deep learning", "reinforcement learning", "theory"]
3434
customCSS = []
35-
customJS = []
35+
customJS = ["https://www.desmos.com/api/v1.11/calculator.js"]
3636
dateFormat = "2 January 2006"
3737
emphasisWithDots = true
3838
since = "2024"

0 commit comments

Comments
 (0)