You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: source/docs/programming_and_computer_usage/complexity.md
+22-1Lines changed: 22 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -175,7 +175,7 @@ The reason why worst case is generally preferred is because:
175
175
176
176
#### Linear search algorithm
177
177
178
-
The [linear search algorithm](https://princomp.github.io/lectures/data/search#finding-a-particular-value)look for a particular value in an array by inspecting the values one after the other:
178
+
The [linear search algorithm](./lectures/data/search#finding-a-particular-value)looks for a particular value in an array by inspecting the values one after the other:
179
179
180
180
```{download="./code/projects/LinearSearch.zip"}
181
181
!include`snippetStart="// Example value",snippetEnd="// We can optimize this algorithm"` code/projects/LinearSearch/LinearSearch/Program.cs
@@ -205,6 +205,27 @@ Similarly, considering that the array is of size $n$, and counting how many time
205
205
Note that the space usage of both algorithms are $O(c)$, as they require only one variable if we do not copy the array.
206
206
Note, also, that both algorithms have the same worst case and average case complexity, which are the cases we are actually interested in.
207
207
208
+
#### Binary search algorithm
209
+
210
+
The [binary search algorithm](./lectures/data/search#binary-search) looks for a particular value in a *sorted* array by leveraging this additional information: it "jumps" in the middle of the array, and if the value is found, it terminates, if the value is less than the target value, it keep looking in the right half of the array, and it keeps looking in the left half of the array otherwise.
211
+
212
+
What is the time complexity of such an algorithm? It halves the array at every step, and we know that if the array is of size $1$, then it will terminate (either because the value was found, or because it was not in the array).
213
+
That means that, if the array is of size $n$, in the worst case,
214
+
215
+
- after $1$ step, we have an array of size $n / 2$ left to explore,
216
+
- after $2$ steps, we have an array of size $n / 4$ left to explore,
217
+
- after $3$ steps, we have an array of size $n / 8$ left to explore,
218
+
- … after $k$ steps, we have an array of size $n / (2^k) left to explore.
219
+
220
+
Hence, we need to determine what is a $k$ such that $n / (2^k) \leqslant 1$ (since we terminate when the array is of size $1$)
221
+
222
+
223
+
224
+
225
+
226
+
227
+
228
+
208
229
#### Matrix Multiplication
209
230
210
231
Consider the ["schoolbook algorithm for multiplication"](https://en.wikipedia.org/wiki/Computational_complexity_of_matrix_multiplication#Schoolbook_algorithm)
0 commit comments