Lecture 3:
Computational Complexity & Searching
Sookyung Kim
Algorithms & Complexity
2
What is an Algorithm?
3
Efficiency of an Algorithm
Do you like fast/efficient computer program, or slow one?
Of course, we always expect our computers to do their jobs most efficiently!
4
Computation Complexity
5
Measuring Time Complexity
6
Elementary School Algorithm
Example: Integer multiplication
5678
×1234
2
3
1
3
7
2
22
17034
11356
5678
7006652
7
Elementary School Algorithm
Example: Integer multiplication
N multiplications
(up to) N-1 additions
For each row:
N rows
Total operations ≤ 3N2
In total,�N(2N-1) operations
(up to) N2 additions
5678
×1234
2
3
1
3
7
2
22
17034
11356
5678
7006652
8
Software Engineer’s Example
Let’s denote len(list) as N:
1 to N
N
(N2 - N)/2
0 to (N2 - N)/2
N
def linear_search(list, value):
for i in range(len(list)):
if list[i] == value:
return i
return -1
def selection_sort(list):
for i in range(len(list)):
smallest = i
for j in range(i+1, len(list)):
if list[j] < list[smallest]:
smallest = j
list[i], list[smallest] = list[smallest], list[i]
Operation | Count |
== | |
Operation | Count |
smallest = i | |
< | |
smallest = j | |
swap | |
9
Big O Notation
How to Characterize Time Complexity more formally and simply?
10
Simplification of Time Complexity
1. We care only about the worst-case performance!
← because we do not know what input data we will get in advance.
Operation | Count |
smallest = i | N |
< | (N2 - N)/2 |
smallest = j | 0 to (N2 - N)/2 |
swap | N |
11
Simplification of Time Complexity
2. Focus only on a single operation with the highest order of growth (=most expensive).
Operation | Count |
smallest = i | N |
< | (N2 - N)/2 |
smallest = j | (N2 - N)/2 |
swap | N |
12
Simplification of Time Complexity
3. Remove lower order terms.
← They do not really matter, since the higher order one dominates.
Operation | Count |
smallest = i | Ignored |
< | Ignored |
smallest = j | (N2 - N)/2 |
swap | Ignored |
13
Simplification of Time Complexity
4. Remove constants.
← We have already thrown away information at step 2. At this stage, constants are not meaningful.
Worst-case order of growth: N2
Operation | Count |
smallest = i | Ignored |
< | Ignored |
smallest = j | N2/2 |
swap | Ignored |
14
Formal Definition
15
Example
16
Example (cont'd)
17
Exercise (cont'd)
O(N2)
O(1)
O(N2)
O(N)
O(2N)
18
What is Important for Asymptotic Analysis?
500N
2N2
19
Asymptotic Analysis
20
Homework – Problem 1
21
Homework – Problem 2
22
Searching
23
Searching Problem
Given a list of objects and a single target, locate it if it exists.
Example:
Where is 4? → [6].
Where is 17? → It does not exist.
How to solve it?
What is the most efficient way?
Index | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
Value | -7 | 15 | 2 | 6 | -1 | 5 | 4 | 10 | -4 | 21 |
24
Linear Search
Time complexity?
Can we do better?
def linear_search(list, value):
for i in range(len(list)):
if list[i] == value:
return i
return -1
O(N)
25
Searching Problem
Now, let’s consider an additional condition:
Given a sorted list of objects and a single target, locate it if it exists.
Example:
Where is 4? → [4].
Where is 17? → It does not exist.
Can we use the linear search method to solve this version?
Can we solve it more efficiently?
Index | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
Value | -7 | -4 | -1 | 2 | 4 | 5 | 6 | 10 | 15 | 21 |
Yes!
26
Analogy to Dictionary Search
N/2, where N is the number of words in the dictionary!
27
Binary Search
15 should appear on the right side of this!�→ We can safely ignore all values on the left.
Index | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
Value | -7 | -4 | -1 | 2 | 4 | 5 | 6 | 10 | 15 | 21 |
28
Binary Search
Again, 15 should appear on the right side of this!�→ We can safely ignore all values on the left.
Okay, we found it!
Index | 5 | 6 | 7 | 8 | 9 |
Value | 5 | 6 | 10 | 15 | 21 |
Index | 8 | 9 |
Value | 15 | 21 |
29
Binary Search
14 should be on the left of this.�→ We can safely ignore all values on the right.
Index | 8 | 9 |
Value | 15 | 21 |
Index |
Value |
30
Binary Search
first
last
mid
Example: value = 7?
iter=1
iter=2
iter=3
iter=4
iter=5
def binary_search(list, value):
first = 0
last = len(list) - 1
while first <= last:
mid = (first + last) // 2
if list[mid] == value:
return mid
elif list[mid] < value:
first = mid + 1
else:
last = mid - 1
return -1
Index | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
Value | -7 | -4 | -1 | 2 | 4 | 5 | 6 | 10 | 15 | 21 |
31
Time Complexity
O(log N)
32
Additional Problems
33
Recursion
34
Recursive Algorithm
35
Key Components in Recursive Algorithm
36
Binary Search: Recursive Version
search
search
“Same problem of different size”
Up to O(log N) recursive calls.
Each call takes O(1): computing mid, comparison with value, return output.
def binary_search(list, value, first, last):
if first > last:
return -1
else:
mid = (first + last) // 2
if list[mid] == value:
return mid
elif list[mid] < value:
return binary_search(list, value, mid+1, last)
else:
return binary_search(list, value, first, mid-1)
binary_search(list, value, 0, len(list) - 1)
Time complexity?
O(log N)
37
Reversing a String
Non-recursive version:
Recursive version:
def reverse(str):
output = ""
for i in range(len(str)):
output += str[-i-1]
return output
def reverse(str):
if not str:
return str
else:
return str[-1] + reverse(str[:-1])
return reverse(str[1:]) + str[0]
38
Combination
def comb(n, r):
if r == 0 or r == n:
return 1
elif r > n:
return 0
else:
return comb(n-1, r-1) + comb(n-1, r)
This implementation is not efficient. Why?
39
Combination
def comb(n, r):
if r == 0 or r == n:
return 1
elif r > n:
return 0
else:
return comb(n-1, r-1) + comb(n-1, r)
40
Recursion Summary
41
Homework 1 – Binary Search
42
Homework 2 – Binary Search
43
Homework 3 – Binary Search
44
Homework 4 - Recursion
45
Homework 5 - Recursion
46
Homework 6 - Recursion
47