CSCI100: Fall 2021
Lecture 18:
Context
Tourist Walks
Which of the 4 algorithms produced the best path?
Depends on what you care about!
Evaluating Program Runtime
Three approaches we can take to measure runtime:
Evaluating Program Runtime
Three approaches we can take to measure runtime:
Evaluating Program Runtime
Three approaches we can take to measure runtime:
total = 0 => 1 op
range => loop x times
every time increment => 1 op
inside for:
total + i => 1 op
store into total => 1 op
return => 1 op
total: 1 + 3x + 1
def mysum(x): total = 0 for i in range(x): total = total + i return total |
Evaluating Program Runtime
Three approaches we can take to measure runtime:
Today we focus on #3...
Big O Notation
Counting Operation Cons
Counting operations seems nice, but…
Is there some way we can
Big O Notation: O()
Slight tweak to counting operations...we will leave out any multiplicative or lower-order additive terms. This is the “order of growth” of the function.
We often call this “Big O notation” of the runtime. Measures upper bound on order of growth
Used to describe worst case
Order of Growth
y = n
y = n + 1
y = 2n
y = 0.5n + 1
Multiplicative Constant
Order of Growth
y = n
y = n + 1
y = 2n
y = 0.5n + 1
Lower Order Terms
Order of Growth
Dropping all multiplicative constants and lower order terms, these are all O(n).
y = n
y = n + 1
y = 2n
y = 0.5n + 1
Lower-Order Terms
The “order” of a term is how quickly it grows. The most common ones in order from lower order to higher order
Name | Function | Examples |
constant | 1 | 1, 5, 10 |
logarithmic | log(n) | log(2n), 3log(n) |
linear | n | 4n, 0.5n, n |
log-linear | n log(n) | 3nlog(n), nlog(0.5n) |
polynomial | nc | 2n2, 5n3 |
exponential | cn | 2n, 5n + 1 |
Lower “order”
Higher “order”
Lower-Order Terms
The “order” of a term is how quickly it grows. The most common ones in order from lower order to higher order
Name | Function | Examples |
constant | 1 | 1, 5, 10 |
logarithmic | log(n) | log(2n), 3log(n) |
linear | n | 4n, 0.5n, n |
log-linear | n log(n) | 3nlog(n), nlog(0.5n) |
polynomial | nc | 2n2, 5n3 |
exponential | cn | 2n, 5n + 1 |
Lower “order”
Higher “order”
Orders of Growth
Simplification Examples
Drop lower order terms and multiplicative factors
Focus on dominant term: term that will increase the fastest
2n2 + 2n + 2
101000 + 10n3 + 100n
log(n) + n + 4
0.0001 * n * log(n) + 300n
2n30 + 3n
2n2 O(n2)
10n3 O(n3)
n O(n)
0.0001 n log n O(n log n)
3n O(3n)
Orders of Growth
The “Order of Growth” of a program looks at the largest factors in the runtime (which part contributes the most to the runtime when input size gets very big).
Doesn’t need to be precise: “order of”, not “exact”, growth
Properties:
Want upper bound (worst case) on growth as function of input size
Exact Steps vs O()
num1 = n * 2 => 2 ops
num2 = n + 6 => 2 ops
num1 + num2 => 1 op
return => 1 op
total: 6
(Exact) Number of Operations
def mystery(n): num1 = n * 2 num2 = n + 6 return num1 + num2 |
Take the number of operations, and drop multiplicative constants and lower order terms
O(6) = O(1)
Constant runtime
Big O Runtime
Law of Addition
Sequential statements, add
O(f(n)) + O(g(n)) = O(f(n) + g(n))
Example:
def printing(items, digits):
for item in items:
print(item)
for digit in digits:
print(digit)
O(n)
O(n)
O(n) + O(n)
= O(n + n)
= O(2n)
= O(n)
Exact Steps vs O()
answer = 1 => 1 op
while => loop n times
n > 1 => 1 op
inside while:
answer * n => 1 op
store into answer => 1 op
n - 1 => 1 op
store into n => 1 op
return => 1 op
total: 1 + 5n + 1
(Exact) Number of Operations
def fact_iter(n): answer = 1 while n > 1: answer = answer * n n = n - 1 return answer |
Take the number of operations, and drop multiplicative constants and lower order terms
O(1 + 5n + 1) = O(5n + 2) = O(n)
Linear runtime
Big O Runtime
Law of Multiplication
Used with nested loops
O(f(n)) * O(g(n)) = O(f(n) * g(n))
Example:
def printing(grid):
for row in range(len(grid)):
for col in range(len(row)):
print(grid[row][col])
O(n)
O(n) * O(n)
= O(n * n)
= O(n2)
n loops, each O(n)
Exact Steps vs O()
for i in range(n) => n times
for j in range(n) => n times
print(str(i) + str(j)) => 4 ops
==> 4n^2 ops
print(“---”) => 1 op
for k in range(n) => n times
print(k + 2) => 2 ops
==> 2n ops
total: 4n^2 + 1 + 2n
(Exact) Number of Operations
def mystery(n): for i in range(n): for j in range(n): print(str(i), str(j)) print("------") for k in range(n): print(k + 2) |
Take the number of operations, and drop multiplicative constants and lower order terms
O(4n^2 + 1 + 2n) = O(n^2 + 1 + n)
= O(n^2)
Quadratic runtime
Big O Runtime
What’s O() Measuring?
Practical Use of Big O
Computing Big O Runtime
Steps:
Notice that because you drop lowest order terms, you don’t need to worry about whether to count something as 1 or 2 or 5 operations. 1 == 2 == 5, they are all O(1) (constant time)
Comparing Efficiency of Two Algorithms
Steps:
Which Input to Evaluate Function Efficiency?
Express efficiency in terms of input size, so need to decide what input is
def search_for_element(lst, element): for item in lst: if item == element: return True return False |
Terminology Sidenote
Sometimes, you’ll hear Big O / Order of Growth also called “asymptotic runtime”.
An asymptote is a line that approaches a curve but doesn’t meet it.
This is because the Big O runtime of a program is the curve that most closely approaches the actual runtime curve from as N gets very large.
Asymptote
A-sum-ptote
Not - together - fall
Not meeting
Common Operation Runtimes
Concatenate +: O(n + k)
sentence = 'I think'
sentence = sentence + 'therefore I am' is O(n + k), where n is size of original string and k is size of string added to original string
Why is + operation O(n + k)?
Computer memory allocates enough space for both strings
Original string gets copied over -> O(n)
String to concatenate gets written -> O(k)
Space Complexity
Space Complexity
The same way we’ve been thinking about time complexity, we can analyze the space complexity of a particular solution.
Space Complexity refers to how much additional space in memory a program uses when it runs.
For time complexity, our unit was “operations”. For space complexity, we think about number of single “units” written to memory:
Space Complexity - Big O
We talk about space complexity in terms of Big O as well.
def mystery(n): num1 = n * 2 num2 = n + 6 return num1 + num2 |
Creates 2 integers in memory
O(2) = O(1), so constant space complexity
Space Complexity - Big O
We talk about space complexity in terms of Big O as well.
def mystery1(nums, item): length = len(nums) new_list = [] for num in nums: if num == item: break new_list.append(num) return new_list |
In the worst case, creates a new list with n items in it (where n is the length of the given list `nums`), and 1 new integer variable in memory.
O(n + 1) = O(n), linear space complexity
Complexity Practice
We’ll see much more complexity practice in classwork and lab this week.