1 of 36

CSCI100: Fall 2021

Lecture 18:

  • Time Complexity
    • Orders of Growth
    • Big O Notation
  • Space Complexity

2 of 36

Context

3 of 36

Tourist Walks

Which of the 4 algorithms produced the best path?

Depends on what you care about!

4 of 36

Evaluating Program Runtime

Three approaches we can take to measure runtime:

  • Measure with a timer
  • Count number of operations
  • Order of growth: abstractly describe behavior of algorithm

5 of 36

Evaluating Program Runtime

Three approaches we can take to measure runtime:

  • Measure with a timer
  • Time the program using an accurate stopwatch.
  • Compare how long 2 different algorithms take to run.
  • Problems:
    • Depends on the hardware the code is running on
    • Varies from run-to-run (based on available memory, etc.)
  • Count number of operations
  • Order of growth: abstractly describe behavior of algorithm

6 of 36

Evaluating Program Runtime

Three approaches we can take to measure runtime:

  • Measure with a timer
  • Count number of operations
  • Assume each “operation” takes 1 unit of time, count number of operations
  • Count uses variable “n” which is the size of input
  • Pros: takes into account size of input
  • Cons: What counts as an “operation”? Unruly answer with many terms?
  • Order of growth: abstractly describe behavior of algorithm

total = 0 => 1 op

range => loop x times

every time increment => 1 op

inside for:

total + i => 1 op

store into total => 1 op

return => 1 op

total: 1 + 3x + 1

def mysum(x):

total = 0

for i in range(x):

total = total + i

return total

7 of 36

Evaluating Program Runtime

Three approaches we can take to measure runtime:

  • Measure with a timer
  • Count number of operations
  • Order of growth: abstractly describe behavior of algorithm

Today we focus on #3...

8 of 36

Big O Notation

9 of 36

Counting Operation Cons

Counting operations seems nice, but…

  • We end up getting long, complicated functions (300x2 + 10x + 15)
    • We really only care about what happens when the input is large
  • What should we count as an operation?

Is there some way we can

  • Count operations but not worry about small variations
  • Performance when problem size gets arbitrarily large

10 of 36

Big O Notation: O()

Slight tweak to counting operations...we will leave out any multiplicative or lower-order additive terms. This is the “order of growth” of the function.

We often call this “Big O notation” of the runtime. Measures upper bound on order of growth

Used to describe worst case

  • Occurs often and is bottleneck when program runs
  • Express rate of program growth relative to input size
  • Evaluate algorithm, not machine / implementation

11 of 36

Order of Growth

y = n

y = n + 1

y = 2n

y = 0.5n + 1

Multiplicative Constant

12 of 36

Order of Growth

y = n

y = n + 1

y = 2n

y = 0.5n + 1

Lower Order Terms

13 of 36

Order of Growth

Dropping all multiplicative constants and lower order terms, these are all O(n).

y = n

y = n + 1

y = 2n

y = 0.5n + 1

14 of 36

Lower-Order Terms

The “order” of a term is how quickly it grows. The most common ones in order from lower order to higher order

Name

Function

Examples

constant

1

1, 5, 10

logarithmic

log(n)

log(2n), 3log(n)

linear

n

4n, 0.5n, n

log-linear

n log(n)

3nlog(n), nlog(0.5n)

polynomial

nc

2n2, 5n3

exponential

cn

2n, 5n + 1

Lower “order”

Higher “order”

15 of 36

Lower-Order Terms

The “order” of a term is how quickly it grows. The most common ones in order from lower order to higher order

Name

Function

Examples

constant

1

1, 5, 10

logarithmic

log(n)

log(2n), 3log(n)

linear

n

4n, 0.5n, n

log-linear

n log(n)

3nlog(n), nlog(0.5n)

polynomial

nc

2n2, 5n3

exponential

cn

2n, 5n + 1

Lower “order”

Higher “order”

16 of 36

Orders of Growth

17 of 36

Simplification Examples

Drop lower order terms and multiplicative factors

Focus on dominant term: term that will increase the fastest

2n2 + 2n + 2

101000 + 10n3 + 100n

log(n) + n + 4

0.0001 * n * log(n) + 300n

2n30 + 3n

2n2 O(n2)

10n3 O(n3)

n O(n)

0.0001 n log n O(n log n)

3n O(3n)

18 of 36

Orders of Growth

The “Order of Growth” of a program looks at the largest factors in the runtime (which part contributes the most to the runtime when input size gets very big).

Doesn’t need to be precise: “order of”, not “exact”, growth

Properties:

  • Evaluate program’s efficiency when input is very big
  • Express growth of program’s runtime as input size grows
  • Put upper bound on growth

Want upper bound (worst case) on growth as function of input size

19 of 36

Exact Steps vs O()

num1 = n * 2 => 2 ops

num2 = n + 6 => 2 ops

num1 + num2 => 1 op

return => 1 op

total: 6

(Exact) Number of Operations

def mystery(n):

num1 = n * 2

num2 = n + 6

return num1 + num2

Take the number of operations, and drop multiplicative constants and lower order terms

O(6) = O(1)

Constant runtime

Big O Runtime

20 of 36

Law of Addition

Sequential statements, add

O(f(n)) + O(g(n)) = O(f(n) + g(n))

Example:

def printing(items, digits):

for item in items:

print(item)

for digit in digits:

print(digit)

O(n)

O(n)

O(n) + O(n)

= O(n + n)

= O(2n)

= O(n)

21 of 36

Exact Steps vs O()

answer = 1 => 1 op

while => loop n times

n > 1 => 1 op

inside while:

answer * n => 1 op

store into answer => 1 op

n - 1 => 1 op

store into n => 1 op

return => 1 op

total: 1 + 5n + 1

(Exact) Number of Operations

def fact_iter(n):

answer = 1

while n > 1:

answer = answer * n

n = n - 1

return answer

Take the number of operations, and drop multiplicative constants and lower order terms

O(1 + 5n + 1) = O(5n + 2) = O(n)

Linear runtime

Big O Runtime

22 of 36

Law of Multiplication

Used with nested loops

O(f(n)) * O(g(n)) = O(f(n) * g(n))

Example:

def printing(grid):

for row in range(len(grid)):

for col in range(len(row)):

print(grid[row][col])

O(n)

O(n) * O(n)

= O(n * n)

= O(n2)

n loops, each O(n)

23 of 36

Exact Steps vs O()

for i in range(n) => n times

for j in range(n) => n times

print(str(i) + str(j)) => 4 ops

==> 4n^2 ops

print(“---”) => 1 op

for k in range(n) => n times

print(k + 2) => 2 ops

==> 2n ops

total: 4n^2 + 1 + 2n

(Exact) Number of Operations

def mystery(n):

for i in range(n):

for j in range(n):

print(str(i), str(j))

print("------")

for k in range(n):

print(k + 2)

Take the number of operations, and drop multiplicative constants and lower order terms

O(4n^2 + 1 + 2n) = O(n^2 + 1 + n)

= O(n^2)

Quadratic runtime

Big O Runtime

24 of 36

What’s O() Measuring?

  • Amount of time needed grows as size of input, n, to problem grows
  • Want to know asymptotic behavior as size of problem gets large
  • Focus on term that grows most rapidly in sum of terms
  • Ignore multiplicative and additive constants

25 of 36

Practical Use of Big O

26 of 36

Computing Big O Runtime

Steps:

  • Count the number of operations
  • Drop all multiplicative constants and keep only highest order term.

Notice that because you drop lowest order terms, you don’t need to worry about whether to count something as 1 or 2 or 5 operations. 1 == 2 == 5, they are all O(1) (constant time)

27 of 36

Comparing Efficiency of Two Algorithms

Steps:

  • Compute Big O Runtime of both algorithms
  • Compare which is higher order. The order Big O Runtime is slower.

28 of 36

Which Input to Evaluate Function Efficiency?

Express efficiency in terms of input size, so need to decide what input is

  • Could be integer: my_sum(x)
  • Could be length of list: sum_list(lst)
  • Decide when multiple parameters: search_for_element(lst, element)

def search_for_element(lst, element):

for item in lst:

if item == element:

return True

return False

29 of 36

Terminology Sidenote

Sometimes, you’ll hear Big O / Order of Growth also called “asymptotic runtime”.

An asymptote is a line that approaches a curve but doesn’t meet it.

This is because the Big O runtime of a program is the curve that most closely approaches the actual runtime curve from as N gets very large.

Asymptote

A-sum-ptote

Not - together - fall

Not meeting

30 of 36

Common Operation Runtimes

  • List or string contains check `item in sequence`:
    • O(n), Because we have to look through each item in the list/string
  • list.append(item)
    • O(1). Constant time to add one more item to the end of a list
  • Indexing: `sequence[index]`:
    • O(1), Constant time to read the value at particular index in a sequence
  • Slicing: `sequence[:k]`:
    • O(k), Linear because slicing makes a new copy in Python
  • Getting length of something with len(sequence):
    • O(1), The length is a property of the sequence, quick to look up
  • Comparing two lists:
    • O(n), Linear because you have to compare each item to each other item in both lists
    • n here is the length of both lists

31 of 36

Concatenate +: O(n + k)

sentence = 'I think'

sentence = sentence + 'therefore I am' is O(n + k), where n is size of original string and k is size of string added to original string

Why is + operation O(n + k)?

Computer memory allocates enough space for both strings

Original string gets copied over -> O(n)

String to concatenate gets written -> O(k)

32 of 36

Space Complexity

33 of 36

Space Complexity

The same way we’ve been thinking about time complexity, we can analyze the space complexity of a particular solution.

Space Complexity refers to how much additional space in memory a program uses when it runs.

For time complexity, our unit was “operations”. For space complexity, we think about number of single “units” written to memory:

  • Integers
  • Booleans
  • Characters
  • etc.

34 of 36

Space Complexity - Big O

We talk about space complexity in terms of Big O as well.

def mystery(n):

num1 = n * 2

num2 = n + 6

return num1 + num2

Creates 2 integers in memory

O(2) = O(1), so constant space complexity

35 of 36

Space Complexity - Big O

We talk about space complexity in terms of Big O as well.

def mystery1(nums, item):

length = len(nums)

new_list = []

for num in nums:

if num == item:

break

new_list.append(num)

return new_list

In the worst case, creates a new list with n items in it (where n is the length of the given list `nums`), and 1 new integer variable in memory.

O(n + 1) = O(n), linear space complexity

36 of 36

Complexity Practice

We’ll see much more complexity practice in classwork and lab this week.