1 of 29

Lecture 5: Asymptotic Analysis II

CSE 373: Data Structures and Algorithms

1

Please log onto PollEv.com/champk to answer the daily lecture participation question

Please log onto https://edstem.org/us/courses/21257/discussion/1361663 to submit live lecture questions

2 of 29

Warm Up

CSE 373 22 SP – CHAMPION

2

public void mystery2(ArrayList<String> list) {

for (int i = 0; i < list.size(); i++) {

for (int j = 0; j < list.size(); j++) {

System.out.println(list.get(0));

}

}

}

+2

n(2)

Possible answer

T(n) = 4n2

n(n(2))

Construct a mathematical function modeling the runtime for the following functions

Approach

-> start with basic operations, work inside out for control structures

  • Each basic operation = +1
  • Conditionals = test operations + appropriate branch
  • Loop = iterations (loop body)

Please fill out the Poll at- PollEv.com/champk

3 of 29

Announcements

  • Proj 0 – 143 Review Project Due Tonight 11:59pm PST
  • Proj 1 Releases Tonight
    • Partner Project!
    • Due Wednesday April 13th

Partners

    • Yes, 3 person groups are allowed
    • Default is working alone
    • Define your own partnerships and groups via Gradescope

Kasey OH posted

    • Zoom Drop in Wednesdays 6-8pm
    • Calendly Schedule http://calendly.com/kasey-champion
      • Wednesdays 4-6
      • Fridays 12-1

CSE 373 22 SP – CHAMPION

3

4 of 29

Questions?

CSE 373 SP 18 - KASEY CHAMPION

4

5 of 29

Big O

CSE 373 19 SU - ROBBIE WEBER

6 of 29

Code to Big-Oh

  • 143 general patterns: “O(1) constant is no loops, O(n) is one loop, O(n2) is nested loops”
    • This is still useful!
    • But in 373 we’ll go much more in depth: we can explain more about why, and how to handle more complex cases when they arise (which they will!)

CSE 373 20 AU – SCHAFER

6

CODE

BIG-OH

for (i = 0; i < n; i++) {

a[i] = 1;

b[i] = 2;

}

O(n)

?

7 of 29

Meet Algorithmic Analysis

  • Algorithmic Analysis: The overall process of characterizing code with a complexity class, consisting of:
    • Code Modeling: Code 🡪 Function describing code’s runtime
    • Asymptotic Analysis: Function 🡪 Complexity class describing asymptotic behavior

CSE 373 20 AU – SCHAFER

7

COMPLEXITY

CLASS

CODE

Code Modeling

RUNTIME

FUNCTION

Asymptotic Analysis

1

2

for (i = 0; i < n; i++) {

a[i] = 1;

b[i] = 2;

}

O(n)

f(n) = 2n

8 of 29

Where are we?

  • We just turned a piece of code into a function!
    • We’ll look at better alternatives for code modeling later
  • Now to focus on step 2, asymptotic analysis

CSE 373 20 AU – SCHAFER

8

COMPLEXITY

CLASS

CODE

Code Modeling

RUNTIME

FUNCTION

Asymptotic Analysis

1

2

for (i = 0; i < n; i++) {

a[i] = 1;

b[i] = 2;

}

O(n)

f(n) = 2n

9 of 29

Finding a Big Oh

CSE 373 20 AU – SCHAFER

9

= 9n2 + 3n + 3

≈ 9n2

≈ n2

f(n) is O(n2)

f(n) = (9n+3)n + 3

COMPLEXITY

CLASS

RUNTIME

FUNCTION

Asymptotic Analysis

2

We have an expression for f(n). How do we get the O() that we’ve been talking about?

1. Find the “dominating term” and delete all others.

-The “dominating” term is the one that is largest as n gets bigger. In this class, often the largest power of n.

2. Remove any constant factors.

10 of 29

Can we really throw away all that info?

  • Big-Oh is like the “significant digits” of computer science
  • Asymptotic Analysis: Analysis of function behavior as its input approaches infinity
    • We only care about what happens when n approaches infinity
    • For small inputs, doesn’t really matter: all code is “fast enough”
    • Since we’re dealing with infinity, constants and lower-order terms don’t meaningfully add to the final result. The highest-order term is what drives growth!

CSE 373 20 SP – CHAMPION & CHUN

10

Simple

We don’t care about tiny differences in implementation, want the big picture result

Decisive

Produce a clear comparison indicating which code takes “longer”

Our goals:

11 of 29

Function growth

CSE 332 SU 18 - ROBBIE WEBER

11

…but since both are linear eventually look similar at large input sizes

whereas h(n) has a distinctly different growth rate

The growth rate for f(n) and g(n) looks very different for small numbers of input

But for very small input values h(n) actually has a slower growth rate than either f(n) or g(n)

Imagine you have three possible algorithms to choose between.

Each has already been reduced to its mathematical model

 

 

 

 

 

 

 

 

 

12 of 29

Definition: Big-O

CSE 332 SU 18 - ROBBIE WEBER

12

 

Big-O

 

 

 

We wanted to find an upper bound on our algorithm’s running time, but

-We don’t want to care about constant factors.

-We only care about what happens as n gets large.

13 of 29

Applying Big O Definition

CSE 332 SU 18 - ROBBIE WEBER

13

 

 

Show that

is

 

Big-O

Apply definition term by term

 

 

Add up all your truths

 

 

14 of 29

Exercise: Proving Big O

  •  

CSE 332 SU 18 - ROBBIE WEBER

14

 

Big-O

15 of 29

Writing Big-O Proofs

  •  

CSE 332 SU 18 - ROBBIE WEBER

15

16 of 29

Writing Big-O Proofs

  •  

CSE 332 SU 18 - ROBBIE WEBER

16

 

 

 

 

17 of 29

Note: Big-O definition is just an upper-bound, not always an exact bound

  •  

CSE 332 SU 18 - ROBBIE WEBER

17

 

 

 

 

18 of 29

Note: Big-O definition is just an upper-bound, not always an exact bound (plots)

What do we want to look for on a plot to determine if one function is in the big-O of the other?

You can sanity check that your g(n) function (the dominating one) overtakes or is equal to your f(n) function after some point and continues that greater-than-or-equal-to trend towards infinity

CSE 373 SP 22 - CHAMPION

18

If we want the most-informative upper bound, we’ll ask you for a simplified, tight big-O bound.

O(n^2 ) is the tight bound for the function f(n) = 10n2+15n. See the graph below – the tight big-O bound is the smallest upper bound within the definition of big-O.

If you zoom out a bunch, the your tight bound and your function will be overlapping compared to other complexity classes.

 

 

n3

n5

n4

10n2 + 15n

 

 

n2

10n2 + 15n

19 of 29

Questions?

CSE 373 SP 18 - KASEY CHAMPION

19

20 of 29

Uncharted Waters: a different type of code model

  •  

CSE 332 SU 18 - ROBBIE WEBER

20

 

Find a model f(n) for the running time of this code on input n. What’s the Big-O?

boolean isPrime(int n){

int toTest = 2;

while(toTest < n){

if(toTest % n == 0) {

return true;

} else {

toTest++;

}

}

return false;

}

Number of iterations?

-Smallest divisor of n

21 of 29

Prime Checking Runtime

CSE 332 SU 18 - ROBBIE WEBER

21

 

This is why we have definitions!

 

22 of 29

CSE 332 SU 18 - ROBBIE WEBER

22

 

 

 

 

23 of 29

Big-O isn’t everything

  •  

CSE 332 SU 18 - ROBBIE WEBER

23

 

 

 

Our prime finding code is O(n). But so is, for example, printing all the elements of a list.

Your experience running these two pieces of code is going to be very different.

It’s disappointing that the O() are the same – that’s not very precise.

Could we have some way of pointing out the list code always takes AT LEAST n operations?

24 of 29

 

CSE 332 SU 18 - ROBBIE WEBER

24

 

 

 

The formal definition of Big-Omega is the flipped version of Big-Oh.

When we make Big-Oh statements about a function and say f(n) is O(g(n)) we’re saying that f(n) grows at most as fast as g(n).

But with Big-Omega statements like f(n) is Ω(g(n)), we’re saying that f(n) will grows at least as fast as g(n).

Visually: what is the lower limit of this function?

What is bounded on the bottom by?

25 of 29

Big-Omega definition Plots

  •  

CSE 373 SP 18 - KASEY CHAMPION

25

 

 

2n3

n2

n

1

n3

26 of 29

 

CSE 332 SU 18 - ROBBIE WEBER

26

Note: this right graph’s tight O bound is O(n) and its tight Omega bound is Omega(n). This is what most of the functions we’ll deal with will look like, but there exists some code that would produce runtime functions like on the left.

f(n) = n

prime runtime function

 

 

 

 

27 of 29

O, and Omega, and Theta [oh my?]

Big Theta is “equal to”

    • My code takes “exactly”* this long to run
    • *Except for constant factors and lower order terms

CSE 332 SU 18 - ROBBIE WEBER

27

 

 

 

f(n) = n

To define a big-Theta, you expect the tight big-Oh and tight big-Omega bounds to be touching on the graph (meaning they’re the same complexity class)

28 of 29

O, and Omega, and Theta [oh my?]

  • Big-O is an upper bound
    • My code takes at most this long to run

  • Big-Omega is a lower bound
    • My code takes at least this long to run

  • Big Theta is “equal to”
    • My code takes “exactly”* this long to run
    • *Except for constant factors and lower order terms

CSE 332 SU 18 - ROBBIE WEBER

28

29 of 29

Examples

  • 4n2 Ω(1)
  • true
  • 4n2 Ω(n)
  • true
  • 4n2 Ω(n2)
  • true
  • 4n2 Ω(n3)
  • false
  • 4n2 Ω(n4)
  • false

CSE 332 SU 18 - ROBBIE WEBER

29

  • 4n2 O(1)
  • false
  • 4n2 O(n)
  • false
  • 4n2 O(n2)
  • true
  • 4n2 O(n3)
  • true
  • 4n2 O(n4)
  • true