CSE 373 WI25 Section 2
Algorithm Analysis
Agenda
Announcements
Mon (1/13) | Tues (1/14) | Wed (1/15) | Thurs (1/16) | Fri (1/17) | Sat (1/18) | Sun(1/19) |
| |
| P0 due @ 11:59 pm | | | (P0 late due) @ 11:59 pm |
Mon (1/20) | Tues (1/21) | Wed (1/22) | Thurs (1/23) | Fri (1/24) | Sat (1/25) | Sun(1/26) |
| EX1 due @ 11:59 pm | | P1 due @ 11:59 pm | EX1 late due @ 11:59 pm | | P1 late due @ 11:59 pm |
MicroTeach: Basic Big O
Worse Runtime
...
Some Basic Rules
�Exponential dominates a polynomial.��Higher order polynomial dominates a lower order one.��Any polynomial dominates a log.��Log dominates constant.���Where does this come from?
We’ll show how to prove “dominations” later.�
Important!
In asymptotic analysis, we don’t care about leading constants or “smaller” terms.��Why? All of this assumes n → ∞.
Problem 1a, 1b: Comparing growth rates
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
We want to find the complexity class of this overall function but the bases of these two logarithms are different. Are these two logarithms in the same complexity class? Does one dominate the other?
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Change of Base Formula:
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Ex: log4(n) = log2(n) / log2(4)
Note that log2(4) = 2. It’s just a constant and we don’t really care about constants when doing asymptotic analysis (since we assume n → ∞).
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Ex: log4(n) ≈ log2(n)
All logarithms that differ only by a constant base are in the same complexity class! That’s why you’ll often see us omitting the base altogether.
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
= log2(n)/2 + log2(n) = 3/2 * log2(n)
∈ O(log(n))
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Drop constants!
≈ n ∈ O(n)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Drop constants!
≈ 2n ∈ O(2n)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
∈ O(1)
Remember that when doing asymptotic analysis, we’re only concerned with what happens to the runtime as n → ∞. Here, our runtime is a constant, which means that it doesn’t depend on our input size n. We use O(1) to represent the family/set of constant functions.
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Higher order polynomials dominate lower order ones. 4n2 dominates 8n as n → ∞ so that’s the term that determines the complexity class of the overall function.
∈ O(n2)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Ordered from fastest growing to slowest:
O(2n)
≥ O(n2)
≥ O(n)
≥ O(log(n))
≥ O(1)
Note: Code in this complexity class takes longer to execute than the others!
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Tricky! Constant multipliers don’t matter in big-O notation, but they do in an exponent. Why? Because it’s not really a constant factor.
∈ O(2n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Tricky! Constant multipliers don’t matter in big-O notation, but they do in an exponent. Why? Because it’s not really a constant factor.
Ex: 22n = 2n⋅2n
The 2 in the exponent actually means multiplying by 2n, which we can’t ignore!
Multiplying exponents:
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
This is a different complexity class entirely!
Note: xn grows at a strictly faster rate than any yn where x > y.
∈ O(3n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Same thing. It’s it’s own complexity class.
∈ O(2n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Ordered from fastest growing to slowest:
O(3n) ≥ O(2n) ≥ O(2n/2)
MicroTeach: O, Omega, Theta
The Three Bounds
A Picture!
O(n log n)
Θ(n)
Θ(log n), Θ(1)
Θ(n^2)
Θ(2^n)
Θ(n!)
Ω(n log n)
Θ(n)
Θ(log n), Θ(1)
Θ(n^2)
Θ(2^n)
Θ(n!)
O(n log n)
Ω(n log n)
Asymptotically “Less than or equal to”
Asymptotically “Greater than or equal to”
How to get Θ?
f must be both O(g) and Ω(g) to be Θ(g)!�
Think of this as “squeezing” the function from above and below.
Problems 2c, 2d, 2e: True or false?
2c
If a function is in Ω(n), then it could also be in O(n²)
True: It’s possible for a function to both live “below” n2 but “above” n.�Example: n1.5
2d
If a function is in Θ(n), then it could also be in O(n²)
True: Any function that lives “on” the n line must also live “below” n2.
2e
If a function is in Ω(n), then it is always in O(n)
False: An Ω(n) function such as n2 might live in the orange, but it can never be considered O(n).
Problems 4A+B: Code Modelling
Problem 4A
Problem 4a)
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a)
int x = 0; Constant Work
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a)
int x = 0;
for (int i= 0; i < n; i++) { Operates N times
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a)
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) { Operates
X += j; (N^2)/3 times
}
}
Problem 4a)
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
x += j; Constant Work
}
}
Problem 4a)
POSSIBLE RUNTIME:
T(N) = 1 + 1 x (N^2/3) x N = 1 + (N ^3) /3
WORST CASE RUNTIME:
O (N^ 3)
(N^3)/3 = ⅓ * (N^3)
DROP CONSTANT !
Problem 4B
Problem 4b)
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
+1
+1
+1
+2
What about the for loop??
At most it can only run 3 times!
So the entire for loop is equivalent to 3 * the inside of the loop body!
3 * 4 =
+12
+4
Problem 4b)
POSSIBLE RUNTIME:
T(N) = 13
WORST CASE RUNTIME:
O (1)
MicroTeach: Mathematical Definitions
Strict Big-Oh definition
Harder Problems?
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) �b) 4 is O(5n2)
What is a case?
A case is when there are different branches (if statements), not dependent on n, that happen as n approaches infinity.
You can analyze the runtime of cases separately from each other!
5. Case and Asymptotic Analysis
Question 1
Question 1
There’s an O(n^2) runtime when n is even, right?
And an O(n) runtime when n is odd!
Question 1
Question 2
Question 2
Question 4
Question 4
Question 4
Question 4
Question 4