CSE 373 Section 2
Algorithm Analysis
Summer 2022
(Grab a handout as you enter)
Big-O notation, or something…
nOtatiOn
Agenda for Today
Announcements
Questions?
IntelliJ debugger + checkstyle + group work
Review + Connection to Big-O
ADT
Data Structures
Queue
Array
Nodes
Stack
Array
Nodes
(Concepts, behaviors)
(Implementation)
Array; fast random access, resizing/extra space
Nodes; constant resizing/moving, slow access to the end
Etc. characteristics with different data structures…
Questions?
Basic Big-O Notation
Worse Runtime
...
“Steps” “ms” unit of time
Some Basic Rules
�Exponential dominates a polynomial.��Higher order polynomial dominates a lower order one.��Any polynomial dominates a log.��Log dominates constant.���Where does this come from?
We’ll show how to prove “dominations” later.�
Cn > nc
nc+1 > nc
Nc > logc(n)
logc(n) > C
Important!
In asymptotic analysis, we don’t care about leading constants or “smaller” terms.��Why? All of this assumes n → ∞.
Check using graphs!
Problem 1a, 1b: Comparing growth rates
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
5 min
Talk to each other!!
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
We want to find the complexity class of this overall function but the bases of these two logarithms are different. Are these two logarithms in the same complexity class? Does one dominate the other?
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Change of Base Formula:
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Ex: log4(n) = log2(n) / log2(4)
Note that log2(4) = 2. It’s just a constant and we don’t really care about constants when doing asymptotic analysis (since we assume n → ∞).
= 2
= ½ log2(n)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Using the Change of Base formula, we can convert a logarithm’s base to any number we want!
Ex: log4(n) ≈ log2(n)
All logarithms that differ only by a constant base are in the same complexity class! That’s why you’ll often see us omitting the base altogether.
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
= log2(n)/2 + log2(n) = 3/2 * log2(n)
∈ O(log(n))
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Drop constants!
≈ n ∈ O(n)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Drop constants!
≈ 2n ∈ O(2n)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
??
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
∈ O(1)
Remember that when doing asymptotic analysis, we’re only concerned with what happens to the runtime as n → ∞. Here, our runtime is a constant, which means that it doesn’t depend on our input size n. We use O(1) to represent the family/set of constant functions.
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Higher order polynomials dominate lower order ones. 4n2 dominates 8n as n → ∞ so that’s the term that determines the complexity class of the overall function.
∈ O(n2)
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1a
Simplify each of the following functions to a tight big-O bound in terms of n. Then order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Ordered from fastest growing to slowest:
O(2n)
≥ O(n2)
≥ O(n)
≥ O(log(n))
≥ O(1)
Note: Code in this complexity class takes longer to execute than the others!
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
2 min
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Tricky! Constant multipliers don’t matter in big-O notation, but they do in an exponent. Why? Because it’s not really a constant factor.
∈ O(2n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Tricky! Constant multipliers don’t matter in big-O notation, but they do in an exponent. Why? Because it’s not really a constant factor.
Ex: 22n = 2n⋅2n
The 2 in the exponent actually means multiplying by 2n, which we can’t ignore!
Multiplying exponents:
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
This is a different complexity class entirely!
Note: xn grows at a strictly faster rate than any yn where x > y.
∈ O(3n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Same thing. It’s it’s own complexity class.
∈ O(2n)
1b
State a tight big-O bound for each of the following functions and order them from the fastest growing to slowest growing (“fastest function” meaning the one that increases the most rapidly as n increases).
Ordered from fastest growing to slowest:
O(3n) ≥ O(2n) ≥ O(2n/2)
Questions?
The three notations: O, Omega, Theta
What’s the difference?
The Three Bounds
A Picture!
O(n log n)
Θ(n)
Θ(log n), Θ(1)
Θ(n^2)
Θ(2^n)
Θ(n!)
Ω(n log n)
Θ(n)
Θ(log n), Θ(1)
Θ(n^2)
Θ(2^n)
Θ(n!)
O(n log n)
Ω(n log n)
Asymptotically “Less than or equal to”
Asymptotically “Greater than or equal to”
If O = Ω, Θ???
How to get Θ?
f must be both O(g) and Ω(g) to be Θ(g)!�
Think of this as “squeezing” the function from above and below.
Problems 2c, 2d, 2e: True or false?
7 min
2c
If a function is in Ω(n), then it could also be in O(n²)
True: It’s possible for a function to both live “below” n2 but “above” n.�Example: n1.5
2d
If a function is in Θ(n), then it could also be in O(n²)
True: Any function that lives “on” the n line must also live “below” n2.
2e
If a function is in Ω(n), then it is always in O(n)
False: An Ω(n) function such as n2 might live in the orange, but it can never be considered O(n).
Problems 4a+b: Code Modelling
10 min
Problem 4a
Problem 4a
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a
int x = 0; Constant Work
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a
int x = 0;
for (int i= 0; i < n; i++) { Operates N times
for (int j = 0; j < n* n /3; j++) {
X += j;
}
}
Problem 4a
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) { Operates
X += j; (N^2)/3 times
}
}
Problem 4a
int x = 0;
for (int i= 0; i < n; i++) {
for (int j = 0; j < n* n /3; j++) {
X += j; Constant Work
}
}
Problem 4a
POSSIBLE RUNTIME:
T(N) = 1 + 1 x (N^2/3) x N = 1 + (N ^3) /3
WORST CASE RUNTIME:
O (N^ 3)
(N^3)/3 = ⅓ * (N^3)
DROP CONSTANT !
I like to work inside out; each layer being parentheses multiplied by outer layer
Questions?
Problem 4b
Problem 4b
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
int x = 0;
for (int i = n; i >= 0; i -= 1) {
if (i % 3 == 0) {
break;
} else {
x += n;
}
}
+1
+1
+1
+2
What about the for loop??
At most it can only run 3 times!
So the entire for loop is equivalent to 3 * the inside of the loop body!
3 * 4 =
+12
+4
Adding n 3x; will always be divisible by 3!!
Problem 4b
POSSIBLE RUNTIME:
T(N) = 13
WORST CASE RUNTIME:
O (1)
Code Modeling → Function
Function → Asymptotic Analysis → Big-O
Mathematical Definitions
Behind the notation
Strict Big-Oh definition
Harder Problems?
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) �b) 4 is O(5n2)
Problem 5a: Big-O Proof
Harder Problems?
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) �b) 4 is O(5n2)
First Subproblem
Want to show: 3n is O(5n2).��Start from this inequality:
3n ≤ c ⨉ 5n2��We need to pick two magic numbers.�
First, c: some constant to scale the right hand side (5n2) by.��Second, n0: some value where, once n ≥ n0, the inequality above is always true (for our chosen c).��
First Subproblem
First, c: some constant to scale the right hand side (5n2) by.��Idea: If you’re not sure what to pick, try making the left and right look “similar”.��3n ≤ c ⨉ 5n2
c: ???��n0: ???
First Subproblem
First, c: some constant to scale the right hand side (5n2) by.��Idea: If you’re not sure what to pick, try making the left and right look “similar”.��3n ≤ c ⨉ 5n2
Try: c = ⅗.
c: ???��n0: ???
First Subproblem
First, c: some constant to scale the right hand side (5n2) by.��Idea: If you’re not sure what to pick, try making the left and right look “similar”.��3n ≤ ⅗ ⨉ 5n2
Try: c = ⅗.
c: ???��n0: ???
First Subproblem
First, c: some constant to scale the right hand side (5n2) by.��Idea: If you’re not sure what to pick, try making the left and right look “similar”.��3n ≤ 3n2
Try: c = ⅗.
c: ⅗ ��n0: ???
First Subproblem
Second, n0: some value where, once n ≥ n0, the inequality we “locked in” by picking c is always true.��3n ≤ 3n2��
c: ⅗ ��n0: ???
Make this statement true
First Subproblem
Second, n0: some value where, once n ≥ n0, the inequality we “locked in” by picking c is always true.��3n ≤ 3n2�� This is obviously true when n ≥ 1!�� So pick n0 = 1.
c: ⅗ ��n0: 1
Where We Are
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2)
Where We Are
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2)
Second Subproblem
Want to show: 4 is O(5n2).��Start from this inequality:
4 ≤ c ⨉ 5n2����
c: ???��n0: ???
Second Subproblem
Want to show: 4 is O(5n2).��Start from this inequality:
4 ≤ c ⨉ 5n2��Try: c = ⅘. ��
c: ???��n0: ???
Second Subproblem
Want to show: 4 is O(5n2).��Start from this inequality:
4 ≤ ⅘ ⨉ 5n2��Try: c = ⅘. ��
c: ???��n0: ???
Second Subproblem
Want to show: 4 is O(5n2).��Start from this inequality:
4 ≤ 4n2��Try: c = ⅘. ��
c: ⅘ ��n0: ???
Second Subproblem
Want to show: 4 is O(5n2).��When is this inequality true?
4 ≤ 4n2����
c: ⅘ ��n0: ???
Second Subproblem
Want to show: 4 is O(5n2).��When is this inequality true?
4 ≤ 4n2��Clearly when n ≥ 1.��So n0 = 1.��
c: ⅘ ��n0: 1
Where We Are
Instead of proving:��3n + 4 is O(5n2) all at once,��Try proving big O (so producing a c and n0) for each subproblem first!��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
Now What?
Try to combine these sub-results for the complete proof!� ��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
Now What?
Try to combine these sub-results for the complete proof!� ��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
�a) 3n ≤ ⅗ ⨉ 5n2 when n ≥ 1
b) 4 ≤ ⅘ ⨉ 5n2 when n ≥ 1
Now What?
Try to combine these sub-results for the complete proof!� ��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
�a) 3n ≤ ⅗ ⨉ 5n2 when n ≥ 1
b) 4 ≤ ⅘ ⨉ 5n2 when n ≥ 1
3n + 4 ≤ ⅗ ⨉ 5n2 + ⅘ ⨉ 5n2 when n ≥ 1
Combine left side for original;
Combine right side for answer
Now What?
Try to combine these sub-results for the complete proof!� ��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
�a) 3n ≤ ⅗ ⨉ 5n2 when n ≥ 1
b) 4 ≤ ⅘ ⨉ 5n2 when n ≥ 1
3n + 4 ≤ (⅗ + ⅘ ) ⨉ 5n2 when n ≥ 1
Now What?
Try to combine these sub-results for the complete proof!� ��a) 3n is O(5n2) ✅ c: ⅗, n0: 1�b) 4 is O(5n2) ✅ c: ⅘, n0: 1
�a) 3n ≤ ⅗ ⨉ 5n2 when n ≥ 1
b) 4 ≤ ⅘ ⨉ 5n2 when n ≥ 1
3n + 4 ≤ (⅗ + ⅘ ) ⨉ 5n2 when n ≥ 1
3n + 4 is O(5n2)?��c = 7/5
n0 = 1
Questions?
Summary