04/21/23 1
CS 3343: Analysis of Algorithms
Lecture 2: Asymptotic Notations
04/21/23 2
Outline
• Review of last lecture
• Order of growth
• Asymptotic notations– Big O, big Ω, Θ
04/21/23 3
How to express algorithms?
Nature language (e.g. English)
Pseudocode
Real programming languages
Increasing precision
Ease of expression
Describe the ideas of an algorithm in nature language.Use pseudocode to clarify sufficiently tricky details of the algorithm.
04/21/23 4
Insertion Sort
InsertionSort(A, n) {for j = 2 to n {
}
}1 j
sorted
1. Find position i in A[1..j-1] such that A[i] ≤ A[j] < A[i+1]2. Insert A[j] between A[i] and A[i+1]
▷ Pre condition: A[1..j-1] is sorted
▷ Post condition: A[1..j] is sorted
04/21/23 5
InsertionSort(A, n) {for j = 2 to n {
key = A[j];i = j - 1;while (i > 0) and (A[i] > key)
{A[i+1] = A[i];i = i – 1;
}A[i+1] = key
}}
Insertion Sort
1 i j
Keysorted
04/21/23 6
Use loop invariants to prove the correctness of loops
• A loop invariant (LI) is a formal statement about the variables in your program which holds true throughout the loop
• Proof by induction– Initialization: the LI is true prior to the 1st iteration– Maintenance: if the LI is true before the jth iteration, it
remains true before the (j+1)th iteration– Termination: when the loop terminates, the invariant
shows the correctness of the algorithm
04/21/23 7
Loop invariants and correctness of insertion sort
• Claim: at the start of each iteration of the for loop, the subarray consists of the elements originally in A[1..j-1] but in sorted order.
• Proof: by induction
04/21/23 8
Prove correctness using loop invariantsInsertionSort(A, n) {for j = 2 to n {
key = A[j];i = j - 1;▷Insert A[j] into the sorted sequence A[1..j-1]
while (i > 0) and (A[i] > key) {A[i+1] = A[i];i = i – 1;
}A[i+1] = key
}}
Loop invariant: at the start of each iteration of the for loop, the subarray A[1..j-1] consists of the elements originally in A[1..j-1] but in sorted order.
04/21/23 9
InitializationInsertionSort(A, n) {for j = 2 to n {
key = A[j];i = j - 1;▷Insert A[j] into the sorted sequence A[1..j-1]
while (i > 0) and (A[i] > key) {A[i+1] = A[i];i = i – 1;
}A[i+1] = key
}}
Subarray A[1] is sorted. So loop invariant is true before the
loop starts.
04/21/23 10
MaintenanceInsertionSort(A, n) {for j = 2 to n {
key = A[j];i = j - 1;▷Insert A[j] into the sorted sequence A[1..j-1]
while (i > 0) and (A[i] > key) {A[i+1] = A[i];i = i – 1;
}A[i+1] = key
}}
Assume loop variant is true prior to iteration j
1 i j
Keysorted
Loop variant will be true before iteration j+1
04/21/23 11
TerminationInsertionSort(A, n) {for j = 2 to n {
key = A[j];i = j - 1;▷Insert A[j] into the sorted sequence A[1..j-1]
while (i > 0) and (A[i] > key) {A[i+1] = A[i];i = i – 1;
}A[i+1] = key
}} 1 j=n+1
Sorted
Upon termination, A[1..n] contains all the original
elements of A in sorted order.
n
The algorithm is correct!
04/21/23 12
Efficiency
• Correctness alone is not sufficient– Brute-force algorithms exist for most problems– E.g. use permutation to sort– Problem: too slow!
• How to measure efficiency?– Accurate running time is not a good measure– It depends on input, computer, and
implementation, etc.
04/21/23 13
Machine-independent
• A generic uniprocessor random-access machine (RAM) model– No concurrent operations– Each simple operation (e.g. +, -, =, *, if, for)
takes 1 step.• Loops and subroutine calls are not simple
operations.
– All memory equally expensive to access• Constant word size• Unless we are explicitly manipulating bits
04/21/23 14
Analysis of insertion Sort
InsertionSort(A, n) {for j = 2 to n {
key = A[j]i = j - 1;while (i > 0) and (A[i] > key) {
A[i+1] = A[i]i = i - 1
}A[i+1] = key
}
}
How many times will this line execute?
04/21/23 15
Analysis of insertion Sort
InsertionSort(A, n) {for j = 2 to n {
key = A[j]i = j - 1;while (i > 0) and (A[i] > key) {
A[i+1] = A[i]i = i - 1
}A[i+1] = key
}
}
How many times will this line execute?
04/21/23 16
Analysis of insertion SortStatement cost time__
InsertionSort(A, n) {
for j = 2 to n { c1 n
key = A[j] c2 (n-1)
i = j - 1; c3 (n-1)
while (i > 0) and (A[i] > key) { c4 S
A[i+1] = A[i] c5 (S-(n-1))
i = i - 1 c6 (S-(n-1))
} 0
A[i+1] = key c7 (n-1)
} 0
}
S = t2 + t3 + … + tn where tj is number of while expression evaluations for the jth for loop iteration
04/21/23 17
Analyzing Insertion Sort• T(n) = c1n + c2(n-1) + c3(n-1) + c4S + c5(S - (n-1)) + c6(S - (n-1)) + c7(n-1)
= c8S + c9n + c10
• What can S be?– Best case -- inner loop body never executed
• tj = 1 S = n - 1 • T(n) = an + b is a linear function
– Worst case -- inner loop body executed for all previous elements
• tj = j S = 2 + 3 + … + n = n(n+1)/2 - 1• T(n) = an2 + bn + c is a quadratic function
– Average case• Can assume that on average, we have to insert A[j] into the
middle of A[1..j-1], so tj = j/2• S ≈ n(n+1)/4• T(n) is still a quadratic function
Θ (n)
Θ (n2)
Θ (n2)
04/21/23 18
Analysis of insertion SortStatement cost time__
InsertionSort(A, n) {
for j = 2 to n { c1 n
key = A[j] c2 (n-1)
i = j - 1; c3 (n-1)
while (i > 0) and (A[i] > key) { c4 S
A[i+1] = A[i] c5 (S-(n-1))
i = i - 1 c6 (S-(n-1))
} 0
A[i+1] = key c7 (n-1)
} 0
}What are the basic operations (most executed lines)?
04/21/23 19
Statement cost time__InsertionSort(A, n) {
for j = 2 to n { c1 n
key = A[j] c2 (n-1)
i = j - 1; c3 (n-1)
while (i > 0) and (A[i] > key) { c4 S
A[i+1] = A[i] c5 (S-(n-1))
i = i - 1 c6 (S-(n-1))
} 0
A[i+1] = key c7 (n-1)
} 0
}
Analysis of insertion Sort
04/21/23 20
Statement cost time__InsertionSort(A, n) {
for j = 2 to n { c1 n
key = A[j] c2 (n-1)
i = j - 1; c3 (n-1)
while (i > 0) and (A[i] > key) { c4 S
A[i+1] = A[i] c5 (S-(n-1))
i = i - 1 c6 (S-(n-1))
} 0
A[i+1] = key c7 (n-1)
} 0
}
Analysis of insertion Sort
04/21/23 21
What can S be?
• S = j=2..n tj
• Best case:• Worst case:• Average case:
1 i j
sorted
Inner loop stops when A[i] <= key, or i = 0
Key
04/21/23 22
Best case
• Array already sorted• S = j=2..n tj
• tj = 1 for all j• S = n-1 T(n) = Θ (n)
1 i j
sorted Key
Inner loop stops when A[i] <= key, or i = 0
04/21/23 23
Worst case
• Array originally in reverse order sorted• S = j=2..n tj
• tj = j• S = j=2..n j = 2 + 3 + … + n = (n-1) (n+2) / 2 = Θ (n2)
1 i j
sorted
Inner loop stops when A[i] <= key
Key
04/21/23 24
Average case
• Array in random order• S = j=2..n tj
• tj = j / 2 on average• S = j=2..n j/2 = ½ j=2..n j = (n-1) (n+2) / 4 = Θ (n2)
1 i j
sorted
Inner loop stops when A[i] <= key
Key
What if we use binary search? Answer: still Θ(n2)
04/21/23 25
Asymptotic Analysis
• Running time depends on the size of the input– Larger array takes more time to sort– T(n): the time taken on input with size n– Look at growth of T(n) as n→∞.
“Asymptotic Analysis”• Size of input is generally defined as the
number of input elements– In some cases may be tricky
04/21/23 26
Asymptotic Analysis
• Ignore actual and abstract statement costs• Order of growth is the interesting measure:
– Highest-order term is what counts• As the input size grows larger it is the high order term that
dominates
0 50 100 150 2000
1
2
3
4x 10
4
n
T(n
)
100 * n
n2
04/21/23 27
Comparison of functions
log2n n nlog2n n2 n3 2n n!
10 3.3 10 33 102 103 103 106
102 6.6 102 660 104 106 1030 10158
103 10 103 104 106 109
104 13 104 105 108 1012
105 17 105 106 1010 1015
106 20 106 107 1012 1018
For a super computer that does 1 trillion operations per second, it will be longer than 1 billion years
04/21/23 28
Order of growth
1 << log2n << n << nlog2n << n2 << n3 << 2n << n!
(We are slightly abusing of the “<<“ sign. It means a smaller order of growth).
04/21/23 29
Exact analysis is hard!
• Worst-case and average-case are difficult to deal with precisely, because the details are very complicated
It may be easier to talk about upper and lower bounds of the function.
04/21/23 30
Asymptotic notations
• O: Big-Oh
• Ω: Big-Omega
• Θ: Theta
• o: Small-oh
• ω: Small-omega
04/21/23 31
Big O
• Informally, O (g(n)) is the set of all functions with a smaller or same order of growth as g(n), within a constant multiple
• If we say f(n) is in O(g(n)), it means that g(n) is an asymptotic upper bound of f(n)– Intuitively, it is like f(n) ≤ g(n)
• What is O(n2)?– The set of all functions that grow slower than
or in the same order as n2
04/21/23 32
So: n O(n2)
n2 O(n2)
1000n O(n2)
n2 + n O(n2)
100n2 + n O(n2)
But:1/1000 n3 O(n2)
Intuitively, O is like ≤
04/21/23 33
small o
• Informally, o (g(n)) is the set of all functions with a strictly smaller growth as g(n), within a constant multiple
• What is o(n2)?– The set of all functions that grow slower than n2
So: 1000n o(n2)
But: n2 o(n2)
Intuitively, o is like <
04/21/23 34
Big Ω
• Informally, Ω (g(n)) is the set of all functions with a larger or same order of growth as g(n), within a constant multiple
• f(n) Ω(g(n)) means g(n) is an asymptotic lower bound of f(n)– Intuitively, it is like g(n) ≤ f(n)
So: n2 Ω(n) 1/1000 n2 Ω(n)
But:1000 n Ω(n2)
Intuitively, Ω is like ≥
04/21/23 35
small ω
• Informally, ω (g(n)) is the set of all functions with a larger order of growth as g(n), within a constant multiple
So: n2 ω(n)
1/1000 n2 ω(n)
n2 ω(n2)
Intuitively, ω is like >
04/21/23 36
Theta (Θ)
• Informally, Θ (g(n)) is the set of all functions with the same order of growth as g(n), within a constant multiple
• f(n) Θ(g(n)) means g(n) is an asymptotically tight bound of f(n)– Intuitively, it is like f(n) = g(n)
• What is Θ(n2)?– The set of all functions that grow in the same
order as n2
04/21/23 37
So: n2 Θ(n2) n2 + n Θ(n2) 100n2 + n Θ(n2)
100n2 + log2n Θ(n2)
But:nlog2n Θ(n2)1000n Θ(n2)1/1000 n3 Θ(n2)
Intuitively, Θ is like =
04/21/23 38
Tricky cases
• How about sqrt(n) and log2 n?
• How about log2 n and log10 n
• How about 2n and 3n
• How about 3n and n!?
04/21/23 39
Big-Oh
• Definition: O(g(n)) = {f(n): positive constants c and n0
such that 0 ≤ f(n) ≤ cg(n) n≥n0}
• lim n→∞ g(n)/f(n) > 0 (if the limit exists.)
• Abuse of notation (for convenience):f(n) = O(g(n)) actually means f(n) O(g(n))
There existFor all
04/21/23 40
Big-Oh
• Claim: f(n) = 3n2 + 10n + 5 O(n2)• Proof by definition
(Hint: to prove this claim by definition, we need to find some positive constants c and n0 such that f(n) <= cn2 for all n ≥ n0.) (Note: you just need to find one concrete example of c and n0 satisfying the condition, but it needs to be correct for all n ≥ n0. So do not try to plug in a concrete value of n and show the inequality holds.)Proof:3n2 + 10n + 5 3n2 + 10n2 + 5, n ≥ 1
3n2 + 10n2 + 5n2, n ≥ 1 18 n2, n ≥ 1
If we let c = 18 and n0 = 1, we have f(n) c n2, n ≥ n0. Therefore by definition, f(n) = O(n2).
04/21/23 41
Big-Omega
• Definition: Ω(g(n)) = {f(n): positive constants c and n0
such that 0 ≤ cg(n) ≤ f(n) n≥n0}
• lim n→∞ f(n)/g(n) > 0 (if the limit exists.)
• Abuse of notation (for convenience):f(n) = Ω(g(n)) actually means f(n) Ω(g(n))
04/21/23 42
Big-Omega
• Claim: f(n) = n2 / 10 = Ω(n)
• Proof by definition: f(n) = n2 / 10, g(n) = n
Need to find a c and a no to satisfy the definition of f(n) Ω(g(n)), i.e., f(n) ≥ cg(n) for n ≥ n0
Proof:
n ≤ n2 / 10 when n ≥ 10
If we let c = 1 and n0 = 10, we have f(n) ≥ cn, n ≥ n0. Therefore, by definition, n2 / 10 = Ω(n).
04/21/23 43
Theta
• Definition:– Θ(g(n)) = {f(n): positive constants c1, c2,
and n0 such that 0 c1 g(n) f(n) c2 g(n), n n0 }
• f(n) = O(g(n)) and f(n) = Ω(g(n))
• Abuse of notation (for convenience):f(n) = Θ(g(n)) actually means f(n) Θ(g(n))
Θ(1) means constant time.
04/21/23 44
Theta
• Claim: f(n) = 2n2 + n = Θ (n2)
• Proof by definition:– Need to find the three constants c1, c2, and n0
such that
c1n2 ≤ 2n2+n ≤ c2n2 for all n ≥ n0
– A simple solution is c1 = 2, c2 = 3, and n0 = 1
04/21/23 45
More Examples
• Prove n2 + 3n + lg n is in O(n2)• Need to find c and n0 such that
n2 + 3n + lg n <= cn2 for n ≥ n0
• Proof:n2 + 3n + lg n <= n2 + 3n2 + n for n ≥ 1
<= n2 + 3n2 + n2 for n ≥ 1 <= 5n2 for n ≥ 1
Therefore by definition n2 + 3n + lg n O(n2).(Alternatively: n2 + 3n + lg n <= n2 + n2 + n2 for n ≥ 10
<= 3n2 for n ≥ 10)
04/21/23 46
More Examples
• Prove n2 + 3n + lg n is in Ω(n2)
• Want to find c and n0 such that
n2 + 3n + lg n >= cn2 for n ≥ n0
n2 + 3n + lg n >= n2 for n ≥ 1
n2 + 3n + lg n = O(n2) and n2 + 3n + lg n = Ω (n2)
=> n2 + 3n + lg n = Θ(n2)
04/21/23 47
O, Ω, and Θ
The definitions imply a constant n0 beyond which they aresatisfied. We do not care about small values of n.
04/21/23 48
Using limits to compare orders of growth
0• lim f(n) / g(n) = c > 0
∞n→∞
f(n) o(g(n))
f(n) Θ (g(n))
f(n) ω (g(n))
f(n) O(g(n))
f(n) Ω(g(n))
04/21/23 49
logarithms
• compare log2n and log10n
• logab = logcb / logca
• log2n = log10n / log102 ~ 3.3 log10n
• Therefore lim(log2n / log10 n) = 3.3
• log2n = Θ (log10n)
04/21/23 50
• Compare 2n and 3n
• lim 2n / 3n = lim(2/3)n = 0
• Therefore, 2n o(3n), and 3n ω(2n)
• How about 2n and 2n+1?
2n / 2n+1 = ½, therefore 2n = Θ (2n+1)
n→∞ n→∞
04/21/23 51
L’ Hopital’s rule
• You can apply this transformation as many times as you want, as long as the condition holds
lim f(n) / g(n) = lim f(n)’ / g(n)’n→∞ n→∞
Condition:
If both lim f(n) and lim g(n) are or 0
04/21/23 52
• Compare n0.5 and log n
• lim n0.5 / log n = ?
• (n0.5)’ = 0.5 n-0.5
• (log n)’ = 1 / n• lim (n-0.5 / 1/n) = lim(n0.5) = ∞• Therefore, log n o(n0.5)• In fact, log n o(nε), for any ε > 0
n→∞
04/21/23 53
Stirling’s formula
nnn
ene
nnn
2/122!
!n nn en 2/1(constant)
04/21/23 54
• Compare 2n and n!
• Therefore, 2n = o(n!)
• Compare nn and n!
• Therefore, nn = ω(n!)
• How about log (n!)?
n
nnn
n
nnn e
nnc
e
nncn
2lim
2lim
2
!lim
0limlim!
lim nnnn
n
nnn e
nc
en
nnc
n
n
04/21/23 55
)log(loglog)!log( 2/1 nnn
n
enCe
nncn
)log(
log2
1)log
2(log
2
log2
1log
nn
nnnn
nn
C
nnnnC
04/21/23 56
More advanced dominance ranking
04/21/23 57
Asymptotic notations
• O: Big-Oh
• Ω: Big-Omega
• Θ: Theta
• o: Small-oh
• ω: Small-omega
• Intuitively:
O is like o is like <
is like is like >
is like =
Top Related