# Asymptotic Analysis

Bart Massey

### Performance of Algorithms:

• Algorithms run on instances

• Performance varies with instance size, "difficulty"

• Key performance measures: time, space (memory/storage)
• "big-O" notation family captures gross change with size
• Difficulty of problem is worst-case difficulty of instances

• Difficulty of instance is performance of best-possible-performing algorithm

### Big-O

• Upper bound on a function

• f(n) in O(g(n)) === exists c, n0 . forall n > n0 . 0 <= f(n) <= c * g(n)

• Example:

7*n2+n+1 in O(n2) because when n > 10 we find that 7*n2+n+1 <= 8*n2

• Note the restrictions, in particular "for sufficiently large n"

• Note "in" vs "=". There's a whole thing here.

• Common classes: O(1), O(lg n), O(n), O(n lg n), O(n2), O(nk), O(2n), O(nn)

### Big-O captures approximate scaling

• Assume f in O(n2) and f(5) = 37

• Then f(10) = 37 * 22 = 148 ?

• No! But we can assume that it is a reasonable upper bound

### Big-O and tractability

One of many tables of this kind of thing:

``````      n lg n   n lg n   n^2   n^3    2^n      n!

2   1        2     4     8      2        2
4   2        8    16    64     16       24
8   3       24    64   512    256   40,320
16   4       64   256  4096  65536   2.1e13
32   5      160  1024 32768  4.3e9   2.6e35
64   6      184  4096 2.6e5 1.8e19   1.3e89
128   7      896 16384 2.0e6 3.4e38  3.8e215
``````

### The Assumptions Of Big-O

• The big-O of an algorithm is independent of the machine it is implemented on

• The scaling at sizes of interest is the big-O scaling

• At least some representative instances scale like big-O

• Lower-order factors don't matter

• Roughly, f(n) in O(nk) is tractable, others are not

### big-Ω

• Lower bound on a function

• f(n) in Ω(g(n)) === exists c, n0 . forall n > n0 . 0 <= g(n) <= c * f(n)

### big-Θ

• Two-sided bound on a function

• f(n) in Θ(g(n)) === f(n) in O(g(n)) and f(n) in Ω(g(n))

### Asymptotic sums

• Sum is maximum: f1 in O(g1) and f2 in O(g2) => f1 + f2 in O(max(g1, g2))

• Also true for Ω, Θ

### What is "n"?

• Measure of instance size (info content)

• Example: squaring

• O(1): "numbers are atomic and CPUs multiply"
• O(|n|^2) = O((lg n)2): "numbers are in binary, multiplication uses constant-time additions, grade-school algorithm"
• O(n*|n|) = O(n lg n): "numbers are in binary, repeated-addition algorithm"

### Worst case, best case, average case

• When analyzing algorithms

• Best case is not usually useful

• Average case is hard to compute, only useful if know what instances to average over

• Worst case is the thing