# Recurrences and Recursive Algorithms

Bart Massey

### Extending The Previous Example

UNIQUE ELEMENTS IN ORDER

Given: An array a

Find: An array that is the permutation of the set of elements of a in the order of occurrence in a

ALGORITHM COPY-IN

To find the unique elements of a in order:
i ← 1
j ← 2
while j ≤ |a|
for k in 1..i
if a[j] = a[k]
j ← j + 1
continue while loop
i ← i + 1
a[i] ← a[j]
j ← j + 1
return a[1..i]

``````#!/usr/bin/python3

from random import randrange

# Produce the minimal in-order unique sublist of an input list.
def nub(a):
i = 1
j = 1
def duplicate():
for k in range(i):
if a[j] == a[k]:
return True
return False
while j < len(a):
if not duplicate():
a[i] = a[j]
i += 1
j += 1
return a[:i]

# Produce a random list of small integers.
def test_case():
n = randrange(10)
return [randrange(5) for _ in range(n)]

for _ in range(10):
a = test_case()
print(a, end=" ")
a = nub(a)
print(a)
``````

### High Hierarchy

• It would be easy to argue with O(nn) =/= O(n!) or O(2n) =/= O(3n). I did that once, not so long ago.

• The definition is pretty clear . Which is sad, because it is kind of useless for these larger classes.

• Thus, instead of using these complexity classes, we usually define EXPTIME = O(2p) where p(n) is some function O(nk).

### Recurrence Relations

[Much of this material is fairly liberally borrowed from Daniel Leblanc's lecture notes.]

• Describe the next element of a sequence in terms of preceding elements

• For example

``````  a[0] = 0; a[i+1] = a[i] + 1
a[0] = 1; a[i+1] = 2 a[i]
a[0] = 0; a[i+1] = a[i] + i+1
a[0] = 1; a[1] = 1; a[i+2] = a[i+1] + a[i]
``````
• Can also write these backward:

``````  a[0] = 0; a[i] = a[i-1] + 1
a[0] = 1; a[i] = 2 a[i-1]
a[0] = 0; a[i] = a[i-1] + i
a[0] = 1; a[1] = 1; a[i] = a[i-1] + a[i-2]
``````
• Can also write explicit form:

``````  a[i] = i
a[i] = 2^i
a[i] = sum(j=0..i, j) = i (i + 1) / 2
a[i] = fibonacci(i)
``````

### Recurrence Relations Reveal Recursive Runtimes

• Consider the following recursive algorithm for summing the weight of leaves of a tree.

``````   ALGORITHM TOTAL LEAF WEIGHT

leaf-weight(t):
if t is a leaf
return t.weight
s <- 0
if t.left exists
s <- leaf-weight(t.left)
if t.right exists
s <- s + leaf-weight(t.right)
return s
``````
• To find the worst-case runtime of this algorithm for a tree of depth d, note that in the worst case

``````   T[1] = 0
T[d] = 2 T[d-1] + 1
``````
• Solve this recurrence to get an explicit form

``````   T[d+1] = 2^d - 1
``````
• There's the worst-case complexity O(2d)

### Example: Binary Search

``````    ALGORITHM BINARY SEARCH

binary-search(a, k, l, h):
if h < l
fail
m <- (l + h) div 2
if k < a[m]
return binary-search(a, k, l, m-1)
if k > a[m]
return binary-search(a, k, m+1, h)
return m

T[0] = 0
T[n] = T[n div 2] + 1

T[n] = lg (n+1)
``````

### Example: Towers of Hanoi

• T[1] = 1; T[n] = 2 T[n-1] + 1

• T[n+1] = 2n - 1

• Describes a "general" case for solving recurrences.

• Skiena doesn't mess with it, so why should we? You can look it up if you decide you care. It's not too hard to apply.

• Main result: O(n) work plus even splitting gives O(n lg n).

### Example: Merge Sort

``````    ALGORITHM MERGE SORT

merge-sort(a):
if |a| <= 1
return a
a1, a2 <- split(a)
a1 <- merge-sort(a1)
a2 <- merge-sort(a2)
return merge(a1, a2)

T[1] = 0; T[n] = 2 T[n/2] + Θ(n)
``````
• T[n] in Θ(n lg n)

### Example

• Consider Dijkstra's GCD Algorithm

``````  ALGORITHM GCD

gcd(m, n):
if m == n then return m
else if m > n then return gcd(m-n, n)
else return gcd(m, n-m)
``````
• Does this algorithm always terminate?

• What is a worst-case running time for this algorithm?

• How could this algorithm be rewritten iteratively?