Overview
A recurrence relation specifies a sequence by expressing each term as a function of earlier terms. In algorithms, recurrences model running time, work, or number of subproblems as input size changes (e.g., halving in divide-and-conquer or subtracting a constant in iterative algorithms). Solving or bounding recurrences yields asymptotic costs like O(n log n) for merge sort or O(log n) for binary search.
This note surveys common recurrence forms, solution techniques (recursion trees, substitution/induction, characteristic equations), and practical guidance for using them to analyze algorithms.
Note
There are two broad flavors in CS notes:
Cost recurrences (e.g.,
T(n) = a T(n/b) + f(n)): model time or work.Value recurrences (e.g.,
F(n) = F(n−1) + F(n−2)): define combinatorial sequences or DP states. Techniques overlap but have different standard tools.
Motivation
When an algorithm reduces a problem into subproblems and combines solutions, its cost naturally depends on the cost of those subproblems. Recurrences encode this relationship:
-
Divide-and-conquer: split size
nintoasubproblems each of sizen/b, do combine workf(n). -
Incremental: reduce
nby a constant (n → n−1), add a per-step cost. -
Exponential branching: make choices at each step (backtracking/DFS), yielding
T(n) = T(n−1) + T(n−2)or similar.
They also appear in dynamic programming, where the value of a state depends on previously computed states.
Definition and Formalism
A recurrence for a sequence {T(n)} is an equation of the form
-
Linear, homogeneous with constant coefficients:
T(n) = c1 T(n−1) + c2 T(n−2) + … + ck T(n−k)withT(0..k-1)given.
-
Non-homogeneous (adds a function):
T(n) = c1 T(n−1) + … + ck T(n−k) + g(n).
-
Divide-and-conquer (multiplicative + additive):
T(n) = a T(⌈n/b⌉) + f(n)for integersa ≥ 1, b > 1.
Base cases pin down small inputs: T(1), T(0), or T(n0) for a threshold n0.
Tip
Always state base cases and domains (e.g., “assume
nis a power ofb” or “solve forn ≥ 1and extend with floors/ceilings”). Missing base cases cause incorrect constants or off-by-one mistakes.
Example or Illustration
Classic algorithms → recurrences
-
Merge sort:
T(n) = 2T(n/2) + Θ(n)→T(n) = Θ(n log n). -
Binary search:
T(n) = T(n/2) + Θ(1)→T(n) = Θ(log n). -
Strassen (matrix multiply):
T(n) = 7T(n/2) + Θ(n^2)→T(n) = Θ(n^{log₂7}) ≈ Θ(n^{2.807}). -
Insertion sort (worst-case):
T(n) = T(n−1) + Θ(n)→T(n) = Θ(n^2)by summation. -
Balanced tree height:
H(n) = H(⌊n/2⌋) + 1→H(n) = Θ(log n).
Properties and Relationships
Three canonical solution methods
-
Recursion tree / expansion Repeatedly expand the recurrence to visualize the work per level and sum a series. Great for intuition and quick upper/lower bounds.
-
Substitution (a.k.a. induction) method Guess a bound
T(n) ≤ c · h(n)and prove it by induction, choosing constants that make the inequality hold. Use for tight proofs, especially with awkwardf(n)or floors/ceilings. -
Master / Akra–Bazzi theorems Provide ready-made asymptotic bounds for
T(n) = ∑ a_i T(n/b_i) + g(n)under regularity conditions. See Recurrences — Master Theorem for the commonaT(n/b)+f(n)form and brief notes on Akra–Bazzi (handles multiple subproblem sizes and additive perturbations).
Linear recurrences with constant coefficients
For T(n) = α T(n−1) + β T(n−2):
-
Solve the characteristic equation
x^2 − αx − β = 0with rootsr1, r2. -
General solution:
T(n) = A r1^n + B r2^n(orT(n) = (A + Bn) r^nfor repeated root). -
Determine
A, Bfrom base cases.
This covers Fibonacci-like growth, geometric decays/growths, and many DP value recurrences.
Typical growth regimes for aT(n/b)+f(n)
Let p satisfy a · (1/b)^p = 1, i.e., p = log_b a. Compare f(n) with n^p:
-
Subcritical
f(n) = O(n^{p−ε}): total dominated by leaves →T(n) = Θ(n^p). -
Critical
f(n) = Θ(n^p log^k n): extra logarithmic factor →T(n) = Θ(n^p log^{k+1} n). -
Supercritical
f(n) = Ω(n^{p+ε})and regular: combine dominates →T(n) = Θ(f(n)).
Note
The “regularity condition” usually requires
a f(n/b) ≤ c f(n)for somec < 1and largento prevent pathological oscillations. See the Master/Akra–Bazzi page for precise statements.
Implementation or Practical Context
Turning a recurrence into code
-
Cutoff thresholds: Many real sorts (quick/merge/introsort) cut over to Insertion Sort below a small
kto reduce overhead; this changes base cases and constants but not leading asymptotics. -
Memoization vs naive recursion: For value recurrences like
F(n)=F(n−1)+F(n−2), naive recursion isΘ(φ^n), but memoization or bottom-up DP solves it inΘ(n). The recurrence doesn’t fix the algorithm’s complexity; implementation choices do. -
Parallelism: If
asubproblems are independent, a span (critical path) recurrenceS(n) = S(n/b) + polylog(n)may be much smaller than total workT(n). Analyze work and span separately for parallel algorithms.
Workflow for analyzing a new algorithm
-
Write the clean recurrence (state base cases, assume smooth sizes).
-
Sanity-check bounds by plugging in extremes:
-
If all combine work vanished (
f(n)=0), doesT(n)drop to leaf costΘ(n^{log_b a})? -
If no subproblems (
a=0), doesT(n)equal puref(n)?
-
-
Pick a method:
-
Recursion tree for a quick picture,
-
Master/Akra–Bazzi for standard forms,
-
Substitution for custom
f(n)or when regularity is unclear.
-
-
Repair floors/ceilings with slack (replace
n/bby⌈n/b⌉and absorb constants). -
Document assumptions (e.g.,
na power ofb,fmonotone), then remove them with padding or monotonicity arguments.
Tip
In substitution proofs, include a negative slack like
−c'nin your guess to absorb rounding and base-case noise: e.g., guessT(n) ≤ c n log n − c'n. This often makes the inequality go through.
Common Misunderstandings
Warning
Confusing value vs cost recurrences.
F(n)=F(n−1)+F(n−2)describes a value. The time for naive recursion follows a different recurrenceT(n)=T(n−1)+T(n−2)+O(1)(exponential). Don’t conflate them.
Warning
Ignoring base cases. Choosing
T(1)=Θ(1)vsT(2)=Θ(1)changes constants and can shift small-nbehavior, affecting where a hybrid algorithm should switch strategies.
Warning
Overusing the Master Theorem. It doesn’t apply to every
f(n)(e.g., non-polynomial oscillations, negative terms) or to multiple distinct subproblem sizes unless generalized (Akra–Bazzi).
Warning
Forgetting floors/ceilings. Proofs assuming
ndivisible bybmust be patched; otherwise the bound may fail at boundary sizes.
Warning
Assuming tightness from a single bound. A recursion tree can give an upper bound; you still need a matching lower bound (or a theorem guaranteeing tightness) for
Θ(·).
Broader Implications
Recurrences tie algorithm structure to asymptotic behavior. Better splits or cheaper combine steps translate into provable improvements (a, b, and f(n) shape). They also clarify engineering trade-offs:
-
Changing pivot selection in quicksort alters the distribution of subproblem sizes (affecting the expected recurrence).
-
Introducing cutoffs and cache-aware merges modifies
f(n)and base cases, explaining empirical speedups without changingΘclass. -
For parallel algorithms, distinguishing work vs span recurrences guides depth-first scheduling and grain size.
Summary
Recurrence relations are the lingua franca for analyzing recursive and incremental algorithms. For aT(n/b)+f(n), compare f(n) with n^{log_b a} to classify regimes; use recursion trees for intuition, substitution for rigorous bounds, and Master/Akra–Bazzi when applicable. For linear constant-coefficient value recurrences, solve via characteristic equations. Always state base cases, handle floors/ceilings, and document assumptions. With these tools, you can translate algorithm structure directly into asymptotic performance guarantees.