Master Theorem

In the study of Divide and Conquer Algorithms, a recurrence tree can be used to determine the runtime complexity. These notes focus on the master theorem, a blueprint for solving any recurrence of the form

\[ T(n) = aT(n/b) + f(n). \]

  • \(n\) is the size of the problem,
  • \(a \geq 1\) is the number of subproblems,
  • \(b > 1\) is the factor by which the problem size is reduced, and
  • \(f(n)\) is the cost of the work done outside of the recursive calls.

Each recurrence is solved in \(T(n/b)\) time, and \(f(n)\) would include the cost of dividing and recombining the problem. The full theorem as described in Introduction to Algorithms is restated below (Cormen et al. 2022).

Master Theorem

Let \(a > 0\) and \(b > 1\) be constants, and let \(f(n)\) be a driving function that is defined and nonnegative on all sufficiently large reals. Define the recurrence \(T(n)\) on \(n \in \mathbb{N}\) by

\[ T(n) = aT(n/b) + f(n), \]

where \(aT(n/b)\) actually means \(a’T(\lfloor n/b \rfloor) + a{’’}T(\lceil n / b \rceil)\) for some constants \(a’ \geq 0\) and \(a’’ \geq 0\) such that \(a = a’ + a’’\). Then \(T(n)\) has the following asymptotic bounds:

  1. If \(f(n) = O(n^{\log_b a - \epsilon})\) for some constant \(\epsilon > 0\), then \(T(n) = \Theta(n^{\log_b a})\).
  2. If \(f(n) = \Theta(n^{\log_b a} \log^k n)\) for some constant \(k \geq 0\), then \(T(n) = \Theta(n^{\log_b a} \log^{k+1} n)\).
  3. If \(f(n) = \Omega(n^{\log_b a + \epsilon})\) for some constant \(\epsilon > 0\), and if \(a f(n/b) \leq k f(n)\) for some constant \(k < 1\) and all sufficiently large \(n\), then \(T(n) = \Theta(f(n))\).

Theorem Breakdown

The common function \(n^{\log_b a}\) is called the watershed function. The driving function \(f(n)\) is compared to it to determine which case applies. If the watershed function grows at a faster rate than \(f(n)\), case 1 applies. If they grow at the same rate, case 2 applies. If \(f(n)\) grows at a faster rate, case 3 applies.

In case 1, the watershed function should grow faster than \(f(n)\) by a factor of \(n^\epsilon\) for some \(\epsilon > 0\). In case 2, technically the watershed function should grow at least the same rate as \(f(n)\), if not faster. That is, it grows faster by a factor of \(\Theta(\log^k n)\), where \(k \geq 0\). You can think of the extra \(\log^k n\) as an augmentation to the watershed function to ensure that they grow at the same rate. In most cases, \(k = 0\) which results in \(T(n) = \Theta(n^{\log_b a} \log n)\).

Since case 2 allows for the watershed function to grow faster than \(f(n)\), case 3 requires that it grow at least polynomially faster. \(f(n)\) should grow faster by at least a factor of \(\Theta(n^\epsilon)\) for some \(\epsilon > 0\). Additionally, the driving function must satisfy the regularity condition \(a f(n/b) \leq k f(n)\) for some constant \(k < 1\) and all sufficiently large \(n\). This condition ensures that the cost of the work done outside of the recursive calls is not too large.

Application of the Master Method

In most cases, the master method can be applied by looking at the recurrence and applying the relevant case. If the driving and watershed functions are not immediately obvious, you can use a different method as discussed in Divide and Conquer Algorithms.

When this can be applied, it is much simpler than the other methods. Let’s revisit some of the main problems we’ve explored before discussing applications for which the master method could not be used.

Example: Merge Sort

Merge Sort has a recurrence of the form \(T(n) = 2T(n/2) + \Theta(n)\). The driving function is \(f(n) = \Theta(n)\). The constants \(a\) and \(b\) are both 2, so the watershed function is \(n^{\log^2 2}\), which is \(n\). Since \(f(n)\) grows at the same rate as the watershed function, case 2 applies. Therefore, \(T(n) = \Theta(n \log n)\).

Example: Matrix Multiplication

The recurrence of the divide and conquer version of matrix multiplication for square matrices is \(T(n) = 8T(n/2) + \Theta(1)\). Given \(a = 8\) and \(b = 2\), we can see that the complexity is inherent in the recurrence, not the driving function. The watershed function is \(n^{\log_2 8}\), which is \(n^3\). This grows at a faster rate than \(\Theta(1)\), so case 1 applies. Therefore, \(T(n) = \Theta(n^3)\).

Example: Median Finding

Median finding has a recurrence of the form \(T(n) = T(n/5) + T(7n/10) + \Theta(n)\). Given the two recurrence factors, how do we evaluate the driving function? The form itself does not fit the master theorem, so it cannot be applied in this case. We could use the substitution method, recurrence trees, or the Akra-Bazzi theorem to solve this one.

Example: Cormen et al. Exercise 4.5-2

In this exercise from Introduction to Algorithms, we are asked to find the largest integer value \(a\) such that an algorithm with the recurrence \(T(n) = aT(n/4) + \Theta(n^2)\) is asymptotically faster than \(\Theta(n^{\log_2 7})\). Since \(b = 4\), the largest integer \(a\) will be the smallest integer such that \(\log_4 a < \log_2 7\). Solving for the inequality shows that \(a = 48\) is the largest such integer.

References

Cormen, Thomas H., Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. 2022. Introduction to Algorithms. 4th ed. MIT Press. http://mitpress.mit.edu/9780262046305/introduction-to-algorithms/.
Alex Dillhoff
Senior Lecturer

"If we understood the world, we would realize that there is a logic of harmony underlying its manifold apparent dissonances." - Jean Sibelius

Related