Consider a recurrence relation of the following structure.

$T(n)≤aT(bn )+f(n)$This equation and its coefficients tells us a lot about the algorithm. Each letter represents a part of how the algorithm works.

- $a$ tells us the number of subproblems we divide into.
- $bn $ is the size of each of the subproblems. We can have multiple $T(bn )$ terms in a single relation.
- $f(n)$ is the time it takes to recombine and/or divide the the subproblems. $f(n)∈O(n)$, and so we can say $f(n)=c_{1}n_{d}$ for some constant $c_{1}$ and $d$.

Finally, $T(n)$ would then tell us the overall running time for a size $n$.

## Intuition and results

The master theorem tells us about the balance between the number of subproblems (and the size of each), and the time it takes to recombine. They both fight for runtime.

- $f(n)=n_{d}≪n_{log_{b}a}$. In this case, the cost to recombine the subproblems is cheaper than the number of subproblems. Then, $T(n)=O(n_{log_{b}a})$.
- $f(n)=n_{d}≫n_{log_{b}a}$. Here, the cost to recombine is expensive, and will dominate. Therefore, we have $T(n)=O(n_{d})$.
- $f(n)=n_{d}=n_{log_{b}a}$. Here, the cost of recombining and the size of the subproblems have the “same weight” and one does not dominate the other. Therefore, we can’t ignore either and $T(n)=O(n_{d}g_{a}n)$.

## Proof

Draw a tree. At level $0$, we have one node of size $n$. At level 1, we have $a$ nodes of size $bn $. At level 2, we have $a_{2}$ nodes of $b_{2}n $, and so on.

The height $h$ of this tree satisfies the equation $b_{h}n =1$. Solving this we have $h=g_{b}n$. We get the total cost by summing the cost of each level, for each node:

$T(n)=i=0∑h a_{i}⋅(b_{i}n )_{d}=n_{d}i=0∑h (b_{d}a )_{i}$Once again, we can consider the three cases: $a<b_{d}$, $a=b_{d}$, and $a=b_{d}$. The other important part to realize is that the summation we have (after pulling out $n_{d}$) is a *geometric series*. This tells us how the summation will evaluate, and which term in the summation will dominate the others.