\documentclass[11pt]{article}
\pagestyle{empty}
\newcommand{\sig}{\sigma}
\newcommand{\ra}{\rightarrow}
\newcommand{\eps}{\epsilon}
\newcommand{\ah}{\alpha}
\newcommand{\ben}{\begin{enumerate}}
\newcommand{\een}{\end{enumerate}}
\newcommand{\hsa}{\hspace*{1cm}}
\newcommand{\hsb}{\hspace*{2cm}}
\newcommand{\NIL}{{\tt NIL}}
\newcommand{\So}{\\ {\tt Solution:}}
\begin{document}
\begin{center} {Comments on the Master Theorem of \S 4.5}
\end{center}
Let $a,b$ be positive integers and consider the recursion
\[ T(n)=bT(n/a) \]
with initial value $T(1)=1$. Lets only look at powers of $a$.
So $T(a)=bT(1)=b$, $T(a^2)=bT(a)=b^2$ and, in general $T(a^i)=b^i$.
But now -- how do we write $T(n)$ as a function of $n$?
Lets take the case $a=2$, $b=4$ so $T(2)=4$, $T(4)=16$, $T(8)=64$
and we see that the function is $T(n)=n^2$.
For general $a,b$ we do some manipulation.
Setting \[ n = a^i \]
we have
\[ T(n)=b^i = (a^{\log_ab})^i = (a^i)^{\log_ab} = n^{\log_ab} \]
(Here lets brush up on basic facts of logarithms. For any $a,b$
\[ \log_ab = \frac{\ln b}{\ln a} \]
but in many important cases this is an integer because $b$ is a
power of $a$, for example, $\log_28=3, \log_381=4$ and, a frequent
example, $\log_aa=1$ for any $a$. For other cases, like $a=2,b=7$,
$\log_27$ is a constant that can be easily found on your calculator
by the above formula.
This goes a long way to understanding the Master Theorem.
We look at the recursion (the initial conditions don't
affect things asymptotically)
\[ T(n)=bT(n/a) + f(n) \]
where we call $f(n)$ the {\em overhead}. Then, basically,
the Master Theory has three cases. Which case you are in
depends on how the overhead compares to what $T(n)$ would
be (namely, $n^{\log_ab}$) if there were no overhead.
\ben
\item {\tt Low Overhead}. Suppose $f(n)$ is substantially
smaller than $n^{\log_ab}$. Then the overhead does not
asymptotically affect the recursion and $T(n)=\Theta(n^{\log_ab})$.
\item {\tt Just Right Overhead}. Suppose $f(n)=\Theta(\log_ab)$.
(Perhaps surprisingly, this comes up in many important cases.)
Then the overhead has a logarithmic
affect and $T(n)=\Theta(n^{\log_ab}\lg n)$
\item {\tt High Overhead}. Suppose $f(n)$ is substantially
higher than $n^{\log_ab}$. Note then that $T(n)\geq f(n)$
from the recursion and so $T(n)$ could not possibly be
$\Theta(n^{\log_ab})$. In this case the overhead is the
main contributor to $T(n)$ and the result is $T(n)=\Theta(f(n))$.
\een
Comment: The word ``substantially" in describing Low and High
Overhead has a technical meaning which is given in the precise
statement of the Master Theorem 4.1 on Page 73. But we don't
want to get that deeply into it in this class. Basically, anytime
$f(n)$ is asymptotically higher (lower) than $n^{\log_ab}$
then it will be (in this class) substantially higher (lower).
But be sure to interpret asymptotically higher in Theta-land.
Suppose, for example,
\[ T(n) = 8T(n/2) + 7n^3+234n^2 \]
So $\log_28=3$. In Theta-Land $f(n)=7n^3+234n^2=\Theta(n^3)$
so we are in the Just Right Overhead case.
\end{document}