## Fundamental Algorithms, G22.1170, Fall 2005, Yap

```
The following reading recommendations will grow with
the weekly lecture and based on recitation feedback.

does not require full understanding, but gives you a general
orientation.  Just read as far as you can, starting from the
introduction (even 10 minutes of reading just before class would
be helpful).  Even learning the few buzz words on the topic will
give you the ``heads up'' for understanding the lectures.

We encourage doing exercises on your own:
a paper and pencil is an important accessory in reading our notes!
We would be happy to discuss your solutions or difficulties
in person or in recitation.

CHAPTER 1: INTRODUCTION

Most of this chapter is for general orientation.

However, the asymptotic notation in Section 8 must be mastered.

Also, the Comparison Tree Model in Section 3, and the
related Information Theoretic Bound in Section 5 is required.

Use Appendix A for general reference in this course:  But we suggest
looking over the Appendix once, to familiarize yourself with
some basic mathematical notations and concepts that we will use.

CHAPTER 2: RECURRENCES

1. Simplification 	(easy)
2. Karatsuba Multiplication (fun)
3. Rote Method 		(easy)
4. Real Induction	(tough going)
5. Basic Sum		(use as reference)
6. Standard Form	(very useful)
10.2 Master Theorem	(moderate)

The proof of Master Theorem uses two
techniques from the following sections:

7. Domain Transform 	(easy)
8. Range Transform 	(easy)

CHAPTER 3: BALANCED SEARCH TREES

(updated Oct 8)
1. Read casually Sections 1 and 2.
Be familiar with the definitions of
Dictionary and Priority Queues (p. 4).
2. Read Sections 3: this section is essential, but
it is mostly a review of basic Binary Search Trees (BST).
3. Read Section 4: AVL Trees.

CHAPTER 4: PURE GRAPH ALGORITHMS

(updated Oct 13)
2. Read Section 2 for overview.  Some of the
concepts are not needed,  so use it for later reference.
(Appendix in CHAPTER 1 also has some graph notations)
3. Sections 3 is easy background on graph representation.
4. Sections 4 and 5 is the heart of this chapter, on BFS and DFS.
5. Sections 6 and 7 are slightly more advance, and we
will cover some selection from here.

CHAPTER 5: GREEDY METHOD

(updated Oct 26)
Topics: linear bin packing, activities selection,
Huffman code algorithm,
dynamic Huffman code algorithm, Minimum spanning tree
You may skip Section 5 (on matroids).

CHAPTER 6: AMORTIZATION METHOD

(updated Nov 16)
Read Sections 1,2 and 4-7 and 9.
In particular, you do not need to read the
analysis of Splay Trees.  But we like you to
understand its application to convex hulls.

For Fibonacci heaps, you
need to know the amortized complexity bounds
when Fibonacci heaps are applied to Prim's algorithm.

CHAPTER 7: DYNAMIC PROGRAMMING

(prelim guide)
Study Sections 1, 2 and 4.
We recommend reading Section 3 (more lightly).

CHAPTER 14: MINIMUM COST PATHS

Just read sections 3 (Dijstra) and 6 (Floyd Warshall).

CHAPTER 30: NP-COMPLETENESS

Read all of it.  Focus on how to do reducibility.

```