## Final Exam: Outline

The final exam will be given Thursday, May 7, from 7:00 to 9:00 in
Warren Weaver, room 317; and Monday May 11 from 7:00 to 9:00 in room 202. It is
closed book and closed notes.

### Topics covered

- Search
- Blind search -- R+N chap 3 through 3.5.
- Informed search --- R+N secs 4.3.

- Game playing -- R+N chapter 6 through 6.4.
- Automated reasoning -- R+N chaps 7 through 7.6; chapter 8 through
8.3; chapter 9; handouts.
- Propositional Calculus
- Davis-Putnam algorithm
- Predicate Calculus
- Resolution theorem proving
- Horn theories; backward and forward chaining.

- Probabilistic reasoning. R&N chap. 13.
- Machine learning:
- Overview. R&N chap 18 through 18.2.
- 1R algorithm. Handout.
- Nearest neighbors. R&N sec. 20.4 except "Kernel methods." pp. 733-5.
- Naive Bayes.
- Decision trees and ID3. R&N sec. 18.3, pp. 653-660. Handout.
- Linear classifiers
- Neural networks: Perceptron and back-propagation. R&N sec. 20.5,
pp. 730-748.
- Evaluation of classification algorithms: Training and test set.
R&N pp. 660-661.
- Clustering
- Minimum description length learning. Handout.

You should know the following algorithms well enough to carry them out:
depth-first search; breadth-first search; iterative deepening; hill-climbing;
game tree evaluation with alpha-beta pruning; Davis-Putnam algorithm;
conversion to clausal form for propositional calculus and predicate calculus;
resolution theorem proving for
predicate calculus; backward-chaining and
forward-chaining over
Horn clauses; nearest neighbors learning,
Naive Bayes learning; 1R learning; ID3 learning (though
I will not give you any problem that involves computing entropies); k-means
clustering.
You should understand the following algorithms well, though I would not
ask you to carry them out on an exam: hill-climbing with sideways motion
and/or random restart; simulating annealing; perceptron learning and
back-propagation.