If you will be unable to take the exam at the scheduled time, or if you have two or more other exams scheduled for the same day, please let me know as soon as possible. If you feel you need to take an incomplete in the class, please consult with me as soon as possible, and no later than the last class meeting on Tue. December 12. If you do not let me know in advance, and you do not show up to the exam, I will allow you to take a make-up only if you missed the exam due to a medical problem or dire family emergency. In such a case you will be expected to take the make-up as soon as possible. Under no circumstances will I grant an extended incomplete in the course if you do not arrange it with me prior to December 12.

The solutions to the exam will be posted on the class web site immediately after the exam. The exams will be graded and the course grades computed within a few days of the exam. Send me email if you want to know your grade.

Topics on the final exam include:

- State space search: Russell and Norvig chap 3 through sec. 3.5.
- Hill-climbing. R&N sec. 4.3, pp. 110-115.
- Adversarial game playing, game trees, alpha-beta pruning. R&N chap 6 though sec. 6.4.
- Logic:
- Propositional logic: Syntax, semantics, Davis-Putnam algorithm, use for combinatorial problems. R&N chap. 7 through 7.4, 7.6, handouts
- Predicate calculus (first-order logic): Syntax, use in expressing sentences, backward and forward chaining inference in Datalog. R&N chap. 8 through section 8.2, section 9.3 and the beginning of 9.4 (pp. 280-288). Handouts.

- Probabilistic reasoning: Foundations, Bayes law, Bayesian networks. R&N chap. 13, chap. 14 to p. 498.
- Natural language processing: General structure, parsing, semantics, ambiguity, probabilistic tagging and K-gram model. R&N chap. 22 sections 22.1-22.3, 22.5, 22.6, 23.1 pages 834-836, handouts.
- Machine learning:
- Overview. R&N chap 18 through 18.2.
- 1R algorithm. Handout.
- Nearest neighbors. R&N sec. 20.4 except "Kernel methods." pp. 733-5.
- Naive Bayes. Xerox from Mitchell.
- Neural networks: Perception and back-propagation. R&N sec. 20.5, pp. 730-748.
- Decision trees and ID3. R&N sec. 18.3, pp. 653-654. Handout.
- Evaluation of classification algorithms: Training and test set. R&N pp. 660-661.
- Minimum description length learning. Handout.

You should know the following algorithms well and tbe able to carry them out on the exam: Depth-first search, breadth-first search, iterative deepening, hill-climbing, alpha-beta pruning, conversion of propositional sentences to CNF, Davis-Putnam, forward and backward chaining in Datalog, chart parsing, 1R, nearest neighbors, Naive Bayes, classification using perceptrons and feed-forward.

You should know the ID3 algorithms in detail and be able to carry out any aspect of the algorithm that does not depend on actually calculating entropies.

You should know the general structure and issues involved in the following algorithms, but you need not memorize the fine details: Perceptron learning, and back-propagation learning.

The following topics were discussed or will be discussed in class, but will not be on the final exam: Entropy, information theory, clustering, any material on planning that I discuss in the last two lectures.