If you will be unable to take the exam at the scheduled time, or if you have two or more other exams scheduled for the same day, please let me know as soon as possible. If you feel you need to take an incomplete in the class, please consult with me as soon as possible, and no later than the last class meeting on Thursday May 8. If you do not let me know in advance, and you do not show up to the exam, I will allow you to take a make-up only if you missed the exam due to a medical problem or dire family emergency. In such a case you will be expected to take the make-up as soon as possible. Under no circumstances will I grant an extended incomplete in the course if you do not arrange it with me prior to May 8.
The solutions to the exam will be posted on the class web site immediately after the exam. The exams will be graded and the course grades computed within a few days of the exam.
Topics on the final exam include:
Natural language processing: Context free grammar and probabilistic CFG.
CYK algorithm. Compositional semantics. Ambiguity resolution.
(Russell & Norvig chapter 23 sections 23.1 (all), 23.2 (the beginning section, not 23.2.1 or 23.2.2), plus handouts.)
State space search. State spaces. Depth-first search, breadth-first
search, iterative deepening, simple hill-climbing, sideways motion, random
(R&N Chap 3 through 3.4.1, 3.4.3-3.4.5; Chapter 4 through 4.1.1.)
AND/OR tree, MIN/MAX tree, alpha-beta pruning.
(R&N chapter 5 through 5.3 except 5.2.2)
Propositional logic. CNF. Translating sentences into CNF.
Davis-Putnam algorithm. Compiling problems into
satisfiability. Predicate calculus.
(R&N chapter 7 through 7.4; 7.6 through 7.6.1. Skim 7.7. R&N chapter 8 through 8.2. Handouts)
Probabilistic reasoning: Foundations, independence,
Bayes law, random variables,
expected value, maximum expected utility, decision trees (the decision analysis
(R&N chap. 13, chapters from Davis textbook on NYU Classes.)
You should know the following algorithms well and tbe able to carry them out on the exam: CYK parsing, depth-first search, breadth-first search, iterative deepening, hill-climbing, game tree evaluation with alpha-beta pruning, conversion of propositional sentences to CNF, Davis-Putnam backward chaining in Datalog, 1R, nearest neighbors, Naive Bayes, k-means clustering, agglomerative clustering.
You should know the ID3 algorithms in detail and be able to carry out any aspect of the algorithm that does not depend on actually calculating the "importance" of the attribute.
You should know the general structure and issues involved in the EM algorithm, but you need not memorize the fine details.
Any material introduced in class later than May 1 will not be on the final exam.