Study sheet for final exam

The final exam will be given on Tuesday, December 18, from 12:00 to 1:50 in WWH 1013. NOTE CHANGE OF ROOM. The exam will be open book and open notes. It will be a cumulative exam, covering the entire semester, but it will emphasize the topics that were not covered on the mid-term. I will not ask about any material that I did not cover in class.

If you will be unable to take the exam at the scheduled time, or if you have two or more other exams scheduled for the same day, please let me know as soon as possible. If you feel you need to take an incomplete in the class, please consult with me as soon as possible, and no later than the last class meeting on Tue. December 11. If you do not let me know in advance, and you do not show up to the exam, I will allow you to take a make-up only if you missed the exam due to a medical problem or dire family emergency. In such a case you will be expected to take the make-up as soon as possible. Under no circumstances will I grant an extended incomplete in the course if you do not arrange it with me prior to December 11.

The solutions to the exam will be posted on the class web site immediately after the exam. The exams will be graded and the course grades computed within a few days of the exam. Send me email if you want to know your grade.

Topics on the final exam include:

You should know the following algorithms well and tbe able to carry them out on the exam: Recursive descent parsing, chart parsing, conversion of propositional sentences to CNF, Davis-Putnam, forward and backward chaining in Datalog, 1R, nearest neighbors, Naive Bayes, classification using perceptrons, feed-forward, k-means clustering, hierarchical agglomerative clustering.

You should know the ID3 algorithms in detail and be able to carry out any aspect of the algorithm that does not depend on actually calculating entropies.

You should know the general structure and issues involved in the following algorithms, but you need not memorize the fine details: Perceptron learning, back-propagation learning, support vector machine. What I want you to know about support vector machines is (a) that they compute the linear separator with maximal margin, which (generally) gives better predictions than any other linear separator; (b) that they can be used effectively in very high dimensional spaces.

Any material on planning that I will discuss in the remaining lectures will not be on the final exam.