Sample Exam Problems from 2nd Half of Course
The final exam will be given Monday, May 7 from 10:00 to 11:50 in
Warren Weaver, room 101. It is
closed book and closed notes.
The exam is cumulative, covering the material from the entire course.
Roughly 1/3 of the exam will be drawn from the first half of the
course, and the other 2/3 will be drawn from the second half.
Some sample problems from the second half of the course are given below.
Topics covered in second half of course:
- Game playing -- R+N secs. 5.1-5.4
- Machine Learning
- 1R algorithm --- handout
- Naive Bayes --- handout
- Decision trees --- R+N sec 18.3
- Perceptrons/Back propagation networks --- R+N secs 19.1-19.4
- Evaluation --- handout
- Minimum description length learning --- handout
- Automated reasoning
- Datalog -- handout. The textbook's coverage of automated logical inference
is very thorough (chaps 6-10); however, there's no section of it that can
be read separately that corresponds to what I've taught.
- Knowledge representation -- Look lightly over chap 8 of the textbook.
The exam will not contain any questions about detailed specifics here, only
general issues, along the lines of problem 2, problem set 7.
- Vision. R+N, chap 24
What is the result of doing alpha-beta pruning in the game tree shown
Name three conditions that must hold on a game for the technique of MIN-MAX
game-tree evaluation to be applicable.
A. Give an example of a decision tree with two internal nodes (including
the root), and explain how it classifies an example.
B. Give a high-level description of the ID3 algorithm to construct decision
trees from training data. You need not give the definition of
entropy or expected entropy.
C. What kinds of techniques can be used to counter the problem of over-fitting
in decision trees?
Consider the following data set with three Boolean predictive attributes,
W,X,Y and Boolean classification C.
W X Y C
T T T T
T F T F
T F F F
F T T F
F F F T
We now encounter a new example: W=F, X=T, Y=F. If we apply the Naive
Bayes method, what probability is assigned to the two values of C?
"Local minima can cause difficulties for a feed-forward, back-propagation
neural network." Explain. Local minima of what function of what arguments?
Why do they create difficulties?
Which of the following describes the process of task execution
(classifying input signal) in a feed-forward, back-propagation neural network?
Which describe the process of learning? (One answer is correct for each.)
- a. Activation levels are propagated from the inputs through the hidden
layers to the outputs.
- b. Activation levels are propagated from the outputs through the hidden
layers to the inputs.
- c. Weights on the links are modified based on messages propagated
from input to output.
- d. Weights on the links are modified based on messages propagated
from output to input.
- e. Connections in the network are modified, gradually shortening
the path from input to output.
- f. Weights at the input level are compared to the weights at the
output level, and modified to reduce the discrepancy.
Explain briefly (2 or 3 sentences) the use of a training
set and a test set in evaluating learning programs.
Explain how the minimum description length (MDL) learning theory
justifies the conjecture of
A. perfect classification hypotheses (i.e. classification
hypotheses that always give the correct classification, given the
values of the predictive attributes) for nominal classifications.
B. imperfect classification hypotheses (i.e. hypotheses that do better
than chance) for nominal attributes.
C. approximate classification hypotheses for numeric classifications.
(i.e. hypotheses that give answers that are nearly correct.)
Give an example of a Datalog program and a query that never returns an answer
interpreter uses a depth-first search, but that will
return an answer if the interpreter is modified to use an iterative
deepening search. (Hint: there is an example that uses one rule and one fact.)
Problem 10 Describe the use of edge detection and of
thresholding in low-level computer vision.
Describe briefly two ways in which texture can be used in vision