## Sample Exam Problems

### Problem 1:

Consider the following collection of sentences in the propositional logic.
1. P => (Q <=> R).

2. Q <=> ~(R^W)

3. Q => (P^W).

A. Convert this set to CNF. (You need not show the intermediate steps.)
B. Show how the Davis-Putnam assignment finds a satisfying assumption.
(Assume that, when a branch point is reached, the algorithm chooses the
first atom alphabetically and tries TRUE before FALSE.)

### Problem 2:

What is the result of doing alpha-beta pruning in the game tree shown
below?

### Problem 3:

Consider a domain where the individuals are people and languages.
Let L be the first-order language with the following primitives:
s(X,L) --- Person X speaks language L.
c(X,Y) --- Persons X and Y can communicate.
i(W,X,Y) --- Person W can serve as an interpreter between persons X and Y.
j,p,e,f --- Constants: Joe, Pierre, English, and French respectively.

A. Express the following statements in L:
- i. Joe speaks English.
- ii Pierre speaks French.
- iii. If X and Y both speak L, then X and Y can communicate.
- iv. If W can communicate both with X and with Y, then
W can serve as an interpreter between X and Y.
- v. For any two languages L and M, there is someone who speaks both
L and M.
- vi. There is someone who can interpret between Joe and Pierre.

B. Show how (vi) can be proven from (i)---(v) using backward-chaining
resolution. You must show the Skolemized form of each statement, and
every resolution that is used in the final proof. You need not show the
intermediate stages of Skolemization, or show resolutions that are
not used in the final proof.
### Problem 4:

Let A, B, C be Boolean random variables. Assume that
Prob(A=T) = 0.8
Prob(B=T | A=T) = 0.5.
Prob(B=T | A=F) = 1.
Prob(C=T | B=T) = 0.1
Prob(C=T | B=F) = 0.5
A and C are conditionally independent given B.

Evaluate the following terms. (If you wish, you can give your answer as an
arithmetic expression such as "0.8*0.5 / (0.8*1 + 0.5*0.1)")
- i. Prob(B=T).
- ii. Prob(A=T | B=T)
- iii. Prob(C=T | A=F).

### Problem 5:

Consider the following data set with three Boolean predictive attributes,
W,X,Y and Boolean classification C.
X Y Z C number of instances
-----------------------------------
T T T T 7
T T F F 1
T F T T 2
F T T F 1
F F T F 8
F F F T 1

A. What classifier does the 1R algorithm output?
B. What classification does Naive Bayes predict for a test instance with
X=F, Y=T, Z=F, and what probability does it give to that prediction?
C. If the ID3 algorithm is run on this, the top-level node of the decision
tree will be a test of X. Show the entire decision tree. (The remaining
entropy comparisons are trivial.)
D. Give an argument that no linear discriminator can perfectly classify
this data set. (Hint: Compare the effect of attribute Z on C in going from
the first line to the second line against its effect in going from
the 5th line to the 6th line.)
### Problem 6

Explain briefly (2 or 3 sentences) the use of a training
set, a validation set and a test set in evaluating
learning programs.