1. P => (Q <=> R).A. Convert this set to CNF. (You need not show the intermediate steps.)
2. Q <=> ~(R^W)
3. Q => (P^W).
B. Show how the Davis-Putnam assignment finds a satisfying assumption. (Assume that, when a branch point is reached, the algorithm chooses the first atom alphabetically and tries TRUE before FALSE.)
s(X,L) --- Person X speaks language L. c(X,Y) --- Persons X and Y can communicate. i(W,X,Y) --- Person W can serve as an interpreter between persons X and Y. j,p,e,f --- Constants: Joe, Pierre, English, and French respectively.A. Express the following statements in L:
Prob(A=T) = 0.8 Prob(B=T | A=T) = 0.5. Prob(B=T | A=F) = 1. Prob(C=T | B=T) = 0.1 Prob(C=T | B=F) = 0.5 A and C are conditionally independent given B.Evaluate the following terms. (If you wish, you can give your answer as an arithmetic expression such as "0.8*0.5 / (0.8*1 + 0.5*0.1)")
X Y Z C number of instances ----------------------------------- T T T T 7 T T F F 1 T F T T 2 F T T F 1 F F T F 8 F F F T 1A. What classifier does the 1R algorithm output? B. What classification does Naive Bayes predict for a test instance with X=F, Y=T, Z=F, and what probability does it give to that prediction? C. If the ID3 algorithm is run on this, the top-level node of the decision tree will be a test of X. Show the entire decision tree. (The remaining entropy comparisons are trivial.) D. Give an argument that no linear discriminator can perfectly classify this data set. (Hint: Compare the effect of attribute Z on C in going from the first line to the second line against its effect in going from the 5th line to the 6th line.)