Sample Exam Problems

Problem 1:

Name three conditions that must hold on a game for the technique of MIN-MAX game-tree evaluation to be applicable.

Problem 2:

What is the result of doing alpha-beta pruning in the game tree shown below?

Problem 3:

Consider a domain where the individuals are people and languages. Let L be the first-order language with the following primitives:
s(X,L) --- Person X speaks language L. 
c(X,Y) --- Persons X and Y can communicate. 
i(W,X,Y) --- Person W can serve as an interpreter between persons X and Y. 
j,p,e,f --- Constants: Joe, Pierre, English, and French respectively.
A. Express the following statements in L: B. Show how (vi) can be proven from (i)---(v) using backward-chaining resolution. You must show the Skolemized form of each statement, and every resolution that is used in the final proof. You need not show the intermediate stages of Skolemization, or show resolutions that are not used in the final proof.

Problem 4:

Let A, B, C be Boolean random variables. Assume that
Prob(A=T) = 0.8
Prob(B=T | A=T) = 0.5.
Prob(B=T | A=F) = 1.
Prob(C=T | B=T) = 0.1
Prob(C=T | B=F) = 0.5
A and C are conditionally independent given B.
Evaluate the following terms. (If you wish, you can give your answer as an arithmetic expression such as "0.8*0.5 / (0.8*1 + 0.5*0.1)")

Problem 5:

A. Give an example of a decision tree with two internal nodes (including the root), and explain how it classifies an example.
B. Describe the ID3 algorithm to construct decision trees from training data.
C. What is the entropy of a classification C in table T? What is the expected entropy of classification C if table T is split on predictive attribute A?
D. What kinds of techniques can be used to counter the problem of over-fitting in decision trees?

Problem 6:

Consider the following data set with three Boolean predictive attributes, W,X,Y and Boolean classification C.
  W   X   Y   C
----------------
  T   T   T   T
  T   F   T   F
  T   F   F   F
  F   T   T   F
  F   F   F   T
We now encounter a new example: W=F, X=T, Y=F. If we apply the Naive Bayes method, what probability is assigned to the two values of C?

Problem 7:

"Local minima can cause difficulties for a feed-forward, back-propagation neural network." Explain. Local minima of what function of what arguments? Why do they create difficulties?

Problem 8:

Which of the following describes the process of task execution (classifying input signal) in a feed-forward, back-propagation neural network? Which describe the process of learning? (One answer is correct for each.)

Problem 9

Explain briefly (2 or 3 sentences) the use of a training set and a test set in evaluating learning programs.