Sample problems from 2nd half of course

Let me emphasize that this is just a collection of sample problems, not a sample final exam.

Multiple choice problems

Problem 1

Consider the following CFG grammar:
S -> NP VP
NP -> NG | NG "and" NG
NG -> pronoun | noun
VP -> verb | verb NP | VP "and" VP

Lexicon:
I : pronoun.
cook : noun, verb
eggs : noun
fish : noun, verb.
Which of the following parse tree are correct:
 
i.  S ---> NP ---> NG ---> pronoun ---> I
       |
       |-> VP ---> verb ---> cook
               |
               |-> NP ---> NG ---> noun ---> eggs
                       |
                       |-> "and"
                       |
                       |-> NG ---> noun ---> fish


ii. S ---> NP ---> NG ---> pronoun ---> I
       |
       |-> VP ---> verb ---> cook
               |
               |-> NP ---> NG ---> noun ---> eggs
                       |
                       |-> "and"
                       |
                       |-> VP ---> verb ---> fish


iii.S ---> NP ---> NG ---> pronoun ---> I
       |
       |-> VP ---> VP ---> verb ---> cook
               |       |       
               |       |-> NP ---> NG ---> noun ---> eggs
               |               
               |-> "and"
               |
               |-> VP ---> verb ---> fish


iv. S ---> NP ---> NG ---> pronoun ---> I
       |
       |-> VP ---> verb ---> cook
               |       
               |-> NP ---> NG ---> noun ---> eggs
               |       
               |-> "and"
               |
               |-> VP ---> verb ---> fish
A. All four.
B. Only (i)
C. (i), (iii), and (iv).
D. (i) and (iii).
E. (i) and (iv).

Problem 2

In a chart parser, the "EXTENDER" module could combine edge [2,4,VP -> VG * NP] with
A. edge [2,4,VG -> modal verb *] to create edge [2,4,VP -> VG NP *]
B. edge [4,6,VG -> modal verb *] to create edge [2,6,VP -> VG NP *]
C. edge [2,6,VG -> modal verb *] to create edge [2,6,VP -> VG * NP]
D. edge[2,4,NP -> determiner noun *] to create edge [2,4,VP -> VG NP *]
E. edge[4,6,NP -> determiner noun *] to create edge [2,6,VP -> VG NP *]
F. edge[2,6,NP -> determiner noun *] to create edge [2,6,VP -> VG * NP]

Problem 3

Compositional semantics is

Problem 4

Bayes' Law states that

Problem 5

In a feed-forward, back-propagation network, learning proceeds by

Long Answer Problems

Problem 6:

Consider a domain where the individuals are people and languages. Let Z be the first-order language with the following primitives:
s(X,L) --- Person X speaks language L. 
c(X,Y) --- Persons X and Y can communicate.
i(W,X,Y) --- Person W can serve as an interpreter between persons X and Y.
j,p,m,e,f --- Constants: Joe, Pierre, Marie, English, and French respectively.

A. Express the following statements in Z:

B. Show how sentences (i), (ii), (iii), (v), and (vi) can be expressed in Datalog. (Hint: Sentences (i) and (v) each turn into two facts in Datalog.)

C. Explain why sentence (iv) cannot be expressed in Datalog.

D. Show how (vi) can be proven from (i), (ii), (iii) and (v) using forward chaining.

D. Show how (vi) can be proven from (i), (ii), (iii) and (v) using backward chaining.

Problem 7

A. What conditional probabilities are recorded in the Bayesian network?

B. For each of the following statements, say whether it is true or false in the above network:

  • B and C are independent absolutely.
  • B and C are independent given A.
  • B and C are independent given D.
  • A and D are independent absolutely.
  • A and D are independent given B.
  • A and D are independent given B and C. C. Show how Prob(B=T) can be calculated in terms of the probabilities recorded in the above network.

    Problem 8

    Consider the following pair of sentences:
    A. Joe wore a wool suit. ("suit" = pants and jacket)
    B. The suit is in the court. ("suit" = lawsuit).
    Explain how the disambiguation techniques of selectional restriction and frequency in context can be applied in these two sentences.

    Problem 9

    List the major modules of a natural language interpretation system and explain their function.

    Problem 10

    Consider the sentence "Hammers are for driving nails into surfaces." Name two words in this sentence that are lexically ambiguous. (There are at least four.) For each of these two words, describe a disambiguation technique which will choose the right interpretation over at least one of the wrong interpretations. Be specific.

    Problem 11

    In this problem and in problem 12, we consider a data set with three Boolean predictive attributes, A,B,C, and a Boolean classification, Z.

    A. Suppose that your data is completely characterized by the following rules:

    Construct a decision tree whose predictions correspond to these rules.

    B. True or false: Given any consistent set of rules like those above, it is possible to construct a decision tree that executes that set of rules. By "consistent", I mean that there are no examples where two different rules give different answers.

    Problem 12

    Which of the following expresses the independence assumption that is used in deriving the formula for Naive Bayesian learning, for the classification problem in problem 10.

    Problem 13

    Consider the following data set T. A and B are numerical attributes and Z is a Boolean classification.
          A   B   Z
          1   2   T
          2   1   F
          3   2   T
          1   1   F
    
    Find a set of weights and a threshhold that categorizes all this data correctly. (Hint: Sketch a graph of the instances in the plane where the coordinates are A and B.)