Practice Midterm Exam for Artificial Intelligence
Midterm: Thursday, October 22. Closed book, closed notes.
To study:
Overview of AI: Russell and Norvig, chaps 1,2
Prolog: Bratko chaps 1-3,5 Prolog handout
Natural Language Processing:
Russell and Norvig, chaps 22, 23, section 24.7
Bratko chap 17 (for the general idea, not for implementation details)
Handouts on Prolog syntactic parser, semantic processor, and ambiguity.
Note: This is just a sample, to suggest the subjects to be covered and
the probable length and difficulty of the problems. The format of
the actual test may be entirely different.
Part I. Multiple choice problems: 1 correct answer in each part.
(5 points each)
Problem 1.
The predicate ``length(L,N)'' is supposed to bind N to be the length
of L. Thus, ``length([a,b,c,d],N)'' should succeed with N bound to 4.
Which of the following definitions is correct: (Only one is.)
A. length([],0).
length([X|L],N) :- length(L,N) + 1.
B. length([],0).
length([X|L],N) :- length(L,N), N is N+1.
C. length([],0).
length([X|L], N+1) :- length(L,N).
D. length([],0).
length([X|L], N) :- N1 is N-1, length(L,N1).
E. length([],0).
length([X|L],N) :- length(L,N1), N is N1+1.
Problem 2.
The Prolog query ?- [X,Y | L] = [a,b,c]
A. Succeeds once with X=a, Y=b, L=c.
B. Succeeds once with X=a, Y=b, L=[c].
C. Succeeds once with X=[a], Y=[b], L=[c].
D. Succeeds many times: First with X=[], Y=[], L=[a,b,c]; next, on
backtracking with X=[], Y=[a], L=[b,c]; and so on through the
various divisions of the list [a,b,c].
E. Fails.
Problem 3.
Compositional semantics is
A. The principle that the meaning of a sentence is derived by combining
the meanings of the words in a mode indicated by the syntactic structure.
B. A technique for applying world knowledge to semantic interpretation.
C. The problem of giving an interpretation to a text of many sentences.
D. A method of disambiguation.
E. The decomposition of a word into a root and its inflections, prefixes
and suffixes.
Problem 4.
In a Markov model
A. Node N is labelled with the probability of being at N.
B. Node N is labelled with the probability that you will start at N.
C. The arc from node N to M is labelled with the probability of going
to M, given that you are now at N.
D. The arc from node N to M is labelled with the probability that you
just came from M, given that you are now at M.
E. The arc from node N to M is labelled with the probability that your
last transition was from N to M.
Part II. (20 points each)
Problem 5.
Suppose that you have the following small family tree in Prolog
parent(philip, charles).
parent(philip,anne).
parent(charles, william).
Further, you have the following recursive definition of ``ancestor'':
ancestor(X,X).
ancestor(X,Z) :- parent(X,Y), ancestor(Y,Z).
Now a user comes and issues the query
?- ancestor(A,B).
Each time that Prolog returns an answer, the user inputs `;' to ask
it to look for another answer.
What answers does Prolog return, and in what order?
Problem 6.
List the major modules of a natural language interpretation system and
explain their function.
Problem 7.
Consider the grammar defined by the following BNF:
S -> NP VP
NP -> | PP* | NP "and" NP
VP -> { NP } { NP } PP*
PP -> NP
Vocabulary:
"John" :
"Mary" :
"rose" : ,
"saw" : ,
"morning" :
"gave" :
"the" :
"to" :
"in" :
A. For each of the following sentences, draw all the parse trees, if any,
in this grammar for the sentence.
1. John gave Mary the rose.
2. Mary gave the rose to John.
3. John and Mary saw the rose.
4. John rose in the morning.
5. John rose and saw the morning.
B. Give an example of a meaningless sentence in this grammar with this
vocabulary.
Problem 8.
Give an example of a sentence or pair of sentences
in which selectional restrictions can be
used to disambiguate potential anaphoric ambiguity. Explain
the ambiguity and the selectional restriction used.