Practice Midterm Exam for Artificial Intelligence Midterm: Wednesday, March 7. Closed book, closed notes. To study: Overview of AI: Russell and Norvig, chaps 1,2 Search: Russell and Norvig chap 3, section 4.4, appendix B.2 (p. 855) Natural Language Processing: Russell and Norvig, chaps 22, 23, section 24.7 Handouts on non-deterministic algorithm, parsing algorithms, ambiguity Note: This is just a sample, to suggest the subjects to be covered and the probable length and difficulty of the problems. The format of the actual test may be entirely different. Part I. Multiple choice problems: 1 correct answer in each part. (5 points each) Problem 1. The following non-deterministic algorithm for the N-queens problem has two keywords omitted. N-QUEENS (in N : integer; out B : board) begin B := empty N x N board; while not every column in B contains a queen do KEYWORD.A C := an empty column in B; KEYWORD.B R := a row such that square is not attacked by any queen in B endwhile return(B) end A. KEYWORD.A and KEYWORD.B are both "pick". B. KEYWORD.A is "pick" and KEYWORD.B is "choose". C. KEYWORD.A is "choose" and KEYWORD.B is "pick". D. KEYWORD.A and KEYWORD.B are both "choose". Problem 2. The time requirement for an iterative deepening search is roughly proportional to A. The number of states in the state space. B. The depth of the state space C. The depth of the shallowest goal state in the state space. D. The branching factor. E. The number of states no deeper than the shallowest goal state. F. The number of states to the left of the leftmost goal state. Problem 3. Compositional semantics is A. The principle that the meaning of a sentence is derived by combining the meanings of the words in a mode indicated by the syntactic structure. B. A technique for applying world knowledge to semantic interpretation. C. The problem of giving an interpretation to a text of many sentences. D. A method of disambiguation. E. The decomposition of a word into a root and its inflections, prefixes and suffixes. Problem 4. Bayes' Law states that A. Prob(P|Q) = Prob(P) / Prob(Q). B. Prob(P|Q) = Prob(Q|P) C. Prob(P|Q) = Prob(Q|P) / Prob(Q) D. Prob(P|Q) = Prob(P) * Prob(Q|P) / Prob(Q) E. Prob(P|Q) = Prob(Q) * Prob(Q|P) / Prob(P) Part II. Problem 5. (20 points) List the major modules of a natural language interpretation system and explain their function. Problem 6. (30 points) Consider the grammar defined by the following BNF: S -> NP VP NP -> | PP* | NP "and" NP VP -> { NP } { NP } PP* PP -> NP Vocabulary: "John" : "Mary" : "rose" : , "saw" : , "morning" : "gave" : "the" : "to" : "in" : A. For each of the following sentences, draw all the parse trees, if any, in this grammar for the sentence. 1. John gave Mary the rose. 2. Mary gave the rose to John. 3. John and Mary saw the rose. 4. John rose in the morning. 5. John rose and saw the morning. B. Give an example of a meaningless sentence in this grammar with this vocabulary. C. Suppose we use a bottom-up parser to parse sentence 1. Show the state of the partial parse after the first three steps. (I am not very concerned here about the exact order in which things are done, nor about exactly how you count steps. What I mean is "Show the state of the program at an early stage of processing.") Problem 7. (30 points) A. Give an example of a sentence or pair of sentences in which selectional restrictions can be used to disambiguate potential anaphoric ambiguity. Explain the ambiguity and the selectional restriction used. B. Give an example of a sentence or pair of sentences in which there is a potential anaphoric ambiguity that cannot be disambiguated using selectional restrictions. Explain why not. Give a method for carrying out the disambiguation.