## Introduction to Artificial Intelligence: Solution Set 5

Assigned: Feb. 26

Due: Mar 5

### Problem 1:

Consider the following problem: I get to work, but I can't find the keys
to my office in my pockets. Call this event K. There are three possible
explanations:
- The keys are actually in my pants pocket, where they're supposed to be,
but buried under other junk. (Event P)
- The keys are actually in my coat pocket,
but buried under other junk. (Event C)
- I left the keys at home. (Event H)

We estimate the relevant probabilities as follows:
Prob(P) = 0.8
Prob(C) = 0.19
Prob(H) = 0.01
Prob(K | P) = 0.1
Prob(K | C) = 0.2
Prob(K | H) = 1.0.

A. Evaluate Prob(P | K), Prob(C | K), and Prob(H | K).
** Answer: **
First, Prob(K) =
Prob(K|P) Prob(P) +
Prob(K|C) Prob(C) +
Prob(K|H) Prob(H) = 0.1 * 0.8 + 0.2 * 0.19 + 1.0 * 0.01 = 0.08 + 0.038 + 0.01
= 0.128.

Prob(P | K) = (By Bayes' rule) Prob(P) Prob(K|P) / Prob(K) = 0.1 * 0.8 / 0.128
= 0.625

Prob(C | K) = (By Bayes' rule) Prob(C) Prob(K|C) / Prob(K) = 0.2 * 0.19 / 0.128
= 0.296

Prob(H | K) = (By Bayes' rule) Prob(H) Prob(K|H) / Prob(K) = 1.0 * 0.01 / 0.128
= 0.078

B. I check my pockets again, and again don't find the keys. Suppose that
the two checks of my pockets are independent and identical. That is,
let M be the event that I will miss my keys twice in checking the pocket
twice. We suppose that

Prob(M | P) = (Prob(K | P))^{2} = 0.01
Prob(M | C) = (Prob(K | C))^{2} = 0.04
Prob(M | H) = (Prob(M | H))^{2} = 1.0.

Evaluate Prob(P | M), Prob(C | M), and Prob(H | M).
First, Prob(M) =
Prob(M|P) Prob(P) +
Prob(M|C) Prob(C) +
Prob(M|H) Prob(H) = 0.01 * 0.8 + 0.04 * 0.19 + 1.0 * 0.01 =
0.008 + 0.0076 + 0.01 = 0.0256

Prob(P | M) = (By Bayes' rule) Prob(P) Prob(M|P) / Prob(M) = 0.01 * 0.8 / 0.0256
= 0.311

Prob(C | M) = (By Bayes' rule) Prob(C) Prob(M|C) / Prob(M) = 0.04 * 0.19 / 0.0256
= 0.295

Prob(H | M) = (By Bayes' rule) Prob(H) Prob(M|H) / Prob(M) = 1.0 * 0.01 / 0.0256
= 0.394

C. Estimate how many times I have to check my pockets before I am 90% sure
that I have left my keys at home.
** Answer:** 4. (3 tries gives about 83% sure.)

### Problem 2

The overall syntax of a sentence often constrains the part-of-speech of
the particular words in ways that are not captured by the k-gram model
of tagging parts of speech. For instance, the sentence "I can fish and
tomatoes" can only be given a synactic parse if "can" is labelled
a verb and "fish" is labelled a noun. However, the trigram model will
probably decide that the most likely parse is that "can" is a modal and "fish"
is a verb.
A. Give an example to show that

P(Tag[5]=noun | Tag[3]=verb, Tag[4]=conjunction)

is not zero. (All I'm asking for is one sentence with those
three elements in a row.)

B. Propose a method that will allow you to combine probablistic information
from the trigram model with syntactic constraints. Explain how this would
fix this problem.