G22.2590 - Natural Language Processing - Spring 2005 Prof. Grishman
Lecture 13 Outline
May 2, 2004
Review Assignment #10 ... quantification, event reification.
Announce NLP course for fall: Machine Translation.
Discuss final exam outline.
Solicit feedback on course and Jet (please send
Discourse Analysis: Planning
One approach to the analysis of narrative is through the use of
We presume that the narrative describes a rational agent attepting to
a goal, and executing a sequence of actions to achieve that goal
However, the narrative only describes a select subset of the actions
Our objective in discourse analysis is to reconstruct the implicit
and goals (and in so doing to resolve any ambiguities of syntax,
form, or reference)
The problem is one of plan inference: to infer a plan from some
of its steps and goals
This is somewhat different from the typical planning problem in
where we start with an explicit goal and seek a plan (typically, the
plan) which satisfies the goal, although in both cases we are searching
a space of possible plans.
In any planning problem, we have a set of predicates which
the state of the system, and a set of actions, which affect the
state of the system (making some predicates true or false). Some
may be compound actions: they represent sequences of simpler actions.
actions have preconditions: predicates which must be true in
for the action to apply, and effects, predicates which become
when the action is performed.
We can represent a plan by a tree, in which a goal dominates an
which achieves that goal, and an action dominates its precondition
and (if it is a compound action) its constituent actions.
Given a discourse, we ideally seek to create a plan (tree) in which
the root (initial goal) is either an explicitly stated goal or is known
to be a "plausible goal", and in which each sentence of the discourse
be tied to some action in the goal tree. (In reality we will normally
be able to connect all assertions in the discourse to the plan,
but will prefer analyses which have the maximal connection to a plan.)
Michelin Guide (from Wilensky): "Willa was hungry. She grabbed the
Michelin Guide and got in her car."
Trains domain (Allen, p. 484):
"Jack needed to be in Rochester by noon. He
bought a ticket at the station."
"Sue bought a ticket to Rochester. She boarded
the train at 4PM."
Equipment failure reports:
Pure "user initiative" systems maintain no model of dialog
prior discourse only maintained to resolve anaphora and analyze
(The other simple organization is pure "system initiative", where
system asks the questions and only accepts direct answers … a fancy
For mixed-initiative systems, we need to maintain some
representation of dialog goals in a goal stack. There will be
system (task) goals and user goals
For example, in an information gathering task, the system will have
goals corresponding to the information it needs to gather (e.g., slots
to fill in a form). If the top goal is such a system goal, the system
ask a question (to fill one slot in the form). The input will be
with respect to this top goal:
Dialog Analysis and Speech Acts (J&M
A direct answer will be used to fill this slot.
Other information in the answer may be used to fill other slots.
Another response (e.g., a question) introduces a user goal.
Simple dialog systems process questions at "face value". This can
to "stonewalling" behavior such as (from Waltz):
User: Are there summaries for January?
User: Could I have the January summaries?
User: I would like the January summaries.
System: I understand.
User: Where are the January summaries?
System: On my disk.
User: Can you give me the January summaries?
System: Yes, I already told you that.
To understand why this is peculiar (and inappropriate) we have to go
the literal meaning of an utterance and see the utterance as a communicative
act. Like other actions, communicative acts will have preconditions
For example, in a communicative act of the Inform class a speaker
may assert a proposition P with the effect that the hearer then
P. (M&J p. 736). In an act of the Request class, a speaker requests
that the hearer perform some action A, with the effect that the hearer
then intends to perform A.
Given such an interpretation of communicative acts, the task for the
is then to identify a plausible goal of the speaker from the
act he/she performs, and then to be responsive to that goal.
For example, in the stonewalling above, the literal interpretation
not correspond to a plausible goal, so a cooperative system would seek
a more likely goal (actually getting the summaries) and respond to
If you ask a train agent "When does the train to Jamaica leave?" and
he answers "3:15, track 27", it’s because he inferred that you wanted
get on that train and therefore needed to know where as well as when it