G22.2590 – Natural Language Processing – Spring 2006
The final exam
- Worth 30 points towards final grade
- Given Tuesday, May 9th, 2006, 5:00 – 6:50 (usual class
- Open book and notes (you will need your book!). If you have
simple calculator, that may also be helpful if we have a question on
or PCFG probabilities.
- Five to seven questions
The questions will be taken from the following list of question
Most of these correspond directly to questions asked for
I may also ask one or two short (1-blue-book-page) essay questions
issues we have discussed in the lectures.
- English sentence structure: Label the constituents (NP,
VP, PP, etc.) of an English sentence based on the grammar given in
(and summarized in the handout for homework #2). If the sentence is
show its multiple parses. If the sentence violates some grammatical
describe the constraint. (homework #2).
- Context-free grammar: Extend the context-free grammar to
an additional construct, or to capture a grammatical constraint.
- Parsing: Given a very small context-free grammar, to step
the operation, or count the number of operations performed by a
backtracking parser, a bottom-up parser, or a chart parser (homework
- POS tagging: Tag a sentence using the Penn POS tags
- HMMs and the Viterbi decoder: Describe how POS tagging can
performed using a probabilistic model (J&M sec. 8.5; lecture 4
Create an HMM from some POS-tagged training data. Trace the operation
a Viterbi decoder. Compute the likelihood of a given tag sequence and
likelihood of generating a given sentence from an HMM (homework #4).
- Feature grammar: Augment a context-free grammar using the
formalism of J&M 11.3 to capture a grammatical constraint (homework
- Chunkers and name taggers. Explain how BIO tags can
used to reduce chunking or name identification to a token-tagging task.
how chunking can be evaluated. (lecture #7). Explain how a
maximum-entropy model can be used for tagging or chunking (lecture and
- Probabilistic CFG: Train a probabilistic CFG from some
apply this PCFG to disambiguate a sentence. Explain how this PCFG can
extended to capture lexical information. Compute
probabilities. (homework #9)
- Logical form: write the logical form of an English
with or without event reification (J&M chap. 14 and 15.1;
- Jet: be able to extend, or trace the operation, of one of
Jet pattern sets we have distributed and discussed (for noun and verb
and for appointment events). Analyze and correct a shortcoming in
the appointment patterns (homework #10).