Surface Realization by Drew Jacobs What does it do?   Derive a human readable sentence from a discourse plan. Discourse plan does not give syntax, only functional.

Download Report

Transcript Surface Realization by Drew Jacobs What does it do?   Derive a human readable sentence from a discourse plan. Discourse plan does not give syntax, only functional.

Surface Realization
by
Drew Jacobs
What does it do?


Derive a human readable sentence from a
discourse plan.
Discourse plan does not give syntax, only
functional information. The Surface Realizer
adds syntactical information and assures that
the sentence will comply with lexical and
grammatical constraints.
What doesn’t it do?


Will not verify that the correctness of the data
provided by the discourse planner or that the
information makes sense.
Does not deal with more than one sentence
at a time. If the plan calls for many
sentences, the surface realizer will be called
once for each sentence required.
Simple Surface Realization Tools

Canned Text Systems
- Takes a given input and matches it directly
to a pre-made sentence.
- Commonly used in simple systems such as
error messages or warnings.
- Has no flexibility whatsoever.
Simple Surface Realization (cont.)

Template Systems
- The idea of a template is that there are premade
sentences with fill in the blank words that are filled in
by the input.
- These systems work well with Form Letters and
Slightly more advanced Error or Warning Messages.
- They are still very inflexible, but better than canned
text systems.
Two More Advanced Approaches
(Feature Based Systems)

Systemic Grammar – Representation of
sentences as collections of functions. Rules
allow mapping from functions to grammatical
forms. (Halliday, 1985)

Functional Unification Grammar –
Represents sentences as feature structures
that can be combined and altered to produce
sentences. (Kay, 1979)
Systemic Grammar



A systemic grammar comes from branch of
linguistics called Systemic-Functional
Linguistics
Described by M.A.K. Halliday in the book “An
Introduction to Functional Grammar” (1985)
Systemic Grammars work by utilizing rules
on functions in layers to generate actual
sentences
Mood Layer

Also known as the interpersonal
meta-function.

It describes the interaction between the sentence
writer and the reader.

Examples would be the whether the writer is telling
the reader something or is asking a question.
Transitivity Layer

Also known as the ideational
meta-function.

It identifies items such as who the actors are
and what the goals are for the sentence.

Also identifies the type of process being
performed.
Theme Layer

Also known as the textual meta-function.

This layer tries to fit the expression with a
given theme and reference.
Systemic Grammar Rules


Within the layers, there exist realization statements
that allow for transition from functional form to
syntactic form.
These statements contain operators for adding
syntactic objects to the sentence. Operators take
the form of :
+X for inserting a function
X=Y for assigning equivalency
X > Y for saying X must come before Y
X/Y for assigning a syntactic expression Y to a
function X.
Figure 20-2 From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin, Prentice Hall, 2000.
Example Input


“The System Will Save the Document”
Input to a systemic grammar would take this
form :
:process
save
:actor
system
:goal
document
:speechact assertion
:tense
future
From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin, Prentice Hall, 2000.
Example Sentence
Here we can see how each layer corresponds to the
text.
From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin, Prentice Hall, 2000.
Functional Unification Grammar



This method of NLG works using the
principles of Unification grammars. These
grammars were shown in Chapter 11 to
describe a parsing method.
It works by building a feature structure and a
list of potential alternations for the structure.
It then matches the feature structure of the
input with the feature structures in the
grammar.
Categories, Elements, and Patterns



The Category in a feature structure represents what
the feature structure corresponds to. For example S
is a sentence, and NP is a noun phrase.
Each of the elements in a feature structure are the
components that make up the feature that is being
represented. So a Process or a Goal would be an
element of a Sentence, and so forth.
Patterns describe the ordering of elements in a
feature structure. So, for example, a Noun Phrase
could have the pattern (Det, PrNoun).
Figure 20-3 From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin. Prentice Hall, 2000.
An Example Unification

An Example Input (Functional Description)
From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin. Prentice Hall, 2000.
Figure 20-4 From “”Speech and Language Processing”, Daniel Jurafsky & James H. Martin. Prentice Hall, 2000.
Natural Language Generation
Programs

KPML – a text generation system based off
of the earlier Penman system. Uses
Systemic-Functional Linguistics Principles.
http://www.fb10.uni-bremen.de/anglistik/langpro/kpml/README.html

FUF/SURGE – a text generation system and
English Grammar using Functional
Unification.
http://www.cs.bgu.ac.il/research/projects/surge/index.htm
Additional Resources

http://registry.dfki.de/
- A comprehensive listing of current NLP tools.

http://www.dynamicmultimedia.com.au/siggen/
- The website for the Special Interest Group on Generation.

http://www.fb10.uni-bremen.de/anglistik/langpro/NLG-table/NLGtable-root.htm
- A website with a full listing of all programs specifically used for
Natural Language Generation.
References

Halliday, M.A.K. “An Introduction to Functional Grammar”. Edward
Arnold, London 1985.

Hovy, Eduard. “Language Generation.”
http://www.lt-world.org/HLT_Survey/ltw-chapter4-all.pdf

Jurafsky, Daniel & Martin, James H. “Speech and Language
Processing”. Prentice Hall, New York 2000.