Next: Example: speech method. Up: Robust parsing of word-graphs Previous: Annotated word-graph

Weights

The weights that are associated with the edges of the graph can be sensitive to the following factors.

• Acoustic score. Obviously, the acoustic score present in the word-graph is an important factor. The acoustic scores are derived from probabilities by taking the negative logarithm. For this reason we aim to minimize this score. If edges are combined, then we have to sum the corresponding acoustic scores.
• Number of `skips'. We want to minimize the number of skips, in order to obtain a preference for the maximal projections found by the parser. Each time we select a skip edge, the number of skips is increased by 1.
• Number of maximal projections. We want to minimize the number of such maximal projections, in order to obtain a preference for more extended linguistic analyses over a series of smaller ones. Each time we select a category edge, this number is increased by 1.
• Quality of the QLF in relation to the context. We are experimenting with evaluating the quality of a given QLF in relation to the dialogue context, in particular the question previously asked by the system [26].
• Ngram statistics. We have experimented with bigrams and trigrams. Ngram scores are expressed as negative logarithms of probabilities. This implies that combining Ngram scores requires addition, and that lower scores imply higher probability.

The only requirement we make to ensure that efficient graph searching algorithms are applicable is that weights are uniform. This means that a weight for an edge leaving a vertex vi is independent of how state vi was reached.

In order to be able to compute with such multidimensional weights, we express weights as tuples . For each cost component ci we specify an initial weight, and we need to specify for each edge the weight of each cost component. To specify how weights are updated if a path is extended, we use the function that maps a pair of a multidimensional weight and an edge a to multidimensional weight. 7 Moreover, we need to define an ordering on such tuples. In order to experiment with different implementations of this idea we refer to such a collection of specifications as a method. Summarizing, such a weight method is a triple where

1. is the initial weight;
2. is the update weight function;
3. is an ordering on weights

Subsections

Next: Example: speech method. Up: Robust parsing of word-graphs Previous: Annotated word-graph

2000-07-10