In this section we want to extend the nlp_speech method to take into account Ngram probabilities. Obviously, we can extend the weight tuple with a further argument which expresses the accumulated weight according to the Ngram probabilities. However, a potential problem arises. If we extend a given path by using the transition labeled w, then we want to take into account the probability of reading this word w given the previous N-1 words. However note that in the definition of the annotated word-graph given above these words are not readily available. Even if we make sure that each path is associated with the last words seen so far, we must be careful that weights remain uniform.
The solution to this problem is to alter the definition of the graph, in such a way that the relevant N-1 words are part of the vertex. If we want to take into account Ngram probabilities (), then the vertices of the graph will be tuples where v is a state of the word-graph as before, and are the previous N-1 words. For example, in the case of bigrams (N=2), vertices are pairs consisting of a word-graph state and a word (the previous word). A number of special symbols is introduced as beginning-of-sentence markers. The start vertex is now . The notation x:k is used to refer to the last k elements of x.
The annotated word-graph for Ngrams is a weighted graph (V,E) and some fixed N, such that: