Home Site
 

    Page 23
16.4 Table of Contents Bottom of Page Site Map
For example, the word bird maps to Hawk through the is_a relationship. Duck also interlocks with bird by the same is_a relationship. By sliding along these relationships, NLPWin uses the knowledge stored in MindNet to identify the meaning of words in relations to other words.

Discourse Component

The Discourse module, pioneered by Simon
Corston- Oliver, takes the data passed up from previous components and summarizes it. For instance, it can summarize the essence of a book, similar to Cliff Notes, presenting the key points of the book.

Meaning Representation Component

At the top of the NLP arch, the Meaning Represent
ation component represents the Holy Grail of computational linguists, true language understanding. Once in this state, NLPWin has finished the increasingly abstract parsing of the original text and it stores the information in MindNet, it is possible to reverse the entire process to produce meaningful responses.
In other words, the Generation component converts
the abstract, or logical, forms taken directly from NL Text back into NL Text. By first dissecting and digesting text fed into it and then synthesizing meaningful responses enables the systemto engage humans in conversation (dialogue). While many of the previous attempts at this type of system have
focused on narrow vocabularies, the NLP Group's ambition is to enable broad coverage of entire natural languages, such as English, Spanish, Japanese, etc.

Applying NLPWin to Machine Translation
Although the research linguists at Microsoft have made
groundbreaking strides in developing the initial components of NLPWin (with the Word grammar checker perhaps the most notable milestone), teaching computers to actually understand language remains a distant goal. Given that the language Generation module appears to depend on the Meaning Representation component, the successive and cumulative nature of NLPWin implies that language translation remains beyond the current capabilities of the system.
Fortunately for the field of Machine
Translation (MT), the NLP Group has found a method to short-circuit the process. Once it reaches the Logical Form stage, these highly abstract constructs stored in MindNet it is possible to match or map to their counterparts in another language. Thus, the system could perform MT without the machine truly understanding the meaning of the words.
The creation of the NLPWin Machine Translation
system takes place in two stages: training and runtime.

Training

Figure 3 presents an overview of the MT training
process. The system begins with a pair of equivalent sample sentences from a database.
Figure 3: An overview of the MT training process.

To Page 22

16.4 Table of Contents
Top of Page

To Page 24


16.4 2002
23

PC AI Magazine - PO Box 30130 Phoenix, AZ 85046 - Voice: 602.971.1869 Fax: 602.971.2321
e-mail: info@pcai.com - Comments? webmaster@pcai.com