NL2KR demo

Slide 0
Slide 0

In the NL2KR directory when we give the command

./NL2KR-L-GUI.sh &

we get the NL2KR GUI.

In that GUI there are six buttons in the top:

Lambda, Inverse, Generalization, CCG Parser, NL2KR-L, NL2KR-T

In the coming slides we will show the working of each of these six buttons. The first 4 are development tools while NL2KR-L and NL2KR-T are the main mode of operation of the NL2KR system; NL2KR-L is used to learn meaning of new words from examples of sentences and their meaning/translation and an initial dictionary and NL2KR-T is used in translating sentences using given and learned meaning of words.

Slide 1
Slide 1

Here we show the working of the Lambda button.

When one clicks on the Lambda button the GUI has two text inputs: Function and Argument.

In this slide we give the Function and Argument as shown above.

(Since we do not have the Greek lambda symbol in our key board, we use the symbol # to represent lambda.)

We then click on Run Lambda.

In the black space in the GUI the Result is given.

In addition another window pops up showing a parse tree illustrating the lambda application.

Slide 2
Slide 2

Here we show another example of Lambda application.

Slide 3

Slide 3

Here we show another example of Lambda application.

In this example, we use the symbol A to denote “Forall” and the symbol > to denote “implication”.

The symbol @ represents functional application. Thus if F is a lambda function and H is an input to the lambda function F, then F(G) is written as F@G.

Slide 4

Slide 4

Here we show another example of Lambda application.

Slide 4.1

Slide 4.1

The key innovation of our approach is Inverse Lambda. Using it we are able to learn the meaning of new words when we know the meaning of a sentence or a phrase and some words of that phrase/sentence.

In this example, we are given the meaning/translation of the sentence
“Vincent loves Mia” and the meaning of the words “Vincent” and “Mia”.

By using Inverse Lambda we will be able to find the meaning of the phrase “loves mia” and the word “loves”.

The following two slides show the use of the button Run Inverse with respect to this example.

Slide 5

Slide 5

Here we show the Application of Inverse Lambda.

As we mentioned, this is the key innovation of our approach. It allows us to come up with lambda expressions of words whose meaning we may not have known before.

In other words, given F and G such that F = G@H,
Inverse Lambda determines H.

Similarly, given F and H such that F = G@H,
Inverse Lambda determines G.

In the GUI, F is denoted by Parent, G by Left and H by Right.

In the above example, Given Parent and Right, by clicking on Run Inverse we obtain Left.

In other words knowing the meaning of the sentence “Vincent loves Mia” and the meaning of the word “Vincent” Run Inverse gives us the meaning of the phrase “loves Mia”.

Slide 6

Slide 6

Here we show the continuation of the previous example of using Inverse.

Here knowing the meaning of the phrase “loves Mia” and the meaning of the word “Mia” Run Inverse gives us the meaning of the word “loves”.

Slide 6.1

Slide 6.1

We now give a different Inverse Lambda example.

Here we know the meaning of the sentence “Every boxer walks” and the meaning of the words “walks” and “boxer”.

In the following two slides we show how using the Run Inverse button we can learn the meaning of the phrase “every boxer” and the word “every.

Slide 7

Slide 7

Here we show the example of using Run Inverse to learn the meaning of the phrase “every boxer” when we know the meaning of the sentence “Every boxer walks” and the meaning of the word “walks”.

Note that we use the symbol A to represent “for all” and the symbol > to represent “implication”.

Slide 8

Slide 8

Here we show the example of using Run Inverse to learn the meaning of the phrase “every” when we know the meaning of the phrase “Every boxer” and the meaning of the word “boxer”.

Slide 9
Slide 9

In this slide we illustrate Generalization.

Given a lexicon of words, their categories and their lambda expression, if we give a new word and its category, the “Generalization” module looks for words in the lexicon that belong to the same category and uses their meaning given as lambda expressions to construct meaning of the new word.

In this example, Generalization finds the meaning of the new word “Mia” based on its category N.

It finds that the word “Vincent” in the lexicon is of the category N and it then generalizes that meaning of the word “Vincent” to give a meaning to the word “Mia”.

Slide 10

Slide 10

This shows that Generalization obtains the meaning of the new word (not in the lexicon) “eats” by using the meaning of the word “takes” which is of the same category (S\NP)/NP.

Slide 11

Slide 11

This shows that Generalization obtains the meaning of the new word (not in the lexicon) “walks” by using the meaning of the word “fights” which is of the same category S\NP.

Slide 12

Slide 12

The button CCG Parser gives a CCG parse of a sentence. It uses an in-built dictionary based on C&C and Stanford Parser.

If sentences do not parse correctly or do not parse at all using this button, then one can add additional syntax (consisting words and their categories) in a file to override the inbuilt syntax.

Slide 13
Slide 13

Now we come to the main two modules of the NL2KR system: NL2KR-L and NL2KR-T.

The NL2KR-L module takes (a) a dictionary (lexicon) of words, their CCG categories and meanings/translation given as lambda expressions; (b) training example of sentences and their meanings/translations; and (c) a syntax file (containing words and their categories) to override the inbuilt syntax as input and produces an enhanced lexicon where new entries (of words, their CCG categories and lambda expressions) are added.

In addition, the enhanced lexicon assigns weights to each triplet of words, their CCG categories and lambda expressions, so as to address ambiguity. These weights are assigned so that a probabilistic parsing of the training sentences results in maximizing the probability that the sentences in the training set get translated to their corresponding meaning.

The NL2KR-T module takes a set of sentences and the enhanced lexicon mentioned above and translates the sentences. Currently, NL2KR-T also integrates correctness evaluation and hence it takes both sentences and their expected translations so as to evaluate if the computed translations matches with the expected translations.

In the next several slides we will first show how the NL2KR-L module takes a lexicon with only the meaning of the word John, a training set of two sentences and their meanings and an empty syntax file (for overriding) and learns the meaning of the words “eats” and “rice” using a combination of Inverse Lambda and Generalization. These new meanings are added to the lexicon and an enhanced lexicon is obtained.

We then show how NL2KR-T then takes this enhanced lexicon to correctly translate the sentences “Mary drinks water” and “Mary eats”. During the translation it again uses Generalization as the words “Mary” “drinks” and “water” do not appear in the lexicon that was given to NL2KR-T as input.

Slide 13.1
Slide 13.1

This slide shows the content of:

* Lexicon given as the file dictionary.txt;

* Overriding syntax given as the file syntax.txt; and

* Training examples given as the file train.txt.

It also shows that dictfinal.txt is removed.

In several following slides we will show how NL2KR-L works on such input.

Slide 14

Slide 14

Given the meaning of the word “John” and the meaning of the sentences “John eats rice” and “John eats”, NL2KR-L first parses the sentence “John eats rice” and knowing the meaning of “John”, it uses Inverse Lambda to find the meaning of the phrase “eats rice”.

But it cannot immediately process “eats rice” further as it does know the meaning of either words.

It then uses generalization to obtain the meaning of “rice”.

But then it proceeds to the second example sentence “John eats”.

Further processing is shown in the next slide.

Slide 15

Slide 14

Given the meaning of the word “John” and the meaning of the sentences “John eats rice” and “John eats”, NL2KR-L first parses the sentence “John eats rice” and knowing the meaning of “John”, it uses Inverse Lambda to find the meaning of the phrase “eats rice”.

But it cannot immediately process “eats rice” further as it does know the meaning of either words.

It then uses generalization to obtain the meaning of “rice”.

But then it proceeds to the second example sentence “John eats”.

Further processing is shown in the next slide.

Slide 16
Slide 16

Now when processing “John eats rice” NL2KR-L knows one meaning of “John”, “rice” and “eats”, but they don’t match; so it uses Inverse Lambda to find another meaning of “eats”.

Further processing is shown in the next slide.

Slide 17
Slide 17

In this slide parameter estimation is taking place.

Slide 18
Slide 18

Here parameter estimation is completed and the learned lexicon is evaluated against the sample set. The evaluation shows that using the learned lexicon correct prediction is made for all the items in the training set.

Slide 19
Slide 19

This slide shows that after NL2KR-L is run dictfinal.txt is created and it has 4 entries of 4-tuples of words, CCG categories, lambda expressions and weights.

It also shows the test set in the file test.txt that we will use in the NL2KR-T module shown in the next slide.

Slide 20
Slide 20

This slide shows the running of NL2KR-T that uses a syntax file syntax.txt for overriding (which as shown in the previous slide is empty), the enhanced lexicon obtained as the file dictfinal.txt (whose content is shown in the previous slide) and the test set given as the file test.txt.

This slide shows that NL2KR-T correctly translates the sentence “Mary drinks water”; here the translation is matched with the expected translation given as part of the test.txt file.

Slide 21
Slide 20

This slide shows the running of NL2KR-T that uses a syntax file syntax.txt for overriding (which as shown in the previous slide is empty), the enhanced lexicon obtained as the file dictfinal.txt (whose content is shown in the previous slide) and the test set given as the file test.txt.

This slide shows that NL2KR-T correctly translates the sentence “Mary drinks water”; here the translation is matched with the expected translation given as part of the test.txt file.

-->