AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

The heart of A.I.
 
 
  [ # 16 ]

Thanks guys, I also think I finally found a place of equally minded people.
Here are my 2 bits:
I suppose the problem is, how to relate the 2 parts of information together in a meaningful way. The thing I noticed about your examples were their structure:
x being y, x verb y (where x = agent, y = object)
So I would create a frame that filters on this type of sentence structure and then transform the data object a bit so that the (x verb y) becomes the main part and (x being y) will be seen as a separate statement that gets linked to the first using the ‘why’ meaning.

With respect to how I relate multiple parts of speech to a word: I use a tree like structure: a text-neuron is the leaf and represents the raw word. This is contained by multiple lists (clusters). For each pos, a new list.  For each unique meaning of the word, there is another new list, which contains the pos list.
In the back-end, I use my own blob-like database structure, which allows me to do fast disk access in a multi threaded way. I originally used xml files for debugging (a file per neuron), but that became so slow it had to be replaced (even with all the caching that’s going on).

 

 
  [ # 17 ]

Dave, the tense wasn’t a problem… well, after I spent enough time coding the morphology module.  You enter ‘ran’, or ‘run’, or ‘running’ and it gives the “lemme” of the verb, which is common to all verb forms.

example output….

]ran

—-Simple Store Properties (sprops) of ‘ran’—-
  tense = past
  principal-part = past-tense
  pos = verb
  verb-lemme = run

]running

—-Simple Store Properties (sprops) of ‘running’—-
  tense = present
  principal-part = present-participle
  pos = verb
  verb-lemme = run

]run

—-Simple Store Properties (sprops) of ‘run’—-
  human-body-movement = true
  tense = past
  tense = present
  principal-part = past-participle
  principal-part = present-tense
  principal-part = past-tense
  pos = verb
  verb-lemme = run

So CLUES handles that quite easily for along time now.  I won’t get into what “Simple Store Properties’ are… i’ll have to put together an entire document on that (there is 3 other types, Complex-Mill, and Simple-Mill—basically any that doesn’t come right from the text files, but instead is “computed” is a “Mill”—in honour of Charles Babbage).. and data that comes right from text files, instead of CPU processing, is “Store”—again a term from Charls Babbages dream of the ‘Analytical Engine’

@Chuck -  Yes, it knows to interpret ‘missed’ as relating to ‘not being on time’ because of the sentence before it involves being ‘late’... the rule that combines the two types of statements together makes that possible.

So, this morning I am working on my next challenge.. if you guys like that one, you will love this one.

The goal is to determine essential information from a predicate.  We’ll use the following example predicates….

      1.  picked up a wallet
      2.  picked up no wallet
      3.  picked up a wallet with money in it
      4.  picked up an empty wallet
      5.  did not pick up a wallet
      6.  didn’t pick up a wallet
      7.  did not pick up no wallet (bad english i know, but have to deal with)
      8.  did not pick up any wallet
      9.  picked up 2 wallets
      10. did not pick up no 2 wallets (bad enlighs, i know, but have to deal with)
      11. picked up a wallet full of 20’s
      12. picked up a wallet with no money in it
      13. did not pick up a wallet with no money in it
      14. picked up a wallet which had 50 dollars in it
      15. picked up a wallet that had only 2 cents in it.

# 7 above shows how illogical the human brain is—to many people, they consider that to mean we did NOT pick up a wallet.. very bad English with a double negative, but I hear enough people use that construct.

My goal right now is to have code look at each of those and determine…

is the basic idea here about ‘obtaining’ an object X, where that object X is something money is often kept in (example “wallet”).. AND extract the following information…

      Did the user actually specify that there was money in the wallet?  If so, how much?
        and actually did the user say a wallet was found or not.  How many, if any, wallets were found?

I need this information in order to know how to respond.

Example, I do not want to respond with “Interesting… was there any money in it?”
examples 3 and 11 already state there was money in it. 

how much money? in the last example, I don’t want to say “Wow!! Lucky you!! ”  LOL. .. only 2 cents smile

Now stage 2 (grammar parse tree generation) is doing its job very well .. and I have nicely structured data to work with….

** parse tree for “picked up a wallet”...

pos = predicate
dcomp.noun1.adjective1.val = a
dcomp.noun1.num-adjective = 1
dcomp.noun1.val = wallet
dcomp.num-noun = 1
num-verb = 1
parse-tree-id = 2
verb1.val = picked up

So that is easy for stage 3 code (concept finder) to work with . .it is just the number of different parse trees that can ‘boil down’ to the same meaning!!  A **LOT** of work ahead of me !!

@ Jan - frame that links the two? Yes, already done…but just look at the number of possibilities their are within just the predicate !!! yikes!!!!!!!!  Yes.. .giving up has crossed my mind (naturally), but NO. .. on I go!!

Why am I first generating grammar parse trees?  Because, keep in mind, you may also have something like ...

“found a good book, some candy, no wallets, but a tv”  (1)
“found a good book, some candy, a few wallets, and a tv”  (2)

I want to know that in (1) I didn’t find a wallet, and (2) I did.  And let me tell you, it is easier to ‘query’ a more structured thing like the parse tree than the raw input line.

 

 
  [ # 18 ]

Before you think that is easy to parse.. what about

“found some really really good books, very extremely wonderfully good candy, no wallets and a really great tv”
or

“Uncle Henry and that very very funny and really intelligent cousin George found some really really good books, very extremely wonderfully good candy, no wallets and a really great tv”

or

“While we were having the BBQ, Uncle Henry and that very very funny and really intelligent cousin George found some really really good books, very extremely wonderfully good candy, no wallets and a really great tv”

OR. . .and this is where I am now . ..  detecting contradictions…

“While we were having the BBQ, Uncle Henry and that very very funny and really intelligent cousin George found some really really good books, very extremely wonderfully good candy, no wallets and a really great tv and 8 wallets”

Did they find no wallets or 8 wallets… detect contradictions.

See why I first parse into a structured tree by grammar first??

 

 
  [ # 19 ]
Victor Shulist - Aug 7, 2010:

.... in my humble opinion is….  IMAGINATION.

It seems to be an essential ‘ingredient’ to A.I. and especially chatbots.

In my sometimes-not-so-humble opinion: Also YES.

But what is IMAGINATION? Again IMSNSHO it is the sum of thought processes that are evoked either by external or internal stimuli.

In getting my bot to parse the sentence….

  “The train being late, the soldiers missed their boat.”

I have it understand the connection between the to statements.  *BUT* I had to basically give it a rule that relates these 2 concepts.  I’m sure humans don’t need to be fed these rules that relate the two.

Imagination in that case would be:

[ul]
[li]What is a train[/li]
[li]What is time[/li]
[li]What does it mean/imply to miss something[/li]
[li]Of course with the correct disambiguation of “miss”, also with the correct disambiguation of “their” in this context.[/li]
[/ul]

What would the bot need?  Well, it would need to consider…  It is probably true that the soldiers were going to use that same train to get to the place where the boat was.  That is not stated in the sentence…. it seems that the bot will need an imagination, and to theorize, and from that, conclude on its own.

Again - as so often observed in this forum - I think you try to tackle the problem too high up the semantic ladder. Please try to IMAGINE wink how low-level the “thinking” of your computer is and that you first need to adapt some low-level concepts to that low level thinking.

The ultimate, or “ideal” way to do this of course, is if the bot had an imagination and ran a kind of “simulation”  of reality, that I believe we humans have.

Ok, so we are talking about VISUALIZATION. Now there’s interesting literature about natural language aquisition of visually impaired children:

http://books.google.com/books?id=lNsNAAAAQAAJ&lpg=PA18&dq=language acquisition impaired children&pg=PA18#v=onepage&q=language acquisition impaired children&f=false

And others. I have about 5 books on this topic in my library, because it is a key issue to understand language acquisition of the human brain under conditions similar to those of your computer - i.e. without the visual aid most of us have.

Richard

 

 
  [ # 20 ]

I have since been able to have my bot understand that sentence.

“The train being late, the soldiers missed their boat.”

the correct meaning of “missed” is concluded by the fact that some method of transportation was “late”.

CLUES can be told, if you have 2 sentences, where first sentence involves a subject which is a method of transportation, and that subject is being assigned adjective that means “late”, and the second sentence involves people doing the verb “missed” to direct object <method of transportation>”, then it makes the connection that the REASON that they were late was because method of transportation was late.

Later, I will give it another “con-spec” (concept specification) which will relate to sentences like

“The boy’s father died, he missed him”

Based on the semantics of that sentence, CLUES will know that to die, means the person is no longer there, and ‘missed’ in this case has a new meaning.    I don’t see it being too difficult to give it that conspec.

 

 
  [ # 21 ]
Victor Shulist - Sep 6, 2010:

Later, I will give it another “con-spec” (concept specification) which will relate to sentences like

“The boy’s father died, he missed him”

Based on the semantics of that sentence, CLUES will know that to die, means the person is no longer there, and ‘missed’ in this case has a new meaning.    I don’t see it being too difficult to give it that conspec.

Hate to play devil’s advocate, but doesn’t “missing the boat” also mean the boat is no longer there? And if you have to manually code in all these “conspecs”, that will be a monumental time sink.

Not that I have a better solution at present, just thinking aloud.

 

 
  [ # 22 ]

Yes, this is the nature of research, make a theory, test it in practice, check your results, then go back and refine your theory.

And yes, no one said creating a machine that can pass a turing test was NOT going to be a “monumental time sink”.

Think of how long it takes a child to learn.

And yes, if “missing the boat” is an expression that can mean 50 other things, you put those conspecs in as well.  Then based on context, you score each one, and pick the most likely.

 

 < 1 2
2 of 2
 
  login or register to react