AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

GRACE/CLUES
 
 
  [ # 31 ]

Dealing with nested and/or/but statements bothered me for a while, but I think I’ve developed a pretty elegant solution. Not really using nested lists, but by organizing the way parse trees are linked together. I really should start a thread one of these days wink

 

 
  [ # 32 ]
C R Hunt - Feb 13, 2011:

Dealing with nested and/or/but statements bothered me for a while, but I think I’ve developed a pretty elegant solution. Not really using nested lists, but by organizing the way parse trees are linked together. I really should start a thread one of these days wink

I was pulling my hair out for a bit about this, but it is long past (figured it out Feb ‘09 actually).

CLUES handles nicely now things like:

(adjective-1) (noun-1), (noun-2), (noun-3), (adverb-1), (adverb-2) (adjective-3) (noun-4), (adjective-4) (adjective-5) (noun-5)

so it knows for example that noun-4 is modified by adjective-3 (which in turn is modified by adverb 1 and 2)

And a compound noun like that can serve as:

* subject noun
* predicate noun
* indirect object of verb
* direct object of verb
* object of preposition

 

 
  [ # 33 ]

Drop a small update.

So this past weekend I finally have Grace understand how to handle anaphoric resolution.  Backward case is handled first, there will be other forms of anaphoric resolution to handle later.

Basic ambiguity detection is handled.  For example, for a complex sentence, CLUES will know that, for example, if there is only a single subject noun in the subordinate clause, and it is a singular form of pronoun, (‘he’, ‘she’ for example), and the main clause is also a single singular proper noun, it knows it can assume that that same subject (of the main clause) also did the predicate(s) of the subordinate clause.  When there is more than one person named in the subject of the main clause, it knows it has ambiguity and may ask (later it will take hints from the conversation state). 

On the other hand, even if there are 2 subjects named in the main clause, but a single pronoun as the subject of subordinate-clause, if the gender is different, it can do a resolution.  Example “Bob called Cathy because he wanted to ask her out”, the engine will know it can deduce her=Cathy.

If it encounters something like “John was unhappy because she didn’t make it to the dance”, knowing the gender of John is male, it would know it cannot resolve ‘she’ and thus to ask ‘Who is “she” ?’ 

Again.. later, it will take information from the conversation state to resolve ‘she’ (the last mentioned female in the conversation perhaps).

 

 
  [ # 34 ]

You might find this interesting:
An Algorithm for Pronominal Anaphora Resolution
http://acl.ldc.upenn.edu/J/J94/J94-4002.pdf

 

 
  [ # 35 ]

Merlin,

Many thanks, this document, so far, seems to be a very good reference.

Even if I don’t use the same algorithm, it will provide a good set of test cases for my existing one.
Anaphoric resolution is, after all, only one of many milestones to reach,  Thanks again !

 

 
  [ # 36 ]

Some of the design considerations of Grace and her engine (CLUES) that match up fairly close to what is talked about here

http://plato.stanford.edu/entries/chinese-room/

I gave this link also in the thread I created “Chinese room”

Two main approaches have developed that explain meaning in terms of causal connections. The internalist approaches, such as Schank’s conceptual representation approach, and Conceptual Role Semantics, hold that a state of a physical system gets its semantics from causal connections to other states of the system. Thus a state of a computer might represent “kiwi” because it is connected to “bird” and “flightless” nodes, and perhaps also to images of prototypical kiwis. The state that represents the property of being “flightless” might get its content from a Negation-operator modifying a representation of “capable of airborne self-propulsion”, and so forth, to form a vast connected conceptual network, a kind of mental dictionary.

 

 

 
  [ # 37 ]

With Grace starting her life out with full ability to deal with completely free form NL, the definition of those state nodes will be very rich smile

 

 
  [ # 38 ]
Victor Shulist - Mar 18, 2011:

...to form a vast connected conceptual network, a kind of mental dictionary.

Oh wait, where did we here that before…... smile

 

 
  [ # 39 ]

Now combine THAT with a solution for ‘symbol grounding’ and you have the solution for the ‘hard problem’ (re: David Chalmers).

 

 
  [ # 40 ]
Hans Peter Willems - Mar 18, 2011:

Now combine THAT with a solution for ‘symbol grounding’ and you have the solution for the ‘hard problem’ (re: David Chalmers).

I’m not 100% convinced of needing grounding - for a text best chatbot that is. 

You’re going for the Robotic TT, so I see your point of view.  I’m going to see how far the “semantic network” idea, as mentioned in that paper takes me.  If true external world grounding is needed, I can ‘cross that bridge when I get to it’ smile

Hans Peter Willems - Mar 18, 2011:
Victor Shulist - Mar 18, 2011:

...to form a vast connected conceptual network, a kind of mental dictionary.

Oh wait, where did we here that before…... smile

smile

 

 
  [ # 41 ]

Victor, I think you misunderstand the ‘symbol grounding problem’ as it has very little to do with Robotics or even ‘physical phenomena’. It goes towards instating ‘consciousness’ in a machine. I suggest you read up on the ‘Hard problem’, ‘cognitive phenomena’ and ‘Qualia’. David Chalmers has done a lot of work in this area, some of the links I posted elsewhere are pointing to his work.

 

 
  [ # 42 ]

No, I know it has nothing to do with Robotics.

I’m just thinking that perhaps you believe it is necessary for all an AI’s symbols, words ,whatever to be grounded in order to achieive your goal of consciousness, and also for purposes of an AI mind to control a robot.  Because you have mentioned that you intend on using your AI for robotics.

 

 
  [ # 43 ]

..... or rather you see consciousness as a prerequisite for symbol grounding to exist, and symbol grounding as a prerequisite for understanding.

 

 
  [ # 44 ]

...... why on Earth would I think symbol grounding has to do with robotics or physical phenomena
question question:exclaim:

 

 
  [ # 45 ]
Victor Shulist - Mar 19, 2011:

I’m just thinking that perhaps you believe it is necessary for all an AI’s symbols, words ,whatever to be grounded in order to achieive your goal of consciousness

It seems that almost every important researcher in the field of AI consciousness agrees that this IS the ‘hard problem’. I just tend to agree to that.

Victor Shulist - Mar 19, 2011:

... and also for purposes of an AI mind to control a robot.

Again this strange jump. NO, it is not necessary to control a robot. YES, it IS necessary to control a CONSCIOUS robot. My aim is to build a conscious AI-mind (i.e. solving the ‘hard problem’, also known as the ‘holy grail in AI research’).

Victor Shulist - Mar 19, 2011:

Because you have mentioned that you intend on using your AI for robotics.

That conscious AI-mind COULD be build into a robot (to give it a body) but this is not a prerequisite of attaining conscious AI in the first place. What I HAVE said however, is that sensory input (which is NOT the same as ‘robotics’) is important for consciousness to exist (again, most researchers do agree on this).

 

 < 1 2 3 4 5 >  Last ›
3 of 7
 
  login or register to react