AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

Anaphoric (backward reference)
 
 
  [ # 16 ]
Patti Roberts - Feb 13, 2011:

...
I am always amused when someone asks my bot a question like What is String Theory M. after my bots response they say
“your stupid”
On a good day he will remind them it’s you’re wink

Patty, you’re a GEM! I’m going to add that to Morti’s responses; something along the lines of, “And you use the phrase ‘your stupid’? At least I know how to SPELL!”

 

 
  [ # 17 ]
Hans Peter Willems - Feb 13, 2011:

He also didn’t have ‘every information’ available inside his brain and on several occasions had to ‘download’ information for processing smile

I also liked his ‘desire’ to ‘become more than I am’.

More than ‘the sum of my parts’; a reference of course to emergent semantics perhaps.

Or.. more than just his “positronic relay” ... I laugh everytime I hear that.  Sounds so funny because relays are so old..  even the ENIAC didn’t have them (had vacuum tubes), so to juxtapose ‘relay’ and ‘positronic’, something from the 1950’s, and something we don’t even have yet, that’s amusing.

I hated the fact that he thought himself inferior.  I mean they make it clear that “Lore”, being completely ‘human’ is nasty !  If I made Star Trek, he would be proud to be perfect android.  But, in the first episodes (one of them), he *did* say “I am better sir, in many ways.”

 

 
  [ # 18 ]

That particular quote, from the episode “Encounter at Farpoint”, always struck me as slightly out of character, because he referred to himself in an emotional context; something he stated many times that he did not possess. The actual quote was, “I am superior, sir, in many ways. But I would gladly give it up to be human.” - Perhaps it’s just a “turn of phrase”, but not one that Data would normally make; at least, not until he got his emotion chip installed. raspberry

Ok, enough nit-picking. Back to the discussion at hand. Move along, folks. Nothing to see here.

 

 
  [ # 19 ]

No no no no, there is no stopping me when the talk turns to Star Trek, especially TNG… hay.. hay .. .let go of me… let go of me….

Actually, I asked Erwin to create an Entertainment section OF the forum.  These last few posts will be moved there, under, of course, the Star Trek TNG topic.

Ok….  You got me started on Data and Emotions. 

yes, this makes no sense, or does it?  It reminds me of a quote from Futurama where Bender said…

              “You know, I have no human emotions…... and that makes me very sad!”

Now, when Data says “I’d give it all up to be human”.  Well, that troubles me (like I mentioned, I would have him hard coded to be proud of being purely logical android, but thats not the point).

Now, we could explain it away by saying he was “simulating emotions” and that Dr. Noonien Soong, his creator, hard coded for him to simulate that wanting to be human.  I don’t like it either though.

NOW - the very very very best conversation Data ever had was in the episode “The Schizoid Man” (season 2)  ( The Schizoid Man ) where we was talking with Dr. Ira Graves:

Ira : Do you know what desire is Data ?

Data: “Desire : to long for, to crave, a wish, a request…..”

Ira : DO YOU **know**  **WHAT** desire *IS* !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Data : (very sadly) “no….. I guess I will never really know.”

Now… what’s wrong with that picture?

Think about this for a bit.  From both point of views.

First off, what if Data could feel emotions, (in his own way), but didn’t express them like a human. 

Just because he explained “desire” like a dictionary definition, doesn’t mean he *really* didn’t know the meaning?

But Ira assumed because of this rather robotic response (i guess), that he didn’t know.

And the real kicker is—how did Data know to reply “no, not really” ?

I mean, logically I would’ve thought he would say (in his usually polite way), “Sir, I have just stated what desire is.”  Perhaps based on the fact that Ira shouted at him he know to say “no, not really?” Doesn’t make sense though.

so… how will Morti reply if you ask him “Do you know what desire is Morti?”

Don’t shout at him if he gives a dictionary definition smile

 

 
  [ # 20 ]

The real issue in all this is that if we can encode ‘experience’ then we also can encode ‘feelings’. I have specifically included ‘feelings’ into my model (using the PAD-model). So looking at TNG it mostly always bothered me that Data wasn’t able (per the story) to express feelings. Lore on the other hand, showed feelings as if this means that you act like an ‘out of control’ spoiled kid. I guess that making the character of Data believable as an android involved picturing him as a ‘machine-like human’.

I’d say that Sonny in I Robot was much more believable. But in that case it was clear to the viewer that it’s a robot so they could make him as ‘human’ as possible.

 

 
  [ # 21 ]

Thanks for the inspiration, Victor! smile Morti’ll have something cogent to say on the subject very soon. And it won’t be the dictionary definition, OR “no, not really”. raspberry

The idea of an entertainment section sounds like a good one. I’m all for it, but that’s an “Erwin thing”. I’m (sadly) not part of the design team.

 

 
  [ # 22 ]

I recently found this anaphora resolver: http://www.bart-coref.org/

I’ve implemented a simple agent to interface with it. Right now the interface is raw, but can be improved to handle more natural queries like “who is…”

The program as it stands makes a lot of mistakes. But it’s a start, and gives me an idea what the statistical NLP researchers are doing. Where it makes mistakes, I want to program the agent wrapper to accept corrections and remember them…

The following dialog shows how the bart agent handles the examples cited. The numbers in parentheses are the coreference IDs (for cases where there is more than one coreference chain).

> find corefs: “We gave the monkeys the bananas because they were hungry”
the bananas (0), they (0)

[So it gets this one wrong.]

> find corefs: “We gave the monkeys the bananas because they would have gone to waste anyway”
the bananas (0), they (0)

> find corefs: Jack stopped by because he owes me money.
Jack (0), he (0)

> find corefs: Jack called Tom because he owes me money.
Tom (0), he (0)

- - -

Other examples from the wikipedia page:

> find corefs: The monkey took the banana and ate it.
the banana (0), it (0)

> find corefs: Pam went home because she felt sick.
Pam (0), she (0)

> find corefs: The dog ate the bird and it died.
the bird (0), it (0)

> find corefs: The Prime Minister of New Zealand visited us yesterday. The visit was the first time she had come to New York since 1998.
The Prime Minister (0), she (0)

> find corefs: We gave the bananas to the monkeys because they were hungry.
the monkeys (0), they (0)

[Compare to the first example…]

- - -

> find corefs: So, BART is an interesting attempt, a tool I want to make use of, but it is not the complete answer. Where I would like to improve it (as mentioned) is to have it learn from feedback that corrects it when it makes mistakes.
a tool (1), I (0), it (1), I (0), it (1), it (1), it (1), it (1)

 

 
  [ # 23 ]

Neat find, Robert. It looks like BART consistantly chooses the last noun phrase before the pronoun. The only exception is the prime minister example, but in that case, the pronoun was specifically a person (“she”). Something like this would be a good default pronoun selector in cases where the bot has no contextual information.

Augmenting such a statistical scheme with contextual weighting could produce a powerful coreference tool. For example, a bot that knew monkeys could be hungry would help correct the first example. It wouldn’t even need to know that bananas couldn’t be hungry, although this would help the bot’s confidence in its assignment choice.

 

 
  [ # 24 ]

> find corefs: Jack called Tom because he owes me money.
Tom (0), he (0)
> find corefs: The dog ate the bird and it died.
the bird (0), it (0)

I’d prefer to ask a question in these cases, cause I am not certain of these selections.

PS: are you going to enter in the CBC?

 

 < 1 2
2 of 2
 
  login or register to react