|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Hello,
I would like to introduce Aici, short for Artificially intelligent chat interface. I have just registered it in on the chatbot.org site. You can also download an Aici-client from my downloads page. It is currently only available in the form of a windows application, though a web-based version is also under construction.
Aici’s primary purpose is to demonstrate what can be done with resonating neural networks (it’s underlying technology). This is a new concept that I developed which is showing some interesting results. Though Aici is still young and limited in certain ways, I think that the conversation logs show a clear and steady progression. That, while I was actually still having to spend more attention to the design of the underlying tools and technology compared to Aici’s routines themselves.
For those who would like to try out Aici, it’s perhaps best to take a look at the conversations that I already had with it, to get an idea of the limitations. In short: use simple sentence structures, present time, and no punctuation at the end. (The network is already able to correctly parse complex sentence structures, I just haven’t yet finished that path further. I am currently implementing time based routines, and a . or ? is something I’ll do between the soup and potatoes one day )
Anyway, I’d love to hear what you think of it so far.
MODERATOR UPDATE: I’ve split this thread from another one, and merge another into it. As a result, the first two postings in this thread might appear somewhat uncorrelated. EVL
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 1 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
@Victor: The ‘X-is-a-y’ and ‘x has-a-y’ that I talk about in the objects and assets posts and that you see in the demo’s are more a semantics thing in my system. There are some others also defined, for instance to handle verbs, short sentences without verbs, general conversation stuff,.... The parsing itself can currently handle a fairly large subset of the English language. If you would like to see what is currently possible, take a look at the ‘English grammar’ definition in the designer (project tab, flows/English grammar)’ This is a flow definition (very similar to a parser definition for coco, lexx, yacc (the latter perhaps a little less). Also, here’s a post entry that shows the parse result for a more complex sentence structure. It’s a short sentence, but one that requires something more than just word knowledge to parse correctly.
Ps: all the parsing is done through neural code, so the network is able to change it’s own parsing routines (though only desirable for learning new language I suppose).
Could I train it for something like this. . .
“Jack went to his closet and took out his new suit because he was going to a party given by his company”.
“What did jacket get his new suit for?”
In short, not yet. Here’s what still needs to be done:
- The ‘jacket’ would be a bit of a problem at the moment. ‘Jack’ would produce better results, I think.
- I am currently laying out the neurons to handle past and future tenses (I have switches all over the place (in neurons) that check for time=current, past or present).
- I still have to define some frames to handle composite sentences. Frames are the conduit between the parse result and the action that is performed. Basically, they define all the sentence structures that are understood and what should be done with that type of structure.
-the verbs ‘go’ and ‘take’ will have to be defined, with some neural code behind them. At some later stage, this should be teachable, but that requires some other actions which haven’t been defined yet.
-I will probably have to do some more clean up on the parser definition + add some more filters. Even if it was just to get the nr of threads under control (up until now I have always had to do this for a new, more complex sentence structure, so no difference with this one probably).
So still a bit of work before something like this will be handled correctly.
About Linux: The current version is written in .net/WPF. Mono should be able to run the core, but not the client apps since they don’t yet support WPF. So it should be possible to put a small client like the Aici-client together in Mono. I also have the beginnings of a web-based interface. This is in the form of a WCF service and WCF client web app. The thing here is that I am not a web developer. Luckily, such a website is not much more as the demo tutorial that is given for WCF, so I was following that, but stopped half way. Which is a bit stupid of me cause now I’ll have to do the whole tutorial again.
Auch, what a lengthy post.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 2 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
I went to that post, regarding “I feel you and you feel me”
this is what my bot produced for that:
num-simple-sentence = 2
parse-tree-id = 2
pos = multi-simple-sentence
sentence-list-type = space
simple-sentence1.num-predicate = 1
simple-sentence1.predicate1.num-verb = 1
simple-sentence1.predicate1.verb1.val = feel
simple-sentence1.subject.noun1.val = i
simple-sentence1.subject.num-noun = 1
simple-sentence2.num-predicate = 1
simple-sentence2.predicate1.dcomp.noun1.val = me
simple-sentence2.predicate1.dcomp.num-noun = 1
simple-sentence2.predicate1.num-verb = 1
simple-sentence2.predicate1.verb1.val = feel
simple-sentence2.subject.noun-list-type = and
simple-sentence2.subject.noun1.val = you
simple-sentence2.subject.noun2.val = you
simple-sentence2.subject.num-noun = 2
so this parse tree (2) says they are 2 different and isolated sentences.
but..
num-predicate = 1
parse-tree-id = 1
pos = simple-sentence
predicate1.dcomp.noun-clause1.num-predicate = 1
predicate1.dcomp.noun-clause1.predicate1.dcomp.noun1.val = me
predicate1.dcomp.noun-clause1.predicate1.dcomp.num-noun = 1
predicate1.dcomp.noun-clause1.predicate1.num-verb = 1
predicate1.dcomp.noun-clause1.predicate1.verb1.val = feel
predicate1.dcomp.noun-clause1.subject.noun-list-type = and
predicate1.dcomp.noun-clause1.subject.noun1.val = you
predicate1.dcomp.noun-clause1.subject.noun2.val = you
predicate1.dcomp.noun-clause1.subject.num-noun = 2
predicate1.dcomp.num-noun-clause = 1
predicate1.num-verb = 1
predicate1.verb1.val = feel
subject.noun1.val = i
subject.num-noun = 1
this parse tree thinks of it as [i feel] *THAT* [you and you feel me]
so it also considers that.
parse tree id 1 had a merit of 0, and parse tree id 2 above had merit of -1.
and it is correct—if we want 2 sentences, most people that know basic English are going to write it as “I feel. You and you feel me.” *IF* that is what they meant. But since that period was not there, it gave it a merit of -1.
I’ll have to check my grammar rule base, I probably don’t have a parse rule that tries to evaluate it as a compound sentence. I’ll check that, in fact I’m fairly certain that rule is not in there yet. But see “I feel you you feel me” below - I must have it in for certain cases.
I tried also. . .
“I feel you you feel me”.
First off, by the rules of grammar that should not be considered two sentences, because a period should separate them.
Yes, yes, I know, user’s are going to be sloppy! Thus, my bot also will consider, maybe the user *did* mean 2 sentences.
I bot generated the following for “i feel you you feel me”
*** Possibility # 1 ****
num-simple-sentence = 2
parse-tree-id = 2
pos = multi-simple-sentence
sentence-list-type = space
simple-sentence1.num-predicate = 1
simple-sentence1.predicate1.dcomp.noun1.val = you
simple-sentence1.predicate1.dcomp.num-noun = 1
simple-sentence1.predicate1.num-verb = 1
simple-sentence1.predicate1.verb1.val = feel
simple-sentence1.subject.noun1.val = i
simple-sentence1.subject.num-noun = 1
simple-sentence2.num-predicate = 1
simple-sentence2.predicate1.dcomp.noun1.val = me
simple-sentence2.predicate1.dcomp.num-noun = 1
simple-sentence2.predicate1.num-verb = 1
simple-sentence2.predicate1.verb1.val = feel
simple-sentence2.subject.noun1.val = you
simple-sentence2.subject.num-noun = 1
so that means it thought of it as 2 independent sentences, [i feel you] and [you feel me]
it also had this parse tree that it could mean…
*** Possibility # 2 ****
pos = multi-simple-sentence
sentence-list-type = space
simple-sentence1.num-predicate = 1
simple-sentence1.predicate1.num-verb = 1
simple-sentence1.predicate1.verb1.val = feel
simple-sentence1.subject.noun1.val = i
simple-sentence1.subject.num-noun = 1
simple-sentence2.num-predicate = 1
simple-sentence2.predicate1.dcomp.noun1.val = me
simple-sentence2.predicate1.dcomp.num-noun = 1
simple-sentence2.predicate1.num-verb = 1
simple-sentence2.predicate1.verb1.val = feel
simple-sentence2.subject.noun-list-type = space
simple-sentence2.subject.noun1.val = you
simple-sentence2.subject.noun2.val = you
simple-sentence2.subject.num-noun = 2
So it also considers the other possiblitiy that you mentioned in your post ....
[i feel] -and-
[you you feel me]
as 2 completely independent sentences.
notice the “simple-sentence2.subject.num-noun = 2” above.
Now, it also considers…
*** Possibility # 3 ****
which you didn’t consider on your post, and I didn’t even think of…..
pos = simple-sentence
predicate1.dcomp.noun-clause1.num-predicate = 1
predicate1.dcomp.noun-clause1.predicate1.dcomp.noun1.val = me
predicate1.dcomp.noun-clause1.predicate1.dcomp.num-noun = 1
predicate1.dcomp.noun-clause1.predicate1.num-verb = 1
predicate1.dcomp.noun-clause1.predicate1.verb1.val = feel
predicate1.dcomp.noun-clause1.subject.noun-list-type = space
predicate1.dcomp.noun-clause1.subject.noun1.val = you
predicate1.dcomp.noun-clause1.subject.noun2.val = you
predicate1.dcomp.noun-clause1.subject.num-noun = 2
predicate1.dcomp.num-noun-clause = 1
predicate1.num-verb = 1
predicate1.verb1.val = feel
subject.noun1.val = i
subject.num-noun = 1
So with this one, it think perhaps you mean
[I feel] *THAT* [you you feel me]
Unlike the parse tree above where it considered [i feel] and [you you feel me] as 2 independent sentences, here it is considering that the direct compliment noun of verb ‘feel’ is not simply the noun ‘me’ , but the noun clause you you feel me.
well…. perhaps you did mean that !!
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 3 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
btw: there is no such thing as “simple sentence without verb”. By definition, for a sequence of words to be defined as a sentence, it must have at least one subject noun, at least one predicate, which must contain at least one verb, and optionally one or more direct compliment nouns or direct compliment adjectives.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 4 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Victor Shulist - Aug 7, 2010: I went to that post, regarding
well…. perhaps you did mean that !!
I was trying to show that the parser can figure out how to correctly group words together: at the level of the subject, sentence,...
The point is, that all of this is done with only neurons and links. When using traditional parser generators, you have to add custom code to get it correct. This makes it very difficult to debug, but more importantly very difficult to auto render from code, which is the goal eventually. Basically, it makes the parser static while this type of parser remains dynamic: you can easily generate new flow definitions while using the same parsing code.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 5 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Victor Shulist - Aug 7, 2010: btw: there is no such thing as “simple sentence without verb”. By definition, for a sequence of words to be defined as a sentence, it must have at least one subject noun, at least one predicate, which must contain at least one verb, and optionally one or more direct compliment nouns or direct compliment adjectives.
Ok.
I was referring to short statement types like the previous one. How do you call them then?
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 6 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Hi Jan… you’re on here right now at the same time… I made some changes to the post above, if you want to look at it again.
So ‘frames’ are were you define grammar rules for your bot?
I wonder if it is similar to my bots rules. I basically tell it a ‘high level’ goal, for example, prove this is a sentence, and a sentence is made up of noun and predicate. Then, it has a sub-goal, prove this sub-string of the input is a predicate, and on it goes like that. Is that how your frames kind of work ?
group of words, unless it has both subject noun and predicate verb, is only a phrase.
a phrase cannot be a sentence, and also cannot be a statement, since a statement is a type of sentence, in addition to imperative, interrogative, etc
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 7 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Jan Bogaerts - Aug 7, 2010:
Basically, it makes the parser static while this type of parser remains dynamic: you can easily generate new flow definitions while using the same parsing code.
excellent, very similiar to my design. I can add a rule at any level, and the system will automatically use that at any level, whether it is main goal, sub-goal, sub-sub-goal, etc. Basically, it finds a use for that new rule in the chain of operations it needs to do to prove the main goal.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 8 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Victor Shulist - Aug 7, 2010: Hi Jan… you’re on here right now at the same time… I made some changes to the post above, if you want to look at it again.
So ‘frames’ are were you define grammar rules for your bot?
I wonder if it is similar to my bots rules. I basically tell it a ‘high level’ goal, for example, prove this is a sentence, and a sentence is made up of noun and predicate. Then, it has a sub-goal, prove this sub-string of the input is a predicate, and on it goes like that. Is that how your frames kind of work ?
group of words, unless it has both subject noun and predicate verb, is only a phrase.
a phrase cannot be a sentence, and also cannot be a statement, since a statement is a type of sentence, in addition to imperative, interrogative, etc
I don’t think that we are doing things in a similar way. I don’t really have ‘rules’ as you say, at least not in 1 specific place, more scattered al around.
Also, I process in exact the opposite direction: instead of top down, I go bottom up. I think, the best way to explain it is this:
All the network tries to do, is change the neurons it has received for processing, into other neurons. At it’s core, that is all that happens, over and over and over and over again. So at first it starts with a number-neuron (representing a letter), this is tranformed into a neuron that indicates digit, alpha, capital, sign,... Those are transformed into flow elements (the items that make up a scan definition). This is repeated until there is only 1 resulting neuron left, which triggers the entire process again (after it has been split up again), but for english language parse: flow items are transformed into other flow items. At the end, there is again 1 neuron left, which triggers a search in all the available frames (after it has been split up again), and so on…
So basically, my rules are mostly in the flows (letter/word orders) and little code clusters attached to various things like the parse definition items or words. The frames come at the end of the parsing stage and can probably also be seen as a sort of rules. After that, come some more rules in the actions (those perform the content of the statement). These rules deal mostly with the ‘path-walking’ of object, subject and value (things like ‘my brother’s wife’, ‘the pretty but large house’,....) and also some more rules to extract or convert the ‘what’ of a sentence. (if you say: ‘it is blue’, the system needs to extract that you are talking about color, emotion,...).
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 9 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Victor Shulist - Aug 7, 2010:
I can add a rule at any level, and the system will automatically use that at any level, whether it is main goal, sub-goal, sub-sub-goal, etc. Basically, it finds a use for that new rule in the chain of operations it needs to do to prove the main goal.
Can the system also add it’s own rules?
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 10 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
So far I haven’t needed it to; having it combine existing rules into more complex rules on its own seems to suffice. There are only so many English grammar rules to put into the system, and so far I am finding it much easier and faster to simply define the rule than ‘teach it’. If it becomes necessary, it has crossed my mind to explain new rules to it, via natural language itself! Once the system is fully ‘boot strapped’. Similar to the way we start with assembly language, then write C from assembly, and then write Java or perl from C, and on we go.
For processing order, I sort of go bottom up and top down at the same time, and the engine tries to find the connecting point.
as for installing your client, I am going to look into installing mono to get .net on this machine. I could also install a virtual machine with windows.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 11 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Victor Shulist - Aug 7, 2010: Once the system is fully ‘boot strapped’. Similar to the way we start with assembly language, then write C from assembly, and then write Java or perl from C, and on we go.
That’s pretty cool.
I was thinking also in terms of letting it learn other languages. Or, in a previous stage, easily define other languages.
as for installing your client, I am going to look into installing mono to get .net on this machine. I could also install a virtual machine with windows.
I wouldn’t bother with mono. There’s no point. The client is currently using WPF as front end. Mono doesn’t support WPf and isn’t planning to, so at the moment, I think it’s best to try a virtual machine. Off course, a virtual machine on top of a jit on top of another virtual machine. Swell
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 12 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
True, that is a lot of layers of software abstraction. Hum, maybe I have a spare desktop system kicking around in the basement to install windows on!
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 13 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
If you have the spare time. Perhaps for a quick look-see it’s probably easier to get a virtual machine. Processing speed is fairly ok with the last release.
|
|
|
|
|
Posted: Aug 7, 2010 |
[ # 14 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Yes, I’m looking forward to trying out your bot. Mainly I’m interested in how you provide these ‘flows’. I’m guessing you are not providing it formal grammar rules but simply letting it see many many examples of which words appear together, and based on statistical analysis, determine the correct parsing?
|
|
|
|
|
Posted: Aug 8, 2010 |
[ # 15 ]
|
|
Senior member
Total posts: 257
Joined: Jan 2, 2010
|
Jan,
Your work is very interesting. I read your white paper tonight referenced in your first post. I got to the “Putting it All Together”...and it didn’t come together for me. =) Do you have another reference that puts it together in the form of a drawing? I saw your screen shot of the N²D program and saw the graphics…but it showed only a small diagram.
Welcome to the forums. I hope to follow your project in the coming months.
Regards,
Chuck
|
|
|
|