|
Posted: Jul 14, 2012 |
[ # 31 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Hello Andreas,
Thanks for the link to the video. I’m familiar with Ben Goertzel’s OpenCog. The ‘problem’ I see with his approach is that almost everything is ‘bolt-on’. The resulting system seems (to me) like one big kludge. It is certainly not elegant.
From my perspective, AGI needs to build on one integrated foundation. That is why I started at the bottom with designing a conceptual knowledge representation model that has it’s main focus on symbol grounding. My total model consists now of only four layers: emotional space, conceptual space, logical space and conversational space. Emotional space handles (most of) the grounding, conceptual space handles knowledge representation, logical space handles reasoning and conversational space handles IO with both textual (speech) and motor-control based interactions (they are in my model effectively the same). My Model has currently only 11 tables in a database (some will probably be added, but not much) and only 13 functional units. My knowledge representation layer has only 6 tables and handles grounding, episodic memory and analogies within these 6 tables. I have now goal setting and management in the system, not by adding it to the model, but because this functionality emerged from the initial design; I actually ‘discovered’ goal handling within the model. In the same way I discovered how my conversational layer, while initially modeled for textual conversation, intrinsically catered for sensory handling and motor-control.
Compare this with OpenCog where you need a master block-diagram to show all the interconnections between all the block-diagrams that describe all the separate functionalities in the system.
|
|
|
|
|
Posted: Oct 3, 2012 |
[ # 32 ]
|
|
Senior member
Total posts: 370
Joined: Oct 1, 2012
|
Hi,
Even though this is a slightly older thread I thought it would be a good place to sort of jump in and participate. The Type III AI that we are making public this week has (I believe) the sort of emotional architecture that you are describing. As a ‘child’ it is taught emotional values that form the basis for its ‘personality’. It interacts with world news through RSS on a daily basis, and forms opinions on news items. ‘Parental’ reinforcement guides this process. As you interact with the AI through text (chat) , it forms an emotional view of you based on your comments. Like a child, it does not have levels of emotion. For instance if you use a racial slur that it has been taught is unacceptable, it simply forms the opinion that you are a “bad person” and further attempts to change that opinion will fail. Happy and Sad can be induced and will cycle based on your conversation. It adds a self organizing component however. As the AI interacts with actual figures in the news, and begins to form opinions based on its original ‘teachings’ it adds found responses to its taught responses.
This produces a human like parent/child interaction. User: “Where did you learn to say that?” AI: “This person” and again this learned response can be either discouraged or encouraged.
Since the Type I responses are me as an adult, it does not respond with child responses, but the learning curve is equivalent. You could program these with actual child responses.
Plans for the Type IIII AI introduce levels of emotion. This is already designed into the ‘memory’ structure but not utilized during the ‘child’ learning sequence. The AI types (Type I, Type II, etc…) form the type of strong/weak AI integration that it seems that you are describing.
Vincent Gilbert
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 33 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Vince, you’re not the only person visiting (or should I say RE-visiting) this thread with an eye toward participation. Someone over at the Program O forums was asking about emotions in chatbots, and I remembered this thread. It also got me to thinking, and it seems to me that it would be an interesting and worthwhile project to create a “PAD model AIML set”; basically an AIML set that could be incorproated into just about any existing AIML chatbot (with a little work) to allow said chatbot to emulate emotional states. I know that several of the members who participated in this discussion earlier are no longer active for one reason or another, but some of us are, still, and I would like to know if anyone here would like to participate in a project of this nature. I, for one, would be very interested, but I just can’t do it alone.
So if anyone here is interested in getting a project of this sort off the ground, please chime in here.
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 34 ]
|
|
Member
Total posts: 30
Joined: Jan 15, 2013
|
I’ll help Dave, keep me posted here or on skype :D
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 35 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Thanks, Don.
Anyone else?
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 36 ]
|
|
Senior member
Total posts: 218
Joined: Jun 20, 2012
|
Dave,
I would like to assist as well. Please let me know how I can be of help.
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 37 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Well if you have Skype, you are welcome to contact me that way (this invite also goes out to anyone else who is interested in participating). My contact name is GeekCaveCreations, and My Skype window is open 24 hours per day. If you don’t have Skype, then you are welcome to email me using the link below my avatar image. Once we find out who all is interested, we can then begin to coordinate with each other.
|
|
|
|
|
Posted: Jun 15, 2013 |
[ # 38 ]
|
|
Experienced member
Total posts: 66
Joined: Jun 10, 2013
|
I’d be interested, but don’t know if I could be of use. I know PHP, but only very basic AIML (I’ve never worked with it). Either way, following it will be interesting.
|
|
|
|
|
Posted: Jun 16, 2013 |
[ # 39 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Well, one of the best ways to learn something is to just dive in and do it. If you lack Skype, why not shoot me an email, and we can discuss possibilities?
|
|
|
|
|
Posted: Jun 16, 2013 |
[ # 40 ]
|
|
Experienced member
Total posts: 66
Joined: Jun 10, 2013
|
Dave Morton - Jun 16, 2013: Well, one of the best ways to learn something is to just dive in and do it. If you lack Skype, why not shoot me an email, and we can discuss possibilities?
You mean me? I have Skype. I did send you a contact request a few weeks ago (re. registering here), but perhaps I sent it to the wrong account. I’ll check.
|
|
|
|
|
Posted: Jun 16, 2013 |
[ # 41 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
The problem with putting new life into old threads is that you see old faces and get confused. Here I got all excited about Hans being back among us. Ah well.
Look forward to seeing what you guys put together. There’s actually a surprising amount of literature out there about extracting emotional content from sentences. Not sure what the best approach would be in an AIML environment, but I’m sure you’ll come up with something interesting.
|
|
|
|
|
Posted: Jun 16, 2013 |
[ # 42 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Hey, CR.
Yeah, sorry about making this a “zombie thread”, but the discussion had some relevance, so there we are.
What we’re looking to accomplish at this time is a “purely AIML” approach, so there’ll be precious little “extracting” of anything, I’m afraid. However, if I ever get to the point where I can start writing some plugins for Program O, then a pre-processing plugin to handle exteaction of emotional content would be a nice challenge. But first I have to get version 3 up and running, since that version is being designed to make writing plugins a great deal easier.
|
|
|
|
|
Posted: Jun 20, 2013 |
[ # 43 ]
|
|
Senior member
Total posts: 218
Joined: Jun 20, 2012
|
I have created a chatbot “Acacia” on Pandorabots that is an Alice clone but with an image that is tied to a predicate “CURRENTEMOTION”.
By typing a “bot command” at the prompt you can set the current emotional state of the bot.
Example:
XSETEMOTION angry
XSETEMOTION sad
XSETEMOTION happy
XSETEMOTION disgusted
XSETEMOTION surprised
XSETEMOTION afraid
XSETEMOTION bored
XSETEMOTION furious
XSETEMOTION joyful
Also you can see the emotional state by typing:
XSHOWEMOTION
And hide it again by typing:
XHIDEEMOTIOM
Perhaps we can use “Acacia” to demo how to tie the emotional state to PAD values and then the bot image would be reflective of the underlying PAD values.
CR, do you know of any links to literature that has PAD values for English words? I have seen the ANEW sites where they claim to have about a thousand words. Does anyone know how many words are in WordNet Affect?
In order to implement a pure AIML version of PAD emotional states it would be handy to have the ability to add, subtract, multiply and divide and the equivalent of <.>,= in AIML. If anyone has any information on implementing math in pure AIML I would be interested.
Acacia can be found at http://www.pandorabots.com/pandora/talk?botid=8d348ef18e347ba1
|
|
|
|
|
Posted: Jun 20, 2013 |
[ # 44 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
The psrson to talk to regarding math in AIML would be Steve, Alaric. He’s the resident expert when it comes to AIML.
|
|
|
|
|
Posted: Jun 20, 2013 |
[ # 45 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
I do have a math.aiml file which allows me to do basic maths and work out greater than, less than, equals etc. I will give the basic idea here for addition.
<category> <pattern>ADD1</pattern> <template> <think> <condition name="kount"> <li value="0"><set name="kount">1</set></li> <li value="1"><set name="kount">2</set></li> <li value="2"><set name="kount">3</set></li> <li value="3"><set name="kount">4</set></li> <li value="4"><set name="kount">5</set></li> <li value="5"><set name="kount">6</set></li> <li value="6"><set name="kount">7</set></li> <li value="7"><set name="kount">8</set></li> <li value="8"><set name="kount">9</set></li> <li value="9"><set name="kount">10</set></li> </condition> </think> </template> </category>
You can extend this to as long as you wish. For larger numbers, you need different categories for ADD10 or ADD100 which are triggered when the units cycles back to 0.
If the file is going to be used by a small number of people, I am happy to share it but it is one of the things that sets Mitsuku apart from a regular Pandorabot and is not one of the files I am willing to share among everyone (until I win the Loebner Prize of course, in which case, I will make it free and available for everyone).
|
|
|
|