|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
Since the <that> expires on Pandorabots, how would I allow a client to continue the conversation with that <that> by storing it externally.
For example:
Bot: That’s great. When do you want to meet.
....time passess and the <that>WHEN DO YOU WANT TO MEET</that> expires from Pandorabots.
So…I store externally the “WHEN DO YOU WANT TO MEET” and the <topic>
The client responds:
Client: tomorrow at 10am
How do I instantiate value of <that> with the value above so that Pandorabot can match the <input> with both the and the associated <that> and<topic>?
Currently, without thiis feature, I get
NO MATCH because I have no pattern that matches that input without a topic or a <that> to provide context.
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 1 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
I’m not quite certain that I follow, unless you’re saying that the “time” that passes is more than 30 minutes without input. Or, if this is not the case, perhaps a bit more detail would help, as I seem to be missing something.
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 2 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
time that passes is without input. Therefore the value of <that> has been erased from Pandorabots.
If that’s the case, then a response would have no context unless ther were a way to re-instantiate it. that help?
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 3 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
It does, in that I understand the question now. Sadly, that doesn’t help me to formulate an answer that will similarly help, as (to the best of my knowledge) there’s currently no way to “reach back into Pandorabots” and re-initialize the <that> value once it’s “expired”.
I could point out that Program O doesn’t have this problem, since Predicates, topics, and other “variables” persist for as long as your browser is open, but I won’t do that.
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 4 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
what is Program O? (how could I not?)
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 5 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Program O is a chatbot engine/AIML interpreter that works essentially the same as Program Z (the engine behind Pandorabots). The primary difference (of which there are many) is that Program O is not a chatbot hosting service. As such, it needs to be installed on an existing web server, and that server has to run PHP (for all intents and purposes, they all do, so that’s not a problem). Program O also requires a MySQL database as well, but again, most web servers also provide access to those, too. Lastly, since Program O needs to be installed, there are certain skills and experience (namely with PHP/MySQL, and some web server administration), so there is a bit of a “learning curve”. I don’t want to “steal” you away from Pandorabots, but if none of what I just described has scared you off yet, I invite you to check out http://www.program-o.com
Currently, there are technical issues with the Program O forums, but that’s something that Liz Perreau (Program O’s creator) has control over, and she’s been hip deep in baby dragons for a while now, so is not available to fix the issue. Still, as lead developer for Program O, I’m more than capable of providing support in these forums until the Program O forums get fixed.
BTW, you can also learn a bit about Program O at it’s GitHub page
|
|
|
|
|
Posted: Jan 13, 2015 |
[ # 6 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
You can’t do that in Pandorabots. If the user hasn’t said anything for 30-40 minutes, the bot assumes they no longer want to talk. This seems reasonable to me. In real life, if someone said they were buying a DVD and 40 minutes later you asked, “Which one?”, the original chatter would wonder what you meant. It’s the same with the bots.
|
|
|
|
|
Posted: Jan 18, 2015 |
[ # 7 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
@Steve—
Hi, in the DVD talking toa customer service rep case, that makes sense.
My use case actually expects a response and it seems more reasonable than not. It’s texting a person and if a human would have a reasonable response that’s what I would expect to be enabled by the bot.
I am thinking of enabling a cache to record the last <that> but would still need a way to re-initiate so Pandorbot knows that the user is responding to a particular context….would there be a way to do that if I stored it in a cache?
If I were to receive a text from someone and then then 30 minutes responded, but then had no idea what I were texting them about, I don’t think it makes sense in most cases. I text people all the time and they get back to me after 30 minutes and pick up from where the conversation started, versus as it works with Pandorabots, it has no context of how to continue.
So this might be a different use-case, but hoping for a work-around since I could probably find a way to store the last response….but not ideal….
@Dave—
I read the documents and am interested. Quick questions:
* Is AIML2.0 supported?
* Are the releases tested through CI before being released (lots of good services for free for open source)
* documentation for the API exist (something like using swagger which Pandorabots or, even better, a hosted version for free for open source would be ideal)...
Thanks to both of you——I’m intrigued to learn more..
|
|
|
|
|
Posted: Jan 18, 2015 |
[ # 8 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Your situation makes sense. I can’t think of an easy way to do this. You could maybe get the AIML to check for an existence of <that> and if one wasn’t there, to pass the cached one through from the client side? Messy though.
Let’s say, <that> should be “WHEN DO YOU WANT TO MEET” and <that> is currently blank.
You could get the client side to send something like XSETUPTHAT WHEN DO YOU WANT TO MEET to a category like:
<category> <pattern>XSETUPTHAT *</pattern> <template><star/></template> </category>
but somehow hide the output client side.
This would set up <that> with your cached value. As I say, messy though.
|
|
|
|
|
Posted: Jan 18, 2015 |
[ # 9 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
The current version of Program O does not yet support AIML 2.0, but I’m working on a complete re-design of the project that not only supports the new standard, but will also have a new storage/search algorithm that should greatly improve performance and reliability. I’m still a long way from releasing this new version, though.
I’m not quite sertain what “CI” is, to be honest (Unless you mean CodeIgniter, but that’s a “version 3” thing ). All testing of releases is performed by me before pushing them to GitHub, but that’s about it, right now. For all intents and purposes, every version of Program O is more or less a “beta version”.
Liz Perreau, the creator of Program O, has assigned herself the task of creating the user documentation for the project (since I just plain suck at documentation of a non-programming nature). I do put in some internal documentation that explains what some of the functions do, but the internal documentation isn’t very useful to the average botmaster. Sadly, Liz is currently on sabbatical from the project for the foreseeable future, so new user documentation isn’t on the immediate horizon. If you need to know about using the Program O API, I could certainly answer questions and provide advice and suggestions.
|
|
|
|
|
Posted: Jan 19, 2015 |
[ # 10 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
@Steve—
Thanks…so would I need to make two separate calls…the first to create the <that> set up…and then a second response with the actual client’s response?
So I guess the process would have to actually go something like:
a) receive the client response
b) check to see what the <that> is in the bot_response
c) if it is empty, then go to the cache
d) initiate the <that>
e) then re-run the client response now that the bot has a <that>
That the way?
|
|
|
|
|
Posted: Jan 19, 2015 |
[ # 11 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
@Dave =-
CI would be continuous integration so that if something got pushed to the repository it would have been run through tests and display the pass/fail status in the repository so we don’t pull down a non-working version.
I’d like to keep up to date—everything I have written so far uses the AIML 2.0 specs so that woud be tough but I think as things progress out of beta, I’d prefer to have our own server since we’ve had issues with Pandorabots from time to time that having our own server would eventually solve (particularly retaining sessions and predicates and such).....
thanks….
|
|
|
|
|
Posted: Jan 19, 2015 |
[ # 12 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
How are your PHP skills? One sure fire way to get the AIML 2.0 standard into Program O would be to help develop the project.
(please note that I generally don’t openly recruit people who’s skills and experience are unknown to me, but as I’ve been “flying solo” on this project for a while, it would be nice to have a little extra help )
|
|
|
|
|
Posted: Jan 19, 2015 |
[ # 13 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Tim Fong - Jan 19, 2015: @Steve—
Thanks…so would I need to make two separate calls…the first to create the <that> set up…and then a second response with the actual client’s response?
So I guess the process would have to actually go something like:
a) receive the client response
b) check to see what the <that> is in the bot_response
c) if it is empty, then go to the cache
d) initiate the <that>
e) then re-run the client response now that the bot has a <that>
That the way?
Well, that’s how I would approach it. Not ideal but it should get the job done.
|
|
|
|
|
Posted: Jan 20, 2015 |
[ # 14 ]
|
|
Experienced member
Total posts: 38
Joined: Oct 7, 2014
|
@Dave—I don’t develop in PHP
@Steve—thanks…not super-ideal….AIML should actually just natively support it, I would think. Seems like a reasonable scenario to have the option to set the value…. :(
|
|
|
|