first off, in normal conversation across the net a 10-15 second delay per volley is common. So if it were truly only going to take several seconds it wouldnt matter. Second… no, there is no way to issue a message at start and a message at end.
The structure of the engine involves 2 fixed threads and various user threads is as follows:
1. the server is waiting in a thread to accept someone’s connection
2. a user connection comes in, and the request is read in using a new thread
3. The chatbot engine thread begins processing the request
4. The chatbot engine thread completes the request
5. THe chatbot engine thread hands the answer back to that user thread and goes on to someone else’s request.
6. the user thread sends the answer back to the user (meanwhile another user thread may be making a request to the chatbot engine thread)
WHile it is technically possible for me to change this flow, the server is designed as a mass-production server, reducing thread synchronization to a minimum. Providing a flow that allowed multiple messages back to the user would significantly slow down the server process I think by increasing synchronization,