Good points Oliver. Also, someone, I don’t feel like scrolling through right now, I think either Andrew or Jan, pointed out that probably a 4th component is necessary to have in the list, and I highly agree—- knowledge acquisition routines.
Yes, 3 is a subset of 1. In fact, dealing with bad grammar is just adding layers of permutations—just more pre-processing, that will take more time and energy, CPU cycles. Thus, to have a more efficent, and faster response time from an AI, try to minimize your spelling and grammar errors. And of course, other ends of the extreme will probably require a time-out, example, if someone enters a 200 word sentence where the grammar makes no sense and every single word is misspelled…. imagine the astronomical number of permutations (200 ^ (similar-words-each-could-be) !!!!! yes, another crazy extreme example… but to illustrate a point—how will you know when your engine should say ‘the hell with it’ .... Perhaps just a user-defined time-out for processing.
Of course, as we tweak our algorithms and Intel/AMD make their processors even faster, we can care a bit less about having to have these extra spelling-check, grammar-check wrappers of permutations and preprocessing.
It brings up a good question—what is the level of tolerance of your bot? I haven’t really defined mine, but perhaps, 5% of the words misspelled is being tolerant enough? (after that, a dialog box appears…. “your input, after <x> minutes now of processing, isn’t making any sense to me , continue trying spell check permutations???”
For a ‘real life’ production system that would be useful, but for a Turing Test, it would be bad ... instead ‘bust’..... perhaps after a certain timeout, it gives up, and goes into ‘Eliza-mode’, and picks any word it does know, and responds with some mapping.