Thanks for the feedback from both Vincent and Victor. As far as Harri’s brain surgery, I took him as far as recognizing different word types and about 15% of the way into subject predicate analysis, but then faced the issue of dumping someone else’s grammar core into him for brain inception. That seemed not the best thing, so I have commented out his parsing for the moment, and I went to work first on the sql interface.
Now about data access speed, I agree with Jan, just as you do. I can’t rely on sql for RESPONSIVENESS, but the I can for STABILITY. I have completed the conversion process where the sql tables are copied out into JSON files and saved conveniently in the site’s directory tree. As you can see if you click the bottom UI button, and then click the GO! button, I worked out two synonymous AJAX calls (and the framework for 100 more as needed) bringing that sql table data into the js envron and combining elements of each file, just to get it started. What you see there is redundant data. The driller into the sql is still alive up top while the driller into the same data in flat file format is also alive and active. Since the json files auto generate with each click of the Encode JSON button, harri’s JSON brain is always up to date with his sql brain.
Granted, the window into JSON is not clear as of today, but check back in another few days to see it come along. I also wish to have a xml conversion for information aspects that are better suited to that format, and potentially CSV as well. Having the core data in a RDB gives me all the integrity advantages of normalization, and I don’t have to compromise speed either, because all info will be available in flat form too.
To the other points, I am concerned with interpretive particulars regarding “what’s” and “ups” just as much as the next guy. But to reiterate my “stopper” which I mentioned above, I feel that one of the fundamentals underpinning interpretive issues involves syntactic / semantic approach at the core level. I think I can do it better, but need to prepare the field before I go out to play, so to speak.
As I explained in other posts, I need to make harri do two things before I start hard coding response paradigms. He needs to do ordering logic loops continuously though his available data, and much of that will be self contemplative; that is, re-mashing data that he already has taken ownership of, and I intend to police that grade of data well. Secondly, he must be able to “PAST CYCLE CHANGE”. That can be temporal change, conversant change, or other STATE change. He needs to be able to retain short term for comparative iteration, and then summarize into longer term loops, summarizing those in turn at some point into permanent, ordered “beliefs” or “postulations” that can be always adjusted.
With those three particulars hammered out (brain, logic, and cyclical comparative), I will feel ready to begin work again on conversant development, but I have a little mountain to climb first.