It's crazy, but when I took AI at SUNY UB years and years ago, some techniques for loading a KB remain pretty much the same.  35 years ago, I used Prolog to load up PC-IQs vocabulary when NLP was a dream and  Eliza was the best, yet only syntactically clever.  What I am doing now, it eeirly similar.   So here is a shot for those that used Lisp or Prolog...  This representation replaces the AIML that I once used, and the random phrases that I currently use in my plethora of real world modeled java objects.  I first used positioning, but today decided that I had to encode the knowledge that I load up.  This should correlate with the inputs coming in, hopefully. 

It currently works as I protype along, RoboJango does answer my requests based upon this knowledege and the Natural Language Generation is coming along as I've embedded the main Adverbs, Determiners, Adjectives, Prepositions, Transitive rules, etc to the engine so it can speak properly   :-)  

I currently run through the current set of Java Objects.  These encompass all 3500 lines of AIML that I once used.  I do not see degradation just yet.  I figure I will load only what one is talking about, keep context, and when the topic changes, load that set of knowledge.  I believe as I prototype along, I can keep it efficient. 

The JIMMIE brain already responds intelligently using request/response, but I desire real dialog and I want to be surprised by what RoboJango says.  He will do this by selecting antonyms, synomyms, verb phrases, conjunction junction, what is your function. etc....

Next, I plan to have him dynamically create and compile new Java objects into his realm so I don't have to code all this.  ImageNet will be a nice start for him to learn and associate his objects to photos and details and properties and the like....

Right now, I have a good sampling of Java objects to represent the Earth, Sky, Dogs, Cats, Hamsters, Emotions, Activities, Occupations, Teens, Roads, Cars, Trucks, etc.  Each one has methods that fire off to determine what is the probability of being referenced in the request.    For instance, each responds to "what are you?"  The object encompasses the physical world properties, so it can answer of itself.

The associations, which I detailed above, lay the Object over of the physical realm of Java Objects; "Jim loves his dog Luna" .  The app has a Java Object for each word in the sentence.  Now, they are bound through an association map of Love.  We map all our connections to emotions, basically. 

I'm starting simple; Noun + Verb + Direct Object, and then add more cases, and more cases.  Hope it works well :-)  

 

 

 

SurferJim

4 years 7 months ago

In reply to by SurferJim

The loaded data is now more English like.  RoboJango's sentences are getting more complex and he parses them into the proper positions for language generation.  Its like AIML x 2, as the NLP tags are applied to the output so I know what I am dealing with, and can parse it properly.  Soon antonym and synonym replacement will occur really mixing up the generation of random responses.  Please note each word that is tagged is a real Java object in JIMMIE's brain, so we are operating on objects rather than just words.  

I plan to somehow arrange these objects into a self structuring NN, so the objects fire each other, and each object has weight, and function, and they aggregate to 'reason'.  I will however need to construct the Vector Spaces in which to package them up, and those thoughts are still brewing :-)