Small update: The 'bindings' feature in the new programming language is almost working, I'm currently getting out the bugs. I've also already written the initial code-snippet responsible for decoupling input from output and putting the 'creativity' in between, just the actual layers in between haven't been written yet.
Here's how the output-creation algorithm looks like at the moment (not tested)
/*
Activates a series of functions to modify the asset defined in CallbackIn.
Then looks up a pattern, based on how the value for 'CallbackIn' maps to that pattern. CallbackIn should be an asset, the attribute of all the children, makes up the pattern name.
Finally, activates the 'Output' section of the pattern that was found (which can be a topic or rule) and returns the result that was rendered. When the input was a topic, the 'questions' section will be activated, if it was a rule, the 'conditionals/output' section will be rendered.
You can optionally specify an argument, value 'true' or 'false' to indicate if the do-patterns also need to be executed or not.
Execution is done in another thread, to make certain that any var changes done by the rendering, don't affect any other parts.
*/
Cluster CreateOutput
{
this()
{
var iFound = ^Bot.TextRenderers.*; //get all the callbacks that need to be called before getting a pattern to render.
foreach(iFound in iFound)
{
PushValue(callbackIn); //need to manually add the function arguments for the callback.
PushValue(^(iFound).Args);
call(^(iFound).Callback); //a renderer is an asset with children: Callback (value = the cluster to call), and 'Args' (possible extra values for the callback, that were supplied by the user). The callback usually modifies the cluster found in CallbackIn, so pass that also along.
}
var iPath;
foreach(var iChild in CallbackIn) //get all the attribute values so we can build the name of the pattern to activate.
Add(ref(iPath), GetFirstOut(iChild, Statics.Attribute));
iPath = GetCommonParentWithMeaning(Statics.CompoundWord, iPath); //get the compound from all the pattern names, this links to the pattern.
if(count(iPath) > 0)
{
var iPattern = GetFirstIn(iPath, Statics.NameOfMember);
if(Count(iPattern) > 0)
{
var exec = New(neuroncluster);
var callresult = New(neuroncluster);
if (Count(callbackargs) > 0)
AddChild(exec, callbackargs);
AddLink(exec, callresult, Render);
AddInfo(exec, callresult, Render, OutputSin, PatternMatcher.SplitResults, iPattern);
BlockedSolve(exec);
callbackout = Union(callbackout, GetChildren(callresult));
Add(ref(PatternMatcher.ItemsToFreeze), GetChildren(exec));
Delete(exec, callresult);
return; //it worked, lets get out of here so that there is no error rendered.
}
}
Error("Can't map the path '" iPath "' to a pattern");
}
}
I'm a bit in lingo about what to do next. I need to finish these layers + add more pattern definitions (input + output) + I still need to fix the android version.
BUT, I just found this
Kaggle thing, which is soooo tempting. There are a couple of contests in there that I think I can do (within the timeframe of the contest). Some things are actually fairly similar to what I did for the
QA1WATT bot (which was written in something like a day, maybe 2). And others are in areas that I have already been preparing for since long ago (visual pattern matching for instance) .
Furthermore, with
google and
nuance getting into the personal assistant market, I'm feeling seriously outclassed and will always have to play a catch-up game with them, starting with getting rid of my startup-bugs and ending with a proper STT engine (that would take me years).
So, I think I'm first going to try and enter in
this competition and focus on that for a little bit (once the binding bugs are done).