1
General Project Discussion / Re: Reasoner.js: a framework for generalized theory synthesis
« Last post by ivan.moony on Today at 12:04:13 am »I had some experiments lately with symbolic algorithm synthesis. I had an input, and I had an output, and I had to clue up how can I brute force combine given rules to transform the input to the output. It works well with a small amount of steps, but eventually combinatorial explosion happens where I get stuck in almost endless loop that may take centuries to finally get an answer.
And that is where neural networks seem to do very good job: to quickly learn how to connect inputs to outputs, memorizing weights for future construction of outputs. Of course, there is an approximation error with neural networks, but given enough learning examples, it may work like a charm.
Basically, the two technologies should do the same thing, but with different advantages and drawbacks. The symbolic one brings certainty but it is slow while the neural one is fast but it brings randomness. Is combining them really the next big thing the world is waiting for?
And that is where neural networks seem to do very good job: to quickly learn how to connect inputs to outputs, memorizing weights for future construction of outputs. Of course, there is an approximation error with neural networks, but given enough learning examples, it may work like a charm.
Basically, the two technologies should do the same thing, but with different advantages and drawbacks. The symbolic one brings certainty but it is slow while the neural one is fast but it brings randomness. Is combining them really the next big thing the world is waiting for?