Ai Dreams Forum

Member's Experiments & Projects => AI Programming => Topic started by: Gustavo6046 on December 04, 2017, 04:21:16 pm

Title: Neural networks and Markov: a new potential, with a single problem.
Post by: Gustavo6046 on December 04, 2017, 04:21:16 pm
I have a new idea, where Markov chains build sentences, for which the connections are chosen by neural networks. The max number of normalizable connections for each Markov node is 128. The problem here is how to find out how to form the reply, where the input is the sentence that came *before*. To retrieve it, I need a neural network that gets the next node from the current reply node, AND the input sequence. Seq2seq networks are not an option.

My possible solution would be to convert a phrase into a number using of an encoder neural network, e.g. with the phrase "Hello, my dear son!" we iterate from inputs [''Hello", 0] → A, where A is the output of the neural network. Then we do it again, but with the word "my", so that ["my", A] → A + B. And so on, until we convert that phrase to A + B + C + D — where the plus sign isn't a sum, but some sort of joining, that goes on inside the neural network.

That number is then passed into a decoder neural network, such that [0, A + B + C + D] → [N₁, A + B + C + D], and [N₁, A + B + C + D] → [N₂, A + B + C + D], ..., [0, A + B + C + D]. Nₙ is denormalized into the word that corresponds to the nth node that follows the node Nₙ₋₁

What about you? Any better solutions or suggestions? :)
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: Korrelan on December 06, 2017, 11:59:44 am
Welcome Gustavo

The conversion of words to symbols/ indexes could be done with a simple dictionary database.  Each word is just substituted for its numerical index/ position in the dictionary and then fed into the decoder.

I presume the object of the decoder is to reduce the complexity of the sentence prior to processing?

For the output of the system you would have to reverse the sequence.  The output of the processing is fed into a encoder that produces the relative dictionary indexes to form the output sentence.

Am I close lol?

 :)
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: Gustavo6046 on December 06, 2017, 03:01:07 pm
The thing is, Markov would be a helpful guide to avoid words that wouldn't make sense in a sequence - Markov would parse sentences that make sense.

Anyway, that what you described is a seq2seq neural network, which I unfortunately don't know how to do.
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: Korrelan on December 06, 2017, 03:12:42 pm
Quote
The thing is, Markov would be a helpful guide to avoid words that wouldn't make sense in a sequence - Markov would parse sentences that make sense.

Very true, it would cut out having to parse the sentence through a lookup list. 

It sounds like a cool idea.  If you code it, post your progress and results, I would be interested following your progress.

 :)
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: 8pla.net on December 08, 2017, 10:58:26 am
I have an idea for which neural networks process the states from which Markov chains build sentences.
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: Korrelan on December 09, 2017, 07:21:44 pm
Well? Come on... Spill the theory...

 :)
Title: Re: Neural networks and Markov: a new potential, with a single problem.
Post by: Gustavo6046 on December 11, 2017, 11:50:31 pm
It did not go very well, since the LSTMs are always addicted to one word, repeating it indefinitely ("no no no no", "argh argh argh", "mediocre mediocre mediocre mediocre" etc). so I decided to try a slightly less generative method.

Let's say we train four sentences.


These are stored into a chain, not of words, but of sentences.

To get an answer from this bot,


That should be okay for this case, but I did not test for other cases.