Neural networks and Markov: a new potential, with a single problem.

  • 6 Replies
  • 1303 Views
*

Gustavo6046

  • Roomba
  • *
  • 3
  • Inventions of a mad man... I guess?
I have a new idea, where Markov chains build sentences, for which the connections are chosen by neural networks. The max number of normalizable connections for each Markov node is 128. The problem here is how to find out how to form the reply, where the input is the sentence that came *before*. To retrieve it, I need a neural network that gets the next node from the current reply node, AND the input sequence. Seq2seq networks are not an option.

My possible solution would be to convert a phrase into a number using of an encoder neural network, e.g. with the phrase "Hello, my dear son!" we iterate from inputs [''Hello", 0] → A, where A is the output of the neural network. Then we do it again, but with the word "my", so that ["my", A] → A + B. And so on, until we convert that phrase to A + B + C + D — where the plus sign isn't a sum, but some sort of joining, that goes on inside the neural network.

That number is then passed into a decoder neural network, such that [0, A + B + C + D] → [N₁, A + B + C + D], and [N₁, A + B + C + D] → [N₂, A + B + C + D], ..., [0, A + B + C + D]. Nₙ is denormalized into the word that corresponds to the nth node that follows the node Nₙ₋₁

What about you? Any better solutions or suggestions? :)

*

korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1017
  • Look into my eyes! WOAH!
    • Google +
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #1 on: December 06, 2017, 11:59:44 am »
Welcome Gustavo

The conversion of words to symbols/ indexes could be done with a simple dictionary database.  Each word is just substituted for its numerical index/ position in the dictionary and then fed into the decoder.

I presume the object of the decoder is to reduce the complexity of the sentence prior to processing?

For the output of the system you would have to reverse the sequence.  The output of the processing is fed into a encoder that produces the relative dictionary indexes to form the output sentence.

Am I close lol?

 :)
It thunk... therefore it is!

*

Gustavo6046

  • Roomba
  • *
  • 3
  • Inventions of a mad man... I guess?
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #2 on: December 06, 2017, 03:01:07 pm »
The thing is, Markov would be a helpful guide to avoid words that wouldn't make sense in a sequence - Markov would parse sentences that make sense.

Anyway, that what you described is a seq2seq neural network, which I unfortunately don't know how to do.

*

korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1017
  • Look into my eyes! WOAH!
    • Google +
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #3 on: December 06, 2017, 03:12:42 pm »
Quote
The thing is, Markov would be a helpful guide to avoid words that wouldn't make sense in a sequence - Markov would parse sentences that make sense.

Very true, it would cut out having to parse the sentence through a lookup list. 

It sounds like a cool idea.  If you code it, post your progress and results, I would be interested following your progress.

 :)
It thunk... therefore it is!

*

8pla.net

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1003
    • 8pla.net
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #4 on: December 08, 2017, 10:58:26 am »
I have an idea for which neural networks process the states from which Markov chains build sentences.
My Very Enormous Monster Just Stopped Using Nine

*

korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1017
  • Look into my eyes! WOAH!
    • Google +
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #5 on: December 09, 2017, 07:21:44 pm »
Well? Come on... Spill the theory...

 :)
It thunk... therefore it is!

*

Gustavo6046

  • Roomba
  • *
  • 3
  • Inventions of a mad man... I guess?
Re: Neural networks and Markov: a new potential, with a single problem.
« Reply #6 on: December 11, 2017, 11:50:31 pm »
It did not go very well, since the LSTMs are always addicted to one word, repeating it indefinitely ("no no no no", "argh argh argh", "mediocre mediocre mediocre mediocre" etc). so I decided to try a slightly less generative method.

Let's say we train four sentences.

  • A: Hello, how are you?
  • B: Great, thanks!
  • C: Oh, great then! How about you?
  • D: Great too, thanks!

These are stored into a chain, not of words, but of sentences.

To get an answer from this bot,

  • Find the top sentences with the most important words that intersect with the query. (in this case A and C)
  • Beginning with the largest intersection (A), for which the replying sentence is B, get the words of B with the highest TF-IDF words, then get ALL the words between the first and the last of these important words. ("Great,")
  • Repeat for C -> D. "Thanks!"
  • Join them correctly, in the order they were processed (or in any order you find better). ("Great, thanks!")

That should be okay for this case, but I did not test for other cases.

 


absolutely nothing motor controller
by ranch vermin (Home Made Robots)
Today at 11:49:42 am
Friday Funny
by LOCKSUIT (General Chat)
August 17, 2018, 04:10:47 pm
Ten Commandments of Logic
by RoyMac (General Chat)
August 17, 2018, 12:01:02 pm
XKCD Comic : Equations
by Tyler (XKCD Comic)
August 17, 2018, 12:00:14 pm
XKCD Comic : Repair or Replace
by Tyler (XKCD Comic)
August 16, 2018, 12:01:05 pm
Simple AI Website to let you know what is AI
by LOCKSUIT (General AI Discussion)
August 16, 2018, 01:23:19 am
The last invention.
by korrelan (General Project Discussion)
August 15, 2018, 11:06:53 pm
Hi there
by Art (New Users Please Post Here)
August 15, 2018, 08:06:54 pm

Users Online

40 Guests, 0 Users

Most Online Today: 52. Most Online Ever: 208 (August 27, 2008, 09:36:30 am)

Articles