Working on a pattern NLP

  • 7 Replies
  • 292 Views
*

MikeB

  • Roomba
  • *
  • 10
Working on a pattern NLP
« on: September 19, 2019, 09:10:29 AM »
Hi,

My name is Mike. I like GOFAI/hard coded minimalistic AI. I think ML is good for visual/sound pattern (text to speech, visual shapes) detection but not for text processing ...

I am working on a pattern based NLP which converts english/other language words directly into grammar symbols, which are then picked up in matched sentences. The grammar interpretation is completely unique, and encodes user-perspective (precise, astute, optimistic, holding-carrying, explanation, vivid) with the type of word it is (person, object, moving/doing, logical, and others), for max intention.

Memory is symbolic, but instead of "this is this".. it is "this is this about this". And "is" has a past/present/future encoding.

I have been programming it since January. It is the Michelangelo bot on Pandora. It has no knowledge, no learning, but also no guessing and will generally get what you mean. (but keep your expectations low).

I can post all sorts of unique insights on GOFAI AI but nothing on ML...

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *****************
  • Sentinel
  • *
  • 3533
  • First it wiggles, then it is rewarded.
Re: Working on a pattern NLP
« Reply #1 on: September 19, 2019, 10:09:03 AM »
I sorta agree on the text=non-NN way but also Glove discovers cat=dog and requires a 300+ dimensional space for the close-ed-ness relational amount and that requires nets. And GPT-2 uses nets, try it here > https://talktotransformer.com/
GPT-2 is real, I've verified it and you can try yourself making it generate novel stories about 5 things i.e. aliens landed in government castle using nail clippers. Issue with Glove is you gotta store the relations to get at the next, you can't do it on the fly, however, i have yet to verify that.

WoM on this forum seems to do that too, the past/present/future hard codings, etc. I mean what is all that for anyway? I have to re-read WoM's project Acuitis againnn and most does seem needed but some is a bit non flexible (maybe). Gotta be careful to let 'it' do the solution, instead of us doing it.
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *****************
  • Sentinel
  • *
  • 3533
  • First it wiggles, then it is rewarded.
Re: Working on a pattern NLP
« Reply #2 on: September 19, 2019, 10:17:12 AM »
also

I'm currently checking out the openCog chainer, it seems similar to my and GPT-2 project for AGI creation.
Emergent

*

AndyGoode

  • Trusty Member
  • *****
  • Mechanical Turk
  • *
  • 192
Re: Working on a pattern NLP
« Reply #3 on: September 19, 2019, 09:41:22 PM »
I think ML is good for visual/sound pattern (text to speech, visual shapes) detection but not for text processing ...

Correct:

Quote
(p. 36)
Classical AI
techniques are best suited for natural language processing, planning, or explicit reasoning,
whereas neural networks are best suited for lower-level perceptual processes, pattern
matching, and associative memories.

Haykin, Simon. 1994. Neural Networks: A Comprehensive Foundation. New York, New York: Macmillan College Publishing Company.

I am working on a pattern based NLP which converts english/other language words directly into grammar symbols, which are then picked up in matched sentences. The grammar interpretation is completely unique, and encodes user-perspective (precise, astute, optimistic, holding-carrying, explanation, vivid) with the type of word it is (person, object, moving/doing, logical, and others), for max intention.

I don't understand your description. Anyway, are you aware that there exist exactly 10 grammatical patterns in English? (One site says 9, but all other sources I've found list 10.)

https://www.globalenglishcpm.com/the-10-sentence-patterns.html
http://www.uobabylon.edu.iq/eprints/publication_10_9890_948.pdf

If I were tackling the problem of recognizing natural language, I would definitely incorporate knowledge of those 10 patterns into the system.
« Last Edit: September 20, 2019, 01:22:21 AM by AndyGoode »

*

goaty

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 426
Re: Working on a pattern NLP
« Reply #4 on: September 20, 2019, 03:22:23 AM »
Saying neural nets stop at classification is just more pre-disbelief in whats possible in a.i…  but this isn't even the point.
I can guarantee you any form of program,  be it symbolic language or whatever, can all fit into a feedforward neural network,  even frogger will.
So its just another form of not generalizing enough,   and disbelief that one can extend an idea further, when its probably not true.

*

Hopefully Something

  • Trusty Member
  • ********
  • Replicant
  • *
  • 646
  • whatever works
Re: Working on a pattern NLP
« Reply #5 on: September 20, 2019, 05:28:50 AM »
Yup. I believe it can be done with anything. But the most efficient way would be combining various technologies so that they compliment each other. We humans are stuck with neural nets, better than cpu's at some things, worse at others. AI would do best with access to both, and more.

*

MikeB

  • Roomba
  • *
  • 10
Re: Working on a pattern NLP
« Reply #6 on: September 25, 2019, 07:14:17 AM »
Neural Nets are still guessing though? Both algorithm and ML/NN still try to match the input sentence with a known sentence?

The first step in a traditional NLP is to strip the word of all "ing/ed/s" add ons.. this gets rid of context embedded into the word from the user. "ing" = travelling, moving, completing. "ed" = travelled, moved, completed.
- Travel = present tense
- Travelling = future tense
- Travelled = past tense.

I don't understand your description. Anyway, are you aware that there exist exactly 10 grammatical patterns in English? (One site says 9, but all other sources I've found list 10.)

I wasn't aware specifically of that list... but sentences that are short like "jane stepped forward" doesn't have much meaning and context... The goal of my pattern NLP is basically context encoding. The same sentece with full context would be like "jane stepped forward, during dance class, for her career", encoding the environment and her goal in life...

The other goal of the NLP is to be as compact as possible and use as little cpu as possible... no searching, guessing, algorithms. One instruction memory.. so everything has to be in full context, extracted from the input, fed through a 100% correct path to a fixed output. The ultimate compact bot is one that can understand anything in the world but only ouputs "yes" or "no".

*

MikeB

  • Roomba
  • *
  • 10
Re: Working on a pattern NLP
« Reply #7 on: September 25, 2019, 07:35:20 AM »
The "grammar perspective" sentence types are an invention to extract more context. Basically assigning all non-nouns to Precise/astute/optimistic/holding-carrying/explanation/vivid categories.

('Question-type' words in brackets):
Precise = present point. (Who/What/When/Where/Which)
Astute = inquisitive. (Is/Are/Am)
Optimistic = future. (Could/Would/Should/May/Can)
Holding-Carrying = an issue. (Have/Has/Was/Were)
Explanation = past story. (How/Why/If)
Vivid = present situation.

Basically all non-noun words are fed through these "perspectives", then sentences are made from that. EG. "How are you" = QUESTION_EXPLAIN, DEFINE_ASTUTE, PERSON_VIVID. Now there is very little error in understanding the meaning.. if there needs to be more "resolution" eg, is it a how/why/if?, then the original word for each can be remembered and extracted... Super fast compared to algorithm but there are still a large number of symbolic sentences...

 


Users Online

23 Guests, 1 User
Users active in past 15 minutes:
LOCKSUIT
[Trusty Member]

Most Online Today: 28. Most Online Ever: 340 (March 26, 2019, 09:47:57 PM)

Articles