Anyone know of a parser like this?

  • 79 Replies
  • 14651 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #15 on: July 16, 2018, 02:24:59 am »
But more importantly, does your program prune away uncommon features like "then were" in "they ate and then were all happy"?
Emergent          https://openai.com/blog/

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #16 on: July 16, 2018, 02:25:21 am »
the numbers refer to the node level(Depth) ... if i expand those nodes you will see the "STOPCHAR" this is for the end of sentence marker.... therefore from the root node to the "StopChar" node is a complete sentence. or ( "SUFFIX" all other prefixes are contain within its own suffix.....)

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #17 on: July 16, 2018, 02:27:58 am »
But more importantly, does your program prune away uncommon features like "then were" in "they ate and then were all happy"?

If a node STopChar is at level 1/2 then obviously its just a term which has been mention in some string...
But if a node has 2/3 node then it must contain some structural data... if a branch has many sub branches then that branches data all must be structurally related... And so forth
as with the cat node containing all the different actions the cat took ... and dog the dog took... the single nodes had no sub-nodes so they must be some subject which is being referenced by an object (note the single item deteminer(the) Contains dog and Cat nodes) (noun phrase detection)

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #18 on: July 16, 2018, 02:30:39 am »
Can it connect the top node "the cat was" to one far below by a link...? Like this:

Else it can't re-use them to make 1 bigger thing
Emergent          https://openai.com/blog/

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #19 on: July 16, 2018, 02:41:12 am »
But more importantly, does your program prune away uncommon features like "then were" in "they ate and then were all happy"?

If a node STopChar is at level 1/2 then obviously its just a term which has been mention in some string...
But if a node has 2/3 node then it must contain some structural data... if a branch has many sub branches then that branches data all must be structurally related... And so forth
as with the cat node containing all the different actions the cat took ... and dog the dog took... the single nodes had no sub-nodes so they must be some subject which is being referenced by an object (note the single item deteminer(the) Contains dog and Cat nodes) (noun phrase detection)

All actions under the ALL determiner would group under all... this is why its useful for "Merging sentences" so now a paragraph of text can have defined structure and meaningful ngrams not brute force ngram creation... remember
we are building techniques which AI can utilise for unsupervised learning. (without training) ..... But also maybe can correct itself (clean bad data) (Success / Failure (reinforcement learning) from self built historical data....     <<<<<<<<<<< See the intelligent algorithm.. collects(unsupervised) / (success / failures (probabilistic / Statistical methods)/ Bayesian methods ( cleans) / refines collection / (learns) <<<<<Self supervised learning? Is that algorithm aware maybe of its own errors and failures and attempts to correct itself thereby learning and improving itself based on experience collected and anyalized (thinking)....Deciding .... Hmm very intelligent for a simple Process? maybe?

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #20 on: July 16, 2018, 02:46:41 am »
Yeah I learnt that yesterday, that nodes with deeper structure means it isn't stuff like "were then".....it can't combine it to make "were then is like"! but wait what if it makes "were then all happy"? Oh oh, then it thinks the lower smaller snippet "then were" is a good thing!...

I mean if it was "were then the gold" in "they ate but were then the gold winner and they didn't want to be" then it'd link together 2 bad uncommon pieces forever...and can it go past 2/3 nodes and be a nuisance? "were then the gold winner and"....yes?....
Emergent          https://openai.com/blog/

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #21 on: July 16, 2018, 02:47:04 am »
Can it connect the top node "the cat was" to one far below by a link...? Like this:

Else it can't re-use them to make 1 bigger thing

now when designing an Algorithm you have to set your RULE for the algorithm ;... when presented by your case data it is tested for MATCHES based on your STRATEGY then a selected action can be taken (old time game strategy)

The Inference engine

if that what your rules define as a "Legal" move then yes you can .... Construct a New branch by adding the Sub node to the other sub node at the desired point nodes can always be copied or moved.... if that what your rules denote then you can write a function to perform your action

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #22 on: July 16, 2018, 02:49:27 am »
oh just a fix on my pic of your pic:
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #23 on: July 16, 2018, 02:50:06 am »
But does *your* program combine 2 distant snippets like the red line I drew shows??
Emergent          https://openai.com/blog/

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #24 on: July 16, 2018, 02:54:29 am »
But does *your* program combine 2 distant snippets like the red line I drew shows??

All seeing all doing remember!

Don't forget to read the posts.... diagrams cant help you .... right now ....

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #25 on: July 16, 2018, 02:56:01 am »
So what do you do about if your program reads:

"We were then all tired and sat in the old lobby at the old school and then were all too cold to move."

and saves

"lobby at the old school and"

like this in your format (see image):
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #26 on: July 16, 2018, 02:58:10 am »
(see above)

in that image, it does have more than 2/3 nodes, a structure, yet, it is a bad take from a sentence, it'll never be re-used, and yes it got re-used by the word parts that followed but, that's all it'll ever get! It'll ass tons of clutter and false real world knowledge....slow searching...etc!! For a AGI..
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #27 on: July 16, 2018, 03:02:37 am »
oh and, is your tree saving them like on the left or the right?

I'd say the one on the left is really bad but hmm wait, thinking...if it scans many sentences and learns structure like that, um, it may learn "lobby at the", "at the old", "the old school" in its lifetime, and, and, no this is horrible clutter...
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Anyone know of a parser like this?
« Reply #28 on: July 16, 2018, 03:06:29 am »
Quote
in that image, it does have more than 2/3 nodes, a structure, yet, it is a bad take from a sentence, it'll never be re-used, and yes it got re-used by the word parts that followed but, that's all it'll ever get! It'll ass tons of clutter and false real world knowledge....slow searching...etc!! For a AGI..

Also, That's also bad for re-use, again, it'll never be reusing those features...

- because it's doing the left side example, not the one on the right i show.
Emergent          https://openai.com/blog/

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: Anyone know of a parser like this?
« Reply #29 on: July 16, 2018, 03:09:19 am »
So what do you do about if your program reads:

"We were then all tired and sat in the old lobby at the old school and then were all too cold to move."

and saves

"lobby at the old school and"

like this in your format (see image):

There are a few valid shapes in there but the sentence is an irregular sentence and would need pre-processing such as changing the "WE" to a valid noun or object/subject as well as the and being replaced by full stops... a i say "bad data in bad data out" there is no real data in that sentence only snippets... perhaps pre-processing would give better results....even this irregular sentence has "missing data" so its contents wold not be pre-processed without the related sentences from the paragraph.... this is why a lexer / tokenizer / rule-set is required/// to root out the bad data and block data collection... intelligence.... smart code!

 


Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
Attempting Hydraulics
by MagnusWootton (Home Made Robots)
August 19, 2024, 04:03:23 am
Server Upgrade
by Freddy (Welcome to AI Dreams forum.)
August 12, 2024, 03:20:04 pm
Reasoner.js: a framework for generalized theory synthesis
by ivan.moony (General Project Discussion)
July 07, 2024, 01:35:38 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

396 Guests, 0 Users

Most Online Today: 397. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles