anyone want to have a chat or a stab at NLP with me?

  • 40 Replies
  • 15693 Views
*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #15 on: April 19, 2015, 11:14:55 pm »
Been away for a few but thought I toss this in.

There's a difference between Translating and Understanding...a big difference.

While a Blind person might not be able to "See" or have an appropriate sensory experience of exactly what an apple might be, he / she can certainly experience a very large part of it. It's relative size, shape, texture, small, taste, firmness, crunchiness, sweetness, thick (a bit difficult to chew) skin on the outside, a more firm center core with small seeds, etc.

Could be told that some might be red and some yellow depending on variety but the color issue alone would not necessarily deter the non-sighted person from experiencing or enjoying an apple that can't be seen.

Does usage perhaps play a greater part than any "understanding" or translating?

As long as proper connection are or can be made, then the rest of these "concerns" become moot.

Google's voice enabled search usually get it right every time and there's been times I've accidentally slurred my speech and it still got it correctly.

I'm sure it will get better in time as well but for now, it's pretty spot on! It can also tell you the definition, usage, origin, etc., and can translate as well.

Would you expect any less of your own robot? ;) JK

In the world of AI, it's the thought that counts!

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #16 on: April 20, 2015, 06:27:36 pm »
I see nobody bothered reading my last reply so I will be very short this time...

intelligent bot should be aware of both, of the meaning of the sentence and of its syntax and grammar properties.
No, they shouldn't (or at least, don't have to be). Many people can talk, but they don't necessarily know how to write and break down what they said into grammatical or morphological units. In fact, people have been talking for a long time before grammar was created.

Edit: This may also contribute to me not liking NLP and NLU focusing on strict grammatical rules.
You don't need to consciously know how to break a sentence down: You are nevertheless following many linguistical rules, whether you are aware of it or not. At some point you learned them, by example or in school. Where AI is concerned you have the same options: Program the rules as we already know them, or expect the AI to figure out a foreign language without taking classes. I haven't seen any success with the latter yet. *edit*: Except Watson to a degree.

Your earlier post pretty much described the symbol grounding problem, a popular theory which I personally do not consider a problem. A blind person may have no concept of "green", but then do you really have a sensory understanding of "meerkat" or "black matter" or "Higgs Boson particle"? And does this make you incapable of thinking about these subjects intelligently?
The definition of understanding of "symbol grounding" supporters seems to be "To have the exact same forms of knowledge representation as a human", but would that definition not make it impossible for dolphins and aliens and binary circuitry to "understand" anything (-like a human)? Might as well not even try then.

NLP is not the same as grammar though. Context-free grammar is a relatively simple form of NLP that was popular up to the 80's, but a lot of the modern techniques have ditched the grammar and work with statistical properties of text instead. It's still processing of language.
CO2 retains heat. More CO2 in the air = hotter climate.

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #17 on: April 23, 2015, 04:30:31 pm »
Symbol grounding is 'reducing difference'  or you could say 'slowing it down'  because your retina is always changing, and your reponse is changing slower.
I claim I can ground symbols by tracking,  the connecting the points together with edges, then comparing edge topologies between different things purely algorythmically start to finish,  could be partially 3d,  and its a vector based approach.

That im fairly confident in the theory of,  but if you actually manage to put it together, you can be a top competitor these days if you actually implement it, because its currently what is being worked on by most ai dudes.  2015.

NLP is more than that, I dont know how the hell you make your robot 'LITERATE'  but I still am stuffing around in my head what it even is to do it, even just generally.

Im wondering if the whole sensor is actually language,  whos to say what doesnt symbolize something, even if its internally in the robots head, for more "self conscious type algorythms"   then its members and operations on members, if you think english is symbolic logical paradigm code, and maths being factoral code.   

Im sorta thinking about taking in commands from the sensor,  but what if its not a direct command,  its something where the reaction is variable.

I actually stopped thinking about this and started back on symbol grounding again, because I havent implemented it yet, and it would quite put me quite up there if I did already.

*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #18 on: April 23, 2015, 07:24:26 pm »
Here are some random thoughts I have about this subject. Not really too deep but it was still fun to create.
It was getting very late here and so I was starting to mumble some later on in the vid.. Sorry about that.



*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 322
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #19 on: April 23, 2015, 10:08:47 pm »
I just got back into it after a Working break .
Been working on the problem , yet still after tagging the sentence , extracting predicates and subjects , and objects subjects , articles ....
The problem remains

Detecting the sentence type ...
Regardless if it is a complex , compound , complex/compound etc ...

Each sentence or complete thought (could be paragraph)

Is either ,
Reason , question, implication , opinion , description ..... Advice .., report ... Data .... Etc .

Detecting which type of sentence also denotes which reply pathway to choose . This idea or understanding enables for response or no response as sometimes people ramble on an switch subjects mid sentence ... They can be telling you some information or just making and exclamation ... "I like ... " , "a cat sat on the mat" a description of an event which contains data yet not to be saved yet I'm this case it could be a witness statement ...so completely relevant ... By understanding the purpose of the sentence the data in the sentence has a specific placement in the knowledge storage process ....

Should we diagram the sentence ? Or should we categorise the sentence as a thought process or ...

Questions ideas , yet still detecting sentence type or purpose ... Is important for understanding that "the green furry cat jumps over the moon" is a gibberish sentence and although a description is utter "crap" ....,.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #20 on: April 23, 2015, 11:09:43 pm »
@Spydaz
Sentence types can be recognized by their syntax elements. Also important value has a context in which a sentence is implanted. Like if you say "ask me some question, I'm curious of what I will hear from you.", as an answer we would get a structure that doesn't need to be answered back.

@AiDreams
if we had a natural language processing system (to recognize all subjects, predicates, actors, patients and other stuff), then deriving implicit knowledge wouldn't be that much hard, we would just have to implement logic deduction mechanism. And that is something I'd like to see with chatbots: to be able to get answers as a logic consequences to inputted data. And after deduction, there simply follows abduction (which is only inversed deduction) and induction, also not very difficult to implement after NLP system is programmed. Having all that goodies that go hand by hand with NLP, one could easily ask a chatbot questions like: "How can artificial food be synthesized?" and if or when there is enough data in the knowledge base, we would get the answer in a form of recipe theory about synthesizing food (which is simply a chain of logic causes and consequences that lead from starting objects to resulting objects). Also I'd like to see automatic learner of new languages, but English NLP is also a nice start that, when solved, could give a nice basic understanding material of how to build higher level automatic universal language learner.

The part of my current work includes investigation of structures by which any natural or synthetic language could be described and so far I'm satisfied with results.

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #21 on: April 23, 2015, 11:35:54 pm »
Of course purposes are very important if you want to make conversation on top of exchanging information. As Snowman tells in his video, and as any grammar book will tell you, you can distinguish questions from the order "verb-subject" and statements from the order "subject-verb". Question marks, question words and intonation are also not to be ignored as these can override the conventions of syntax.
Other sentence purposes can be distinguished by link words "because", "for example", "although", or by the use of reporting verbs. And then there are some purposes that you can not recognise from syntax at all, but only from context. "My car broke down." becomes an explanation only when it is preceded by "I can't come to work.". And that requires more than just linguistics. It requires the AI to investigate causes and other correlations.

If you're just checking for what not to remember, then I'd suggest only storing things that come up repeatedly, e.g. earlier mentioned subjects that are referred to or elaborated on over the span of several sentences. You could separate temporary and permanent memory. Syntax will only tell you that "the cat jumps" is a present tense statement about a specific cat that should therefore be obvious to locate, yet is nowhere to be found in context, and that therefore this is probably a theoretical statement. Knowledge and inference will tell you that cats can not jump that distance nor breathe in space, and that therefore the statement is a lie. And psychology will tell you what the user actually wants from you.

Ivan: Remove "easily", "simply" and "not very difficult" from your description and I might agree with you. NLP and inference engines are two very different components, and inbetween them you also need a very good knowledge representation for them to work well together.

Here's something of an overview of all NLP techniques, in video explanations: https://class.coursera.org/nlp/lecture
« Last Edit: April 24, 2015, 08:31:37 am by Don Patrick »
CO2 retains heat. More CO2 in the air = hotter climate.

*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #22 on: April 24, 2015, 08:31:18 am »
Here are some more thoughts. Not especially insightful but at least I'm trying :)


Entire texts should be evaluated and not just single sentences. Spydaz has been attempting to implement this type of evaluation with his NLP programming. Just as a sentence has many rules of grammar to dictate the appropriate conveyance of information, groups of sentences have their own style of grammar. And just like sentence structures, the order of multiple sentences can indicate the type of information being presented. For instance, "Why was I at school today? It was because my parents made me go." This example would not make any sense in the reverse order.

On a larger scale of text evaluation you can examine essays and even books. Large texts often follows the Introduction/Body/Conclusion pattern. There are also a variety of styles of speeches, thesis statements, poetry, sonatas, arguments, etc.

As for evaluating specific types of information being conveyed (ie Reason , question, implication , opinion, etc.), sometimes we humans have to be told expressly what type of information is being presented. Or perhaps, we have to be expecting only a certain type. Occasionally, sentence types are obvious and are indicated by special words like "because". If I ask for a reason why my friend arrived late then I would expect an explanation. If I asked for your opinion that is what I would expect. However, if I was expecting a fact, but instead, received your opinion, I may not be able to distinguish the two. Basically, labeling the type of information being presented is sometimes faulty. It is sometimes difficult to understand the type of information being presented in any conversation.

Lots and lots of people are poor communicators... can I get an Amen! Understanding information being poorly presented can be difficult or even impossible for even a communication expert like Art  O0. Basically, you shouldn't set your sites too high when creating an NLP or an Inference/Deduction Machine. Somethings are only understood through various forms of telepathy ;) , or by truly knowing the converser's personality and personal history. Just do your best and try not to be a perfectionist when making a NLP. It takes a child a very short time to learn the basics of any language, but it will take a person years or even decades to fully understand some higher forms of communication.

So we start off with simple words, then we progress to basic forms of grammar. As children, we have very limited context, because we understand so little about our world. Things like inference and deduction (aka. logic) is barely even working. We probably spend most of our time crying or throwing tantrums because we cannot constructively deal with our emotions. This, I think, is the best place to start with a Chatbot, at the beginning of human language development.

We can convey lots of information through a simple subject-verb sentence. Children are demanding: "Daddy stop." Possessive: "My toy." Insightful: "Daddy Funny." We can demonstrate some conversational logic. Fire is hot. Hot is bad. (therefore) Fire is bad. Yes, this is overly simplified but it can be built upon. Try to convey every form of child logic and simple sentence structure to emulate a child and then attempt to build from there... see where it takes you. Try to make a child-like Ai that could pass a Turing test.  Anyway, it's just a thought.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #23 on: April 24, 2015, 03:17:23 pm »
Ivan: Remove "easily", "simply" and "not very difficult" from your description and I might agree with you. NLP and inference engines are two very different components, and inbetween them you also need a very good knowledge representation for them to work well together.

Yeah, that's me, I'm known for always being too optimistic. It's a feature, not a bug :)

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #24 on: April 24, 2015, 03:40:25 pm »
I follow OpenCog group and here and there pops up something interesting. This is one of those cuties:

https://drive.google.com/file/d/0BzvP-g2cuUrMUUZhZzhpTURWbjJ1b2RubTlBTXdERGNOWjlv/view?usp=sharing

In short the authors write about 42 behavioral patterns in verbal comunication. Maybe someone would find it interesting.

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #25 on: April 24, 2015, 06:01:18 pm »
A paper on speech acts, I see, the descriptive approach of discourse analysis. I haven't found that field of research useful, but it's worth a look just to think about.

Yeah, that's me, I'm known for always being too optimistic. It's a feature, not a bug :)
Ah well, as long as it's not contageous, I can live with that.
CO2 retains heat. More CO2 in the air = hotter climate.

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #26 on: April 24, 2015, 06:17:58 pm »
Ivan,  that looks like something that a markovian chain would do better with that just the letters themselves!

so apart from that "type of statement" as a larger scope on the detail of the statement:::->::->

I just had a blast back to my old theory.   here cop this (very silly because its easy to come up with, and a bit of a throw back, but somehow interesting for me) idea->

what if you had an ordinary markov chain,  and its storing its random bursts, to what you say to them.

so now what if you run a heap of trials, using the markov chain (which is a simulator, the most simple 1d sim you can make.  and is actually the only one you really see.) 

Youve got 400 or so gigahertz from your video card to trial a damn shitload of letter grunts out of it, and the virtual simulation of what your saying to it (even though its only its gibberish version of what you said.) to cycle at this phenomenal rate you need for these isotropic searches. 

but would it ever start making sense?

Its alot like being a (really stupid) dog trainer.

I guess you have to give it a present, or a punishment, and itll predict all the way to the present or punishment (it has to go sim a long way till it gets to either one or the other), given so much text.

Its fully shaped by its sorroundings.  (its sorroundings being the chain database)   Im very interested to try it out.

(id use something like space barring vector machine to do it, but you have to think in letter strings, but its the same idea.)

It seems like a deficiency, not a benefit, because of course if you hand trained it would be smarter.  but you know what an NLP could be good for??    AutoSupervising this thing!

Because NLP is so weak (right?) then itll be easy to confuse these triggers the NLP is giving it.  but it could be fun, and could definitely tell people apart from each other, and lots of other things, and it is actually a fully extendable idea, but just text.

i should know- ive been thinking about video sims FOR ages!  and they are freaking hard.  and noones done it before. only text has been done->http://thinkzone.wlonk.com/Gibber/GibGen.htm
« Last Edit: April 24, 2015, 10:44:43 pm by ranch vermin »

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #27 on: April 25, 2015, 02:41:25 pm »
here snowman, thanks for the video lesson it was good really helped me.

This thing doesnt learn off you at all,  except its supervised with some text group labelling system, I havent ironed out yet.  It will be extremely simple and artificial, and will include what you said in your video, which was great.

Just see how braindead I am at trying pull out an explanation. :)




[EDIT]  because im brute forcing what it says, its performance gets over the problem of raw data specificy,  so I was thinking for the opponent side (what you say to it)  maybe i have to do a little generalization work on the playback. so instead of linking the raw words, i actually chain the intent. [/EDIT]
« Last Edit: April 25, 2015, 03:47:20 pm by ranch vermin »

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #28 on: April 25, 2015, 03:11:09 pm »
I’ve never tried to write a NLP before but I would probably try something simple like this…
 
A dictionary would have to convert each word (preferably each letter or pixel that makes up each letter) into a unique number.

The input sentences/ knowledge would be added to a stack and given a unique index. 
A routine would continually search the stack looking for groups/ sequences of numbers that occur more than once (duplicates), when found the combination would be added to the stack with a new index and the original found locations replaced with the index.  This method would compact the data and build a hierarchical data structure that could cover very large/ complex conversation topics and sentence structures. 

Indx   Words->
001 - 012 009 786 455 000 000 000 000 000
002 - 344 122 898 000 000 000 000 000 000
003 - 001 786 334 000 000 000 000 000 000 <- Note 001 refers to the top entry etc

I would then set it loose reading as much text as possible and over time each commonly used phrase would gain a unique index and be moved to the top of the stack (for speed). 

Real world data would also be recorded in the stack and if a blackboard (% scoring) technique was used to score the data based on the current conversation stream a rough gist of the whole topic could be generated by feeding the triggered indexes back into the stack; so they in turn trigger other related data etc (like a train of thought) eventually spitting out just one index no matter how complex the conversation topic.  Also provides a kind of holographic feature where one index included in multiple lines can relate a massive amount of relevant data.  Sensory data could also be easily added so its included in the results.  So could heuristics to guide it.

I suppose each entry/ line could be taken as a neuron and the word/ letter/ pixel numbers could be synapse. 

Oh! Any rarely or unused would be deleted over time.

Seems to make sense in my head… I’ll give it some more thought.
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 322
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: anyone want to have a chat or a stab at NLP with me?
« Reply #29 on: April 25, 2015, 03:26:49 pm »
The use of the Markov chain really, Interesting. The Markov principle although useful Mainly i can see that its real purpose is prediction. Predicting the conversation... yet not good for determining responses. (probably a wild statement).


A computer cannot know what a "Thing" is, yet the Concept-Net data base attempts this ... knowing the who,what,why,when of something as well as its basic descriptions and possible actions can build a truth table about that object. this can give the computer and understanding about that object.... as we know different object different characteristics. in this way we can develop truth about these objects/Nouns. these objects will have some sort of hierarchy. animal / Vegetable / mineral / person / place ....

when thinking along the lines of NLP information needs to be collected along the way. the Subject verb object Structure gives us some of the data elements in the sentence as well as helping to break the sentence into meaning full clauses.

"his girlfriend Pats a dog" would make sense "His girlfriend Pat's a dog. " does not as the correct punctuation which denotes the clause structure is incorrect. with the extra comma there would be two clauses instead of one.... therefore the apostrophe would be considered to be a mistake unless a set of rules which handle corrections, such as Pronoun begins a new clause, then two clauses would be detected or corrected... these extra complexities are not to be considered in initial models only after multi level testing would these corrections be added and tailored.

Clause structures breaks a single though into single declaratives. (information)

the logic (predicate or inference) is another consideration of the sentence analysis process the logic of a sentence if / then, ALL CATS are animals... gives reasoning capability's to the sentence analysis process. as well as the structure "SUBJECT", "PREDICATE, "OBJECT" predicates can also be categorized to their counterpart in the heir achy..
<person> <Born in><Location>, helps use to do a loose yet targeted, data extraction. (Snowball algorithm)

Motive and Emotive analysis, this could also be defined as sentiment analysis, yet here its suggested to understand the action of the sentence, also the tense, past present or future. emotion has no true part in an AI, yet for the purposes of understanding the emotion of the user, these considerations should be understood. again individual clauses within a individual thought can give an understanding of the sentiment of the overall text. and if the text refers to the future or the past.

a grammar created by programmers, should be based on a simple set of programmable rules.
 
sentence purpose enables for understanding where to file the data received as well as knowing where to retrieve it from. questions asked about each of the Question types, Who, what, why when also have accompanying declarative types
 
after these multi stages of analysis a lot of information Will have been gathered about the sentence or though being Sent by the user. the correct response ........

The correct response should be in answer to the user sent, or even just a gentle confirmation of understanding what has been sent by the user.

Returning the data in a newly constructed sentence based on the users in put...

What are cats?
 Cats are ......

or

What are cats? <<<< Seeking a description of a cat: or definition:

Cats: feline ,<dictionary response>
Has : 4 legs, Offspring, two eyes,  <Wordnet>
Is living, therefore can die. <Logical implications>
Has Location,etc
Can do the actions: <Sleep> <Run> <Jump> <Eat
and the list learned about the cat can go on and on based upon the truths created about that object/Noun....




« Last Edit: April 25, 2015, 04:53:00 pm by spydaz »

 


OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

235 Guests, 0 Users

Most Online Today: 274. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles