[Element ∊ Set] reasoning in natural language processing

  • 17 Replies
  • 5563 Views
*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Quote
[Element ∊ Set] reasoning in natural language processing

A lot of people here use a kind of natural language processing in theirs experiments, so I thought of writing an article where I'll try to expose a crippled version of first order logic without quantifiers. This version of logic would be relatively easy to implement as a reasoning mechanism about sets and their elements.


Motivation

Thorough converting of natural language sentences to full blown first order logic might be a tricky thing, but luckily, there exist a subset of logic that is easily recognized from natural language. This subset includes statements like:


<x> is a <X>.
<x> is a <A> or <x> is a <B>.
<x> is a <A> and <x> is not a <B>.
If <x> is a <A> and <x> is a <B> then <x> is a <C>.


Sentences like these are easy to convert to a logic language and there exists a nice method of automated reasoning on them in a form of resolution rule. After applying the resolution rule, we would be able to answer questions like:


Is <x> a <C>?


Even if we didn't directly state the set belonging, the resolution rule is able do derive those answers in a complete way, meaning if we don't get a required derivation, then the derivation simply doesn't exist, given a staring chore of assumptions. Scientists prove the completeness of resolution rule and I believe them.


A working subset of logic

Our subset of logic covered in this article will be consisted of:
  • and operator: &
  • or operator: |
  • not operator: ~
  • conditional if - then operator: ->
  • set belonging element-of predicate: ∊
This kind of logic is relatively easy to extract from natural language sentences. Here we deal only with logic operators and a single predicate: ∊ (element-of). To do the complete logical reasoning, after converting natural language expressions to described logic language, we need to convert all the logic sentences to conjunctive normal form (CNF). Keeping all the sentences internally in conjunctive normal form releases us of repeating distributive, negation and DeMorgan's mumbo-jumbos over and over again. We do it only once and never again, while we are ensuring that sentences with different notation, but with the same meaning (considering the mumbo-jumbos) will end up as the same CNF expression (this is called normalization).

Once we did the normalization, we can systematically apply the resolution rule to these sentences, thus inferring all the inferences that exist at all. For example, if we state the following:


If someone is a cat then they are a mouse eater.
Alice is a cat.


this will be converted to our logic language:


(<x> ∊ cat) -> (<x> ∊ mouse eater)
Alice ∊ cat


this, in turn after converting to CNF would look like this:


(~(<x> ∊ cat) | (<x> ∊ mouse eater)) &
(Alice ∊ cat)


and finally, after applying the resolution rule to our normalized conglomerate, one of the derivations would be:


Alice ∊ mouse eater


Now it is easy to answer the question:


Is Alice a mouse eater?


And that's it. A kind of simple, isn't it? The thing is that we can do all the stunts with and / or / not / if-then logical operators in natural language, while all of it gets solved by the resolution rule. We can even detect contradictory input (a lie or a mistake), answer conditional answers (if we have: "~A | B" and a question: "is B", we can answer "Yes, if A"), and finally, we may ask is a whole conglomerate sentence contradictory (which many scientists use in proving by the method "proof by contradiction" - but first remember to convert the question into CNF).



I find explained combination of logic operators and <element-of> predicate a nice little whole that does not require a lot of effort to implement, but provides a decent reasoning framework to any semantic input/output system (read chatbots and similar creations). But if you ask me, while we are making this kind of effort, I'd also implement <subset-of> and <equals> operators in a following way:


        A ⊇ B
----------------------
(<x> ∊ A) -> (<x> ∊ B)


      A = B
------------------
A ⊇ B   &    B ⊇ A


This doesn't introduce much of a complication, while it converts directly to our logic subset, making such statements ready for processing by resolution rule. But be careful when processing a natural language where the same natural language expression might mean any of <is-element>,  <is-subset>, or <is-equal>. It may require some magic thought between distinguishing these three.



I hope that I managed to pass some inspiration to your body, your heart and your soul. And don't be melancholic if this didn't solve all of our questions. Nature took 5.4 billions years to create us, while our lifespan is only about eighty to ninety years (some say that our generation will live about hundred years). One thing at a time, and I'm sure we will get something sane out of our researches.

- ivan -
« Last Edit: May 06, 2018, 12:55:12 am by ivan.moony »

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1372
  • Humans will disappoint you.
    • Home Page
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #1 on: May 05, 2018, 11:58:50 pm »
That's a great article Ivan. Apart from a couple of irrelevant typos there's only one correction that I'd like to suggest.

Quote
If someone is a cat then someone is a mouse eater.
translates to
Quote
(<x> ∊ cat) -> (<y> ∊ mouse eater)

Instead you should say
Quote
If someone is a cat then they are a mouse eater.
which stipulates that the cat and the mouse eater are the same actor.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #2 on: May 06, 2018, 12:58:28 am »
Tx, corrected.

Unfortunately, this correction complicates natural language to logic conversion. Maybe we could introduce a controlled natural language form:
Quote
If x is a cat then x is a mouse eater."
to make things simpler.

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1372
  • Humans will disappoint you.
    • Home Page
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #3 on: May 06, 2018, 01:15:00 am »
There has been a fair bit of research into controlled natural language to logic translation. When I Googled it just now I found a few papers. I'm interested in this one http://www.adampease.org/professional/CELT.pdf and I'm hoping to implement something like it at some stage soon.

I've already implemented a complete FOL library which handles universal and existential quantifiers and all the others that you've mentioned, and reduces everything to conjunctive normal form (not a trivial process I'll be frank) for efficient processing. As you may recall I've also completed a semantic parser to complement it. Just building the knowledge base to tie it all together now. It turns out that's the hard part. Once you've written all the software, your work is just beginning.

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #4 on: May 06, 2018, 11:45:01 am »
When Designing the First order logic module / (Propositional logic) ....

I found that there were many forms of the IF / THEN combination.
A UNLESS IF B, NOT A UNLESS IF B
NOT A UNLESS IF B ..... there was about 12 structures.

he conditional statements mean that the a/b are often reversed. re-cursing the data stored I used a list!...

1.First find all logical connections
2. Then find the conclusion.

1/ Simplfy input to logic shape (All a are B  //  All A are not B)
2/ Save in List

Personally i have found the "Mad Scientist Notation" Not really relevant. Just the keywords (ALL/SOME) I would suppose that the determiners If they contain a Quantity they can be reduced to SOME.

there are other predicates which can also be reduced to the simple forms. but once the "heavy lifting is done" then Whole phrases can become A/B not just Simple Subjects or Objects.

The knowledge base builds itself.....

We are thinking along the same process Inful
 




*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #5 on: May 06, 2018, 01:37:24 pm »
Thinking of triples, any triple can be represented as a two-place logical predicate. Thus, we would have triples floating inside and / or / not / if-then patterns, ready to do some inferences.

But logic supports any arity predicates. Did anyone tried to replace all the n-ary logical predicates by triples? I have a feeling it is possible.

Something like:

"Alice hates Ben on Sunday"

would be a 3-ary predicate

Hates (Alice, Ben, Sunday)

I'm still not sure how to convert this predicate to a set of triples, but I think it has something to do with naming each parameter (who, whom, when).

[Edit]

Given "1" is a constant, would this be a correct way:

WhoHates (1, Alice) &
WhomHates (1, Ben) &
WhenHates (1, Sunday)

Does this notation preserve all the properties of  "Hates (Alice, Ben, Sunday)", regarding to logical inference?
« Last Edit: May 06, 2018, 02:11:03 pm by ivan.moony »

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #6 on: May 07, 2018, 11:41:56 am »
Thinking of triples, any triple can be represented as a two-place logical predicate. Thus, we would have triples floating inside and / or / not / if-then patterns, ready to do some inferences.

But logic supports any arity predicates. Did anyone tried to replace all the n-ary logical predicates by triples? I have a feeling it is possible.

Something like:

"Alice hates Ben on Sunday"

would be a 3-ary predicate

Hates (Alice, Ben, Sunday)

I'm still not sure how to convert this predicate to a set of triples, but I think it has something to do with naming each parameter (who, whom, when).

[Edit]

Given "1" is a constant, would this be a correct way:

WhoHates (1, Alice) &
WhomHates (1, Ben) &
WhenHates (1, Sunday)

Does this notation preserve all the properties of  "Hates (Alice, Ben, Sunday)", regarding to logical inference?
Lets figure it out!
hates (alice,on(sunday(ben))   <<<<<<< Maybe as the first predate 
Hates (Alice , (On Sunday(ben))    Seems like Quad?
The event can also be Before / After / On .... this determiner for time is key to reconstruction (ON)
Hates (Alice (Ben on Sunday)    <<<< Triple <<< Here you notice the triple contains a sub triple ......
Alice hates (Ben on Sunday)    =  (ALICE(Subject) (hates(EMOTION)) Object),(BEN ON SUNDAY)  <<<< Contains Emotion     
Object Triple (Ben on Sunday) contains 
Subject / Time / Date
Ben(subj) On(time/Location) Sunday(Date) (WHEN)

From each triple type a different Structure is re-recognised allowing for Sentence meaning to be maintained. Each SubPredObj has its own individual meaning as a whole. but as a group of meaning the overall meaning can be discerned... As the logic shows alice only hates be on sunday on other days alice may like ben ......

eg. but related Question Who Does alice hate ben / Ben on sunday the related records can be deduced.

 A# Alice #Hates #B BEN 
QUESTWORD+SUBJECT+PREDICATE = OBJECT
Allowing for A marker may reveal (Has sub-Triple (WHEN(DATE) )
Allowing for BEN ON SUNDAY to be revealed Searching for #BEN (logical connections) .....

I often try to remember How will the Information be reconstructed from (Interrogative queries) while also thinking Generally as what would be captured by the same process given a different set of inputs. thinking of retrieval / reconstruction from grammatically correct structures  is a starting point. which can be expanded to non conformist sentence structures or "learned structures (Learned Probabilistic Grammar)"

Maybe
(Actually working now) .... Doesn't take long for a Learned grammar to capture... Although Sentence formation is the next in this stage....(with intent/modality/meaning) from the learned grammer patterns...
« Last Edit: June 26, 2018, 05:00:53 pm by spydaz »

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #7 on: May 07, 2018, 11:51:42 am »
Programming the propositional logic actually gave me headaches!

the recursive search function ;
(item set)
All A are B
All B are C
All D are B
All G are B

Logical connection (B)  Query B as Q
(B=Q) or

D are B
G are B
A are B

(B=P) Query B as P
B are C

So now we can Check the logical connections for truths
All P = Q   
Q = C

D, G, A, B = C
These are the TRUTHS
Instead of Producing a truth table the connections returned are the truth statements 
Hopefully that make sense

The same can be produced for All Propositional logic Structures eg (IF A then B).... the recursive search an also be used .... by producing the set Logical connections The subset can be check for Questions Such as Are D = G or Are A = C  the logical connections for the term can be produced for each term .... in the two subsets the matching results produces the truths.... For each term P/Q....




« Last Edit: May 08, 2018, 09:21:56 am by spydaz »

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #8 on: May 07, 2018, 12:24:24 pm »
Spydaz, what does equality sign stands for? I'm not sure it is a part of propositional logic.
« Last Edit: May 07, 2018, 12:48:27 pm by ivan.moony »

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #9 on: May 07, 2018, 01:30:28 pm »
Spydaz, what does equality sign stands for? I'm not sure it is a part of propositional logic.

There is always Equals... with formal logic the point of it is to determine if  given statement is true/false given previous statements so the output of the propositional calculation is = to ?


D, G, A, B = C in this case D, G, A, B are C = TRUE

In the statements there are no NOT statements / or FALSE statements. or the deductive process would be more complex.
Personally i only capture TRUTH statements although this may not always be the case ...

= Denotes true whereas
!= denotes false;


yet even a
Some G are not C
All F are NOT C

could be considered a truth also.

As a starting point Simple truths are easier to handle. Especially for the recursive searching process.
Such processes or rules are subjective algorithmic programming. such as

adding the stop character to enable the recognition of the end of a STRING. which is used in processing long strands of DNA. (pattern matching). usually with formal logic the whole statement is taken to be true or false (modus pollens  / modus tollens) .......but in AI we are not so interested in these just TRUE/FALSE and the different ways in which statements can be classified as ;   
« Last Edit: May 08, 2018, 09:24:45 am by spydaz »

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #10 on: May 07, 2018, 02:16:54 pm »
Tx, corrected.

Unfortunately, this correction complicates natural language to logic conversion. Maybe we could introduce a controlled natural language form:
Quote
If x is a cat then x is a mouse eater."
to make things simpler.

IF (TRUTH A#) then (TRUTH #B) <<<<<<<<<<<<<< Contains 2 Predicates Each needs to be stored as individual predicates/triples then the conditions evaluates or changes the truth of these statements by conditionally linking them as being true if the other is true

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #11 on: May 07, 2018, 03:02:46 pm »
Quote
IF (TRUTH A#) then (TRUTH #B) <<<<<<<<<<<<<< Contains 2 Predicates Each needs to be stored as individual predicates/triples then the conditions evaluates or changes the truth of these statements by conditionally linking them as being true if the other is true

In compound sentences, we don't know in advance what is a truth value of each predicate. Yet in the resolution inference, if we get a single predicate (an atomic sentence) as a conclusion, then we can say that the predicate alone holds true.

Otherwise, compound sentences restrict a truth value of each of their atomic compounds, so any truth value is possible, other than those that yield contradiction of the compound sentence.

I feel that you would benefit of investigating a real book on propositional calculus. Unfortunately, I can't recommend you a good reference in English language. Maybe to try searching "Mathematical logic", if you are interested?

*

spydaz

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 325
  • Developing Conversational AI (Natural Language/ML)
    • Spydaz_Web
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #12 on: May 07, 2018, 05:07:49 pm »
Quote
IF (TRUTH A#) then (TRUTH #B) <<<<<<<<<<<<<< Contains 2 Predicates Each needs to be stored as individual predicates/triples then the conditions evaluates or changes the truth of these statements by conditionally linking them as being true if the other is true

In compound sentences, we don't know in advance what is a truth value of each predicate. Yet in the resolution inference, if we get a single predicate (an atomic sentence) as a conclusion, then we can say that the predicate alone holds true.

Otherwise, compound sentences restrict a truth value of each of their atomic compounds, so any truth value is possible, other than those that yield contradiction of the compound sentence.

I feel that you would benefit of investigating a real book on propositional calculus. Unfortunately, I can't recommend you a good reference in English language. Maybe to try searching "Mathematical logic", if you are interested?

Being English is my Mother Tongue ... There is no need... lol..
In previous posting i remember saying....when learning predicates their truth value is actually unknown the choice we as humans take is that the condition is true; and yet once the information is re-heard from another source its confidence value of truth can now begin to be measured.

When i build my algorithms i also keep the overall philosophy in mind as-well as the scientific aspects... the usage of the algorithms will be by An artificial intelligence so numerical confidence values for truths or even for the skew of emotions can help with probabilistic truths as well as the logic behind propositional logic can provide Conditional logical truth. Mr Spock may disagree because everything is believed to be quantifiable via logic. yet Captain Kirk would disagree when he often proves the un-logical to often be able to solve the same solution. This is also true of AI programming ; Some functions can be said to be randomly generated whereas some are planned . the English grammatical structure is constantly changing ; The propositional logic is very rigid. yet such statements as described will be captured and the Sub clauses of A# B# C# each have individual truth value even though the logic of the sentence may be correct it maybe totally untrue ..... a rigid approach allows for creating "toy examples" but over large texts it fails as unexpected data is captured or even incomplete data is captured.
when testing you will see exactly what i mean.
Its the results which determine If the data captured was able to be reconstructed or queried or indeed means what the structure detected. There are many levels to propositional logic as often they use very simplified examples ; yet as you progress, multi clause sentences may contain multiple meanings / logical structures / Predicate and emotional content; often a series of statements are related and often the statements are indeed individual thoughts. Although we are using propositional logic we will not "really " be doing the advanced propositional calculus as humans also do not think in these terms. propositional calculus is also "theoretical"   therefore to expand upon the idea is pioneering. and Cutting edge; My mistake its Formal Logic not propositional logic .... its propositional calculus ..... I find that testing and experimentation as well as problems incurred also determine the pathway to your goal. I also try to stay true to my philosophy incorporating the components required. Simple calculations are possible as well as extracting the structures as described in their logic... (its just another grammar) .... if you we in a pure propositional logic Sentence environment the rigid approach is GOOD.... but in natural language (not all sentences have value as propositions)  but they will be captured regardless by other functions.... and yet by adding the confidence values the statements DO have truth attached even if they are not captured by the propositional logic function ... but the same recursive function can be used to re-curse all triples using confidence values.

If the nature of reality changes could an AI Handle it? with a rigid and formal approach "NO"; But with dynamic programming and dynamic learning it could adapt; But the human could!

At the end of the journey the benefit was the algorithm was able to re-evaluate confidence values as well as infer relationships between informally captured predicate triples; which i had previously captured using the snowball algorithm. Adding structure to previously Semi structured data. As the snowball algorithm captures data  regardless of meaning; when browsing the captured data "everything has been captured" how to retrieve or evaluate the large data-set (Formal logic) the ability to perform the propositional calculus with the predicates is a bonus.

I will pop it up on git soon!
« Last Edit: May 07, 2018, 05:48:02 pm by spydaz »

*

Zero

  • Eve
  • ***********
  • 1287
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #13 on: June 26, 2018, 02:40:06 pm »
Really good idea, thank you ivan.  O0

*

Zero

  • Eve
  • ***********
  • 1287
Re: [Element ∊ Set] reasoning in natural language processing
« Reply #14 on: June 28, 2018, 12:25:10 am »
x likes chocolate         =>   <x> is a <person who like chocolate>
x is currently sleeping   =>   <x> is a <person who's sleeping now>
x bought a car            =>   <x> is a <person who bought a car>

An AIML or Rivescript bot could turn any statement into a "<x> is a <A>" statement, which would then be used to feed the engine you invented.

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

337 Guests, 0 Users

Most Online Today: 529. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles