How to convert "and" sequence to an implication...

  • 9 Replies
  • 1511 Views
*

ivan.moony

  • Trusty Member
  • ********
  • Replicant
  • *
  • 711
  • look, a star is falling
How to convert "and" sequence to an implication...
« on: May 13, 2016, 02:07:54 pm »
This might be interesting for those interested in logic. I've read something similar some twenty years ago in one Croatian book and didn't see it since then, so I suppose this is a rare knowledge.

Here is how extract an implication from a conjunction - suppose we have the following first order logic conjunction sequence:

Code: [Select]
seagul(a) &
beak(a) &
wings(a) &
bird(a) &

parrot(b) &
beak(b) &
wings(b) &
bird(b) &

kiwi(c) &
beak(c) &
bird(c) &

husky(d) &
dog(d)

Now we can apply following rules:

1. if A always comes with B and the absence of A always come with the absence of B then we can write "A <-> B", so:
Code: [Select]
bird(x) <-> beak(x)
2. if A always comes with B, but B sometimes comes without A then we can write "A -> B", so:
Code: [Select]
wings(x) -> bird(x)
Actually, this idea follows the intuition of truth table of logic operators "->" and "<->". It should be obvious, but until someone explicitly points it out, one would probably not see it. I don't know why Wikipedia is not exposing this knowledge.
Wherever you see a nice spot, plant another knowledge tree :favicon:

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Global Moderator
  • ******************
  • Hal 4000
  • *
  • 4406
Re: How to convert "and" sequence to an implication...
« Reply #1 on: May 13, 2016, 03:19:27 pm »
It would seem that B would therefore never go anywhere without A being with it.

Think of two people (A & B), then follow your logic.

The word "Always" means forever as the sun Always rises.

"Sometimes", might allow this to happen in your example.
In the world of AI, it's the thought that counts!

*

ivan.moony

  • Trusty Member
  • ********
  • Replicant
  • *
  • 711
  • look, a star is falling
Re: How to convert "and" sequence to an implication...
« Reply #2 on: May 13, 2016, 04:07:32 pm »
Yes, I agree it's a bit messy subject. We are talking about "what we are talking about", right? I'm already dizzy, Logic seems to be a natural narcotic :D

The understanding of implication is a bit messy:

"A ↔ B" is sometimes read "A if and only if B" and it is true only in two cases - (1) A and B are both true; (2) A and B are both false. It is not not always, but either both, either none.

"A → B"  has a bit more complicated truth-table, but basically if there is A, then it has to be B. But, you can also have B without A (i.e. if "C → B & C" is true, then "A" might or might not be false). It is not sometimes, but other three combination except "it is A and not B".

Also, we can write "A ↔ B" as "(A → B) & (B → A)".

Hehe, now I understand what the expression "mad scientist" means. I'm not sure if I understand myself also :) But if you give me a chance, you might get to know something new.
« Last Edit: May 13, 2016, 05:10:22 pm by ivan.moony »
Wherever you see a nice spot, plant another knowledge tree :favicon:

*

Don Patrick

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 386
    • Artificial Detective
Re: How to convert "and" sequence to an implication...
« Reply #3 on: May 13, 2016, 07:45:48 pm »
It sounds like you're trying to have a computer automatically figure out the hierarchy between objects. Whether one object is of the same class as another object (e.g. parrot and seagull) or subservient (parrot and beak). To go on your example: "wings" will be mentioned in texts more often than any individual type of bird mentioned alongside them. You might then presume that wings are a subcomponent of seagulls and parrots, rather than parrots being a subcomponent of wings, or parrots and wings being a similar kind of thing.

In real life though, there is no "always", so you could try a statistical approach looking for oft-coinciding words. Saying that though, it would probably be very tricky to apply this logic to text because writers tend not to repeat word combinations. But it might work with images, where objects coincide more consistently.
Does that sound like what you have in mind?
Personal project: NLP -> learning -> knowledge -> logical inference -> A.I.

*

ivan.moony

  • Trusty Member
  • ********
  • Replicant
  • *
  • 711
  • look, a star is falling
Re: How to convert "and" sequence to an implication...
« Reply #4 on: May 13, 2016, 09:07:26 pm »
Implication unit can be used for numerous effects.

* There is a superset-subset relationship between elements.

* There is also an object-property relationship.

* Then there are some functions definition uses that could say "function → ((parameter1 & parameter2) → result)".

* not to forget natural language use when we are saying if this than that...

I don't know how many other uses can have Logic implication, nor I know particularly how to distinct between these uses. But I think that the key is to recognize some sort of hierarchy from one dimensional list of elements like conjunction is. When dealing with sound, we have one dimensional stream. When dealing with vision, we could also bring it down to somehow controlled eye focus that streams also one dimensional array all the time.

Statistical approach for recognizing different uses of implication sounds nice to me, and I would love to see it developed to some more formal extent. Still, I don't know where to begin...

I guess some general ontology definition should be maintained, started from the day of birth of the bot. The more the bot knows, the easier is to recognize things and the easier is to see how to gather new knowledge (learned constraints should produce less things to check). Like a kind of singularity, the more you know, the easier is to gather new knowledge. And when the bot learns to ask the right question, and to know where to find answers, it could gather knowledge at singularity speed.

So, how to structurize this bot's mind ontology? Maybe I would have some answers on how to begin (http://agidevlog.atspace.eu/flatboard/view.php/forum/2016-04-25161217680df). But it's just a beginning. What's missing is that statistical engine that recognize new structures from a plain stream. Once that new structures are recognized, I think I know how to use them to recognize existing objects or actions (http://aidreams.co.uk/forum/index.php?topic=10470.0), but gathering mentioned new knowledge about unknown stuff still remains a mystery to me.
Wherever you see a nice spot, plant another knowledge tree :favicon:

*

Zero

  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 332
  • Fictional character
    • SYN CON DEV LOG
Re: How to convert "and" sequence to an implication...
« Reply #5 on: May 16, 2016, 06:17:13 pm »
Hi,

I think you could use bayesian networks. I mean, it's not exactly about logic, it looks more like it's about what you think at time T and what you think at time T+1.

You'll never see every "bit" of a man at the same time, but you're still able to recognize a man.

I have to admit, I don't believe in ontologies, and I don't believe in pure logic. I believe in thought-trains.

When you ask a child "what is a penguin?", does he perform some kind of tree-search on his private top-level ontology? No, he just says "it's an animal".

Then ask the same child "What is a dove?", he would probably say "it's a bird". If you look closely, there's something wrong here. The child should answer "an animal" to both questions, or "a bird" to both questions. After all, a penguin is a bird. But the child says "an animal" because the penguin doesn't feel like a bird. This is not logic, this is no ontology. It's thought train. What does this question leads to? What do I think at time T, and what do I think at time T+1? If time T is "what's a penguin", then time T+1 is "an animal". If time T is "what's a dove", then time T+1 is "a bird".

Obviously this is not enough. This question-answer works should be embeddable in long thought sequences, where the thinker asks and answers several things without even noticing it...

So happy to be back :)

*

8pla.net

  • Trusty Member
  • *********
  • Terminator
  • *
  • 792
    • 8pla.net
Re: How to convert "and" sequence to an implication...
« Reply #6 on: May 17, 2016, 09:15:09 pm »
Code: [Select]
what (a) &
is (a) &
a (a) &
sea(a) &
gull(a) &
beak(a)&
wing(a) &
bird(a) &


Oh!  I got it!
If "sea" always comes with "gull" then we have a "seagull".
________________________________________________________________
« Last Edit: May 18, 2016, 12:59:21 am by 8pla.net »
My Very Enormous Monster Just Stopped Using Nine

*

ivan.moony

  • Trusty Member
  • ********
  • Replicant
  • *
  • 711
  • look, a star is falling
Re: How to convert "and" sequence to an implication...
« Reply #7 on: May 18, 2016, 12:17:06 pm »
8pla.net, it's about predicate logic. It's a pretty long read, but basically, predicates are bound to a number of variables like in following examples:

a is a seagull: Seagull (a)
a has a beak: Beak (a)
a has wings: Wings (a)
a is a bird: Bird (a)

You can say that in these examples, variables are a sort of objects and predicates are a sort of classes to which objects belong.

Further, you can even say:

a is a father of b: Father (a, b)

Now the understanding of predicates goes beyond classes. When linked with logic operators, it is possible to derive conclusions from a number of statements (implicitely contained knowledge) and that is what is Logic about.

Just to experience a myst of Predicate Logic, this is a pretty common example of a statement:

Father (a, b) & Father (b, c) -> Grandfather (a, c)

You can express a lot of things, but there is no one to one relationship between natural language sentences and predicate logic sentences.
Wherever you see a nice spot, plant another knowledge tree :favicon:

*

8pla.net

  • Trusty Member
  • *********
  • Terminator
  • *
  • 792
    • 8pla.net
Re: How to convert "and" sequence to an implication...
« Reply #8 on: May 18, 2016, 01:01:35 pm »
Thank you ivan,

You said "a is a father of b: Father (a, b)"
So then, "AI" is a father of "ML": Father (AI, ML).

In my opinion it seems logic may become permanent
as new compound words.

Logic becomes a compound, ivan. We have many
compound words, which exist as separate words.

Like, "sea", "gull" becomes "seagull" or "be",
"long" becomes "belong"

Now lets try something else:

X is natural language: (NL)
X has predicate logic: (PL)
X has artificial intelligence (AI)

NL is a father of PL: Father(NL,PL)

Father(NL,PL) & Father(PL,AI) -> Grandfather(NL,AI)

_______________________________________________________________________

« Last Edit: May 18, 2016, 01:27:08 pm by 8pla.net »
My Very Enormous Monster Just Stopped Using Nine

*

spydaz

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 106
How to convert "and" sequence to an implication...
« Reply #9 on: September 14, 2016, 04:06:34 am »
Mr moony ; always interesting ;

Sounds like basic linguistics ;

On investigation ; predicate logic has begun to explain and give us the ability to store information in a much more detailed way .
The notation used In linguistics is actually verbose ... Compared to Modern predicate logic ...
Is a/ has a/ a part of /
Etc which itself is also becoming "old" as its just an extension of the wordnet ... Which is becoming "descriptive logic" developing an owl ontology .... All interesting concepts ...

Yet for the PhD I'm  more towards syllogisms and formal logic ... And yet it's still predicate logic ...

Ps: thought trains should be thought of as mini ontology's (disposable)
Although once a conclusion has been discovered .. The temporary ontology should be able to be mixed with the internal / knowledge base / ontology ...

Stats and frequency may be used to verify data learned ; knowing if the knowledge has truth or is false ...

Comparing new information with currently held knowledge ... Negating or enforcing that knowledge (as zero has said)
Perhaps ?


Sent from my iPhone using Tapatalk

 


Users Online

24 Guests, 1 User
Users active in past 15 minutes:
Freddy
[Administrator]

Most Online Today: 29. Most Online Ever: 208 (August 27, 2008, 09:36:30 am)

Articles