Symbolic AI vs Machine Learning

  • 23 Replies
  • 28888 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Symbolic AI vs Machine Learning
« Reply #15 on: August 27, 2020, 12:26:30 am »
Oh one more thing: We are talking about discovering the next word when you don't know what it is ex. "To cure cancer, we will _. Tada!". So how can your method franky improve prediction? Frequency, semantics etc help up accuracy, and same for if used on vision. Can you add on? Your 'real way' to do NLP doesn't help / exist. Explain how it tells us 'bark' usually follows 'dog' more than 'sleep'.

?? For example, bark is a verb, as WoM would say, so it is much more likely to follow dog than would ghostly;
dog barked
dog ghostly
.....Do note, this is semantics though, we are translating.....
Emergent          https://openai.com/blog/

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Symbolic AI vs Machine Learning
« Reply #16 on: August 27, 2020, 05:29:28 am »
Oh one more thing: We are talking about discovering the next word when you don't know what it is ex. "To cure cancer, we will _. Tada!". So how can your method franky improve prediction? Frequency, semantics etc help up accuracy, and same for if used on vision. Can you add on? Your 'real way' to do NLP doesn't help / exist. Explain how it tells us 'bark' usually follows 'dog' more than 'sleep'.

?? For example, bark is a verb, as WoM would say, so it is much more likely to follow dog than would ghostly;
dog barked
dog ghostly
.....Do note, this is semantics though, we are translating.....

With a descriptor model it doesn't guess at what a dog can do it knows it or you teach it. So the dog has a descriptor that has a class vector of ability that resolves to a generalized type which is bark, tada! The classes and their generalized types can be modified as the system learns more knowledge. Also the classes and generalized types are enumerated which can make them more easliy consumable for other algorithims. If you stated something like  "To cure cancer, we will _" a good system should ask questions rather than guess. So with descriptor models we can validate if a sentence's logic is sound by looking at the conjuctions and relationships with other words that should relate to various properties listed in a word(s) descriptor profile. 

To anticipate outcomes of live events I am developing a contingency model. I got the idea from a simulator engine I developed for some very high profile weapon systems, the Abrams Tank and the Global Hawk, oops I just might have given myself away.  :-[  Not to give too much more away, but when dealing with complex systems and all their physics computationaly its better not to do all the physics and literally resolve everything to events. Believe it or not "events" are binary, they either happen or they don't.  So, again with this kind of approach there is no need to experience a gagillion events to learn something, in fact this approach can learn from one experience, just as nature does...

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Symbolic AI vs Machine Learning
« Reply #17 on: August 27, 2020, 06:54:23 am »
It's not as simple as "it knows bark entails dog",  a brain knows what the probability is as well. If it sees dog bark 6 times and dog sleep 4 times, then it's 60% likely that bark follows and 40% for sleep.

You use words like classes and types but word2vec / seq2vec can do relationships.

Hmm you say binary events, and 1 example.
1) Well, the brain can tell you how many times something occurs even if it didn't count them all. The more active, the higher the number, ex. 20, 100, 1,000, they are just puppets to shoehorn.
2) And it can remember an event that only occurs 1 time if it is frequent, related, recent, loved, short, etc.
3) It can tell you, no matter if occurred 999 times, if it did at least once, or never, the activity needed here is super low to shoehorn the dummy prop.
....When predicting the next word or visual feature, which is the main concern for AGI, it is Not binary event, it has many possible futures with different probabilities, the one it thinks is true most is the highest probable one. How many problems are binary event? Do dogs bark? Have we got to space? The ball went under or over the hoop? Yes, yes, yes, true, true, under, etc...... Well the yes/no or activated word are primed, then to finally choose which you use the corresponding activity, which makes the ex. yes moreee active than the no node!
Emergent          https://openai.com/blog/

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Symbolic AI vs Machine Learning
« Reply #18 on: August 27, 2020, 07:34:48 am »
It's not as simple as "it knows bark entails dog",  a brain knows what the probability is as well. If it sees dog bark 6 times and dog sleep 4 times, then it's 60% likely that bark follows and 40% for sleep.

No body but no body goes around thinking: "Oh that dog is probably barking and not sleeping because it mostly barks 60% of the time."  Or I see the word dog and think: "The best word that should follow is bark" Nope, most would ask: "Dog, what about dogs? Do you have one? I have several pets." THAT IS WHAT A HUMAN BEING WOULD DO, if they owned a dog(s). ::)

You use words like classes and types but word2vec / seq2vec can do relationships.

No word2Vec deals with semantic similarities and seq2vec tries to predict sequences. You also missed the point in my post of the ontological framework that is used as well and that is all about relationships, that is if you know how to use one. Again no need to mine through a gagillion examples, do it once and you're done!

Hmm you say binary events, and 1 example.
1) Well, the brain can tell you how many times something occurs even if it didn't count them all. The more active, the higher the number, ex. 20, 100, 1,000, they are just puppets to shoehorn

What? That makes no sense... :2funny:

2) And it can remember an event that only occurs 1 time if it is frequent, related, recent, loved, short, etc.

Again your not making sense. How can something that occurs one time still be frequent?

3) It can tell you, no matter if occurred 999 times, if it did at least once, or never, the activity needed here is super low to shoehorn the dummy prop.

So what? How is that not possible with plain old computation.

....When predicting the next word or visual feature, which is the main concern for AGI, it is Not binary event, it has many possible futures with different probabilities, the one it thinks is true most is the highest probable one. How many problems are binary event? Do dogs bark? Have we got to space? The ball went under or over the hoop? Yes, yes, yes, true, true, under, etc...... Well the yes/no or activated word are primed, then to finally choose which you use the corresponding activity, which makes the ex. yes moreee active than the no node!

You're not getting it, events are BINARY, they either happen or they don't. The door was closed or it wasn't. The dog barked or it didn't.  You confuse the ability to anticipate the potential outcomes from causes and effects with the actual event!

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Symbolic AI vs Machine Learning
« Reply #19 on: August 27, 2020, 07:58:06 am »
"No body but no body goes around thinking: "Oh that dog is probably barking and not sleeping because it mostly barks 60% of the time."  Or I see the word dog and think: "The best word that should follow is bark" Nope, most would ask: "Dog, what about dogs? Do you have one? I have several pets." THAT IS WHAT A HUMAN BEING WOULD DO, if they owned a dog(s). ::)"

it's natural, you don't think such, it just decides if you say dogs sleep or dogs bark.....hehe


"Again no need to mine through a gagillion examples, do it once and you're done!"

Huh? Where do you get your proof from for this? Give me an example. Ex.  apples were sold, nails were sold, -- DETECTION -- BOTH HAVE "WERE SOLD" NEXT TO THEM, THEY ARE SIMILAR A BIT.


"Again your not making sense. How can something that occurs one time still be frequent?"

I simply listed off how to keep a memory, just the frequent one didn't apply there, if only saw 1 time something.


"You're not getting it, events are BINARY, they either happen or they don't. The door was closed or it wasn't. The dog barked or it didn't.  You confuse the ability to anticipate the potential outcomes from causes and effects with the actual event!"

Ok so maybe you are talking about the actual event, not predicting it, cus it already happened. Fine. Well, this is just storing observations lol. Even my AI I made does this already. It has memory. I build a tree of events like (and each node has a number of times occurred):

                         > ......
          30 > cat
                         > ......
88 > The           
                         > ......
          58 > dog
                         > ......
Emergent          https://openai.com/blog/

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Symbolic AI vs Machine Learning
« Reply #20 on: August 27, 2020, 08:14:18 am »
Quote
Huh? Where do you get your proof from for this? Give me an example. Ex.  apples were sold, nails were sold, -- DETECTION -- BOTH HAVE "WERE SOLD" NEXT TO THEM, THEY ARE SIMILAR A BIT.

With an ontological framework, the concept of Commerce, Trade, Economics, Business, Sales, Retail, etc would be related to the word "SOLD". Bada bing bada boom baby!

Do you get it NOW?   :D

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Symbolic AI vs Machine Learning
« Reply #21 on: August 27, 2020, 08:24:33 am »
That can be done using a Neural Network.

A NN can have a representation domain node that is triggered when any of those words are triggered. Further, this concept node can link to an other node ex. gumball ex. representation+gumball.
Emergent          https://openai.com/blog/

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Symbolic AI vs Machine Learning
« Reply #22 on: August 27, 2020, 08:48:37 am »
That can be done using a Neural Network.

A NN can have a representation domain node that is triggered when any of those words are triggered. Further, this concept node can link to an other node ex. gumball ex. representation+gumball.

Ontological frameworks are hierarchies of nodes which also relate across domains. So you can either find a concept that directly relates to the subject or connect to a very different concept that's link through a common parent from a common context which has other contexts that don't relate to the current subject matter. So we could start out talking about dogs, but because dogs are made of molecules(described in its descriptor profile) the subject of the conversation could switch to quantum mechanics! Do that with an NN, HA!  :)

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Symbolic AI vs Machine Learning
« Reply #23 on: August 27, 2020, 10:21:18 pm »
Quote
So we could start out talking about dogs, but because dogs are made of molecules(described in its descriptor profile) the subject of the conversation could switch to quantum mechanics! Do that with an NN, HA!  :)

Hmm, only if it follows or is it. Ex. we zoomed in on dogs and found atoms, or took some their atoms / dogs are atoms.

follows: a>b
is it: a=b
Emergent          https://openai.com/blog/

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

346 Guests, 0 Users

Most Online Today: 467. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles