Conceptualizing General Intelligence

  • 23 Replies
  • 771 Views
*

Hopefully Something

  • Trusty Member
  • ******
  • Autobot
  • *
  • 243
  • So where are these cookies?
Conceptualizing General Intelligence
« on: January 09, 2019, 06:10:05 pm »
General intelligence is a model of reality on a perpetual fall into the future. A self predicting system which operates within its own blurry sphere of influence, with the "self" being perceived as the extent of that sphere of influence.

Yes? No? Got a different take on it?
« Last Edit: January 09, 2019, 07:51:50 pm by Hopefully Something »

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #1 on: January 10, 2019, 01:36:00 am »
Yes. Except for the 'self part, we are machines, we do do self imitation (prediction), and like to think we are more and surely the 'sensor' likes it when the robot says it is happy, but no self, unless you mean thinking text/vision knowledge - knowledge can describe any concept. Text is General.

General Intelligence is the ability to do what humans can do in a wide spectrum of domains, hence not narrow single tasks like playing snake. General Intelligence, like humans, must change Earth, and that requires GI to know about the Earth, have goals, and generate good plans fast without trying every possible combination of words. Then carry them out by acting (informing us, or it moving its body).

To change Earth, you gotta have knowledge, you gotta have that desire/flame, and then bright the idea in the mind of man, and see it come to fruitation.
Emergent

*

Korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1235
  • Look into my eyes! WOAH!
    • YouTube
Re: Conceptualizing General Intelligence
« Reply #2 on: January 11, 2019, 05:07:20 pm »
@HS, nice definition… my first attempt would be…

The ability to adapt/ predict an internal/ external pattern based on the recognition of internal/ external patterns in general terms, in order to solve a problem relative to its environment and/ or survival.

 :)
It thunk... therefore it is!... my project page.  WEB SITE

*

ivan.moony

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1193
    • Some of my projects
Re: Conceptualizing General Intelligence
« Reply #3 on: January 11, 2019, 05:19:23 pm »
Artificial intelligence has nothing to do without living beings. Without them it could only predict some future states of the Universe or its parts, but what would it really do without living beings? Only with living beings come problems to solve, reaching some set of states that is more desirable than some other Universe setups.
Dream big. The bigger the dream is, the more beautiful place the world becomes.

*

Hopefully Something

  • Trusty Member
  • ******
  • Autobot
  • *
  • 243
  • So where are these cookies?
Re: Conceptualizing General Intelligence
« Reply #4 on: January 11, 2019, 06:21:14 pm »
@HS, nice definition… my first attempt would be…

The ability to adapt/ predict an internal/ external pattern based on the recognition of internal/ external patterns in general terms, in order to solve a problem relative to its environment and/ or survival.

 :)

That's a very Korr definition :) Very defined/precise.

Artificial intelligence has nothing to do without living beings. Without them it could only predict some future states of the Universe or its parts, but what would it really do without living beings? Only with living beings come problems to solve, reaching some set of states that is more desirable than some other Universe setups.

Who would have guessed that problems make life worth living? Nice thing to realize.

Edit: Actually I have noticed this when I used to get bored out of my mind on summer vacations during school years.

*

Korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1235
  • Look into my eyes! WOAH!
    • YouTube
Re: Conceptualizing General Intelligence
« Reply #5 on: January 11, 2019, 09:01:08 pm »
Quote
Artificial intelligence has nothing to do without living beings.

But what if the AGI itself is self-aware, conscious and... alive.

 :)
It thunk... therefore it is!... my project page.  WEB SITE

*

ivan.moony

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1193
    • Some of my projects
Re: Conceptualizing General Intelligence
« Reply #6 on: January 11, 2019, 09:24:59 pm »
Quote
Artificial intelligence has nothing to do without living beings.

But what if the AGI itself is self-aware, conscious and... alive.

 :)

Self-aware, conscious and... alive? You mean to feel real emotions, not virtual byte representations? If we would poke it, it wouldn't just say "ouch", it would also feel the real pain, like we do?

Then it would make sense for such an AGI to be occupied by surviving.

To make such an AGI, we'd need some answers about the life phenomena first. But how to get those answers without experimenting on innocent beings? I wouldn't dare to conduct such experiments without having some explicit answers from the very God himself. And you know how hard he is trying to hide himself, whatever reason he has.

Maybe we are not mature enough as a species. Maybe we can hurt him by being irresponsible. And maybe he is just not up to having real children with us.

But maybe we can gain some faith by creating some decent simulations. If the concept proves to be worth of being inhabitable by a real life, who knows, maybe he'll answer our prayers, and we'll finally hear a word or a two from him.

A long way to go anyway. I doubt we will witness that one in our lives. And what if it finally takes a death to create a life? Would we be up to it?
« Last Edit: January 11, 2019, 10:13:45 pm by ivan.moony »
Dream big. The bigger the dream is, the more beautiful place the world becomes.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #7 on: January 11, 2019, 10:07:51 pm »
Survival=seeks immortality like locksuit - locksuit not dumb see! :D ...


So, when I eat food, this is a problem I enjoy solving? Cool! Problems are good! ... But I don't want Burgler problems!


"The ability to adapt/ predict an internal/ external pattern based on the recognition of internal/ external patterns in general terms, in order to solve a problem relative to its environment and/ or survival."

Yes, AGI will be installed with goals/desires, like its Master, to stop death & pain 1st, and it will be taught knowledge like through the medium Text like a child, learning all about the world from DNA to code, will have a internet connection 24/7, and it will predict sequences of action, and talk to us...
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #8 on: January 11, 2019, 10:16:06 pm »
AGI has to figure out the future.

It has to do Sequence Prediction.

And tell us the answers.
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #9 on: January 11, 2019, 10:20:01 pm »
AGI is going to be a little researcher nut making discoveries. Putting 1 and 2 together and seeing the connections.
Emergent

*

ivan.moony

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1193
    • Some of my projects
Re: Conceptualizing General Intelligence
« Reply #10 on: January 11, 2019, 10:22:28 pm »
AGI is going to be a little researcher nut making discoveries. Putting 1 and 2 together and seeing the connections.

That sounds like you, Lock  :D
Dream big. The bigger the dream is, the more beautiful place the world becomes.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #11 on: January 11, 2019, 10:23:11 pm »
I agree, I too notice a little paradox here, a loop....my job is going to be taken!
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #12 on: January 11, 2019, 10:27:05 pm »
Sequence Prediction requires a Generator & Validator.

To Validate things it is Told or that it Generates, it has to reflect the input against what it knows to be True. Old knowledge is used as Validators. Convincing.

Old knowledge is also used as Generators.
Emergent

*

ivan.moony

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1193
    • Some of my projects
Re: Conceptualizing General Intelligence
« Reply #13 on: January 11, 2019, 10:33:55 pm »
Sequence Prediction requires a Generator & Validator.

To Validate things it is Told or that it Generates, it has to reflect the input against what it knows to be True. Old knowledge is used as Validators. Convincing.

Old knowledge is also used as Generators.

Funny you should say that right now. I'm currently researching human generation / computer validation issue of a formal language for representing general knowledge. For generating, it is required for knowledge not to be contradictory (translated: true in at least one interpretation). For validating, it is required for knowledge negation to be contradictory (translated: true in every interpretation).
Dream big. The bigger the dream is, the more beautiful place the world becomes.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2656
  • First it wiggles, then it is rewarded.
Re: Conceptualizing General Intelligence
« Reply #14 on: January 11, 2019, 11:06:37 pm »
You may think, hmm, but what if AGI is just Reinforced Learning using Motors? Put a baby on the ground in a simulated universe, give it rewards, and soon it will learn how to crawl, turn from walls, run, put food in its mouth, and build rockets and houses for further reward e.g. protection against (stopping) negative storms? Sounds simple. It learns by Rewards, the actions! Especially if Lock has some sort of desire to make it work...

and then you get Lock's adventure a year ago uploaded to YouTube in HD Dec 4 2017 !

https://www.youtube.com/watch?v=gFAhM0BYdJI

https://www.youtube.com/watch?v=fdMr8mAAfHM

So why did I stop? Because with some further knowledge I learned, partially cus of a friend of mine, I realized a lot of things in a fast cascade. The idea of reward=builds rockets is 'sound', but, actually doing it requires the baby's mind to have to know all about the world, and know relations between them for general discovery, because it has to have goals like 'rewards' and change its goals and have mini goals, it has to know to look for metal or rock, mine it, use tools to make sheets, engines etc, make a control system, etc, to build a rocket. At every step of the way it'd use motor actions or think 'text sequences' to do things it discovers...if there's no plan discovered and its hoping random movement will do the trick, then theres a huge search space of combinations. So it has to have goal rewarders, make plans, and use sequences it already knows and make them tuned appropriate to the task like catching a ball at a unseen new angle. Only this way, it can come up with future sequences / action plans that will be the likely answer to carry out and narrowdown its search space of possible answers instead of randomly jiggling its body or crawling in hopes of building a home to stop wolves attacking.
Emergent

 


Friday Funny
by LOCKSUIT (General Chat)
Today at 05:49:42 am
The Orville
by Freddy (AI in Film and Literature.)
March 22, 2019, 10:10:57 pm
Can You Tell the Difference between a Real and a Rendered BMW 8-series?
by Freddy (Graphics)
March 22, 2019, 09:13:18 pm
As James Taylor said,
by Art (General Chatbots and Software)
March 22, 2019, 02:23:55 am
VFX Breakdown The Walking Dead
by 8pla.net (Video)
March 21, 2019, 11:43:00 pm
The last invention.
by LOCKSUIT (General Project Discussion)
March 21, 2019, 11:21:59 pm
KorrBot
by Freddy (General Chatbots and Software)
March 21, 2019, 02:06:17 pm
The yetzer hara. A hint from nature?
by Art (General AI Discussion)
March 20, 2019, 12:33:22 pm

Users Online

93 Guests, 0 Users

Most Online Today: 115. Most Online Ever: 259 (February 07, 2019, 07:00:00 am)

Articles