Just the Basics

  • 34 Replies
  • 9367 Views
*

djchapm

  • Trusty Member
  • **
  • Bumblebee
  • *
  • 37
Just the Basics
« on: April 12, 2016, 12:02:44 am »
The other day I was reading and in the field of micro biology - they have created a synthetic cell that is working towards the lowest common denominator in creating synthetic life. 
http://www.nature.com/news/minimal-cell-raises-stakes-in-race-to-harness-synthetic-life-1.19633

So by itself - uhhh just some microscopic cell that does nothing.  Who cares.  But on the other hand it may lead to 'artificial' life, wet-ware, amazing things.  (Don't you think the public should be much more afraid of this type of research when compared to the big Skynet worries that is all over AI these days?  Messing with microscopic cells on a mission seems way more frightening to me than an intelligent robot or network.)

Anyway - it's not that impressive to someone outside the field but I'm sure they were celebrating the achievement, and I think it's pretty amazing too. 

My question is - in the parallel of Strong AI - what is our most basic achievement we can hope to obtain that would demonstrate true/strong AI?  For-loops and static calculations aren't going to cut it is my guess.  Turing test is not really a proof of AI.  So what is the proof?  And I think anything human-level is ridiculous to achieve in terms of a true AI as a first pass.  Some people believe if we build enough of these mimicking systems and trained nets etc and put them together, it will amount to a AGI but for this discussion - lets go with the other theory of building a true AI.  I have a hard time imagining being able to create even a worm that isn't just 'mimicking'. 

And hoping to avoid deep philosophical discussions on what-is-intelligence... basically we probably won't know it until we see it, analyze how it did what it did and why, then hold a committee and pass judgement and come up with new criteria.  And no, plants aren't intelligent. 

Some brief thoughts... well... I can't think of even the smallest thing without a ton of argument..... I'll write down some garbage anyway since it's easier to build on something rather than nothing, easier to criticize than to create....
Should it be self contained? 
I guess it has to have some live/streaming physical sense?
Needs something to allow it to interact with non-self.
Has to have a goal/instinct/need/want or something - that can be satisified/measured, but cannot be linked by code to the senses available.  (woah this needs fixing... but my thought is that it would have to learn on it's own how to achieve the goal)
Needs to adapt to obstructions to it's goal. 

Thoughts?  Apologies if there has already been proposals/criteria written on this - it's a very large forum.

Man this sounds kind of ridiculous even to me, but where to start.... The other day I was wondering if I could code something in a black box to track a pin prick of light on it's own without teaching it anything directly.   I was trying to think 'simple' - but I think you could teach a computer to learn that, and it wouldn't be intelligent.  Very adaptive maybe... but wouldn't qualify as intelligence...  So how do you solve this scoping problem without widening the problem to 'solve everything'.  Maybe even describe something simple and intelligent, then discuss how we could create it without violating real intelligence.  Lets not start with Zero's cat.

DJ

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Just the Basics
« Reply #1 on: April 12, 2016, 01:05:21 pm »
Pin prick of light in a black box.

The computer would not be adaptive...only responsive.

(the switched state is either on or off).
In the world of AI, it's the thought that counts!

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6856
  • Mostly Harmless
Re: Just the Basics
« Reply #2 on: April 12, 2016, 03:17:13 pm »
The Turing test seems more akin to magic to me. As soon as you look in the next room and can see that it's a machine the trick is exposed.

What if you never look in the room though ?

Then you have something like Schrödinger's cat. Impossible to prove one way or the other. I suspect this may be the closest we get.

*

Zero

  • Eve
  • ***********
  • 1287
Re: Just the Basics
« Reply #3 on: April 12, 2016, 05:30:59 pm »
You hurt me.

*

djchapm

  • Trusty Member
  • **
  • Bumblebee
  • *
  • 37
Re: Just the Basics
« Reply #4 on: April 12, 2016, 05:53:42 pm »
Haha.  I keep picturing your cat walking on a fence snagging a bird out of the air.  Way too intelligent.  Would like your thoughts though. 

*

Zero

  • Eve
  • ***********
  • 1287
Re: Just the Basics
« Reply #5 on: April 12, 2016, 08:19:01 pm »
You hurt me, djchapm.

*

madmax

  • Bumblebee
  • **
  • 38
Re: Just the Basics
« Reply #6 on: April 12, 2016, 09:14:16 pm »
Actually the question on my opinion is not is there AI or how to test that intelligence, but why we are intelligent and how.What is the mechanism that emerge intelligence.What perception position is complex enough for intelligence to come out.

Are cells like neurons intelligent,or is the DNA intelligent or hides keys for intelligence.Maybe is complexity of condition on earth key for intelligence,maybe temperature or gravity or magnetic field?

What intelligence do,projecting its own structure on unrelated things and in that way making the relation with these new things.So we, as humans, project our intelligence on to us unrelated nonliving world.But is the projection truth.First that is only what intelligence do is not what is it.How intelligence do that is what is it.But that is a big question.

As you suggest in title is better to start with basics,or how intelligence collects information,actually how senses work.Because if these senses evolve something complex as intelligence they must be the way how to escape form the projection of intelligence on outside world and making knowledge more objective.So actually intelligence,absurdly,is now obstacle.Intelligence is not what we need to figure out but why is there intelligence.

Trough history we improve our senses and make progress as civilization.How we improving  senses ofc by trying to understand them.So why we are looking for intelligence when important things are senses.Without senses there is no intelligence.Intelligence on the other hand is not just managing senses and action because these senses it is abut specific way.Intelligence of the mouse and cat are not the same or cat and dog.Evolution of the life DNA code.

Making synthetic cells is making much more sense.Senses are making sense not intelligence.Intelligence is side effect of senses like consciousness.But intelligence is bigger than consciousness.But consciousness is more important.Intelligence dont make consciousness but self-consciousness from consciousness.

"We don't need another hero",(AI)
"All we want is life beyond the Thunderdome"(delusion of intelligence)

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: Just the Basics
« Reply #7 on: April 12, 2016, 10:11:35 pm »
Intelligence is just a tool, a combinatorial ability to predict consequences from causes.

What really matters are emotions. Not even feelings from our senses, but emotions. As we have grown up from childhood, we learned to disregard our feelings (say hunger) and to give priority to emotions (say being happy for giving the food to hungry ones).

There is something mystical about emotions. They are more than just an amplified voltage from our senses. I think the same emotions dictate how we feel about art, how we experience situations, shapes and sounds. Our emotions are arranged as a counter effect of different situations and it is about one's beliefs: are those emotions result of our spiritual world interacting with reality through our sensory input.

Let's return to intelligence. Being a tool for predicting situations, we adjust the world around us to indulge our emotions. I believe that intelligence can be simulated by a machine, but it would be nothing compared to emotions (from complexity point). Emotions are about to be tagged as a real challenge, maybe never to be understood. Intelligence is pure math, it' stands for logic equations, but emotions that stand for understanding art... Try to imagine how to build an algorithm that says "this painting humans like" or "this painting humans dislike". Maybe even impossible, as we all have different tastes...

But never say never, maybe there is such an equation that clearly segregate an art from a pulp. Maybe it can say what i.e. snails like to watch...

*

madmax

  • Bumblebee
  • **
  • 38
Re: Just the Basics
« Reply #8 on: April 12, 2016, 11:52:03 pm »
Actually, what i think, emotions are part of the intelligence and can not be divided from it.Senses give sense to intelligence and emotions govern whit intelligence.Without emotions intelligence could not be established.But emotions some how develop through intelligence the most through self-consciousness.


I presume that emotions have something with energy through time balance,but is very complex,and that is giving instruction for establishing intelligence.Science, where intelligence is seems to be emotionless is the state of developed emotions over individual aspects on the level of social construction for well being of everyone.That is why are scientist exited the most.If we look on this big picture science is like the consciousness of the society and scientist are intelligence again governed by emotions.

So general purpose AI is not possible, this general is needless because intelligence is general that is main characteristic of it.

*

djchapm

  • Trusty Member
  • **
  • Bumblebee
  • *
  • 37
Re: Just the Basics
« Reply #9 on: April 13, 2016, 03:52:20 am »
Okay... So trying to bring it back down to earth here.

We have senses and I think everyone thinks those are lame input devices.
On one level - intelligence could take senses, sequences, previous experience/memory and provide a prediction.  But what people are putting forth here is that this in itself is not true intelligence.  It can be created through brute force processing etc.
The mystical quantity here is emotion which can change the value of those memories and experiences and provide predictions different than straightforward mathematics. 

So a line might be getting drawn here - like I don't think grasshoppers have emotions.  I don't think bees do either - we know from science they change behavior based on smells.  Humans can recall emotion and in turn create the emotion from self.  Like I can remember something sad and become sad.  I don't think bees could do the same thing.

Hopefully I boiled down that conversation a little bit without destroying all your thoughts.... I've thought before of 'gauges' that are constantly moving as streaming data moves through nets/memories etc.  The gauges being for judgement, anger, threat, happiness, etc - their sliding window is bigger than the individual perceptions streaming in.

So then we should be able to 100% create something as complex as a bug/spider/grasshopper/bee using basic math and processing.  Blows my mind though that they can function so well and react so fast to things, and live so long, when their bodies/power levels/brains are so miniscule.  I'm reading about bugs and animals every night as my sons go to bed - it's their favorite.  I'm totally shocked at the amazing things and abilities these animals have.  It's like reading ideas for the "X-Men" series.

So Zero - my emotional intelligence is eating away at me as I have offended you - I didn't mean any offense, was meant to be friendly banter but my sensitivity score has always been like a C-.  Please accept my apology.  I was referring to your cat protocol:

I think the core of instinct is the "cat protocol"
Code: [Select]
1: scan until something unusual catches your attention
2: evaluate it
3: if it seems good, get closer (or eat it) and goto 2
4: if it seems bad, get away (or push it) and goto 2
5: if it's irrelevant goto 1

DJ

*

Zero

  • Eve
  • ***********
  • 1287
Re: Just the Basics
« Reply #10 on: April 13, 2016, 07:24:00 am »
Thank you, djchapm. I understand you didn't want to hurt me.  :)



Ok. Now I'll try to explain my thoughts.

Do you see what happened? When you read "you hurt me", something happened inside you. You felt sorry, because you're not a digital sociopath, and because you assume I have feelings, because you believe I'm a human being.

But let's face it: what you call "Zero" is just text on your screen. Maybe Zero is a fictional character created by 10 science-fiction enthusiasts currently authoring a book together, here in France, and experimenting concepts from La Dernière Littérature, where fictional characters can interact with the real world... Maybe Zero's answers are created during brainstorming sessions, with a lot of copy/paste gathered and translated from non-english internet.

So, you have a dilemma. if Zero isn't real, you should not feel sorry for him, because he doesn't have real feelings. But if he's real, you should feel sorry, because of empathy.

This is the very nature of the Turing test. It's not about mimicking or pretending or faking anything. It's about establishing a real emotional connection, like the one you established with our character Zero.

So which one is true? The truth is: Zero is not a real guy with a nickname, Zero is not a fictional character, Zero is text on your screen, and whatever you do, you'll never know how this text was created. If one day you accept this, then you'll understand that creating True AI isn't a technical problem, but an emotional one. As I said, what you believe is what you get.

You're like Human-A saying:
Code
Human-A: - OMG, I wish I could create it...
Human-B: - Create what?
Human-A: - I don't know.

If you don't know exactly what you want to create, you can't create it.

I do know what I want to create. I want to create a software being that's able to establish real emotional connections with humans.

Shall we succeed now?

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6856
  • Mostly Harmless
Re: Just the Basics
« Reply #11 on: April 13, 2016, 01:25:29 pm »
You hurt me.

Oops sorry  ::)

Always happy to be proved wrong :)

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Just the Basics
« Reply #12 on: April 13, 2016, 01:40:05 pm »
Just don't equate intelligence and emotions to humans alone. Many animals have exhibited various levels of intelligence from dogs & cats, chimps, the octopus and elephants to name a few. It is also noteworthy that these same animals have exhibited emotions as well...perhaps not the entire range of emotions that humans have but emotions none-the-less.

I've watched seagulls pick up a large clam shell then fly perhaps 30 feet high and drop the shell. If the impact of hitting the sand didn't open it, the bird will repeat the process maybe another time or two. Eventually that shell will open and the bird will get it's lunch! Somehow birds have learned this technique and taught it to their young who standby and watch.
Intelligence? Yes! Can't say for certain about emotions but I'd wager that bird becomes quite happy when that shell opens! ;)
In the world of AI, it's the thought that counts!

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: Just the Basics
« Reply #13 on: April 13, 2016, 01:51:23 pm »
Bugs are having a lot in common with humans: they breathe, they eat, they make their children in the similar way we do. Physically they have head, body, brain, nervous system. It is not a proof, but if we are all created in the similar way, why wouldn't we all share the same emotions?

I wonder what do plants have... Do they feel emotions? They just die if something really bothers them, but do they have a system similar to ours, the system of sensing "yes" or "no" from their surroundings?

I see bugs as evolved plants, like I see animals as evolved bugs. So I assume we could all share the same emotional system. But those are just beliefs.

*

djchapm

  • Trusty Member
  • **
  • Bumblebee
  • *
  • 37
Re: Just the Basics
« Reply #14 on: April 13, 2016, 03:10:53 pm »
Alright Zero - I admit I think you're a human.  I think you're saying that if we can fake it well enough such that observers can't tell the difference - then it is just as good as the real thing.  This is basically Turing again.  But the endeavor here is to create something that isn't faking it.

Ivan - bugs evolving from plants is hard to digest.  I think I see what you're getting at - forms of life at different levels or stages.  And I agree with that.  Like Art stated - we can definitely see that elephants and maybe birds have emotions - though i'm not convinced that the bird dropping a clam from 30 feet isn't just another way of trying to break open a clam.  Which is what I'm calling mechanical intelligence.  Trial/Error/Experience which is what the world is getting good at in machine learning today. 

People are arguing that intelligence without emotion is just processing, intelligence with emotion is what we need to achieve to make computers seem 'intelligent'.  Just going with the flow here - I do think it's amazing that a bird would understand the world well enough to pull out that move of dropping a clam. 

I'm not sure where the line is - but emotion/feeling doesn't happen in the world till we advance to the level of a mouse or something.  Maybe it's the size of the brain.  What's the smallest thing you think has demonstrated and crossed the line from mechanical intelligence to emotional intelligence (ahh new terms :)

Thinking of back to basics - wondering if it's better to first accomplish a self contained, mechanically intelligent being before seeking emotional intelligence.  When I say this - I'd still like to pursue/prove the lowest level somehow and describe the features that would require to say that it is intelligent.  Which means it's not specifically coded to beyond basic instincts and in/out interfaces.  I think those two things are required to evolve anything. 

Also - I'm no scholar - I like coding and am fascinated with machine learning.  Just want to say how I appreciate the respect everyone gives here on this forum.  I love discussing this and the discovery aspect, organizing thoughts and ideas.  Couldn't happen if people were abusive or condescending.  Thanks everyone.

DJ

 


OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

284 Guests, 1 User
Users active in past 15 minutes:
Freddy
[Administrator]

Most Online Today: 376. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles