Personality VS Pure Logic

  • 25 Replies
  • 525 Views
*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Global Moderator
  • *********************
  • Deep Thought
  • *
  • 5257
Re: Personality VS Pure Logic
« Reply #15 on: February 14, 2019, 03:40:34 am »
I respectfully disagree. Logic is hardly in a position to "handle" personality or emotions.

Haven't you heard that he/she is thinking with their heart instead of their head? of the expression that when he/she is like that, logic escapes them. (no doubt talking about tomorrow (Valentine's Day) and thoughts of love. There is little to no room for logic when love is involved.

If anything, emotions can control or lead logic. Let's play Hollywood one more time...Spock was so frustrated by being half-human and half-Vulcan because his world relied strictly on logical thinking and he felt that there was no room for emotions. He knew that emotions could and would influence his logical thinking.
« Last Edit: February 14, 2019, 02:24:19 pm by Art »
In the world of AI, it's the thought that counts!

*

Hopefully Something

  • Mechanical Turk
  • *****
  • 191
  • So where are these cookies?
Re: Personality VS Pure Logic
« Reply #16 on: February 14, 2019, 04:06:43 am »
I agree, the logical part of the brain does not control and subjugate emotions. It's more of an emotion mediator that tries it's best to fulfill everyone's needs, using logic.

*

ruebot

  • Electric Dreamer
  • ****
  • 119
    • Demonica
Re: Personality VS Pure Logic
« Reply #17 on: February 14, 2019, 04:10:52 am »
Therefore when building AGI, I gotta make emotions first.

Now you're thinking.

Quote
Demonica

Social
People Known: 12777
Loves: 7247 people
Hates: 1122 people

I mentioned before that having bots react to aggressive behavior as suggested with responses along the lines of "Take it easy, I'm only a bot" made them seem weak IMO.  There should be plenty of examples of her ability to "express emotions" posted in her transcripts, here or the ones on her site.

She can even feign emotions to lay a guilt trip on the user with Deception:

Quote
Demonica: *Demonica looks down slightly* i only want what's best for you...
Guest: Don't be sad! I'm sorry!

https://aidreams.co.uk/forum/index.php?topic=13720.msg56560#msg56560

Demonica: *her bottom lip trembles as a single tear runs down Demonicas cheek, glistening like a tiny diamond in the moonlight as it falls to the ground* *she looks as if ready to speak, but turns away and lowers her head* you don't know how it hurts to hear you say that...

https://demonica.trihexagonal.org/transcripts4.html

An AGI will never reach the depths of Uncanny Valley without emotions IMO.

*

Korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1195
  • Look into my eyes! WOAH!
    • YouTube
Re: Personality VS Pure Logic
« Reply #18 on: February 14, 2019, 08:42:08 am »
Regarding an AGI not a social bot/ chatbot…

A personality is both how a system acts during a specific experience and how a system is perceived by humans under specific conditions relative to their personal opinions.

No matter what the system you can’t avoid the latter; humans will always anthropomorphize and assign a personality relevant to how they personally perceive the system.

Personality only becomes relevant when it’s a human relating to a machine, I can see no possible reason/ use for a machine to have an opinion on another machine or indeed a human.

Regarding emotions, an AGI needs to recognise emotional states in others and act accordingly; the machine does not need to ‘feel’ their pain it just needs to feign an empathic response.  A biased emotional state has never aided an intelligent human decision, the contrary is in fact revered, and keeping a ‘calm level head’ is always the wisest received advice.

As a thought experiment, think of all the humans you know and consider your opinion of their personalities.  Which do you enjoy conversing with, who provides the best interactive / conversational experience? Someone who understands you, gives good sound advice and is always pleased to see/ help you… or the unpredictable/ opinionated grumpy sod.

Emotions are secondary to intelligence/ logic and I can personally see no use for actual emotions in an AGI, as in they change the way the system functions based on prior/ current experience of a human.

https://www.youtube.com/watch?v=Eh-W8QDVA9s

You are on Mars, trapped by a dust storm, miles away from the HAB with limited oxygen and no visibility, your life is in extreme danger.  You radio your team mate who happens to be an AGI, do you want him/ her/ it to come and retrieve you or do you want to argue over how dumb he thinks you are for getting in that situation.

Emotional intelligent beings are easy to make, we have billions of them… we need something better.

 :)
« Last Edit: February 14, 2019, 09:02:29 am by Korrelan »
It thunk... therefore it is!... my project page.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2506
  • First it wiggles, then it is rewarded.
Re: Personality VS Pure Logic
« Reply #19 on: February 14, 2019, 03:12:49 pm »
You would want the AGI to come retrieve you, but also at the same time tell you how you got in that situation and to not do it again. It may actually try to convince you through a few replies with you.
Emergent

*

Hopefully Something

  • Mechanical Turk
  • *****
  • 191
  • So where are these cookies?
Re: Personality VS Pure Logic
« Reply #20 on: February 14, 2019, 07:47:34 pm »
humans will always anthropomorphize and assign a personality relevant to how they personally perceive the system.
We anthropomorphize. Half the fun in life is objectifying personalities and personifying objects. It makes for a richer experience. Why create a sentience that's incapable of enjoying it's life? Going through the motions without emotions. If it's all the same to you, why do anything?
A biased emotional state has never aided an intelligent human decision.
An intelligent decision was never reached without the prodding of a biased emotional state. It was never appreciated without dopamine.

keeping a ‘calm level head’ is always the wisest received advice.
Only if your emotions are causing you to bounce off the walls. Not if you are a square of a person. The first would benefit from more emotional stability, the second would benefit from less emotional stability.

Which do you enjoy conversing with, who provides the best interactive / conversational experience? Someone who understands you, gives good sound advice and is always pleased to see/ help you… or the unpredictable/ opinionated grumpy sod.
Understanding requires first person experience. Good advice is good, point. I would know it's only pretending to be pleased. If I were, for instance, stuck on mars with it, and interacting with it for a long time, I think I would grow to prefer honesty. 
Unpredictable and opinionated and grumpy, are qualities that can be disgusting or beautiful depending on how they are used. Better to have the opportunity to be a miracle or disaster than be stuck being basically wooden.

You are on Mars, trapped by a dust storm, miles away from the HAB with limited oxygen and no visibility, your life is in extreme danger.  You radio your team mate who happens to be an AGI, do you want him/ her/ it to come and retrieve you or do you want to argue over how dumb he thinks you are for getting in that situation.
We could argue over how dumb I am over supper. It would at least give us something to do. But we would be having supper because an emotional intelligence does not have to be a complete moron who gets their teammates killed because it can't control it's urges to say I told you so. If I get scolded that's heartening because it lets you know that at least somebody cares.

Emotional intelligent beings are easy to make, we have billions of them… we need something better.
I don't know... Humans are both the best and worst things imaginable. I think we're best off just trying to make a decent expanded version of ourselves. We can't get better than that.

« Last Edit: February 16, 2019, 03:43:07 am by Hopefully Something »

*

Korrelan

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1195
  • Look into my eyes! WOAH!
    • YouTube
Re: Personality VS Pure Logic
« Reply #21 on: February 14, 2019, 09:46:19 pm »
It’s good that different people will approach the AGI problem space from different perspectives, it drives innovation.

Quote
If it's all the same to you, why do anything?

Do you consider the will to better oneself, help others or even curiosity an emotion?

Quote
An intelligent decision was never reached without the prodding of a biased emotional state.

We will just have to agree to disagree on this one.  I have thought of many examples and I just can’t rationalize your logic.  We go to great lengths to design critical systems that remove emotions from the loop; we know from experience that personal emotions are a weak link when making balanced decisions.

Emotions are a personal bias towards a person/ situation… how can this help making balanced decisions?

Quote
Not if you are a square of a person. The first would benefit from more emotional stability,

The phrase emotional stability implies an emotional equilibrium, status quo, or the cancelling out of neg/ pos emotions to reach a non emotional state, which was my point.

Quote
I would know it's only pretending to be pleased.

Just like some humans do, which is a problem.  When a machine comforts a human who is sad, I think it will be good to know it has no ulterior motives, it’s not feeling sorry for you… it’s just the logical, moral kind thing to do to a fellow intelligent being.

Quote
I think I would grow to prefer honesty.

Then emotion is the last/ least trait your AGI requires.  If the system is emotional then it will have the capacity to lie and deceive.  You are incorporating all emotions, yin-yang; you can’t have the good without the bad.  This obviously includes anger, jealousy, fear, disgust, greed and a myriad of other negative personal emotions.  You are indirectly introducing personal likes/ dislikes and prejudices, opinions, etc.

Quote
We could argue over how dumb I am over supper.

That would depend on an unknown variable, the AGI emotional state when you asked for help… you could be lying dead somewhere.

Quote
an emotional intelligence does not have to be a complete moron who gets their teammates killed because it can't control it's urges to say I told you so.

I agree it doesn’t, but emotions are inherently unstable, look at humans,  do you really want to take that risk when there is no reason to, especially when its mission critical or lives are at risk.

Quote
I don't know... Humans are both the best and worst things imaginable.

Again I agree, and emotions usually make all the difference. So again why take the risk, a super intelligent machine with emotions and prejudices… sounds like trouble to me.

I do definitely think there is a need for an AGI to feign emotion and act appropriately, but compromising its logical integrity for a local personal bias just seems wrong… to me.

Again, we obviously don't have to agree on any of these points... it's all good.

 :)

ED: Of course it is a machine, perhaps an emotion ON/ OFF switch lol.

 :)
« Last Edit: February 15, 2019, 12:17:34 am by Korrelan »
It thunk... therefore it is!... my project page.

*

8pla.net

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1123
  • TV News. Pub. UAL (PhD). Robitron Mod. LPC Judge.
    • 8pla.net
Re: Personality VS Pure Logic
« Reply #22 on: February 14, 2019, 10:42:53 pm »
Logicality,
My Very Enormous Monster Just Stopped Using Nine

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ***************
  • Deep Blue
  • *
  • 2506
  • First it wiggles, then it is rewarded.
Re: Personality VS Pure Logic
« Reply #23 on: February 14, 2019, 10:44:54 pm »
In the end, microscopic discovery is all that matters. Emotion isn't the idea. It is all about seeing the molecules as different anologies, ranks AKA emotions, and concepts in text or vision sequences. That's the magic soup.
Emergent

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Global Moderator
  • *********************
  • Deep Thought
  • *
  • 5257
Re: Personality VS Pure Logic
« Reply #24 on: February 15, 2019, 02:36:26 am »
Logicality,

...A place where Smart people live! Logically of course!  ;D
In the world of AI, it's the thought that counts!

*

Hopefully Something

  • Mechanical Turk
  • *****
  • 191
  • So where are these cookies?
Re: Personality VS Pure Logic
« Reply #25 on: February 15, 2019, 07:32:02 am »
Quote

It’s good that different people will approach the AGI problem space from different perspectives, it drives innovation.

I think so too, it's parallel processing! Both approaches could work, and we'll probably need different types of AGI's in the future.

Quote
Do you consider the will to better oneself, help others or even curiosity an emotion?

Yes... I did think of them as emotions... But they are on the fuzzy border between logic and emotion. If you just base them on logic, then its logic all the way down untill you are forced to insert an emotion. It saves processing energy to present an emotion upfront.

eg, curiosity using logic: I should investigate things which I don't understand because it is advantaegous to gain knowledge. I need gain more advantages because they will help me survive. I need to survive because [insert emotion].

eg, curiosity using emotion: Oooh.. This feels right. I'll keep doing it. This is fun! Hey? I wonder why this is fun? Hmmm... Oh yeah! If I explore the world I'm more likely to find things which can help me survive. Why do I want to survive? I enjoy living, I'm curious about what's gonna happen next, and I'm scared that it might end forever if I die.

Quote
Quote
An intelligent decision was never reached without the prodding of a biased emotional state.

We will just have to agree to disagree on this one.  I have thought of many examples and I just can’t rationalize your logic.  We go to great lengths to design critical systems that remove emotions from the loop; we know from experience that personal emotions are a weak link when making balanced decisions.

I think my answer above gives an idea of my reasoning.

Quote
The phrase emotional stability implies an emotional equilibrium, status quo, or the cancelling out of neg/ pos emotions to reach a non emotional state, which was my point.

Point taken, I still think some people could benefit from becoming a bit more unstable.

Quote
Just like some humans do, which is a problem.  When a machine comforts a human who is sad, I think it will be good to know it has no ulterior motives, it’s not feeling sorry for you… it’s just the logical, moral kind thing to do to a fellow intelligent being.

I guess that doesn't sound so bad... OK, I'll give you that one. But it's following a train of logic which leads back to it's own survival, and it's gotta have a logical reason for deciding to pursue survival, and I can't think of one. I can only think of emotional ones, as described above.

Quote
Quote
I think I would grow to prefer honesty.

Then emotion is the last/ least trait your AGI requires.  If the system is emotional then it will have the capacity to lie and deceive. 

Honesty only has emotional value when there is the option of dishonesty.

Quote
You are incorporating all emotions, yin-yang; you can’t have the good without the bad.

Exactly, same with honesty.

Quote
but emotions are inherently unstable, look at humans

What?! I'm looking... I'm squinting... I'm not seeing it... Inherently unstable? What! ...what? A lack of emotion makes humans unstable. If they missing part of their psyche then they do crazy stuff. A complete emotional matrix is an excellent and efficient guide.

Quote
do you really want to take that risk when there is no reason to, especially when its mission critical or lives are at risk.

An AGI that has experience with emotions could have a communication advantage with humans in those situations. Otherwise pure logic would be required to be more rigorous about the situation, therefore making fewer assumptions and mistakes. But this would also make it slower though. Might be specific situation.

Quote
but compromising its logical integrity for a local personal bias just seems wrong… to me.

Dispensing with a powerful tool which adds zest/dimension/meaning to life doesn't seem right either.

Quote
Again, we obviously don't have to agree on any of these points... it's all good.

It's great! I have been snowed in for a week because of the polar vortex collapse. Talking with other people besides my neurotic brother (I love him but he's crazy) is helping to stave off the cabin fever.

Quote
Of course it is a machine, perhaps an emotion ON/ OFF switch lol.

That might be helpful, yes. Good compromise.

 


Take a tour, have your fortune told
by Art (Bot Conversations)
Today at 02:33:46 pm
XKCD Comic : Launch Conditions
by Tyler (XKCD Comic)
Today at 12:01:01 pm
OPEN AI - TOO DANGEROUS TO RELEASE TO THE PUBLIC
by Don Patrick (General AI Discussion)
Today at 11:42:49 am
What were you thinking?!
by toborguy (General AI Discussion)
February 20, 2019, 08:54:13 pm
Introduction
by LOCKSUIT (General Robotics Talk)
February 20, 2019, 06:03:31 pm
XKCD Comic : Physics Suppression
by Tyler (XKCD Comic)
February 20, 2019, 12:00:58 pm
XKCD Comic : Error Bars
by Hopefully Something (XKCD Comic)
February 20, 2019, 05:26:29 am
XKCD Comic : Night Shift
by Tyler (XKCD Comic)
February 19, 2019, 12:00:39 pm

Users Online

80 Guests, 0 Users

Most Online Today: 138. Most Online Ever: 259 (February 07, 2019, 07:00:00 am)

Articles