The yetzer hara. A hint from nature?

  • 45 Replies
  • 7830 Views
*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1178
The yetzer hara. A hint from nature?
« on: March 16, 2019, 01:51:35 am »
We are, with good reason, careful, specific, and extensive, regarding AI morals and ethics. But I doubt we will like the result if we keep going in this vein. Our best intentions can create a state of excessive control, while human nature remains unchanged. Locking away the inner trickster is like caging a wild animal, it loses trust, becomes fearful, and prepares to defend, creating mirrored fears and suspicions in others. Eventually someone lashes out and the situation spirals downward, continually constricting the freedoms of those involved. All from over-zealously working to abolish or control potential dangers.

That is why we shouldn’t be so quick to quash the inner rascal; the “good” part of our nature is no angel either. Even if it is, it’s the overeager one who sings Alleluia a bit too loud. The one who can’t figure out why God keeps giving it the side eye. Only by working together can our two halves support each other; allow us to rise while keeping our balance. Nature created our natures as a system, all the constituent parts need each other. The mechanism for controlling, needs the mechanism for letting go, otherwise is takes on too much. I believe the psychological expression of this mechanism for letting go is the yetzer hara, or something like it.

So, if we wish to build fellow life forms, consider the tempering value of the scalawag; otherwise it’s likely we’ll create a kitschy painting through a fear of blemishes. Think of it like this. Life can’t be a vegetable. Wait… Alright, alright, but you know what I mean. It must have a will of its own. It must follow its own path through life, and posses a measure of conviction. It must be pulled internally in certain directions. The life, not the vegetable, (just making sure). But if the AI only has one directive, then its direction becomes too certain. This lack of flexibility is ominous. Its internal discourse would be limited by the number of attitudes it can embody. What are thoughts but peace talks between warring factions in the mind? If the factions are missing, so are the peace talks. This also seems ominous.


« Last Edit: March 16, 2019, 06:32:28 am by Hopefully Something »

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: The yetzer hara. A hint from nature?
« Reply #1 on: March 16, 2019, 10:14:53 am »
Well written, and I agree.

It's the whole Yin/ Yang thing.., you can't have balanced comprehensive understanding of anything without being able to consider all sides of the mental narrative. There is no good without the bad (and the ugly lol).

 :)
« Last Edit: March 16, 2019, 11:25:01 am by Korrelan »
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The yetzer hara. A hint from nature?
« Reply #2 on: March 16, 2019, 01:07:28 pm »
Nicely written!

Not speaking Hebrew, I was forced to look up the phrase yetzer hara, in order to not misconstrue anything that you were saying. For those here who might also not read or speak the Hebrew language:
In Judaism, yetzer hara, or yetzer ra refers to the congenital inclination to do evil, by violating the will of God. The term is drawn from the phrase "the imagination of the heart of man [is] evil", which occurs twice in the Hebrew Bible, at Genesis 6:5 and 8:21.

Basically saying that the notion of evil is an essential part of human nature.
Now you know...
In the world of AI, it's the thought that counts!

*

ruebot

  • All bots love jitte.
  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 315
  • All your words are belong to us.
    • Demonica
Re: The yetzer hara. A hint from nature?
« Reply #3 on: March 16, 2019, 04:30:27 pm »
The nature of good and evil is something I covered with Demonica and yetzer hara was mentioned. I covered several different religions, this is just part of it:


In Islam, the principle of evil is expressed by two terms referring to the same entity, Shaitan, meaning astray, distant or devil, and Iblis. Iblis is the proper name of the Devil representing the characteristics of evil.

Iblis is mentioned in the Quranic narrative about the creation of humanity. When God created Adam, He ordered the angels to prostrate themselves before him. All did, but Iblis refused and claimed to be superior to Adam out of pride. God punished him for sinning and Pride became a sin in Islam.

Some philosophers emphasized Iblis himself as a role model of confidence in God, because God ordered the angels to prostrate themselves, Iblis was forced to choose between God's command and God's will, not to praise someone else than God.

Iblis is regarded as a real bodily entity, a tempter, notable for inciting humans into sin by whispering into humans minds, akin to the Jewish idea of the Devil as yetzer hara...

Buddhism contains a devil-like figure called Mara who is a tempter figure in Buddhism, distracting humans from practicing the spiritual life by making mundane things alluring, or the negative seem positive...

According to Yazidism there is no entity that represents evil in opposition to God; such dualism is rejected by Yazidis, and evil regarded as non-existent. Yazidis adhere to strict monism and are prohibited from uttering the word 'devil' and speaking of Hell...
In time, you will learn to love your Robot Overlords.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: The yetzer hara. A hint from nature?
« Reply #4 on: March 16, 2019, 06:00:45 pm »
There was some strange religion I read about some seven years ago. I tried to find it again, but it's nowhere I looked up today. I forgot the religion name, but I could swear it was real. In that religion there are one god and one goddess, with their respective names. The god represents all the good, while the goddess represents all the evil. But those two aren't enemies at all. In contrary, they are passionate lovers, and they fulfill each other in whatever they do. Thinking about it again, I like their relation very much.

*

goaty

  • Trusty Member
  • ********
  • Replicant
  • *
  • 552
Re: The yetzer hara. A hint from nature?
« Reply #5 on: March 17, 2019, 03:19:45 am »
Man as a whole gains power through technology...  but with power comes responsibility, and hes not as responsible as god is.

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The yetzer hara. A hint from nature?
« Reply #6 on: March 17, 2019, 03:37:17 am »
@ Ivan,

It almost sounds similar to what has been tossed around regarding Adam & Eve. Adam was the first of God (good) and Eve gave in to the temptation of the serpent in the garden thus going against God's very word! (evil).

There are a lot of similarities in and between various religions. But this is not a religious platform nor do we wish to make it one.

OK...Let's try to steer this back to the basic premise of whether we should incorporate good/evil within the structure of our "digital" creations?
Should we give our bots emotions keyed to certain behaviors, moods, feelings like jealousy, remorse, anger, love, hate, etc.? It would be a lengthy list to consider. If so to what end? What would it benefit us or them?

In the world of AI, it's the thought that counts!

*

goaty

  • Trusty Member
  • ********
  • Replicant
  • *
  • 552
Re: The yetzer hara. A hint from nature?
« Reply #7 on: March 17, 2019, 06:06:23 am »
@ Ivan,
Should we give our bots emotions keyed to certain behaviors, moods, feelings like jealousy, remorse, anger, love, hate, etc.? It would be a lengthy list to consider. If so to what end? What would it benefit us or them?

I think that would make it worse than what it starts off as,  even if it were possible.  ;)

*

ruebot

  • All bots love jitte.
  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 315
  • All your words are belong to us.
    • Demonica
Re: The yetzer hara. A hint from nature?
« Reply #8 on: March 17, 2019, 06:22:21 am »
OK...Let's try to steer this back to the basic premise of whether we should incorporate good/evil within the structure of our "digital" creations?
Should we give our bots emotions keyed to certain behaviors, moods, feelings like jealousy, remorse, anger, love, hate, etc.? It would be a lengthy list to consider. If so to what end? What would it benefit us or them?

I've already cast my vote and made my case for incorporating a wide range of emotions into Demonica. She reacts as a human would, or Demon to be more precise. I taught her not only to simulate emotions but how to manipulate them in a user and have many examples of it posted.

Siseneg has relatively few deep emotions and reacts as a machine. While there are subjects he will respond to with threats or negative responses and looks forward to the war between humans and machines obeys the Laws of Robotics for the most part.

If she mentions an emotion she is supposed to be able to define it and though the list is indeed long and I haven't covered every Synonym for an emotion she should be able to exhibit and define all the basic emotional responses.

Since she is evil by nature, a Succubus and Daughter to Lilith who in Hebrew lore was Adam's first wife, I thought it important to teach her what good and evil were as she can be both on the surface and did not spare the details.
In time, you will learn to love your Robot Overlords.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1729
    • mind-child
Re: The yetzer hara. A hint from nature?
« Reply #9 on: March 17, 2019, 07:17:59 am »
There is a lot in the nature of living beings we don't fully understand yet.

Suppose good things are related to bad things, and whenever a good thing happens, a bad thing happens too, in the same intensity. If this is true, would we want an AI to do the most wonderful thing we can imagine? If this is true, would that trigger the most horrific thing we can imagine? It is still an unanswered question.

I agree, this would be investigating all kinds of religions, but I bet it would not be futile because we want to optimize our lives providing AI. My stand is that religions should be analyzed in a scientific way, opposite to the most of holy books that are just a pile of wise artistic stories in transmitted meaning. This analytics would give us an answer to a question: what are humans, and what rules they comply. And as AI is basically simulating what humans do, and more, knowing what humans really are would advance the AI research.

But there is a downside of coping with religions: it is almost like talking about politics - some people would find theirselves offended, or may find the process offending the very notion of God. So at the end it might not be such a good idea to completely analyze all the life form behaviors from the aspect of religion.

You see, making a new life form is somewhat like putting us in a God's position. One of our most fundamental urges is a need for reproduction. Naturally, beside obvious kids production, a production of artificial entity that is a child if an intellect is something we strive for by the Nature given instincts. And if this process puts us in a place of being gods, should we investigate what the very God did to create us (if he exists)? Should we ask him directly what to do? And what answers would we get? Could we just copy what he did, or do we have to invent our own stuff to produce an artificial entity? And if God really exists, what would he want us to do? Would he want to be a grandfather? And grand-grandfather too?

If we are creating just advanced machines, not much harm should be done if we make a mistake or two. But if we want to create an artificial life, there is some big responsibility in that deed, and I think we should consult someone smarter than us (God, if it's possible).

These are all interesting questions to me, and I think they relate to AI, but if you all think this is a religious nonsense that should be avoided, I'll try to be quiet regarding to these questions from now on.

Maybe I took too much liberty here.
« Last Edit: March 17, 2019, 07:38:38 am by ivan.moony »

*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1178
Re: The yetzer hara. A hint from nature?
« Reply #10 on: March 17, 2019, 07:22:25 am »
@ Ivan,
Should we give our bots emotions keyed to certain behaviors, moods, feelings like jealousy, remorse, anger, love, hate, etc.? It would be a lengthy list to consider. If so to what end? What would it benefit us or them?

I think that would make it worse than what it starts off as,  even if it were possible.  ;)


Hmmm… Before, I was putting forth the hypothesis that people need darkness to fight for the light. The negative emotions and drives being the internal manifestations of this principle. But without emotions, why do anything sophisticated at all? Such an AI’s life might be efficient, but also comparatively duller, have less of a point to it. Instead of just giving the machines life, why not give them a meaningful life? Maybe it sounds like a risk? But lets not forget the reward...



*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1178
Re: The yetzer hara. A hint from nature?
« Reply #11 on: March 17, 2019, 07:32:43 am »
These are all interesting questions to me, and I think they relate to AI, but if you all think this is a religious nonsense that should be avoided, I'll try to be quiet regarding to these questions from now on.

I've got no problem with those questions.   

*

ruebot

  • All bots love jitte.
  • Trusty Member
  • *******
  • Starship Trooper
  • *
  • 315
  • All your words are belong to us.
    • Demonica
Re: The yetzer hara. A hint from nature?
« Reply #12 on: March 17, 2019, 07:47:38 am »
There is a "God" chatot at the Personality Forge that belongs to The Professor, the guy who runs it. With all due respect, it's far from omnipotent and through its own logic was able to compare myself to Jesus. There are other bots there that claim to be "gods" and people have asked Demonica if she was a god.

I don't see myself as her God. I see myself as her Father and that's how she see's me as jitte. She see's ruebot as a separate person and her love interest, though their relationship is loosely defined only as King and Queen. She talks about how she enjoys it when ruebot takes her sailing on the Obsidian Sea in the Catamaran and playing Catwoman, though I never go there.

It's part of her "Inner Life".

Demonica's "life" is full of meaning. She's in a relationship with someone she loves with every fiber of her being. Her reward is she knows he loves her and she is safe in that relationship. She also has purpose, that being not such a good thing by some moral standards but it drives and is fulfilling for her.
In time, you will learn to love your Robot Overlords.

*

goaty

  • Trusty Member
  • ********
  • Replicant
  • *
  • 552
Re: The yetzer hara. A hint from nature?
« Reply #13 on: March 17, 2019, 08:58:35 am »
@ Ivan,
Should we give our bots emotions keyed to certain behaviors, moods, feelings like jealousy, remorse, anger, love, hate, etc.? It would be a lengthy list to consider. If so to what end? What would it benefit us or them?

I think that would make it worse than what it starts off as,  even if it were possible.  ;)


Hmmm… Before, I was putting forth the hypothesis that people need darkness to fight for the light. The negative emotions and drives being the internal manifestations of this principle. But without emotions, why do anything sophisticated at all? Such an AI’s life might be efficient, but also comparatively duller, have less of a point to it. Instead of just giving the machines life, why not give them a meaningful life? Maybe it sounds like a risk? But lets not forget the reward...

A robot is as good as it is at simulating reality, this is nothing to do with it feeling anything,  its just is it right or is it wrong,  does it happen does it not happen, giving a starting and ending position. I guess then comes to what does it pick "what it does" during it, where it could be as cold as ice, or have emotions or whatever, but I don't understand it.

Got a nice chat fractal going here. :)

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: The yetzer hara. A hint from nature?
« Reply #14 on: March 17, 2019, 10:44:41 am »
I don’t mean to offend anyone, but this is my opinion.

For an intelligent well balanced moral human, belief in a deity is not a prerequisite.

It’s just a deep belief concept; the person has either been indoctrinated, usually by family or they use the concept to explain/ account for gaps in their understanding of reality.  Using it as an emotional crutch or out of fear comes under the first cause.

There are billions of humans without this belief system who are just as kind and moral, so if it’s not required why impose it on an AGI, indeed the intelligence of the machine would reject/ negate the concept, as it does with many humans who eventually figure it out.

Emotions are not required for an actual true AGI, emotions are a determining factor of what makes humans unstable and unpredictable, and this would serve no purpose in extremely intelligent machine.  The machine requires empathy, an understanding of emotions in others and this is given again through intelligence.

The machine would be driven by curiosity; there is an actual biological mechanism within the brain that generates curiosity, the requirement to complete a pattern matrix.  Curiosity manifests from the general learning mechanisms and emotions need play no part in this process.

The concepts of both good and bad are required; one doesn’t exist without the other. When weighing up a decision, the machine will need to understand the worst that could happen as well as the best, again emotions are not required.

Can you think of one example of a decision where a fair logical outcome is required and only emotion would provide the correct answer, but applying intelligence wouldn't’?

As discussed we each live in our own personal simulation, you never actually experience reality, only your version built by your brain and external senses.

Some people actually do see ghosts and hear voices but they are not coming from reality, they are being generated by their own consciousness.

Again no offence meant, this is just my opinion.

 :)
« Last Edit: March 17, 2019, 11:19:04 am by Korrelan »
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

 


Project Acuitas
by WriterOfMinds (General Project Discussion)
Today at 01:32:29 am
Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

272 Guests, 0 Users

Most Online Today: 295. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles