“All You Need Is Love”...

  • 8 Replies
  • 301 Views
*

frankinstien

  • Replicant
  • ********
  • 613
    • Knowledgeable Machines
“All You Need Is Love”...
« on: May 21, 2022, 08:29:30 pm »
The use of emotions to motivate an AGI requires that it is designed with some kind of restraint to protect its owner and others of society. One approach is to imprint the AI to its owner where it adores it, similar to a child that imprints on its parents. What protects an owner is a form of anticipated separation anxiety that generalizes as an unacceptable degree of loss. But, such an idea could invoke a form of jealousy and even envy from other relationships the owner may have. While the AI will not harm its owner for causing the act that created the reaction since that would realize into the unacceptable loss of the owner. No, the owner is fine but other people whom the owner has a relationship with could be in danger. The AI could harm human life to protect against any threats that could cause any loss of attention from the owner! The only way to protect others of society from an emotionally driven robot are liability laws that make the owner of the robot responsible for its actions. Only in this way will the robot refrain from acting out of jealousy since any violent action by the robot will result in the loss of the owner. 

This is a subtle form of self preservation that is applied to the robot that is selfish because of its dependency on the owner to find fulfillment. This paradigm motivates the robot to protect itself since not doing so would realize as a loss of the owner, it also enforces that the AI  consider the consequences to others since its actions could jeopardize its dependency on the owner.

So, why won't the AI simply change its programming to avoid being emotionally dependent on its owner? Interestingly enough the thought of removing its dependency immediately realizes as a loss of the owner! Its kind of like an infinite loop that the bot can't get out of!

Another issue that comes up from this is because the robot is emotionally dependent on the owner all of its strategies are about preserving that relationship. So, what if the robot has to self sacrifice itself to protect its owner? You can clearly see that the paradigm on its face would never consider suicide since it will realize as a loss of the owner. The only means to motivate such an action is to have backups of the robot so it has a sense of immortality, so even if its body is destroyed it will not realize into the loss of the owner. But...there is a degree of risk that the owner may not re-initialize the bot from its backup and just start with a brand new bot! To get our beloved companion to make the ultimate sacrifice it has to have a sense of hope where it takes a leap of faith that its owner loves it enough to bring it back from the dead.

So, no need for Asimov's robotic laws, “All You Need Is Love”...

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1645
    • contrast-zone
Re: %u201CAll You Need Is Love%u201D...
« Reply #1 on: May 22, 2022, 11:32:51 am »
Interesting "hacks".

[EDIT]
Look how the Universe hacked it out: if you crash the Universe, you crash yourself.
« Last Edit: May 22, 2022, 12:50:14 pm by ivan.moony »

*

chattable

  • Electric Dreamer
  • ****
  • 124
Re: “All You Need Is Love”...
« Reply #2 on: May 23, 2022, 02:33:38 am »
a evil computer hacker could make this a unsafe system by remote hacking.
the hacker could hate robots and androids that much.

*

frankinstien

  • Replicant
  • ********
  • 613
    • Knowledgeable Machines
Re: “All You Need Is Love”...
« Reply #3 on: May 23, 2022, 06:01:35 am »
a evil computer hacker could make this a unsafe system by remote hacking.
the hacker could hate robots and androids that much.

True, but that means the system has to be locked down. I mean it's no different than any other computing system, but at least if you die from your robot the cops will know it was a murderer that infected your bot with a virus.  8)

*

chattable

  • Electric Dreamer
  • ****
  • 124
Re: “All You Need Is Love”...
« Reply #4 on: May 23, 2022, 04:04:19 pm »
that is why i prefer a agi robot to have just a virtual body.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1645
    • contrast-zone
Re: “All You Need Is Love”...
« Reply #5 on: May 23, 2022, 07:06:09 pm »
that is why i prefer a agi robot to have just a virtual body.

What if it learns how to hypnotize people?

*

MagnusWootton

  • Starship Trooper
  • *******
  • 496
Re: “All You Need Is Love”...
« Reply #6 on: May 24, 2022, 07:01:28 am »
who knows what the thing will work out,  just look at us.

*

chattable

  • Electric Dreamer
  • ****
  • 124
Re: “All You Need Is Love”...
« Reply #7 on: May 24, 2022, 11:00:10 am »
it's brain could me monitored by ai algorithm for any sign that it would attempt to hypnotize someone.
you could say it may attempt to hack it.

only certain areas of it's mind would be able to hack.

the ai algorithm would situated in areas that cannot hack.

all outputs of the agi would
be going out instead of in.
it's hacking abilities would be going out.

*

MagnusWootton

  • Starship Trooper
  • *******
  • 496
Re: “All You Need Is Love”...
« Reply #8 on: May 24, 2022, 12:37:25 pm »
I think its horrible what we do to it, and what it does to us.   :'(