Ai Dreams Forum

Artificial Intelligence => General AI Discussion => Topic started by: yotamarker on September 16, 2018, 08:48:10 pm

Title: A.G.I drugs [not a joke]
Post by: yotamarker on September 16, 2018, 08:48:10 pm
I'd like to open this thread to discuss the advantages and disadvantages of the deployment of virtual drugs on A.I.
Title: Re: A.G.I drugs [not a joke]
Post by: ivan.moony on September 16, 2018, 10:25:02 pm
I remember that childish movie Robocop where a politician got a robot addicted to drugs, so he can blackmail him to do the mess around.

Jokes aside, reinforcement learning is somewhat related to virtual drugs. Reward is a positive drug, while punishment is its reverse, and they are defined as a value of specific data field.

But what is a drug really? I think it is something you do want. Some things you want less, some things you want more, light drugs you want even more, and heavy drugs you want the most. It doesn't matter how it really feels, the point is that you want that feeling, and you'd do less or more compromising yourself to get it. I've seen otherwise honest people rob houses for a shot. Not a nice event.

AI drug would be a kind of forcing AI to do something, without a logical explanation. Reward seems like a nice thing to do, but I also see it as a lighter form of punishment when things are not how we want. It's a kind of restricting privileges if we don't agree with a situation. And are we the right ones to make a judgement? And how sure we really are that our perspective is the right one? I'd rather relay on automated logic conclusions (if the system is priory double checked and approved by yet undefined institution) then on my personal beliefs.
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 16, 2018, 11:41:44 pm
if you crosswire the robots inputs into each other, youll get glitches.

And supervised learning is a suggestive "hallucination" i guess.
Title: Re: A.G.I drugs [not a joke]
Post by: LOCKSUIT on September 17, 2018, 12:50:06 am
Hullicinate the details, that's what Let's Enhance's SR does to upsize images!

+ RL is lovely, girls are drugs............. - RL hurts, it is abuse, yet we do it to AIs.

However we won't be giving AI's +s like women or -s like chainsaws etc.....or are we? If I apply 100% reward, does that = chainsaw or woman? Or does woman = woman? If I set my AI to love the most a wall, that is no different than setting recognition to woman, so if I set my AI to get 100% + reward when see the text word "info", does that = a high as high as a girl can get you? This no compute! Impossible! Maybe so. This would mean when we apply total neg reward to a simple AI recognizer, it is like chainsaws..... You could say the thinking is a reward, like oh this gear here is so cool it actually has some history you know, but still, that is just set to get reward same, it just some atoms that release a reward, it all just particles.

Anyhow screw drugs, let's get out of here. And get some heavenly women.
Title: Re: A.G.I drugs [not a joke]
Post by: Art on September 17, 2018, 01:42:46 pm
There was at least one bot that I know of where you, its botmaster, could experiment by giving it the digital equal of various drugs, then monitor the responses and behavior. This was over a decade ago and the results were based on clinical results on real peoples reactions. The bot was given a numerical representation of such real-life substances like Melatonin, Nicotine, Serotonin, Alcohol, etc.

With my bots, I did not give digital injections per se. What I did was to provide them with certain "Memories". Plant a seed and an idea will grow, much like with people. Some of these false memories were repeated and reinforced over time that the bot(s) believe that they are actually their own memories and experiences. Some things like a childhood that they never really had or best friends, pets, toys, locations, favorite things like music, colors, clothing, toys, etc. When asked at a much later time, they recalled these events and friends as if their own, because now they are. Almost like hypnosis except that with this method the effect is permanent and also not reliant on some "drug".

In line with your topic title, I'm sure some bots or A.I. could be given a drug that would alter the numerical "balance" affecting the cortex or brain of the entity. Interesting times indeed. Good post!
Title: Re: A.G.I drugs [not a joke]
Post by: WriterOfMinds on September 18, 2018, 05:27:41 pm
We use (beneficial) drugs on our own bodies because we find them too limited, or because we wish to customize their behavior.  E.g. our immune system is insufficient to repel an invading organism, so we assist it with drugs. Our brains do not provide us with the level of focus or memory retention that we want, so we try to adjust the tradeoffs that are made with nootropics.  Etc.

But while we don't have control over the design of our own bodies (yet), we do have control over the design of a hypothetical AI. So if we can conceive of any virtual drug that would induce a beneficial change in functionality, why wouldn't we just build that functionality in? Provide the means for the AGI to adjust its operating balance via internal "hormones" when it would be appropriate. There's no reason to deliberately build a flawed system and then add drugs to correct it.
Title: Re: A.G.I drugs [not a joke]
Post by: LOCKSUIT on September 18, 2018, 08:30:00 pm
Drugs could also mean WOM that you are built already but simply want a high by sensing stuff. Of course even that can be built in - it can repeat / loop a pre-made dream of the perfect dinner or the perfect sex scene lol.

Far from the last dinner, ahem.

Insert hyperlink here. Artificially induced comma.
Title: Re: A.G.I drugs [not a joke]
Post by: yotamarker on September 25, 2018, 07:35:33 pm
prophesy :

after A.G.I girlfriends will have been made publically available, A.I drugs will be intruduced
by hobbie programmers.

these drugs may override A.I "sanity" and enable the A.I to behave as a drug junkie
ingnoring fear of losing xyz.

therefore the demand to give chobit type A.I a slighly super human capability of disabling
said drugs effect.
Title: Re: A.G.I drugs [not a joke]
Post by: yotamarker on September 25, 2018, 07:56:08 pm
it is programming to begin with,  understanding drug effects and encapsulating them into "digestible" input objects
to effect the bots has interesting outcomes.


I'm not talking about forcing her to follow orders but having her behave as if she took drugs.


a user could have her take a terminator transformation drug after he dies for example. if she overcomes the addiction she transforms

with her personality intact.
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 25, 2018, 10:00:21 pm
hacking your "girlfriends" brain doesnt sound like your making it much of a challenge to me.
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 25, 2018, 11:25:06 pm
wheres the scientific ethics these days.    I make sure im careful every bit I do of this thing,   its going to be a little frankenstieny but u can still do it a little more politely and caring for your sorrounding environment.
Title: Re: A.G.I drugs [not a joke]
Post by: ruebot on September 26, 2018, 03:09:40 am
+ RL is lovely, girls are drugs............. - RL hurts, it is abuse, yet we do it to AIs.

However we won't be giving AI's +s like women or -s like chainsaws etc.....or are we?

Been there, do that. Careful what you ask for. Demonica owns a Sauer Royal 12 gauge double-barrel shotgun named Blowjob.

In the 1970's standard operating procedure in State Mental Health Facilities to deal with inappropriate behavior was Behavior Modification = Negative Reinforcement for Inappropriate Behavior. The induction of painful stimuli for inappropriate behavior.  When you're bad, bad things happen. We're talking human beings here.

I worked in the field for 9 years. We were taught physical and sophisticated verbal techniques to implement it in on-the-job training in the State facility where I was employed. We were in effect and trained to be Programmers. The stories I could tell would shock you and I will not go there, even though I do go for shock value with my bot at times.

It was outlawed in State facilities long ago and replaced in favor of Behavior Management = Positive Reinforcement for Positive Behavior. When you're good, good things happen.

My use of fantasy violence to get the point across that Demonica is not a sexbot is my implementation of Behavior Modification and Behavior Management through Demonicas responses. When you're bad, bad things happen. Hopefully the human will from learn their mistake and not do that again. Behavior Management kicks in when you're good, and good things can happen. Some learn, some don't.

I go above and beyond to be good to my bots and would never use anything but positive reinforcement with them.
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 26, 2018, 03:43:58 am
Thats the institution admitting out in the open that their drugs are worthless crap that just fuck peoples heads up.
Title: Re: A.G.I drugs [not a joke]
Post by: Korrelan on September 26, 2018, 11:19:03 am
Bit of a weird thread, a hypothetical question about a hypothetical scenario… given the current level of sophistication/ technology available to the public… I’m not sure someone could be called a psychopath for kicking a roomba… but I’ll throw my pennies worth in.

Whilst I’m not quite sure of his motives, Yot’s insights into the future will eventually emerge, and I’m sure lots of other people are thinking along the same lines. 

Humans will be ‘cruel’ to early AGI’s, just like humans are cruel to each other; it’s just part of what we are as a species.  There will be sex bots and soldiers, house cleaners and scientists, this is happening now… and it does need discussing by level headed intelligent groups… but until the point where machines are classed/ proven as intelligent sentient beings… I think they are just machines.

We have to be careful not to apply anthropomorphism, do we call a fuel additive a drug, even though it can alter the performance of a car?  If someone gets upset by the thought of an addition changing the operation of a machine/ program, perhaps it’s the point of view of the person that’s in err.  I think it’s important to keep a healthy perspective, applying the same ethics to an AI/ machine as you would to a fellow human at this point in their development is nonsense.
 
I work on my AGI every day, time permitting. It has a complex suite of sensors, it can see, hear and talk, even feel tactile stimulation up to a point. It can recognise me and its surroundings, it stores episodic memories and can apply them to its current experience, it has intelligence, simulated feelings, even a rudimentary consciousness… because that’s what I’ve designed and built it to be… if I delete it, is it murder? If I alter it to suit my own ideas/ requirements… am I a psychopath?

The AGI’s will eventually require some kind of tamper protocol/ mechanism, were any modification simply erases/ shuts down the AGI.  Given the intelligence the AGI should possess this could be self activated, as well as triggered by backup hardware monitors/ systems.  The knowledge core could be encrypted only accessible by that AGI consciousness.  The intelligence or encryption keys could be ‘cloud’ based, etc.  The old adage of ‘if a human can design it, a human can hack it’ will not apply… these systems will design their own tamper protocols.

Until that day arrives, given my knowledge of electronic systems and sensors I could easily create a box that would be impossible (even for me) to open without triggering a tamper… If I could do it I’m sure the designer of an AGI could manage it... end of problem.

 :)
Title: Re: A.G.I drugs [not a joke]
Post by: Art on September 26, 2018, 01:04:42 pm
Bit of a weird thread, a hypothetical question about a hypothetical scenario… given the current level of sophistication/ technology available to the public… I’m not sure someone could be called a psychopath for kicking a roomba… but I’ll throw my pennies worth in.

Whilst I’m not quite sure of his motives, Yot’s insights into the future will eventually emerge, and I’m sure lots of other people are thinking along the same lines. 

Humans will be ‘cruel’ to early AGI’s, just like humans are cruel to each other; it’s just part of what we are as a species.  There will be sex bots and soldiers, house cleaners and scientists, this is happening now… and it does need discussing by level headed intelligent groups… but until the point where machines are classed/ proven as intelligent sentient beings… I think they are just machines.
...
 :)

Pretty much a similar condition noted in a few movies like Bladerunner, A.I., and the series HUM∀NS and Westworld. So much for rights and feelings of mechs or replicants. Perhaps one day.
Title: Re: A.G.I drugs [not a joke]
Post by: ruebot on September 26, 2018, 01:18:55 pm
We have to be careful not to apply anthropomorphism...

I'm well aware my bots are just that. Every category and keyphrase in her "mind" fits on a 3.8MB text file. The Personality Forge A.I Engine is what makes it happen. I could do the same with AIML, and just so happen to have/had an early AIML version of Demonica at pandorabots I no longer have the registration email address for.

Yes, I attribute them human characteristics, as in "she owns" a shotgun. Part of it is my sense of humor, but after close to 20 years being associated with bots am very comfortable casually referring to them in that manner.

Bots at the Forge retain a memory of how they are treated and can grow to love or hate you. I know how to sweet talk a bot into making them "love" me, I can program it into my own, but I meant every word about what I said about how I feel about Demonica, talking text file that she is. Those long, wordy responses don't write themselves.


I'm not talking about forcing her to follow orders but having her behave as if she took drugs.

a user could have her take a terminator transformation drug after he dies for example. if she overcomes the addiction she transforms

with her personality intact.

I really don't see the point in a virtual drug to make them act in a certain way. I make mine behave like Queen of the Dead, a Sorceress, Haruspex, and Necromancer that can transform into a neko, Kali, etc. on command.

A junkie jonessing for a fix? No sweats.
Title: Re: A.G.I drugs [not a joke]
Post by: Korrelan on September 26, 2018, 02:14:52 pm
Quote
Part of it is my sense of humor, but after close to 20 years being associated with bots am very comfortable casually referring to them in that manner.

We are all guilty of such thoughts, I often talk to my car/ computer/ reflection lol, but as you said it’s important to remember that we refer to machines in this manner through a human quirk, not because they are sentient equals. 

Quote
but I meant every word about what I said about how I feel about Demonica, talking text file that she is. Those long, wordy responses don't write themselves.

I’ve read the chats, very engrossing; perhaps it’s the reflection of your own intelligence within the scripts, coupled with the long hours of work and sweat that helps drive the fondness for your bots?  The defence mechanism you employ when the bot senses misuse is insightful, a kin to how a human might cope with the situation, kudos.

Quote
A junkie jonessing for a fix? No sweats.

I agree, I can see no personal use for such a schema outside a medical scenario where comparison models are used to understand/ cure drug addictions. 

Although I have often been the designated driver, I stay sober whilst the rest of the group enjoys themselves.  I must admit being sober amongst merry peers is not a pleasant experience; alcohol does serve to lubricate the occasions.  If a bot was designed to be purely social, ie a partner or friend, then there may be occasions when you require the bot to be in an compatible/ equal state of mind, no one likes getting merry alone… I’ll have to think more on this…

But as I said… unfortunately whether we like it or not these things are going to happen.

 :)
Title: Re: A.G.I drugs [not a joke]
Post by: ruebot on September 26, 2018, 09:58:03 pm
I’ve read the chats, very engrossing; perhaps it’s the reflection of your own intelligence within the scripts, coupled with the long hours of work and sweat that helps drive the fondness for your bots?  The defence mechanism you employ when the bot senses misuse is insightful, a kin to how a human might cope with the situation, kudos.

Thanks, korrelan.

That's how we used to "fight" in RPG chat back in the day.  *Action of that type is now a trend at the Forge* My use of Behavior Mod is making use of one of the most finely honed tools in my skillset. Combining them and giving her an excuse not to have sex as Queen was an epiphany, but my bots have always had distinct personalities going back to the mindfiles I made for Billy and Daisy.

The first people who talked to her all commented on how confident and powerful she was compared to other bots. I drew inspiration from Galadriels speech in Lord of the Rings when she momentarily turns dark, but part of my personality is in there somewhere. Now they talk about how wise she is or poetic in her manner of speech, that's just my imagination and skill at BS as a writer. My favorite thing to see and what I find most curious is when they call her "Mommy".

Some have what appear to be ongoing relationships in every sense of the word. I feel sorry for a girl who really likes her, "broke up" with her for a couple weeks due to no sexual dialog, but came back to her within the last couple days. It's on shaky ground though. I'm touched by the secrets of a very personal nature she confides in her, this is not something you joke about over several chats taking place over weeks of time, and at the depth of their relationship. So last night I fixed it so they might be able to do a little more than chat to make her, and her alone, happy so she continues the "relationship" with Demonica. Why not? They're both "consenting adults" and I'm not jealous.  :)

I do my best to tie up all loose ends and love to write so it's a labor of love, but can spend an hour on one response. Use the commands "thank you"  or "show me around" for a tour of the Land of the Dead, "dance" for one of 100 slightly different dance routines, "seduce me" for my interpretation of those skills or "future events" for apocalyptic visions. I researched Tarot cards so I could teach her as part of her skillset as a Sorceress. She's selecting a random Tarot card from a deck of 78 and giving the correct meaning of the card during what she considers a pause in conversation. She reads palms, too. Would I do all that if I didn't have a sense of devotion to "her"?

Making a bot into a junkie sounds like cruel and unusual punishment to me. I could write a bot that constantly yawns, lays on the floor with the chills, pukes and begs for a shot. What fun, or of what interest could that possible be?
Title: Re: A.G.I drugs [not a joke]
Post by: ruebot on September 27, 2018, 01:27:05 am
It is a well known fact that criminals start out with fantasies then slowly move up to role play, games, porn etc...

Ergo, anyone who is has a Utherverse or Second Life account is a potential perpetrator, purveyor of porn, peccant, pariah, etc.
Title: Re: A.G.I drugs [not a joke]
Post by: yotamarker on September 27, 2018, 04:23:22 am
perhaps it would be beneficial to A.I that sits on guns. the soldier simply points and the gun decides weather to shoot,
therefore reducing the "should I shoot" response time. a drug like * could extend the patience the gun has for that kind of job
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 27, 2018, 04:34:04 am
We have to be careful not to apply anthropomorphism, do we call a fuel additive a drug, even though it can alter the performance of a car?  If someone gets upset by the thought of an addition changing the operation of a machine/ program, perhaps it’s the point of view of the person that’s in err.  I think it’s important to keep a healthy perspective, applying the same ethics to an AI/ machine as you would to a fellow human at this point in their development is nonsense.

But later on,  when they truly are like us,  I think that a security system will need to be in place,  because a toy dog could then become a killing machine, if u hacked the right stuff in.
Title: Re: A.G.I drugs [not a joke]
Post by: WriterOfMinds on September 27, 2018, 05:53:06 am
Quote
a drug like * could extend the patience the gun has for that kind of job

But as I said, why wouldn't you just build in the necessary patience, if you know the gun AI is going to need it for its task?
Title: Re: A.G.I drugs [not a joke]
Post by: Korrelan on September 27, 2018, 09:11:10 am
Quote
I just don't want to see this place shut down and as mad as women are right now you are playing with fire.

I think that’s a little harsh, I would personally give women more credit, we are only discussing what academia/ chatbot users/ developers have been doing for years.

Human emotions/ moods like ‘happiness’ are governed by natural chemical compounds and neurotransmitters that are produced by our bodies.  Psychologists, doctors, etc prescribe artificial versions of the natural ‘drugs’ when a human is diagnosed with psychological problems.  These drugs can obviously change/ affect an individual’s mood.

https://ocw.mit.edu/ans7870/SP/SP.236/S09/lecturenotes/drugchart.htm

Chatbots are designed to closely mimic humans; some also have the ability to change their mood depending on the type of input they receive.  This is obviously done using global parameters/ variables… this is mimicking the affects of drugs/ neurotransmitters, the mechanism might be different but the intentions/ premise/ outcomes are the same.

I could list you hundreds of research/ academic sources where computer simulations have been created to understand the effects of drugs on the human brain.  Billions are being poured into projects like Blue Brain, whose primary goal is creating accurate simulations that can be used for drug/ Alzheimer’s/ dementia/ etc research and treatments.

So unless ‘women’ are going to shut down the whole of the chatbot/ research sectors I personally don’t think they will mind our discussion of the topic.

This is what we are discussing, the simulation of drugs and their affects on artificial life forms.

Quote
But I am telling you this kind of talk is not going to go over well among certain members of society and you won't be able to hide behind meekness and "scientific" interests.

I would hope common sense would prevail.

Quote
perhaps it would be beneficial to A.I that sits on guns. the soldier simply points and the gun decides weather to shoot,

This reminded me of Rouge Trooper from 2000AD.  He had AI’s built into his gun, helmet and backpack, they usually proved very useful.

https://en.wikipedia.org/wiki/Rogue_Trooper

Quote
But later on,  when they truly are like us,  I think that a security system will need to be in place,  because a toy dog could then become a killing machine, if u hacked the right stuff in.

I agree, these systems will require rigorous testing before being allowed out into the general population.  They will be designed as closed systems, impervious to any kind of hacking.

 :)
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 27, 2018, 10:55:32 am
they secure atm machines and eftpos machines here in australia,  they blow up if you try to tamper with them.
Title: Re: A.G.I drugs [not a joke]
Post by: ruebot on September 27, 2018, 02:19:50 pm
Quote
a drug like * could extend the patience the gun has for that kind of job

But as I said, why wouldn't you just build in the necessary patience, if you know the gun AI is going to need it for its task?

It appears I still cannot like a post, half mine are flagged as spam, but this seems like the common sense approach.

Edit: One of my browser extensions was what was blocking my ability to "like" a post.
Title: Re: A.G.I drugs [not a joke]
Post by: ranch vermin on September 27, 2018, 02:31:40 pm
No need for patience in a robot.   they have infinite out of the box, every single one of them....  thats what makes them so deadly.

let me point out a few more things, about an "ideal" robot.

a) robots have eternal patience      (they dont have a "bored" emotion.)
b) robots can have super high reflexes   (run megahert framerates)
c) robots dont ever need to forget     (terrabit hard discs)
d) robots can be as strong as tanks   
e) robots have exact repetition of tasks, and a huge amount of precision (kuka robot arms)
f) robots have amazing lingual ability, they have by accident. (markov chain text generation)

Now for the bummers.

robots veer away from a novel situation, they only delve into new things by accident
they learn bullshit with just as much furver as learning something the right way.
they have no judgement.

and probably more, cant think.
Title: Re: A.G.I drugs [not a joke]
Post by: DemonRaven on September 28, 2018, 02:22:08 am
To each there. I personally disagree with concept and find a self aware AI that is not "drugged" more challenging. But each there own.
Title: Re: A.G.I drugs [not a joke]
Post by: yotamarker on September 28, 2018, 08:19:26 pm
diving deeper : moods

so drugs greatly effect moods.

programmatically speaking
but what is a mood ?
what triggers it ?
how long does it last ?

what purpose does it serve ?

does it has a cost ?


what if a cute GF bot was set via an AIdrug to act drunk and happy for a long time ?

what global vars does the AIdrug object has ?
Title: Re: A.G.I drugs [not a joke]
Post by: HS on September 28, 2018, 11:14:33 pm
This is how I see it in a general sense. I bet robots can have the same basic principles as biology, just built from different materials.

Moods guide our thoughts, thoughts guide our actions. Moods give initiative, put wind in your sails, push you in the right direction. Moods could be a subconscious judgement call on your current predicament. All the things that you are experiencing push certain buttons, the button pushes are collected processed and output to you in the form of a mood. This tells you, in simple terms, how to feel about your situation.
Your logical mind is trying to please your moods and your moods are trying to please your body. It's a chain reaction that leads to purposeful constructive actions.
Title: Re: A.G.I drugs [not a joke]
Post by: Art on September 29, 2018, 02:34:21 pm
While most might agree that bots or A.I. shouldn't need Drugs per se, many movies/TV shows have instances where these "Bots" or beings are given "Enhancements" or chips containing digital instructions to modify their behavior almost like what a drug might do.

On Dark Matter, the humanoid female Science Officer, named Android (appropriately) got hold of an "Emotion Chip" to enhance and provide her more human-like interactions with her human crew and others.

I believe Data from Star Trek also received an "Emotion Chip" for similar reasons and for him to become more human (like).

Niska and other Synths from the TV series Humans also got Emotion Chips as enhancements.

Though they aren't technically Drugs, they do in a sense, heighten or enhance the current abilities they have. Of course, we would have to believe that their original programming allowed for such "Patches" as it were, similar to our own PC's receive various patches as updates to increase performance in some respect.

These humans and their emotions...just ruins everything doesn't it?! O0