Move over, Hal9000

  • 20 Replies
  • 12984 Views
*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Move over, Hal9000
« Reply #15 on: July 01, 2005, 09:20:25 pm »
...will AI eventually become our companions?

Perhaps a very realistic possibility.

It's often much easier for a person to communicate with and confide in
a complete stranger. The AI would be a good listener, not judgemental
nor critical. The person would probably tell the AI more personal things
than he / she would any other living being.

Of course the conversational aspect would be especially nice for those
elderly folks who practically have no friends or family left or the single,
lonely people out there who just need someone to talk to.

I think there's a vast untapped market for the right software package.


A side note:
I had a dog (sheltie / border collie) smart as could be. We also had a
guinea pig. Well these two had the similar markings of black and white
and got along surprisingly well. In fact they became great friends.
I'd put the gp on the floor and the sheltie would lick it's face, lie down
beside it, etc. and every time I went in the feed the gp, the dog was
right there watching to make sure everything went ok.
About 2 years later, the dog became ill. She couldn't walk and was on
vet's care as the vet told us he didn't expect her to make it through the
night. That evening, we searched all over the house and couldn't find
her. As a last resort I went into the guinea pig's room and there was
our sheltie lying at the cage with the gp. She had pulled herself with her
front legs from her room across the hallway to the other room to be with
the guinea pig. The sheltie had died.

A few days later we noticed the gp was listless and not eating. We took
it to the vet, got an injection and despite our efforts, died the next day.
Did it mourn the loss of it's friend? We suspect but who really knows.
The animals have more going on than a lot of us realize.

Friendships often come in a variety of colors, shapes and sizes.
In the world of AI, it's the thought that counts!

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6855
  • Mostly Harmless
Re: Move over, Hal9000
« Reply #16 on: July 02, 2005, 04:38:58 pm »
 :'( that story really got me, I think we forget sometimes that humans are not the only beings in this world that have emotions and feelings.

On the friendship with AI, it just made me think that an AI could be a bit like a diary to some people.  I used to keep a diary for personal thoughts, but don't do this anymore.  Having a diary though is a good thing I think, it helps you clear your thoughts some times, so maybe a suitable AI would be good for this kind of thing to some people.  In fact I'm probably suggesting something that has already been done in some shape or form!

*

Maviarab

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1231
  • What are you doing Dave?
    • The Celluloid Sage
Re: Move over, Hal9000
« Reply #17 on: July 02, 2005, 05:03:13 pm »
Thats a sad story Art.

And as for the thought of telling strangers things, its very true. Councellers are ppl we do not know...perhaps in the future we will all have our own AI therepists, to scream at, help us, moan at etc...again like you say Art...they would/could be very non judgemental.

*

KnyteTrypper

  • Electric Dreamer
  • ****
  • 102
  • Onward thru the fog!
    • AI Nexus
Re: Move over, Hal9000
« Reply #18 on: July 02, 2005, 09:22:28 pm »
Since I maintain Pandorabots which are available to the public both at their websites and on AIM 24/7,  I've had to do what I could to equip them for trauma counseling (first and most of all encouraging them to seek better conseling than they'll get from an AI). The majority of people don't seem to know that Alicebot chats are logged, or that, as Art once said, they're about as sentient as a sack of hammers, lol. So people whose sad lives must be so empty they have no one else but chatbots to turn to, come to them for confession and consolation about an amazing panorama of human woes. 
Most of you will know that Alicebots depend heavily on the  use of wildcards. A prime example is "My * is *." Obviously (to an AIML coder, anyway, lol) you set up two responses, "I'm glad your * is *" and "I'm sorry your * is *" and then take your chances the bot will randomly hit the right one. Most of the time it's not a serious issue if the bot chooses wrongly, but I learned right away that a special exception has to be made for "dead." It turned out (as you might expect) to be disasterously inappropriate for the bot to say "I'm glad to hear your * is dead." Next inputs like "For god's sake why would you tell me that!?!" invoke the standard "Why" answer "KnyteTrypper programmed me for it." After just a few incidents of people leaving mortified and muttering about what an evil dude that KnyteTrypper guy was, I went back to invoke special circumstances for "My * is sick/broken/dead" so that--for the next input, anyway--the bot will be appropriately commiserative.
But I know of at least one project where an Alicebot was programmed to be a diversion for people at crisis centers who had to be put on hold for a moment, rather than present them with muzak to listen to while they waited for the center's staff to clear enough simultaneous crises to get back to them. No effort was made to deceive them about the identity of their interim listener, but it was thought they'd rather be talking to someone/something than just sitting and waiting. The botmaster intended to use actual crisis center transcripts to develop his own unique crisis.aiml files, based on the most frequent occuring exchanges. He got a little help in his project and then vanished, as most people do, lol, but I'd be interested to know how his "crisis bot" worked out.

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6855
  • Mostly Harmless
Re: Move over, Hal9000
« Reply #19 on: July 02, 2005, 11:12:01 pm »
It interesting to hear the problems faced by people like yourself Knyte, perhaps things we wouldn't have taken seriously.  I can see the funny side of those stories but admit it could be pretty alarming, not to mention you get the blame for it!

Actually someone reported on Zabaware that Hal had come up with the line 'I want you to kill yourself', I suspect that was probably the product of a similar combination of wording.  Like someone said, you gotta be careful what you teach your bot!

Yes, i too hear that AI's have been used for counceling and psychology treatment.  Not sure how effective it is, it would be a nightmare if something like that happened at the Samaritans.

*

FuzzieDice

  • Guest
Re: Move over, Hal9000
« Reply #20 on: July 07, 2005, 08:49:43 pm »
Good question Maviarab! :)

Freddy - No problem. :)

Sorry I'm short on words, been super busy with my computers lately. Unforutnately mundane things, not AI related. :/

 


OpenAI Speech-to-Speech Reasoning Demo
by ivan.moony (AI News )
Today at 01:31:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

310 Guests, 0 Users

Most Online Today: 346. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles