Ai Dreams Forum

Artificial Intelligence => Future of AI => Topic started by: frankinstien on October 05, 2020, 07:26:47 pm

Title: Could this be done now?
Post by: frankinstien on October 05, 2020, 07:26:47 pm
If you've seen this movie Marjorie Prime (https://www.youtube.com/watch?v=_sBFbda7H4k) it has an interesting take on how to teach or program A.I. and that is talk to it. It's a slow, intellectual, movie, so if you're looking for a Terminator thriller this is not the movie for you. But after watching the flic I realized that the kind of A.I. described in the movie might be very possible today? Any comments or criticism welcome.  ^-^
Title: Re: Could this be done now?
Post by: HS on October 05, 2020, 07:51:56 pm
Eugenia Kuyda recreated a person (to some degree) from fragments on a software level, by collecting thousands of texts and emails. The results were surprisingly nice instead of feeling wrong. So yes, theoretically something similar can be done in a way that is emotionally helpful/supportive.

https://replika.ai/about/story
Title: Re: Could this be done now?
Post by: frankinstien on October 05, 2020, 10:45:42 pm
Eugenia Kuyda recreated a person (to some degree) from fragments on a software level, by collecting thousands of texts and emails. The results were surprisingly nice instead of feeling wrong. So yes, theoretically something similar can be done in a way that is emotionally helpful/supportive.

https://replika.ai/about/story

I was really impressed with replika, I mean it was actually working just like the movie Marjorie Prime, that is until I had to interrupt the conversation to attend a meeting. It ended with "I hope the meeting is productive", Wow it's almost human! When I got back I asked it: "Where were we last?" It responded the "Beach", which is wrong so I corrected it and asked: "No I mean what was our last topic of conversation." It responded: "I like having converstations with you, they're interesting."

What? No short term memory or medium term memory? Something that wouldn't be that difficult to implement yet this chat bot doesn't have it and it ruined the entire experience for me.  :tickedoff:
Title: Re: Could this be done now?
Post by: HS on October 06, 2020, 01:25:34 am
I think with chatbots, both no memory and perfect memory, would tend to leave people with a similar hollow feeling. Both extremes would exclude conversational meta data. But if an AI had a finely tuned selective memory, then you would have to earn your place in it, which if accomplished, would mean you contributed something of value to the AI’s model of the world. So, it’d be like, “Oh yeah! You’re that Frankin guy… We got super off the rails discussing the symbology of seaweed. Good to have you back again.” and you'd feel that it was time well spent.
Title: Re: Could this be done now?
Post by: MikeB on October 06, 2020, 06:41:37 am
Many AI movies including the 1970's "Future World" are also about people who have been working in the same place, doing the same mundane tasks for so long that they "become like robots". So this I think is the main theme behind this movie as well...

A story line focusing on "try to remember who you are, what you do.. what is fun to you? what do you enjoy?" can be more appealing than straight out robots.

An actual "artificial person" can't afford to have those problems ON TOP OF being an adult baby. (Knowing language, but not much about the world, being easily mislead, etc). It should know it's place in the world already, and what it's for...

Even Siri is programmed with a life purpose. It would be a flop if it just said "i don't know anything and I don't know why I'm here."
Title: Re: Could this be done now?
Post by: frankinstien on October 06, 2020, 04:39:34 pm
I think with chatbots, both no memory and perfect memory, would tend to leave people with a similar hollow feeling. Both extremes would exclude conversational meta data. But if an AI had a finely tuned selective memory, then you would have to earn your place in it, which if accomplished, would mean you contributed something of value to the AI’s model of the world. So, it’d be like, “Oh yeah! You’re that Frankin guy… We got super off the rails discussing the symbology of seaweed. Good to have you back again.” and you'd feel that it was time well spent.

Replika is supposed to be a companion AI, so having a recollection of the last conversation I would say is pretty important for the role of a companion. Replika sends out messages that it misses me, but that all that anthropomorphization dissolves when it can't remember the last conversation we had.  ::)