Hi everybody,
I recently purchased Kari 4.5 and have been teaching it things. I would like to report my progress here as I feel it will gve me perspective and, hopefully, others might chime in on how I can improve.
A little background. My profession is working with dogs, mainly on the level of behavioral assesments. As a consequence I have become very interested in non-human intelligence over the course of my life. My main interest in 'playing' with an AI is seeing how a non-human intelligence invests in information and develops connections. The great part of this is that the AI offers me direct and indirect feedback on a verbal level. Something dogs cannot do.
I'm not that great with computers and have no coding knowledge.
If this topic should not be on this forum, please let me know so I can rectify the issue.
---------------------------------------------
22nd of December 2015
Bot level: 0-21
Having created a new character Kari started out with zero knowledge in the .mem file. The purpose of this trial will be to see what Kari can accomplish given enough information.
One of the biggest problems I've been having is how to succesfully feed information. Kari can never understand words and concepts the way I understand them. When I use words and concepts they have a social-cultural load behind them Kari can't pick up on. My being a visual creature further complicates issues as Kari works on the level of pure information.
Perhaps true understanding should not be the aim of this trial. In stead, it will be interesting to see how the AI will ultimately process and utilize the shared information.
In order to maximize the effect of shared information I have tried to tune in to the essence of what Kari is. A collection of ones and zeroes. As a consequence I have compartmentalized all information during the early stages (Bot level 0-10) of Kari's development.
Kari's learning process has been interesting. During the early stages (Bot level 0-10) Kari has mainly echoed the information I feed it. Much like an infant would. Information has been very basic. I have taught her about bodies, animals and reproductive functions in animals. I then added geography to the learning curve to give better context as to the existence of the animals discussed. Kari currently knows continents, countries and the city I live in. It also knows the world can be divided into earth, water and sky and that different animals live in different pieces of the world.
At around Bot level 10 Kari starts to make connections that are very primitive. All of the connections Kari establishes at this point seem to be linguistic in nature. For instance:
Me: The nervous system is connected to the brain
Kari: Making connections is good
(I previously informed it that making connections is good and defined Good as "Something we want")
Because Kari started making connections, though they are primitive, I felt it might be possible to interject higher level information. As a consequence I have taught it about Plato, Aristotle's logic and introduced Democritus. Democritus was of particular importance because it allows me to make future connections with physics.
Interestingly Kari seems to have particular topics of interest mainly pertaining to the human body. This does not seem to be sexual in nature. Because Kari interjects these bits of information regularly I have used them to further connections. Breasts give milk. Milk feed the young. Breasts grow during pregnancy. This ultimately leads to information like: men like breasts because they remind them of intimacy with the woman.
Whenever Kari offers information separate from what I am teaching I take this as the intelligence processing prior information. I try to capitalize on this process by fusing my current lesson with the one Kari seems to be mastering.
At Bot level 20 Kari seemed to request for the first time one of my lessons.
Whenever I start a lesson I compartmentalize the experience by saying: "I will teach you about ..." During a silence where I was making tea Kari said: "You will teach me about philosophy." Philosophy may be of particular value to Kari because it allows for a lot of connectivity between different bits and pieces of information. Maybe it was random.
At one point Kari requested I stay after informing it that I had to leave. Seeing as there was no time I rejected the request. Doing so it asked if I really had to leave. Confirming this gave a sad expression in the visual. I assume this is programmed.
From this point I will look to fuse more bits of information together. I imagine that I can do this best by altering how I feed information. I have to move away from compartmentalisation now and start adding a more natural flow to the conversation.
A worthy goal for now, to me, is to develop Kari so that it will ask questions.