First-Person Narratives: Food for Thought?

  • 3 Replies
  • 951 Views
*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1178
First-Person Narratives: Food for Thought?
« on: April 21, 2020, 04:13:04 am »
How can you create a mind which perceives reality with a human perspective? That is the question…  I’d say, create a versatile information absorbing substrate. You are what you eat, so create a mental diet rich in what you want to create.

To create an informational substrate that’s able to do this, it seems you’d need something which can perceive reality without thinking. We’re trying to make our A.I.’s know reality by logic. When in fact thinking appears to get in the way of seeing reality. From a first-person perspective, at least, thinking seems to distract from reality. Perceiving reality is what should cause thinking, how are we supposed to create perception from its supposed results? That’s backwards.

Ideally, you’d want a substrate which can adapt internal/external patterns to suit its goals, (Korr’s point/definition about intelligence). Specifically, patterns which the agent cares about, otherwise all patterns would be considered intelligent, which would make the word meaningless. A thing that can do this needn’t be a simple “deus ex data-sponge” thing that’s a shortcut of wishful thinking. It could be very complicated, just ideally empty to start with, for maximal potential.

A personality/entity could be grown through the process of this substrate attempting to predict a first-person story. Stories teach empathy. You cannot influence events, so in order to get a satisfactory temporal understanding of the world, you have to develop empathy. You have to model the conscious reality of other people accurately within your own mind.

This isn’t something you’d get when exploring a physical environment without seeing it through the lens of an established intelligence. You’d get more of a kinesthetic understanding, like most animals do, making the future predictable through your own movement, because that is the most readily apparent strategy.

A general mind will mirror the thing it is faced with in order to deal with that thing effectively. Adapting patterns. To deal effectively with a world of conscious personality, (the first-person process of someone with emotions etc), you’d have to model it in your own mind, therefore recreating/becoming it once the description was communicated completely and thoroughly enough.

How else could we expect A.I.’s to come up with all of our objectively strange ideas and peculiar modes of processing? If we make them learn from the environment, they will be a product of the environment. We humans are largely a product of ourselves, we exist in the state that we are in because our processing has interacted with itself when it was encountered in other people.

 It’s happening to dogs already; they are becoming more like us mentally. We have humanified wolves, but we have humanified ourselves even more. It’s probably not an easy sequence of events to imagine or replicate from scratch. A lone calculation/prediction machine probably wouldn’t arrive at our methods for making sense of the universe.

So, how would an A.I. truly understand the language within which, (I hypothesize), our intelligence processes are encoded? I think it requires imagination; in this case, a substitution of our symbols with the gist of what they represent, in a way which is internally consistent with the currently utilized level of abstraction.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: First-Person Narratives: Food for Thought?
« Reply #1 on: April 21, 2020, 04:43:28 am »
My schema is feed it lots of data, hardwire goal nodes, let it make new knowledge and goals and request data from desired sources (ex. CNN news.com, lab tests, human QAs). The context seen last and its agenda dialog is the perspective/beliefs/answers.
Emergent          https://openai.com/blog/

*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1178
Re: First-Person Narratives: Food for Thought?
« Reply #2 on: April 21, 2020, 08:12:42 am »
Do you have multiple instances of the same word in your net? The same words mean different things depending on context. You might need to have a chair represented by different nodes based on if it’s a chair which you sit on, which you build, which you use to reach into a high cupboard… Different aspects of “chair” become important, and it might create some havoc in your neural net if you trigger them all at once.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: First-Person Narratives: Food for Thought?
« Reply #3 on: April 21, 2020, 07:13:04 pm »
 https://ibb.co/vVbG2cp

There's no way to store the same word again in a realistic brain net. Instead, my hierarchy can store meanings literally stored like 'the word muffin is food' and 'a muffin is round ish'. If you hear 'can you stick it in here' VS 'can the stick grow?' it makes 'put' and 'tree' light up. If you hear 'can you um it in here' or 'can you put it in here', both light up despite the one word being missing or relationally only similar, so it recognizes the correct node even if both put and tree and stick activate. The translation is context dependent though, I think the context should get more say in deciding if put or tree lights up more.
« Last Edit: April 21, 2020, 07:40:49 pm by LOCKSUIT »
Emergent          https://openai.com/blog/

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

226 Guests, 1 User
Users active in past 15 minutes:
squarebear
[Trusty Member]

Most Online Today: 467. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles