AI native source code utilization and thoughts on OpenCog technology

  • 0 Replies
  • 890 Views
*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
About the original discussion

Years ago I had a discussion with my friend about AI knowledge base composed from a number smaller programs conceived with AI interfacing the outer world on its own. The idea was about making this possible with Java technology. Java even has a reflection module that sees itself, making it suitable for experiencing self-awareness. By that time, I thought, generally, it could be a good idea, but that Java was too much business-centric embodiment to use it that way, and I wasn't aware of any language that could be a backbone of the idea.

A lot of time passed, and some new languages were invented. Anyway, more time that passed, more I liked the general idea of the above conversation. I learned some programming theory, and somewhat developed my knowledge about logic. At this moment, I think it would be great to have a programming language that copes equally well with business, scientific, and even more abstract notions we normally use in our everyday decisions, and that that kind of language would perfectly suit the general idea of AI knowledge being composed of source code fragments.

About OpenCog

More recently, I examined more thoroughly the idea behind Atomspace, the knowledge base language behind OpenCog project, an AGI attempt initiated by a bunch of doctors, magisters and AI enthusiasts, and I was pleasantly surprised! Although Atomspace is somewhat bloated experiment (almost 70 projects are included in OpenCog GitHub repository), I found a few very inspiring Atomspace veins regarding to initial knowledge base structure and querying its stored data. It has some data mechanisms I didn't encounter anywhere else, which could be very useful on querying knowledge.

The entire Atomspace defines a whole pletora of built-in atoms that was keeping me at the safe distance because I thought they are an unnecessary complication, but among all of them, a few (exactly four) most basic atoms used to query data and populate knowledge base with new conclusions got me very interested, I must say. These atoms stand for: (1) finding data that match given pattern, and (2) applying parameters to data that match given pattern. And they seem to work in both directions - as forward and backward chainers - for finding a result that fits a query, and for finding a query that fits a result.

Although I managed to build only a small percent of the entire OpenCog package (I have to buy more RAM to finish this task, I have only 4GB for now), and I didn't test the functionality of my interest by myself, things I read on OpenCog Wiki pages look very promising to me.

Drawing a parallel

To return to the beginning of the story, a specific subset of Atomspace seems perfect for the purposes of representing together knowledge base and algorithms used to run memorizing, decisions, and actions of an AI.

Is anyone willing to share thoughts in this direction?
« Last Edit: October 12, 2021, 11:09:46 pm by ivan.moony »

 


OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

287 Guests, 0 Users

Most Online Today: 316. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles