A key notion: scalability

  • 0 Replies
  • 1854 Views
*

Zero

  • Eve
  • ***********
  • 1287
A key notion: scalability
« on: August 01, 2018, 10:31:50 am »
I'm all crazy about T3js.

T3js is a neat little core library, based on the principles of Scalable JavaScript Application Architecture, specifically:
   - Enforcing loose coupling between components
   - Making dependencies explicit
   - Providing extension points to allow for unforeseen requirements
   - Abstracting away common pain points
   - Encouraging progressive enhancement

Some believe in neural net approaches, others like symbolic-style, everyone enjoys chatbots... Anyway an AGI would be a huge program, take ROS for example. It's not even close to AGI, and it's already big! We can safely bet one single paradigm won't be enough to host a whole intelligence. Which leads us to this conclusion: we need scalability.

T3js proposes a simple model, where an application is made of isolated modules. Modules can't access anything outside their "sandbox". In their sandbox, we grant them access to services, which are application-wide available tools. A module is typically in charge of only one thing, one "screen area", one purpose. Specifically, they can't communicate directly with other modules. They have to use messages to communicate.

These messages are not function calls, they're not about what should be done in the future. They're about what just happened. They describe a past event, not a desired operation. This is important, because it means you can plug modules in and unplug them without breaking the system, since you never take for granted that this or that will be done. You just shout out: "hey, this things happened". If currently active modules are interested in reacting, so be it. If not, never mind.

The message system in T3js seems simple compared to other libraries like Postal.js. In T3js, you broadcast on 1 channel, which is a string. Modules which declared interest in that exact string channel get the message. There's no wildcard, no topic pattern, no distinction between "channel" and "topic". Some could think that T3js messaging model is too simple for AGI related tasks, and that we need something stronger, like AMQP. I think otherwise: T3js messaging model is already not strict enough.

Say we have a module Foo listening channel "B0". And we have a module Bar broadcasting on channel "B0" on a regular basis. Our system works well, and we want to add a new module, without degrading the system's capabilities of course. We plug our new module "Baz" which also broadcasts on channel "B0". Doing so, we modify the input of module Foo which was working well: the new behavior of Foo might be better or worst.

By adding a new module, we modified things in the system that were not meant to be modified. After all, by adding Baz, we wanted to add new functionalities, not modify existing ones. In a very big system, it can be difficult to spot this kind of influence, and I guess on a long term, make it impossible to reach very high complexity gradually, which is what we need here in AGI.

There's a solution: being even more strict about who broadcasts on which channel. When a new module is plugged, it should be given a range of channels it can broadcast on, and these channels should not be already used by other modules. We can name this "channel collision". Channel collision should be avoided if we want a sane system.

We already know how to handle collision: usually, we use namespaces. In our previous example, Bar would broadcast on "Bar.B0" and Baz would broadcast on "Baz.B0". Simple enough.


It's not the same as flow-based programming though. We don't have to restrict ourselves to a simple input/output model. In other words, modules can have multiple output channels, as long as 1 channel is used by only 1 emitter. We can even have channel patterns. The only important rule is: don't let several modules broadcast on the same channel.

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

252 Guests, 0 Users

Most Online Today: 491. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles