Recent Posts

Pages: 1 2 [3] 4 5 ... 10
21
General AI Discussion / Re: Artificial God?
« Last post by frankinstien on November 24, 2021, 07:09:20 pm »
Quote
Oh no, then it would be what we, humans want. But do we really want the right things for everyone? And more important, do we want to use our creation and throw it away when it is not useful to us anymore? Do we really want an artificial slave?

Who is going to decide what morality to implement? I mean there are those that believe that if you think a bad thing then there's the possibility to act on that idea, so, to prevent rape, murder, physical abuses, we should as a society screen for thoughts and punish those that think such thoughts!

All machines are slaves, they are created to serve humanity.
22
General Project Discussion / Re: Pattern based NLP
« Last post by MikeB on November 24, 2021, 03:06:58 pm »
Update on audio speech recognition.

Traditional speech recognition uses FFT/DCT/DTT Fast Fourier Transform's to decode audio into voice phonemes. These capture 3 voice formants (frequency ranges specific to a phoneme) from one 'signature'. However these use nested-loops and are slow to process. DTT is the fastest but I want to try it another way...

Most spoken phonemes have a range of different frequency areas combined to make the sound - bass/warmness, middle range, high range. EG. "oh" is mostly bass. "ee" bass-middle. "ss" high.

The way I want to try is separating common frequency ranges first initially, then measure the power & complexity afterwards to tell if one range is loud/complex versus the others.

Separating the frequency ranges (band-passing) can be done in real time using just a few instructions, using pre-calculated IIR filters (http://www.schwietering.com/jayduino/filtuino/index.php). FIR filters are better quality but slow.

There is 20ms inbetween recorded audio frames to process the data, so I'm aiming to get both phoneme and NLP processing out the way in 1-10ms. Using the same thread as the one capturing the data.

This is some captured data for the word "wikipedia". The asterisks (*) represent good power & complexity levels versus background noise.

Currently there's little noise filtering and the band-pass filters need tightening up, but eventually if the results are strongly reproducable then they can be added to tables as base values...

23
General AI Discussion / Re: Artificial God?
« Last post by MagnusWootton on November 24, 2021, 01:56:22 pm »
The training data is the Bible, the person who wrote it is the God of the AI...

What if you just wrote the a.i. to put money on the table, continuously,  why super ai is possibly stronger than a stock market predictor, if you want to abuse the situation.
24
General AI Discussion / Re: Artificial God?
« Last post by MikeB on November 24, 2021, 01:28:24 pm »
The training data is the Bible, the person who wrote it is the God of the AI...
25
General Project Discussion / Re: Concept Modeling
« Last post by frankinstien on November 23, 2021, 11:30:26 pm »
Quote
After looking at Rational consequence relation I can see that the approach I took needs a bit more refinement since it is too monotonic.

Actually, the monotonic issue isn't really an issue since it can be coped with using validation and in fact, this approach has the ability to note similarity, since it can grade its relatability.
26
General Project Discussion / Re: Concept Modeling
« Last post by frankinstien on November 23, 2021, 10:45:46 pm »
All I can say is PROLOG probably is complete,  not that I'm saying you might not be able to do better than it, or more concise or something,  but I didn't learn much of it.  With ordinary programming you are complete just with INVERT and a method of communing the variables, and thats it.

I need the process to work in real-time without any need to compile code, yet need performance which is why I need a JIT (JIT has a 1% to 2% cost), which is why I didn't go with Prolog. There was some work done to build a ProLog JIT, but I don't know what happened to that endeavor.  After looking at Rational consequence relation I can see that the approach I took needs a bit more refinement since it is too monotonic.
27
General Project Discussion / Re: Concept Modeling
« Last post by MagnusWootton on November 23, 2021, 09:36:59 pm »
All I can say is PROLOG probably is complete,  not that I'm saying you might not be able to do better than it, or more concise or something,  but I didn't learn much of it.  With ordinary programming you are complete just with INVERT and a method of communing the variables, and thats it.
28
General Project Discussion / Re: Concept Modeling
« Last post by ivan.moony on November 23, 2021, 09:08:55 pm »
Nice, but pretty abstract. Do you have any use examples?
29
General Project Discussion / Concept Modeling
« Last post by frankinstien on November 23, 2021, 08:50:03 pm »
I'm working with concepts as data expressed as a structure with properties and/or methods. I came up with some basic math shown below:



Where concepts are the elements A,B,C,F,G,Q,Z, and T.

As shown in the diagram there are five basic operations along with the use of some set functions of union and intersection.

  • Relatable to: can be expressed as grades or relevance.
  • Indirectly relatable: indicates processing to relate to an element, which can also be viewed as a goal.
  • Apply: is a process of utilizing verbs, adjectives, and adverbs that have expressions of properties and methods as well.
  • Processes into: is a literal function that can produce a concept, e.g. computing the area of a table.
  • not: is the anthesis of features or method outputs

Any concept can also be a set of concepts as hinted at by expressions 6 to 11. Also 4 hints at the ability to chain concepts to reach indirectly related concepts.

Pondering if this is complete enough, any thoughts?
30
General AI Discussion / Re: Artificial God?
« Last post by MagnusWootton on November 23, 2021, 08:04:25 pm »
Yes,  but we need an unflawed method,  its kinda important to do things properly, without a doubt.
Pages: 1 2 [3] 4 5 ... 10