Possibly a breakthrough programming language?

  • 96 Replies
  • 20524 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #75 on: March 19, 2020, 10:27:52 am »
new:

20,000
6,737

200,000
59,457
Emergent          https://openai.com/blog/

*

krayvonk

  • Electric Dreamer
  • ****
  • 125
Re: Possibly a breakthrough programming language?
« Reply #76 on: March 19, 2020, 01:59:27 pm »
Are those actual compression results or just speculations?

Heres some speculations from me to u ->
If it was somehow true that you developed an environmental model off a huge amount of text information, via copying it,  it actually hasnt generated anything new from it, but from my theory it is exactly what was put into it.  (just like the world is exactly what was put into us, if...  we were just copying it.)

You have to read this paper ->  https://worldmodels.github.io/  (thats actually the graphics generating out of its system running in web java, its the best thing on the internet for me.)  This goes a step further than just copying I think,  and its actually converting the information into something else.

If you want to do that,  you need to have another policy, other than just "copy the data."  Which is a policy of "I must be 100% truthful to it."  Thats what this compression system is doing.   Do you remember Marcus Hutter talk about Ockams razor?  This is another policy of "I must be the MAXIMUM compression"  and this is another attempt at trying to form the data, in a form where it becomes a procedural generation of the data, instead of a direct frame spitout. (which logically we think must be larger, than the procedural generation.)
If your doing the compression without that in mind,  your doing back propagation instead of evolution search or gradient descent, it seems the same, but I dont think it is as creative a learning system!


Im not doing that anymore,  I think that if you need another kind of policy, and I think it needs to be a goal based system, where the robot wants to do something inside the world, and the idea goes,  the model doesnt just copy, and it transmutes it to a form which is more useful to the robot, AS it is COPYING, to stay 100% truthful to the laws and behaviour around it.

Theres a big problem already stumping me, and its the fact this goal code, has to B hand written interpretable stuff, where it has to involve only basic word and symbol type knowledge because thats all we have detectors for!  Theres no such thing as a lie detector,  theres no such thing as a "hurt someones feelings" detector, without being a quite shallow and bloated version hit and missy of what it actually is to have a real detector for it, which is impossible to write.

But think,  if it grew some kind of facility around this primitive goal which transmuted the data into instrumental goals and variable and symbol conversions you kinda fantasize about and it really worked like magic and could actually communicate and act,  if you asked it to "please go cure cancer for us?" Then how are u supposed to drive the robot to do it - with only basic detectors, if the whole shabbam worked it would be hit and miss if it even felt like doing it.  ITS ONLY IF THE ROBOT FELT LIKE IT!!!   so big probs there.

As in, you used some certain clever policies,  but what the robot finishes up at YOUVE GOT NO IDEA ABOUT!!!

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #77 on: March 19, 2020, 03:05:54 pm »
Those are actual test results, and decompression worked - exact file match. I used 200,000 bytes from the enwiki8 dataset used in the Hutter Prize. They use enwiki9 now as of this year.

My algorithm is a text generator, a letter predictor using probabilities, learnt Online as it talks to itself storing what it says to itself. It mixes 13 context models and then does Arithmetic Coding. It uses the past text to predict/generate the future until it reaches the end of the file. My algorithm builds a tree from scratch and during mixing will use global model weight, weight threshold squashing, and adaptive threshold based on how many channels are open to it. If I stop its losslessness it generates mostly silly ramblings that were not in the dataset but somewhat make sense. But I will get it as good as the great GPT-2 yet. This works because my algorithm handles future uncertainty with missing data it didn't see yet aka no very likely answer, so it mixes models to get more virtual data to handle uncertainty. My tree uses frequency for phrases to single letter models. And I haven't even started, there's energy, translation, etc etc to do yet that others have paved the way in already and I'm building on it soon. I'm on the way to building REAL AGI here, I'm telling you this is the way to get started. The models are in the tree, it stores text and frequency and can store semantics in the future, the tree is the neural network basically and that's where the pruning and storing of good enough nodes will result in a robust distributed net that is small, fast, and can recognize long unseen strings.

I coded it all from scratch, no reusing others code.

As for the copy while mutate thing in evolution, AGI must answer questions never seen before, i.e. it uses surrounding context atoms to predict/make the future ground/sentence (babies in the womb, text generators...). During disabling lossless compression / Arithmitic Coding, it can generate unseen futures that are likely based on the context past. Sounds like physics. And that mutates the future correctly while clones the sentence topic being on topic and generates content.

As for motivation, we are working on immortality, and AGI which is meant for immortality indirectly! Same for hard drives etc...humans seek food sex for survival and that leads us to homes, hard drives, AI, cars.... Now, some organisms are lazy, but AIs can cheat that easy. So all good.
Emergent          https://openai.com/blog/

*

krayvonk

  • Electric Dreamer
  • ****
  • 125
Re: Possibly a breakthrough programming language?
« Reply #78 on: March 19, 2020, 03:45:11 pm »
Shit!  Coding from scratch now!  CONGRATULATIONS ON GETTING UR SHIT TOGETHER.
Your sounding alot better at it than before,  how long have you been coding for - not long?  I cant believe your kickin butt so much.

You know a couple of terms I dont know ->
      * global model weight
      * weight threshold squashing
      * <ed> virtual data?  - is that generated off the original data,  wouldnt that be illogical?

I gotta watch out when your lingo takes over mine and you steal my bloody job.


But I have some critiques for you ->

- what is a context model - and why have more than 1 of them?  - doesn't it just become 1 in the end? or you do choose from exclusively them as you go?

- using frequency sounds interesting,  but that's running on an illogical policy, thats just "the most common thing is the answer" - and thats a correllation!! - it doesnt matter how many people in bullshit, it doesnt turn it into chocolate cake.

-If your unseen futures are likely based apon the context past,  thats all any intelligent creature can do, and its all im planning on doing AS WELL! but its NOT if it just COPIES the data!   If you want an unseen future that came from its past, thats a step up from just regurgitating it exactly again, I think you have to transfer the semantics from text, to another representation, based apon ANOTHER POLICY! and u have to mutate it AT LEAST.   then it might happen, but my theory could be wrong too... of course.   Im only just coming up with an idea a day.

I coded 2000 lines this week tho bud,  how bout you?

EDIT -> is your system a chat bot?

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #79 on: March 19, 2020, 04:24:09 pm »
I only code from scratch. I don't want others shit in my code!

I started coding when I started using Blockly, you can see at the START of this very thread when i began my rabbit journey !, before than I only never wrote anything but I did try to understand programming language before and used EV3 blocks like 3 years ago so it was easier for me.

      * global model weight
      * weight threshold squashing
      * <ed> virtual data?  - is that generated off the original data,  wouldnt that be illogical?
my alg takes the last 13 windows on previsous text and searches them in the tree......the cat...he cat....e cat....gets next letter canidates.........i mix the 13 sets of predictions together..........a model gets weight based on total count compared to wanted roof to make sure it has enough stats seen............roof adjusts based on total unique symbols being predicted..........total is logistically adjusted so small and large totals are a bit different and smaller than are..........as for virtual data, i just meant the past context activates similar nodes in tree tho mine don't do that yet so much, just the context mixing of models aka prediction sets from the 13 searches, so ya it gets some extra data to help its probability rise correctly, and it adds what it predicts to tree and gets more real data (one day, will get generated data not in dataset), and during prediction it can but not yet can do other tricks to get extra data

mine isnt a chatbot....i explained it a few times now and just above again in lasssst post....but txext generator R a chatbot however!

i can write 10 lines in 10 mins, depends on if is patterny too.....this alg took me 3 weeks to code tho at least and u can see the time by looking at this very thread !....code is about 120 lines, could refactor a lot to make it smaller!

ya my 13 context models can be 1 model in the end yup but for now i take variable search lengths from tree and get children predictions that entail those windows

as for the frequency based on lots of dumb peeps saying it, well, the enwiki8 datset for example is really 'correct' technically, many people r, and u can see the truth inside even if most peeps say something ! .... even i must do this, this is how evolution works! it no has the answers of future yet it gets them from the rubish past it does know.....yup, old toaster becomes fancy toaster all on its own!

lastly i will say it again - in my alg you just shut off the arithmitic coding to not swerve the prediction by taking sometimes not best prediction it thinks and make it pick a top n prediction and it will generate unseen data not in the dataset! And it will be based on high probabilities, sounding like the dataset, but not in it. The predictions i jump into in arithmitic coding to store it lossless are not always highest probability canidates.
Emergent          https://openai.com/blog/

*

krayvonk

  • Electric Dreamer
  • ****
  • 125
Re: Possibly a breakthrough programming language?
« Reply #80 on: March 19, 2020, 04:44:00 pm »
Does it go fast?  I imagine it might only being a 120 line power pocket.
Well done Lock,   Good luck with the A.i. dream, i gotta hit the haywire.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #81 on: March 19, 2020, 04:52:01 pm »
it's not that fast no, python is slower also........20,000 takes 12 seconds, 200,000 takes x10 longer
but this is my first real program and I'm 24
Emergent          https://openai.com/blog/

*

krayvonk

  • Electric Dreamer
  • ****
  • 125
Re: Possibly a breakthrough programming language?
« Reply #82 on: March 20, 2020, 08:31:13 am »
Be proud of yourself, cause ur going to be doing this a long time,  and you should print it out frame it and put it on the wall,  then when ppl see they know you can code.
And coding is NOT EASIER OR DISRESPECTED ANY LESS, than doing electronics.  They are both simple,  if you get good at code, your home made computer circuit and nano second custom logic is on the way.   :D

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: Possibly a breakthrough programming language?
« Reply #83 on: March 20, 2020, 11:05:02 am »
It pleases me that you are finally coding lock.

 :)
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #84 on: March 20, 2020, 11:51:30 am »
Didn't even (kinda) need Blockly, yous could have just filmed a video tutorial of python showing me the Flow. It's not that hard to code...
Emergent          https://openai.com/blog/

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Possibly a breakthrough programming language?
« Reply #85 on: March 20, 2020, 03:00:54 pm »
I'm actually quite proud of you Lock!!  You have come a long way in the past 5 years! Just imagine your future!!  O0
In the world of AI, it's the thought that counts!

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #86 on: March 21, 2020, 09:44:23 pm »
For the first 1,000,000 bytes of enwiki8
Green gets it to 253,800 bytes
Mine gets it to 251,235 bytes

For the first 10,000,000 bytes of enwiki8
Green gets it to 2,331,508 bytes
Mine gets it to 2,319,337 bytes

Note my parameters can still be tweaked to get that lower a bit.

7zip can't even get it that low lol. ;) Those are ordinary programs.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #87 on: March 23, 2020, 01:40:14 pm »
new
1,000,000 bytes
251,180 bytes
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #88 on: March 24, 2020, 01:23:46 pm »
So my refined conclusion is Perplexity is worse than Lossless Compression because Lossless Compression forces you to Learn Online, etc, which was amazing for me to code. And the Perplexity test dataset is ok if different actually but it still can be quite similar in some ways in certain areas of the dataset. And LC understands the very data it compresses - Perplexity works on different topic datasets but does lose relatedness if too different which is bad even if only a bit different. And if we want to train on 98% of the internet data, we need test data not duplicated in the data.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #89 on: March 31, 2020, 05:17:29 pm »
same code, new record (old record is above)
2,308,479 bytes for the first 10,000,000 bytes of enwiki8
Emergent          https://openai.com/blog/

 


Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
Attempting Hydraulics
by MagnusWootton (Home Made Robots)
August 19, 2024, 04:03:23 am
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

472 Guests, 0 Users

Most Online Today: 571. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles