Possibly a breakthrough programming language?

  • 96 Replies
  • 20539 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #60 on: February 24, 2020, 06:53:59 am »
it's working! Almost. Decoder works but then screws it up lol.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #61 on: February 24, 2020, 07:48:47 am »
it worked, it decompressed

now time to upsize it for real etc
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #62 on: February 25, 2020, 12:17:57 am »
Look at that! My less that 100 lines of code - with just this text and a window of 4 letters!


input was 6,224 bits;

i was at the school and was very happy that my new dog would be there and yes i love dogs and cats so much so i knew he was going to be loved by others as well and so i kept walking around the school and my cat also was there at the school and the dog seen the cat chase the others and was so happy that we could all be there yes i was very happy to see my cat and dogs all there with my friends and so we wanted to do something special and the best thing we could do was go to the store near the school and the cats were there ready to buy them and the seller was happy to sell them also so we baught some more cats indeed yes we did and brought them to the school with our other cats and dogs where we could party more and see each other have more fun with the cats and the sc


output 2,293 bits;

1000100000110110011101011101100000100111000101000100110001000110110000111100011011001001000101000001011111000010010101010110000111011010001100100111000101010110001111100000110110111100111010101000110110110001010110111111100110111100100110110101111000110110111111110111101000000110000010011001110011110000100011011001001001101111011011111011011000011110011101001101100100110011000010001110110001110100101100111111101100101010000100000100101110000001011100000110110111010101110101001111101001011100111001000010111111101100001101110010001000110111111100011110011011001010010101010001111111001100111110101010010000000110101111010000100011000101000100000011110100110110110000011111001011000110010010101001101010011000101111010000010100010110101000110100011010101100000010001010101111100010111011101110111001011110010100100001011101100111010111100010011000100011010101000100000101111001011100010101100111000000101000010100100000001011011011111001100101110110011100111011110010100111111111000100011101111101111100011010010111111101110001110010000000000110000110001101101111110100110001001100100001111000111011011011000010111111001111001100110000111000110111111011010011101010100111101110110011101110000001100100101011000111101011111011001000111001110001101000010101011001010001000001000011110101001100010011101100001011001001101011000001100000011010011100000111110100010001011110010010001010111100001111101101000011011101111000001010001101110011111101001010010101110111100101010001101101011001110011110101010100100001101001000111111001000011001001000110011000111011000011110110001110010101000110000101110111100000011100100000010101101000101100111000011011101001110000100111110111101000111111001010010101001001110011111101100110010011101010001011110011011111010011111110111000000011000101000010011110011100001111110100000011000100000001000001000101100111010110000010011001000010011011110000101001110001111110111101001101101101110101000110000110100000111011010000110110110010000110010011001110010100011100100100100011111110000110011001000111011001101001101010110101110100011101000100111100000011100011010111001100101100010110001001101000011111100111110001000001001000001000110110101000100101100111101101111101101001110001100101001010110001011010111000101100111111111101100000000110110000111101000001100

Decompressor made it back regenerated perfectly.
Emergent          https://openai.com/blog/

*

WriterOfMinds

  • Trusty Member
  • ********
  • Replicant
  • *
  • 616
    • WriterOfMinds Blog
Re: Possibly a breakthrough programming language?
« Reply #63 on: February 25, 2020, 01:20:34 am »
That may not be a numerically impressive compression percentage yet, but I'm glad you coded your own algorithm.  Good job.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #64 on: February 25, 2020, 08:49:48 am »
So far 266,731 bytes input - compressed approx. 77,000 bytes.....3.4x compression. I didn't finish the mixing algorithm yet =)

will show code shortly

I made the code fully in Blockly, from scratch, tree and all, in just under 100 lines of code

it does Online Learning, Arithmetic Coding, context mixing...
Emergent          https://openai.com/blog/

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6860
  • Mostly Harmless
Re: Possibly a breakthrough programming language?
« Reply #65 on: February 25, 2020, 09:35:00 am »
Well done Locksuit - so you are enjoying the programming at last then?  8)

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #66 on: February 25, 2020, 10:12:44 am »
Yeah, it's super easy plus fun now with Blockly, I am a pro programmer now.
Emergent          https://openai.com/blog/

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Possibly a breakthrough programming language?
« Reply #67 on: February 25, 2020, 02:40:23 pm »
Yep...Bits...Bytes...what's the difference... O0
In the world of AI, it's the thought that counts!

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #68 on: February 27, 2020, 09:20:12 am »
Ok hot shots, time to release my open source code. I will probably have to limit my future releases to follow modern safety practices and profit (openAI joke). I made this from scratch in Blockly, tree, Online Learning, searching, context mixing, arithmetic encoding, decompressor. It can compress 100MB down to ~23MB. I skipped refactoring it for now, it is still small code too.

https://blockly-demo.appspot.com/static/demos/code/index.html#

Attached is the big input version, try that one, the Blockly one is just to show the toy version. It makes 266,700 bytes into 74,959 bytes and someone's compressor that makes 100MB into 21.8MB makes the 266,700 bytes into 70,069 bytes. To switch to decompression put 'no' at the top and make the input only have the first 15 letters, not all the input of 266700 letters, and put the code it encoded into the decode input at top ex. 0.[here]. You can use the following link to make the code into bits https://www.rapidtables.com/convert/number/binary-to-decimal.html (although I just divide the length of the encoding by 3, then make 1 3rd * 4, and 2 3rds * 3, which results in approx. same bit length. Ex. you create 0.487454848 which is 9 digits long, so: 9 / 3 = 3, 3*4=12 and 6*3=18, 18+12=30, 0.487454848  = 30 bits long!
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #69 on: February 29, 2020, 11:14:20 pm »
in case the link didn't show the blocks, here. Did it?
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #70 on: March 01, 2020, 04:29:34 am »
Btw that XML goes into Blockly. At top of Blockly is the XML tab.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #71 on: March 07, 2020, 10:51:29 am »
cross posting this:

I personally use lossless compression as a true evaluation for finding AGI, which is the goal behind the Hutter Prize. Of course it doesn't cover everything, AI minds need bodies/nanobots, rewards, surf the web, etc. Yet, so far the best compressors talk to themselves and do Online Learning and rearrange articles more neatly and group similar words, etc. So it seems like a million dollar idea. Makes you wonder is there any other evaluation we need if my goal is to massively lower the probability of death/pain on Earth? (That's where Earth is going with evolution and the Hutter Prize, it repairs missing data using prediction to make Earth a neater pattern fractal so save on energy wasted, adds it using Online Learning, then it uses its larger context tree to add new data even faster, exponentially growing in scale.) If I invent true AGI, it uses higher intelligent means (faster than brute force) to make/predict the future and form the settled state ('utopia') that physics will settle us down into, and so does it need to have a body (do we need another evaluation) if it knows enough about the real world that it can extract any unseen text/image it would have seen by a body? It has less need to gather real data. So in a sense, to invent AGI is to just predict well and smash the Hutter Prize. As for the rest of the evaluation for achieving immortality, we do need bodies to carry out tasks still, intelligence doesn't 'do it all', but we already have humanoid bodies with no brain really (and humans etc), so yes all we need to focus on is AGI. And big data. Yes the other evaluation being part of the intelligence evaluation is data/brain size; scale. Scale and Prediction is what we need to work on only. Note your prediction can be good but slow/large RAM needed, but it is less important and we can feel that effect i.e. it tells us the cure for cancer but had to wait a year = a big thank you still. As with scale, the more workers on Earth also the more faster we advance as well.

The first thing you notice is it is easy to get the enwiki8 100MB down to 25MB, but exponentially harder the lower you go. oh. So is intelligence solved? No. But you saw people are getting the data compression needs they need already and won't benefit much more now, so it seems! If a company compresses DNA data down from 100MB to 15MB, why would they/we work on the Hutter Prize so much to get 13MB? Here's why: What if they had not 100MB, but 100GB? the amount cut off is now not 2MB, but 2GB! Also, the more data fed in, the more patterns and the more it can compress it! So not 2GB, but 4GB; 100GB becomes 11GB. Mind you, it is funny our AGI prize is being used to compress data as a tool. Though AI can think up any invention itself, so it doesn't seem odd exactly. Now, seeing that we get more compression ratio the more data fed into it, this means if we make the AI predictor more intelligent in finding/creating patterns, it Will result in a huge improvement, not 15MB>14MB>13.5MB>13.45MB>13.44MB. However I'm unsure that makes sense, maybe it is indeed harder the lower you go if we look at a ex. 100TB input ex. the limit is 2TB (and currently we'd ex. only reach 4TB). Perhaps this is harder the lower you go because there is less hints and higher uncertainty in some problems that require lots of knowledge. So instead it is more honoring to lower compression by just a little bit once it is lower. Of course probability gets worse for hard problems to predict well on, consider the question below for predicting the likely answer:

"Those witches who were spotted on the house left in a hurry to see the monk in the cave near the canyon and there was the pot of gold they left and when they returned back they knew where to go if they wanted it back. They knew the keeper now owned it and if they waited too long then he would forever own it for now on."
> Who owns what?
> Possible Answers: Witches own monk/witches own canyon/monk owns gold/monk owns house/monk owns cave/cave owns pot/there was pot/he owns it

You can't help that hard issues have less probability of being predicted. But you can improve it still. These issues are hard because they haven't been seen much, no match. Yes, simply entailment and translation is used to see the frequency of the answer! How do I know this ;-; :D? Think about it, if real world data humans wrote says the turtle sat in water and yelped, you can be sure it will yelp, you can be sure a rolling ball will fall of a table, a molecule x will twist like x, etc, and if the words are unseen but similar ex. 'the cat ate the ?' and you seen lots 'this dog digested our bread' then you know the probability of what follows (and what is what, cat=dog using contexts seen prior). This works for rearranged words and letters etc, typos, etc, missing or added words... So simply doing prediction will discover the true answers with highest probabilities. Of course, it may not be exactly as simple as a match is all that's needed and then you get the entailing answer (or translation) ex. 'the cure to caner is '. At least it doesn't seem like it, but it is. This requires you to take the word cancer, look into what entails IT; pain, rashes, what those are as well similar to, what entails THOSE, repeat, commonsense reasoning like BERT translation here yes, giving you more virtual generated data, and so you are finding yes the entailment to 'the cure to caner is ' except the prediction is not directly tied to those very words if you know what I mean.

Now if our predictor was a image predictor or movie generator, and also used text context (multi-sensory context predictor), we would have AGI 'talking' to itself learning new data online. And can share its thoughts of vision to talk to us using vision too. We also use Byte Pair Encoded segmentation like 'sounds' (words...) to activate certain images, it is more efficient, images aren't as abstract in that way, but can be much more detailed low level! We will need both later to get better prediction. And yes, images are words, words describe image objects and they are inversely the same thing.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #72 on: March 13, 2020, 02:00:59 pm »
last compressor above - 266,700 bytes into 74,959 bytes
new slightly better compressor - 266,700 bytes into 72,584 bytes
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #73 on: March 16, 2020, 09:04:41 am »
input > 100,000 enwiki8 bytes
Green> 31,518
mine > 31,447

input 200,000
Green > 60,321
mine > 60,394

going to get it low yet, just u wait

startin to write in python editor now, getting more natural lol....saves time heh
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Possibly a breakthrough programming language?
« Reply #74 on: March 17, 2020, 08:30:44 am »
Green keep in mind go the 100MB down to 21.8MB

20,000 enwiki8 bytes input:
green: 6,849
my last: 6,829
my new: 6,811

100,000 enwiki8 bytes input:
Green: 31,518
my last: 31,363
my new: 31,266

200,000 bytes
green > 60,321
mine > 60,057
Emergent          https://openai.com/blog/

 


Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
Attempting Hydraulics
by MagnusWootton (Home Made Robots)
August 19, 2024, 04:03:23 am
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

218 Guests, 0 Users

Most Online Today: 507. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles