How does GPT-2 or LSTMs store 40GB in just 800MB?

  • 0 Replies
  • 3854 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
How does GPT-2 or LSTMs store 40GB in just 800MB?
« on: July 22, 2019, 12:36:12 pm »
GPT-2 was trained on 40GB of text into its neural network, being a Transformer. LSTMs are another type of net. Anyhow, my Trie cannot even hold 1GB (before eliminating repitituos words and phrases...then it becomes ex. 0.2GB) without it becoming 20x larger into 20GB really when should be 0.2GB. Sooo....GPT-2 knows 40GB basically but only makes my RAM go 0.8GB higher.....GPT-2 354M makes it go twice higher - by 1.7GB. This is opposite effect for me, I am seeing my projects give me 20x more than i put in, not less. The parameters hold the data, but, what kind of Trie is this?? How can I emulate such compression?

I suppose it is the layer 1 nodes, certain ones make the next layer light up, and so on......sneaky alien storage compression correct? I already know the answer then maybe.. Or something else?
Emergent          https://openai.com/blog/

 


OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

284 Guests, 1 User
Users active in past 15 minutes:
ivan.moony
[Trusty Member]

Most Online Today: 287. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles