Towards GPT-4

  • 6 Replies
  • 27822 Views
*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1372
  • Humans will disappoint you.
    • Home Page
Towards GPT-4
« on: August 08, 2020, 12:03:03 am »


Lex Fridman just released an interesting short video where he estimates the cost of training a hypothetical GPT-4 model based on current trends. At ten times the size of GPT-3 it would cost about ten times as much to train at today's prices but in ten years it would cost about the same.
« Last Edit: August 08, 2020, 12:26:32 am by infurl »

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Towards GPT-4
« Reply #1 on: August 08, 2020, 12:31:52 am »
The human brain only has 86 billion neurons, but each neuron can have up to 10,000 to 100,000 connections! The definition of a synapse is a junction between two nerve cells, consisting of a minute gap across which impulses pass by diffusion of a neurotransmitter. So, the dendrite connections to axons are what add to the complexity of biological systems. Code wise symbolically representing 86 billion neurons is not a problem with today's technology. Where 228 TOPS are possible when in INT8 mode for the RTX 2080 TI, which could simulate 2650 connections per neurode for each of the 86 billion neurons simulated. But when citing the human brain's connectivity, those 100 trillion synapses include the encoding of raw data like audio and video which we don't have to encode in a neural network since we can store data digitally much more efficiently than biological systems! Now it's a matter of function not data that a neural network must be responsible for. So I think we're either at the point to make a true AGI or at least very close, it's matter of figuring out adept solutions with digital or silcone approaches to the type of processing biology has evolved to do, IMHO.
« Last Edit: August 08, 2020, 01:01:24 am by frankinstien »

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1372
  • Humans will disappoint you.
    • Home Page
Re: Towards GPT-4
« Reply #2 on: August 08, 2020, 01:28:42 am »
Each neuron is comparable to a complete computer with processing and storage taking place on a molecular level rather than just a unit of storage. That would make the human brain more like a supercomputer with almost a hundred billion nodes. The most powerful supercomputer in the world currently has fewer than two hundred thousand nodes, so we have a way to go yet. Then there is the question of efficiency; that supercomputer needs a lot more than 25 watts to run.

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Towards GPT-4
« Reply #3 on: August 08, 2020, 02:11:12 am »
Each neuron is comparable to a complete computer with processing and storage taking place on a molecular level rather than just a unit of storage.

Ah...that's a bit of stretch...Why? A neuron's output, meaning its integration of inputs to produce an output maxes out with a spike train of 300 HZ and that's cycles per second. Where the effective information packet of a neuron is within milliseconds, so only a narrow code range of the spike train is effectively meaningful. Now don't confuse the cellular operations of a neuron which manages metabolism, transmitter transport, membrane pores, etc as the computational process of the neuron. I often analogize neurons to that old board game "Mouse Trap". Where the game has all kinds of contraptions and processes to do one thing and that is drop the cage over the mouse. So too is the neuron's infrastructure complex but simply resolves to a neuron's single spike or a spike train...

*

frankinstien

  • Replicant
  • ********
  • 653
    • Knowledgeable Machines
Re: Towards GPT-4
« Reply #4 on: August 08, 2020, 02:47:59 am »
In comparison, a neurons signal spectrum is from 0 to 300 HZ:



Now look at a NTSC signal:


The bandwidth or information carrying capabilities of a neuron are very low, at best 300HZ compared to an NTSC signal that uses amplitude modulation for basic video(B&W), frequency modulation for the audio and a combination of phase modulation and amplitude modulation for color. So that's three types of information coding super-imposed on a single carrier wave.

So, neurons are no where near the information encoding capabilities of even old NTSC video! The bandwidth in neurology just isn't there to justify calling each neuron a computer onto itself...

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Towards GPT-4
« Reply #5 on: August 08, 2020, 03:05:41 am »
:) The brain comes pre-built at birth with "extra" synapses/ connections. The brain is constantly losing neurons every day. As the brain fills up its storage space it is never using all of its storage... Just extra head-room. Also, the brain has a ton of connections to and from many neurons just so that it can pick and choose which to use/ strengthen. The brain doesn't depend on all of them.
Emergent          https://openai.com/blog/

*

yotamarker

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1003
  • battle programmer
    • battle programming
Re: Towards GPT-4
« Reply #6 on: September 03, 2020, 06:44:46 am »
TLDR :
GPT is an AI by elon musks openAI
it uses prediction and lots of parameter to predict verbatim.
making it sound coherent weather it spits out truths or fantasy.

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

266 Guests, 0 Users

Most Online Today: 461. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles