Cool AI stuff

  • 4 Replies
  • 943 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Cool AI stuff
« on: November 30, 2018, 10:18:30 am »
https://www.quora.com/What-is-the-highest-number-of-neurons-weve-modeled-in-neural-network-code
160B nodes

https://ai.stackexchange.com/questions/2330/when-will-the-number-of-neurons-in-ai-systems-equal-the-human-brain
By the looks of the above link, around 2015 a lot of algorithms were using ~50 million neurons.

Korr/anyone, if you were to consider a range of typical CNNs, or GANs (or korr's) or non-visual like LSTM etc right now, around how many nodes/connections do they (visual, non-visual) have? And how much RAM? I know that team got 160B but ours ain't that big.



One timeline:
https://en.wikipedia.org/wiki/Timeline_of_machine_learning



Source: http://www.human-memory.net/processes_consolidation.html

Quote:
"It should be remembered that each neuron makes thousands of connections with other neurons, and memories and neural connections are mutually interconnected in extremely complex ways. Unlike the functioning of a computer, each memory is embedded in many connections, and each connection is involved in several memories. Thus, multiple memories may be encoded within a single neural network, by different patterns of synaptic connections. Conversely, a single memory may involve simultaneously activating several different groups of neurons in completely different parts of the brain."
---> Comment: Exactly as I understood. Different faces are still "face" neuron. But for text? N/A.

Quote:
"The inverse of long-term potentiation, known as long-term depression, can also take place, whereby the neural networks involved in erroneous movements are inhibited by the silencing of their synaptic connections. This can occur in the cerebellum, which is located towards the back of the brain, in order to correct our motor procedures when learning how to perform a task (procedural memory), but also in the synapses of the cortex, the hippocampus, the striatum and other memory-related structures."
----> Comment: Bad grammar structures pruned. Silence.



omg, now I understand what Episodic vs Semantic is, I c it this time:
http://www.human-memory.net/types_episodic.html
« Last Edit: November 30, 2018, 11:28:04 am by LOCKSUIT »
Emergent          https://openai.com/blog/

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: Cool AI stuff
« Reply #1 on: December 01, 2018, 10:52:58 am »
The number of ‘neurons’ used in a model does not directly represent the effectiveness of the model.

There are so many different interpretations of how a neuron functions and how they should be wired together that it’s impossible to use the quantities as a metric. 

You also have to consider the trade off with computing power and real time simulations.  It’s easy to generate a few billion neurons of a specific design in memory, but can the processors run them at a useful speed.

Having 86+ billion simulated neurons means nothing, if they can’t achieve anything.

 :)

It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Cool AI stuff
« Reply #2 on: December 01, 2018, 11:57:45 am »
True. you can cram nodes to fewer by RNN representations, slowly blurring them, until making them unexistant. Eventually 12 nodes won't store it all. Surely 4 won't.

Existence is needed.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: Cool AI stuff
« Reply #3 on: December 02, 2018, 04:03:07 am »
LSTMS are weighted RNNs and don't "store" sequences. They work by probablistics. Which can reconstruct sequences. Correct!?

Seeing that it reconstructs it, then of course it is 'storing' "Romeo oh romeo where is thou child.".

This job can be done in other hierarchical ways.
Emergent          https://openai.com/blog/

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: Cool AI stuff
« Reply #4 on: December 02, 2018, 10:06:46 pm »
No... as a general rule LSTM's don't 'store' sequences... long-term...

After training the LSTM 'is' the sequence... the sequence is 'embedded' within the LSTM, the LSTM's structure is altered to 'filter' and recognise the sequence.

 :)
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

 


OpenAI Speech-to-Speech Reasoning Demo
by ivan.moony (AI News )
March 28, 2024, 01:31:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

293 Guests, 0 Users

Most Online Today: 363. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles