https://www.quora.com/What-is-the-highest-number-of-neurons-weve-modeled-in-neural-network-code160B nodes
https://ai.stackexchange.com/questions/2330/when-will-the-number-of-neurons-in-ai-systems-equal-the-human-brainBy the looks of the above link, around 2015 a lot of algorithms were using ~50 million neurons.
Korr/anyone, if you were to consider a range of typical CNNs, or GANs (or korr's) or non-visual like LSTM etc right now, around how many nodes/connections do they (visual, non-visual) have? And how much RAM? I know that team got 160B but ours ain't that big.
One timeline:
https://en.wikipedia.org/wiki/Timeline_of_machine_learningSource:
http://www.human-memory.net/processes_consolidation.htmlQuote:
"It should be remembered that each neuron makes thousands of connections with other neurons, and memories and neural connections are mutually interconnected in extremely complex ways. Unlike the functioning of a computer, each memory is embedded in many connections, and each connection is involved in several memories. Thus, multiple memories may be encoded within a single neural network, by different patterns of synaptic connections. Conversely, a single memory may involve simultaneously activating several different groups of neurons in completely different parts of the brain."
---> Comment: Exactly as I understood. Different faces are still "face" neuron. But for text? N/A.
Quote:
"The inverse of long-term potentiation, known as long-term depression, can also take place, whereby the neural networks involved in erroneous movements are inhibited by the silencing of their synaptic connections. This can occur in the cerebellum, which is located towards the back of the brain, in order to correct our motor procedures when learning how to perform a task (procedural memory), but also in the synapses of the cortex, the hippocampus, the striatum and other memory-related structures."
----> Comment: Bad grammar structures pruned. Silence.
omg, now I understand what Episodic vs Semantic is, I c it this time:
http://www.human-memory.net/types_episodic.html