Cool AI stuff

  • 4 Replies
  • 414 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ****************
  • Admiral
  • *
  • 3128
  • First it wiggles, then it is rewarded.
Cool AI stuff
« on: November 30, 2018, 10:18:30 am »
https://www.quora.com/What-is-the-highest-number-of-neurons-weve-modeled-in-neural-network-code
160B nodes

https://ai.stackexchange.com/questions/2330/when-will-the-number-of-neurons-in-ai-systems-equal-the-human-brain
By the looks of the above link, around 2015 a lot of algorithms were using ~50 million neurons.

Korr/anyone, if you were to consider a range of typical CNNs, or GANs (or korr's) or non-visual like LSTM etc right now, around how many nodes/connections do they (visual, non-visual) have? And how much RAM? I know that team got 160B but ours ain't that big.



One timeline:
https://en.wikipedia.org/wiki/Timeline_of_machine_learning



Source: http://www.human-memory.net/processes_consolidation.html

Quote:
"It should be remembered that each neuron makes thousands of connections with other neurons, and memories and neural connections are mutually interconnected in extremely complex ways. Unlike the functioning of a computer, each memory is embedded in many connections, and each connection is involved in several memories. Thus, multiple memories may be encoded within a single neural network, by different patterns of synaptic connections. Conversely, a single memory may involve simultaneously activating several different groups of neurons in completely different parts of the brain."
---> Comment: Exactly as I understood. Different faces are still "face" neuron. But for text? N/A.

Quote:
"The inverse of long-term potentiation, known as long-term depression, can also take place, whereby the neural networks involved in erroneous movements are inhibited by the silencing of their synaptic connections. This can occur in the cerebellum, which is located towards the back of the brain, in order to correct our motor procedures when learning how to perform a task (procedural memory), but also in the synapses of the cortex, the hippocampus, the striatum and other memory-related structures."
----> Comment: Bad grammar structures pruned. Silence.



omg, now I understand what Episodic vs Semantic is, I c it this time:
http://www.human-memory.net/types_episodic.html
« Last Edit: November 30, 2018, 11:28:04 am by LOCKSUIT »
Emergent

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1379
  • Look into my eyes! WOAH!
    • YouTube
Re: Cool AI stuff
« Reply #1 on: December 01, 2018, 10:52:58 am »
The number of ‘neurons’ used in a model does not directly represent the effectiveness of the model.

There are so many different interpretations of how a neuron functions and how they should be wired together that it’s impossible to use the quantities as a metric. 

You also have to consider the trade off with computing power and real time simulations.  It’s easy to generate a few billion neurons of a specific design in memory, but can the processors run them at a useful speed.

Having 86+ billion simulated neurons means nothing, if they can’t achieve anything.

 :)

It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ****************
  • Admiral
  • *
  • 3128
  • First it wiggles, then it is rewarded.
Re: Cool AI stuff
« Reply #2 on: December 01, 2018, 11:57:45 am »
True. you can cram nodes to fewer by RNN representations, slowly blurring them, until making them unexistant. Eventually 12 nodes won't store it all. Surely 4 won't.

Existence is needed.
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ****************
  • Admiral
  • *
  • 3128
  • First it wiggles, then it is rewarded.
Re: Cool AI stuff
« Reply #3 on: December 02, 2018, 04:03:07 am »
LSTMS are weighted RNNs and don't "store" sequences. They work by probablistics. Which can reconstruct sequences. Correct!?

Seeing that it reconstructs it, then of course it is 'storing' "Romeo oh romeo where is thou child.".

This job can be done in other hierarchical ways.
Emergent

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1379
  • Look into my eyes! WOAH!
    • YouTube
Re: Cool AI stuff
« Reply #4 on: December 02, 2018, 10:06:46 pm »
No... as a general rule LSTM's don't 'store' sequences... long-term...

After training the LSTM 'is' the sequence... the sequence is 'embedded' within the LSTM, the LSTM's structure is altered to 'filter' and recognise the sequence.

 :)
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

 


Greetings from Germany
by goaty (New Users Please Post Here)
Today at 03:54:37 am
Project Thread: building Blinky
by Hopefully Something (Home Made Robots)
Today at 03:35:59 am
MYCIN
by AndyGoode (AI Programming)
June 16, 2019, 06:17:52 pm
Friday Funny
by Korrelan (General Chat)
June 16, 2019, 06:11:10 pm
Give us what you got!
by LOCKSUIT (General Chat)
June 16, 2019, 06:13:20 am
Could the internet reach singularity?
by goaty (General AI Discussion)
June 15, 2019, 06:50:31 pm
A whole site full of Robotic entries
by Art (General Robotics Talk)
June 15, 2019, 04:13:32 pm
XKCD Comic : Chernobyl
by LOCKSUIT (XKCD Comic)
June 15, 2019, 01:04:16 pm

Users Online

16 Guests, 1 User
Users active in past 15 minutes:
Freddy
[Administrator]

Most Online Today: 66. Most Online Ever: 340 (March 26, 2019, 09:47:57 pm)

Articles