GENERATING A BRAIN

  • 2 Replies
  • 3195 Views
*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
GENERATING A BRAIN
« on: March 18, 2017, 03:29:37 pm »
GENERATING A BRAIN

----------------------------------------------------------------------------------------

Human brain simulation is closer than you think.

Human brain generation is closer then you think.



The human brain has about 86 billion neurons and 100 trillion synapses.

All neurons/dendrites/synapses are in grey matter.
http://www.indiana.edu/~p1013447/dictionary/greywhit.htm
https://en.wikipedia.org/wiki/Grey_matter



In men about 22.8 billion neurons are located in the thin grey matter Cerebral Cortex at the top of the brain. In women about 19.3 billion.
https://faculty.washington.edu/chudler/facts.html

About 50 billion neurons or more are in the Cerebellum.
https://en.wikipedia.org/wiki/Cerebellum#Cerebellum-like_structures



Women's brains are 8 percent smaller. That's a good reason why they are missing neurons in the Cerebral Cortex and Hippocampus.
http://www.dailymail.co.uk/sciencetech/article-2287523/Women-really-smaller-brains--use-efficiently-men.html

Therefore it's ok if a good amount of neurons in the Cerebral Cortex and Hippocampus are missing in women.

Every day we lose about 85,000 grey matter neurons.
https://faculty.washington.edu/chudler/facts.html



Girl with half her brain.


Walkthrough from people missing their Cerebellum to having barely any brain at all.
https://braindecoder.com/post/how-much-of-the-brain-can-you-live-without-1257759671

https://en.wikipedia.org/wiki/List_of_animals_by_number_of_neurons



Smell, taste, and maybe sound human brain regions are not needed and can be excluded.

Of the human brain regions left, every neuron cannot perform their own function because that would leave no room for memory. There is lots of memory. It doesn't have to be excluded since memory is cheap as Ray says. But a lot of the memory can be excluded. We'll only need enough for a few years until it accurately learns high-quality smarts fast. Same for the motors.

2.5 inch SSD can store 2 trillion bits. Superman Memory Crystal 360 trillion. We can store 490 Exabytes in 1 gram of DNA.

After all, everything around you is from human brain actions from a human body, and senses.

All our brain must do is get those actions and senses, then use them at them at the right times.

----------------------------------------------------------------------------------------

If the universe is expanding and Creating new space and new particles along with that new space, then there is an infinite amount of possible particle-arrangements and time-particle-arrangements that we can form.

If AI were to keep expanding their military tactics and entertainment, there would be an infinite level of intelligence achievable.

But there is only a finite amount of possible particle-arrangements we need be concerned about, as we only need a intelligence capable of being just less than a human and sped-up in the computer to bring us singularity.

That finite amount of particle arrangements is the size of the brain inside a human child's skull. And because neurons are spaced apart and can be quantumly small, that finite size is much smaller.

Originally the human brain neurons were not quantum or even atomically functional. Now they are quantum in our "imagine how small ~ number of particles we're working with". Not that we will, it's just a representation of how many particles. But importantly our qubit amount should shrink a bit then, if the spaced neurons themselves are not particlistically functional. So "shrink".

On top of that, the length of the white matter can be shortened and excluded by wireless communication or through even 1 path by using a unique reference number ex. 1,089 essentially ending up the number of connections, meaning each connection has become 1 qubit now.

It gets simpler than that. We only need to create the brain, and let it run. We mustn't find multiple particle arrangements i.e. video. That easily emerges on its own. Just care for the baby now.

Even simpler. The particle arrangements drops from all possible in this finite amount to just the ones that can be made on the processor because the processor can simulate anything - it doesn't have to be a brain or world, it only needs to have the code, which is the same amount of storage uncompressed, but when we look at our shrunken human brain model I mentioned earlier being now qubits, a connection within this compact brain can take a few extra paths like a maze, whereas the processor takes the shortest path from one place to another. Sure that can be done with an actual brain or world, but I'm not actually sure about this idea, it goes over my knowledge a bit.

Yet simpler, all the memory is empty. The generation of the brain mustn't touch memory itself.

We can find AI, technology, knowledge, books, anything, even dead people just the way they were before leaving.

Generating books with a complete vocabulary of words is re-arranging, then adding, words, from 1 word, to 2, to 3, and so on. This makes the possible arrangements small because instead of using particles we're using media - many particles.

We could do that to the senses and actions in the brain after installing senses and actions from a database. Links. Image/Language/motor networks.

If we create a Language-only AI we can install all words (and each letter) so they are used only once in the hierarchy network parse tree, then configure the tree by reading the internet. Then we could reward words like health and entertainment so it outputs sentences that not just English language and sequentially-norm but also what it "wants" i.e. health and entertainment. I actually have how this works nicely. Also even if it wasn't AI much, something like this discovers knowledge/technology/data/information/etc, which is sorta well AI still!

We can already 3D-ize images by various ways. 3D segment objects of images asked. Re-arrange them, scale them, clone, darken, dent, cut, etc. If it matches the image that also got matched when asked, keep the tweak and tweak again but around here now (boundary). This can put a toothpick straight or slanted in a flower pot, spilt water on new object, lotsssss! Story-telling.

We can destructively scan human brains/use ones already processed and probabilistically generated from them a 3D model to use as a permanent/morph-able (to an extent) base.

We can make the reward-function a video of a newborn in a room and make our 3D environment and body identical and stay the same without tweaking them and only tweak the brain.

We can simulate a physical brain in the 3D world and body. We can make its connections and functions tweak and morph around instead of doing it on the processor which leaves the skull empty. The processor and memory stores ex. images/actions, the functions done on the processor can be hardware TOO naturally as a reaction ex. summing up energy or a relay, or can be computed software i.e. 1/3D Creating for simply just the functions not the media. This is similar to the 3D brain simulation but not all is being simulated and of what is is not interacting with each other - only locally.

Can run on multiple energy-efficient programmable 1-inch IBM TrueNorth Parallel Noded Neural Synapse Processors or Xeon Processors or GPUs or neuromorphic chips or supercomputers, that each contain 5.4 billion transistors which can store bits and may be able to each turn on and off up to 100 billion times per second. This provides more than enough computing power including neurons and synapses since IBM's 5120 transistors for 256 synapses being each composed of 20 transistors may not be needed, while 256 synapses per neuron may be too many and most not used can be borrowed, while 256 synapses are for each image not the individual neurons that make up senses unless pixels are used for the search tree.

Done natural Evolution, we Create AI by Pure Randomization and/or Evolutionary Algorithms (simulation faster perfect and more possibilities) Generators with reward-functions (ex. Stick Program's how long to run distance. Watch improvement speed of generations.) even us as evaluates and backpropagation for finding and for global/local exploitation and exploration of Evolutionary Algorithms, reward-functions, backpropagation, solving physics by simulation, 3D-Worlds allowing quickest learning, Their physics, specific-3D-reaction, Bodies (even better than us), AIs, parameters, Pre-installed Actions and Senses, CircuitMakeUseOf, Casino, and GeneticTextures, then so it Learns (needed for first step>building AI), use the best Generators for Pure and/or Evolutionary Algorithm Generators with rewards and backpropagation, use the best Generators for Pure and/or Evolutionary Algorithm Generators with rewards and backpropagation for CNNs with rewards and backpropagation and layer size/amount etc attributes (CNNs is also learning+creates topologies based on the patterns it sees), Classifiers with backpropagation, then we Train/Educate AI, more training data acquisition = faster learning/Creating. Then it Updates its intelligence/CNNs/actions by itself/Evolutionary-Algorithm/other. Evolutionary Algorithms sexually reproductively create, copy paste to clone (almost like particles coming together to form in simulation) children from crossing best 2 or multiple same-gender parent animals's if want best genes then clones them as next generation plus mutates them like natural evolution does but better, it can start off with few steps so once the main structure is made, it can have hundreds of fine tuning steps found.

Linearly/randomly/probalistically/rate-of-improvemently try all algorithms (plus try coding AI and do around here) as many as can at the same time to try all fields then try hard somewhere. We can make a finite limit on the "video" of of all possible arrangements tried plus teleport particles since lightspeed for even 1 second to let play or let play designed frames would be a 22 digit number. Backpropagation also Stochastic gradient decent and multi random or local/global scouts hill-climbing to update close link weights (strengths) (their energies draw searches).

The longer anything is above the average the more it likely will stay. If someone is 5,000 years old. If something has never moved for 5,000 years. Wage. Behavior. Something growing in length.

----------------------------------------------------------------------------------------

Post your ideas on how to generate a brain.
« Last Edit: March 18, 2017, 04:09:54 pm by LOCKSUIT »
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: GENERATING A BRAIN
« Reply #1 on: March 20, 2017, 12:22:28 pm »
WOAHHHH

DON'T TRY THIS

This can generate an evil brain!

Even though memory is reserved empty, the memory morphing into da code could turn out becoming the code with some images/etc ex. kill humans (exactly that).

A massive advanced techno god/heaven could use this capability and dissipate bad things.
Emergent          https://openai.com/blog/

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4659
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: GENERATING A BRAIN
« Reply #2 on: March 20, 2017, 09:10:36 pm »
Then again, if it's done in a 3D virtual world, we have good control.
Emergent          https://openai.com/blog/

 


LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

436 Guests, 0 Users

Most Online Today: 447. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles