Awareness, Consciousness, and Free Will in General AI Discussion

Here's how I see them, and how they might build on each other:

I’m assuming that a sufficiently self-informed process can become self-aware. The idea is, once you get enough of the right type of interrelatedness happening in a system, that system will become mechanically/ energetically self-informed. Granted, there may be a necessary step between self-informed and self-aware, but I’m not convinced there has to be. It might be more of a complexity dependent spectrum. But this seems like the best candidate for what generates awareness/ experience, so I'm going with it, and I’ll see where it takes me.

Now that we have self-awareness, how can we create consciousness? I think the answer is, self-awareness plus time. A conscious process would need to be so interrelated as to continually remain informed of the actions of its constituent parts during a smudge of time, technically slightly in the past from an external perspective, which it would know as the present moment. The ‘’present’’ might be akin to the most recent/ least faded waves of sensory input/ activity in the nervous system. Then the fading echoes at the tail end of the smudge, might create a passive short-term memory. Comparing the echoes to the most recent waves, would inform the internal processes about trends in the external processes. This would enable them to notice and use time, giving an extra dimension to awareness, and creating something we might call consciousness.

How about free will then? Consciousness tends to go together with indecision. This seems like the effect of a divergent prediction mechanism, instead of a convergent one. An AGI might be paralyzed by the possibilities it could generate, given even slightly incomplete knowledge. It seems like it would need to pause to evaluate each possible future for probability and desirability. Then it would pick and work towards something in the ‘favorite most likely’ category. This looks to me like a, lets say, ‘higher order’ free will, which is achieved at larger scales/ complexities, despite the predictability of simple particles, through indecision followed by self-determinism.

37 Comments | Started June 04, 2021, 04:38:09 am

Releasing full AGI/evolution research in General Project Discussion

This is most of all my notes, images, algorithms etc summarized/unified in their recent forms. Here. I am 24 and started at 18 all this work. I make discoveries in my brain using mostly vision (visual language; shapes, cats, etc which has context, each explain each other like in a dictionary, a small world network of friendly connections).

I have 2 more videos to share but not yet uploaded. And 2 more notes. The last note will have some more very recent good data but these not yet given are of less immediate importance. The long file though does have a lot of recent knowledge in it though still. It's better you know all it, at least the movie.


Code-only of my advanced-ngram 'gpt2':
The software was made on a
The software was made on a wide variety of devices, and operating apps and applications that users can easily read as an app for android. It is a bit of a difference, but i was able to get it. The developers are not going to make it through a web applications, and devices i have seen in the running for the mobile apps. Applications allows users to access applications development tools, and allow applications of the app store. A multimedia entertainment entertainment device, and allows platforms enabled access to hardware interfaces. Using a bit of html application app developers can enable users to access applications to investors, and provide a more thorough and use of development. The other a little entertainment media, and user development systems integration technology. Applications allows users to automatically provide access to modify, optimize capability allows users to easily enable. Both users and software systems, solutions allowing owners software solutions solutions to integrate widgets customers a day. And if you are accessing services product, and mobile applications remotely access to the software companies can easily automate application access to hardware devices hardware systems creators and technologies. Builders and developers are able to access the desktop applications, allowing users access allows users to
((I checked the 400MB, not too long copies pastes, like only 3-5 words at most))

I almost got GPT-2 understood as shown at end of movie but need help, anyone understand it's inner workings? Looking to collaborate.

I recommend you do this as well and mentor each other.

more data by my top-of-mind pick:
AGI is an intelligent Turing Tape. It has an internal memory tape and an external memory tape - the notepad, the desktop, the internet. Like a Turing Tape it decides where to look/pay attention to, what state to be in now, and what to write, based on what it reads and what state it is in. The what/where brain paths. It will internally translate and change state by staying still or move forwards/backwards in spacetime. It'll decide if to look to external desktop, and where to look - notepad? where on notepad? internet? where on internet?

It's given Big Diverse Data and is trying to remove Big Diverse Data (Dropout/Death) so it can compress the network to lower Cost/Error and hence learn the general facets/patterns of the universe exponentially better while still can re-generate missing data despite having a small world network (all quantinized dictionary words explain each other). It uses Backpropagation to adjust the weights so the input will activate the correct node at the end. That node, can be activated by a few different sorts of images - side view of cat, front view of paw, cat ear, mountain, it's a multi dimensional representation space, Since the network Learns patterns (look up word2vec/Glove, it's same as seq2seq) by lowering error cost by Self-Attention evolution/self-recursion of data augmentation (self-imitation in your brain using quantinized visual features/nodes), it therefore doesn't modify its structure by adjusting existing connections (using weights/strengths) to remove nodes, it rather adjusts its structure by adjusting connections weights to remove error and ignores node count Cost.

Intelligence is defined as being flexible/general using little data, but walker bots are only able to solve what is in front of themselves, we need a thinker like GPT-2, and you can see the mind has evolved to simulate/forecast/predict the future using evolutionary mental RL self-imitation self-recursion of data. And intelligence is for survival, immortality, trying to find food and breed to sustain life systems, it's just an infection/evolution of matter/energy / data evolution.

281 Comments | Started December 25, 2019, 09:53:06 pm

artaigallery in AI News

In 2018, Christie’s sold Portrait of Edmond de Belamy (2018), the first-ever original work of art created using artificial intelligence to come to auction (it sold for $432,500 against a high estimate of $10,000), Inspired by reports of the sale, Ben Kovalis and two like-minded childhood friends from Israel, Eyal Fisher and Guy Haimovitz, launched the Art AI Gallery one year later, in late 2019. It involves collections of curated work made using an algorithm that was created over the course of six months and then refined over the next year and a half.

2 Comments | Started August 04, 2021, 12:44:42 pm

Blind robot in General AI Discussion

I was just brainstorming again,  every day, for the past 6-7 years or something???  going 1 mile an hour.

But I think that with a gyro and accellerometre, you could actually a get a robot to realize its environment without a camera,  just by it testing its obstructions when it moves!  even moving objects, but they are stay as permanent last guesses.

Thats kinda nifty,  but the more important thing, is I think that even robots with cameras need to do it too, and they become more adaptive. (can solve more environments) If they can also feel around,  because when you get close to things, or you get snuck up from behind!! you need a way of seeing when your camera doesnt work anymore!  and close proximity is how it works anyway.

  * everything is fully occluded up close to things!  - so upclose seeing stops and feeling starts!
  * unless you have a camera in all directions, theres a huge 180 blind spot on your robot, where it could end up a fly against a window, not knowing whats happening to it, no metric, no success.

1 Comment | Started August 02, 2021, 08:18:25 pm

Project Acuitas in General Project Discussion

I'm going to post updates re: my main project, Acuitas the semantic net AI, in this thread.

My focus this past month was on giving Acuitas the ability to learn more types of inter-word relationships.  He started with just class memberships (<thing> is a <thing>) and qualities (<thing> is a <adjective>), but now he can learn all of the following:

<thing> can do <action>
<thing> is for <action>
<thing> is part of <thing>
<thing> is made of <thing>
<thing> has <thing>

In the process I made extensive updates to the module behind the Text Parser that detects "forms," i.e. syntactic structures that encode these inter-word relationships.

I also upgraded the GUI library from Tkinter to Kivy, which is kind of boring but had to be done, because the old GUI was provoking frequent crashes.

More details on the blog:

The included diagram shows my vision for the conversation engine.  The upper half is implemented (though of course it still needs to mature a great deal); the lower half mostly does not exist yet.

174 Comments | Started June 02, 2017, 03:17:30 pm

Making robots out of super hard grain cracker material. in General Hardware Talk


I was screwing around with cereal,  and I found that I could get a noxious variety of the grain based material, (which happened to be porridge.)
via by virtually just leaving it under the tap and let all the solutes drain away,  what was left was a bad smelling sickly green load of husk and a little starch left over.  (I can tell the starch was in it, because it was leaving behind a colloid when I adgitated it with my wooden spoon.)

I then got it, put it in the mcrowave and it formed a horrible green coloured cracker,    I was very excited to see how hard it was, but my teeth just went through it very similar to a vita-wheet but I had to spit it out it was so disgusting, but it did have a nice cracker texture.

I was disappointed, because I wanted something that was more worthy of a strong resilient robot.   I then put it under the tap,  initially it looked like it was going to hold against the liquid, because I was thinking it was water insoluable,  but it turned out it turned literally into a sponge, and then a disgusting smell came out of it, like a dead body.

I thought I just made the most disgusting mistake of all time, and all was lost,   but I didnt give up, and then I decided to put it in a baking powder solution, and this time it didnt go sloppy as quickly, and it didnt smell as bad, but it still smelt a bit, then I dried it out in the microwave,  to my suprise a lot of white gunk came out of it (This is the starch I think.) and it pocked up all the cracks and it then hardened. (So my theory goes.)

Then this cracker was so hard it hurt my teeth and I couldnt bite into it It was so hard!   Eureka!   I found something decently tough! and its made out of porridge,  porridge goes for very little at the supermarket, you can get a kg for $3!   thats got to be the cheapest building material you can get!  Imagine how many robots you could afford made of this cheap crap!

I eventually was able to snap it in 2 with both hands,  (it was about a cm thick, and 4.5 cm across, both other dimensions, it was flattish) But that was only after trying really hard to do it!  So its pretty good,  especially if Im able to make it a little tougher than that even!

Ive got it super glued now, so when it dries ill see if the centre support makes it so I cant snap it anymore, but thats only if I wait for the superglue to get to full strength.

The main problem with it,  is the horrible stench involved of this over rinsed evil porridge,   everything else is acceptable, except for the lack of water resistance.  Because its made of grain its actually extremely light weight, so it could make an aerial drone fine, as long as the weather was good and humidity down.

So it means,  when the robots get rained on, they disintigrate and leave behind a pong cloud.    :uglystupid2:

Unfortunately I couldnt hold my hand steady enough for a decent photo...

3 Comments | Started July 23, 2021, 04:49:06 am

Nice Js library: Graphology in AI Programming

Hi there,

Just sharing a nice Javascript library I found, Graphology. It lets you manipulate graphs (undirected, directed and mixed) and aims to be "a specification [and reference implementation] for a robust & multipurpose Js Graph object".

In my opinion, its documentation is better than CytoscapeJS's, and the project looks neat. Here:

For example, nodes and edges can be decorated with meaning-vectors from Stanleyfok's vector-object library.

Started August 01, 2021, 02:38:16 pm

short sci-fi featuring a Turing Test in AI in Film and Literature.

This is a short and very clever sci-fi about a group of auditors whose job is to make sure that AI is only used for good.

1 Comment | Started July 30, 2021, 06:30:26 am

have you heard about blender bot 2 in General Chatbots and Software

9 Comments | Started July 17, 2021, 02:28:19 am

what is the spark that would make a user feel not alone ? in General AI Discussion

I have been wondering about this.
what would make an AI feel real to the user or make him feel not alone.

replika, siri, samsung sam, bixby, and even alexa just don't have it.

it seems to be the way the user interacts with it. the user has to always feed her with input,
there doesn't seem to be initiative.

this also goes for moko AI that sends notifications at random times. she would do a check up if you were
feeling upset on your last convo. or say random things at random times to invite the user to some conversation.
but it didn't feel real, it felt like, ahh it's that random none sense again or like, ahh homework conversation time again to
keep her states up.

I don't think this spark has much to do with the actual contents or quantity of contents the AI replies with
and more about the actual engagement triggers.

or maybe its more about a cute body and face the bot must have ?

23 Comments | Started June 13, 2021, 08:30:14 am
Real Steel

Real Steel in Robots in Movies

Real Steel is a 2011 American science-fiction sports film starring Hugh Jackman and Dakota Goyo.

In 2020, human boxers have been replaced by robots. Charlie Kenton, a former boxer, owns "Ambush", but loses it in a fight against a bull belonging to promoter and carnival owner Ricky, who rigged the fight to mess with Charlie as he sees him as a joke, partially because he beat him up the last time they competed for bailing on the bet. Having made a bet that Ambush would win as a result, Charlie now has a debt to Ricky he can't pay—which he runs out on.

Apr 20, 2020, 14:25:24 pm

Singularity in Robots in Movies

Singularity is a Swiss/American science fiction film. It was written and directed by Robert Kouba, and its first shoot, in 2013, starred Julian Schaffner, Jeannine Wacker and Carmen Argenziano. The film was first released in 2017, after further scenes with John Cusack were added.

In 2020, robotics company C.E.O. Elias VanDorne reveals Kronos, the supercomputer he has invented to end all war. Kronos decides that mankind is responsible for all war, and it tries to use robots to kill all humans. VanDorne and Damien Walsh, a colleague, upload themselves into Kronos and watch the destruction. Ninety-seven years later, Andrew, a kind-hearted young man, wakes up in a ruined world. VanDorne and Walsh, still in Kronos, watch Andrew meet Calia, a teenage girl who seeks the last human settlement, the Aurora. Though Calia is first reluctant to let Andrew accompany her, the two later fall in love.

Apr 18, 2020, 13:37:25 pm
Star Wars: Rogue One

Star Wars: Rogue One in Robots in Movies

Rogue One follows a group of rebels on a mission to steal the plans for the Death Star, the Galactic Empire's super weapon, just before the events of A New Hope.

Former scientist Galen Erso lives on a farm with his wife and young daughter, Jyn. His peaceful existence comes crashing down when the evil Orson Krennic takes him away from his beloved family. Many years later, Galen becomes the Empire's lead engineer for the most powerful weapon in the galaxy, the Death Star. Knowing that her father holds the key to its destruction, Jyn joins forces with a spy and other resistance fighters to steal the space station's plans for the Rebel Alliance.

One of the resistance fighters is K-2SO a droid. He is a CGI character voiced and performed through motion capture by Alan Tudyk. In the film, K-2SO is a KX-series security droid originally created by the Empire.

Feb 25, 2020, 18:50:48 pm
Practical Artificial Intelligence: Machine Learning, Bots, and Agent Solutions Using C#

Practical Artificial Intelligence: Machine Learning, Bots, and Agent Solutions Using C# in Books

Discover how all levels Artificial Intelligence (AI) can be present in the most unimaginable scenarios of ordinary lives. This book explores subjects such as neural networks, agents, multi agent systems, supervised learning, and unsupervised learning. These and other topics will be addressed with real world examples, so you can learn fundamental concepts with AI solutions and apply them to your own projects.

People tend to talk about AI as something mystical and unrelated to their ordinary life. Practical Artificial Intelligence provides simple explanations and hands on instructions. Rather than focusing on theory and overly scientific language, this book will enable practitioners of all levels to not only learn about AI but implement its practical uses.

Feb 10, 2020, 00:14:42 am
Robot Awakening (OMG I'm a Robot!)

Robot Awakening (OMG I'm a Robot!) in Robots in Movies

Danny discovers he is not human, he is a robot - an indestructible war machine. His girlfriend was kidnapped by a mysterious organization of spies who are after him & now he must go on a journey to save his girl and find out why the hell he is a robot?!

Feb 09, 2020, 23:55:45 pm
Program Y

Program Y in AIML / Pandorabots

Program Y is a fully compliant AIML 2.1 chatbot framework written in Python 3. It includes an entire platform for building your own chat bots using Artificial Intelligence Markup Language, or AIML for short. 

Feb 01, 2020, 15:37:24 pm
The AvatarBot

The AvatarBot in Tools

The AvatarBot helps you in finding an Avatar for your Chatbot. Answer a few questions and get a match. Keep trying to get the one you really like.

Dec 18, 2019, 14:51:56 pm

Eva in Chatbots - English

Our chatbot - Eva - was created by Stanusch Technologies SA. Eva, just 4 weeks after launch, competed in Swansea (UK) for the Loebner Prize 2019 with programs such as Mitsuku and Uberbot! Now, she is in the top 10 most-humanlike bots in the world! :)

Is it possible for Eva to pass the turing test? It's creators believe it is.

Eva has her own personality: she is 23 years old, she is a student from the Academy of Physical Education in Katowice (Lower Silesia district/Poland). She is a very charming and nice young women, who loves to play volleyball and to read books.

Dec 14, 2019, 13:10:13 pm
Star Wars: Episode IX – The Rise of Skywalker

Star Wars: Episode IX – The Rise of Skywalker in Robots in Movies

Star Wars: The Rise of Skywalker (also known as Star Wars: Episode IX – The Rise of Skywalker) is an American epic space opera film produced, co-written, and directed by J. J. Abrams.

A year after the events of The Last Jedi, the remnants of the Resistance face the First Order once again—while reckoning with the past and their own inner turmoil. Meanwhile, the ancient conflict between the Jedi and the Sith reaches its climax, altogether bringing the Skywalker saga to a definitive end.

Nov 15, 2019, 22:31:39 pm
Terminator: Dark Fate

Terminator: Dark Fate in Robots in Movies

Terminator: Dark Fate is a 2019 American science fiction action film directed by Tim Miller and created from a story by James Cameron. Cameron considers the film a direct sequel to his films The Terminator (1984) and Terminator 2: Judgment Day. The film stars Linda Hamilton and Arnold Schwarzenegger returning in their roles of Sarah Connor and the T-800 "Terminator", respectively, reuniting after 28 years.


In 1998, three years after defeating the T-1000 and averting the rise of the malevolent artificial intelligence (AI) SkynetSarah Connor and her teenage son John are relaxing on a beach at Guatemala. A T-800 Terminator, sent from the future before Skynet's erasure, arrives and shoots John, killing him.

Mackenzie Davis stars as Grace: A soldier from the year 2042 adopted by Resistance leader Daniella Ramos who was converted into a cyborg and sent by her adoptive mother to protect her younger self from a new advanced Terminator prototype.

Oct 29, 2019, 21:27:46 pm
Life Like

Life Like in Robots in Movies

A couple, James and Sophie, buy an android called Henry to help around the house.

In the beginning, this is perfect for both James and Sophie as Henry does housework and makes a good companion to Sophie. But when Henry’s childlike brain adapts by developing emotions, complications begin to arise

Oct 29, 2019, 21:14:49 pm