would emp gunning a robot make you feel sad? in General Chat

I was looking it up, and i cant seem to find footage of someone doing it,  maybe it looks like its a fly getting sprayed with its legs randomly jutting everywhere.  It doesnt have a soul, so it just **looks** like its dieing, because robots dont die, they are just like animated rocks at best.

11 Comments | Started April 07, 2020, 02:44:30 PM

Researching from home: Science stays social, even at a distance in Robotics News

Researching from home: Science stays social, even at a distance
7 April 2020, 7:00 pm

With all but a skeleton crew staying home from each lab to minimize the spread of Covid-19, scores of Picower Institute researchers are immersing themselves in the considerable amount of scientific work that can done away from the bench. With piles of data to analyze; plenty of manuscripts to write; new skills to acquire; and fresh ideas to conceive, share, and refine for the future, neuroscientists have full plates, even when they are away from their, well, plates. They are proving that science can remain social, even if socially distant.

Ever since the mandatory ramp down of on-campus research took hold March 20, for example, teams of researchers in the lab of Troy Littleton, the Menicon Professor of Neuroscience, have sharpened their focus on two data-analysis projects that are every bit as essential to their science as acquiring the data in the lab in the first place. Research scientist Yulia Akbergenova and graduate student Karen Cunningham, for example, are poring over a huge amount of imaging data showing how the strength of connections between neurons, or synapses, mature and how that depends on the molecular components at the site. Another team, comprised of Picower postdoc Suresh Jetti and graduate students Andres Crane and Nicole Aponte-Santiago, is analyzing another large dataset, this time of gene transcription, to learn what distinguishes two subclasses of motor neurons that form synapses of characteristically different strength.

Work is similarly continuing among researchers in the lab of Elly Nedivi, the William R. (1964) and Linda R. Young Professor of Neuroscience. Since heading home, Senior Research Support Associate Kendyll Burnell has been looking at microscope images tracking how inhibitory interneurons innervate the visual cortex of mice throughout their development. By studying the maturation of inhibition, the lab hopes to improve understanding of the role of inhibitory circuitry in the experience-dependent changes, or plasticity, and development of the visual cortex, she says. As she’s worked, her poodle Soma (named for the central body structure of a neuron) has been by her side.

Despite extra time with comforts of home, though, it’s clear that nobody wanted this current mode of socially distant science. For every lab, it’s tremendously disruptive and costly. But labs are finding many ways to make progress nonetheless.

“Although we are certainly hurting because our lab work is at a standstill, the Miller lab is fortunate to have a large library of multiple-electrode neurophysiological data,” says Picower Professor Earl Miller. “The datasets are very rich. As our hypotheses and analytical tools develop, we can keep going back to old data to ask new questions. We are taking advantage of the wet lab downtime to analyze data and write papers. We have three under review and are writing at least three more right now.”

Miller is inviting new collaborations regardless of the physical impediment of social distancing. A recent lab meeting held via the videoconferencing app Zoom included MIT Department of Brain and Cognitive Sciences Associate Professor Ila Fiete and her graduate student, Mikail Khona. The Miller lab has begun studying how neural rhythms move around the cortex and what that means for brain function. Khona presented models of how timing relationships affect those waves. While this kind of an interaction between labs of the Picower Institute and the McGovern Institute for Brain Research would normally have taken place in person in MIT’s Building 46, neither lab let the pandemic get in the way.

Similarly, the lab of Li-Huei Tsai, Picower Professor and director of the Picower Institute, has teamed up with that of Manolis Kellis, professor in the MIT Computer Science and Artificial Intelligence Laboratory. They’re forming several small squads of experimenters and computational experts to launch analyses of gene expression and other data to illuminate the fate of individual cell types like interneurons or microglia in the context of the Alzheimer’s disease-afflicted brain. Other teams are focusing on analyses of questions such as how pathology varies in brain samples carrying different degrees of genetic risk factors. These analyses will prove useful for stages all along the scientific process, Tsai says, from forming new hypotheses to wrapping up papers that are well underway.

Remote collaboration and communication are proving crucial to researchers in other ways, too, proving that online interactions, though distant, can be quite personally fulfilling.

Nicholas DiNapoli, a research engineer in the lab of Associate Professor Kwanghun Chung, is making the best of time away from the bench by learning about the lab’s computational pipeline for processing the enormous amounts of imaging data it generates. He’s also taking advantage of a new program within the lab in which Senior Computer Scientist Lee Kamentsky is teaching Python computer programming principles to anyone in the lab who wants to learn. The training occurs via Zoom two days a week.

As part of a crowded calendar of Zoom meetings, or “Zeetings” as the lab has begun to call them, Newton Professor Mriganka Sur says he makes sure to have one-to-one meetings with everyone in the lab. The team also has organized into small subgroups around different themes of the lab’s research.

But also, the lab has continued to maintain its cohesion by banding together informally creating novel work and social experiences.

Graduate student Ning Leow, for example, used Zoom to create a co-working session in which participants kept a video connection open for hours at a time, just to be in each other’s virtual presence while they worked. Among a group of Sur lab friends, she read a paper related to her thesis and did a substantial amount of data analysis. She also advised a colleague on an analysis technique via the connection.

“I’ve got to say that it worked out really well for me personally because I managed to get whatever I wanted to complete on my list done,” she says, “and there was also a sense of healthy accountability along with the sense of community.”

Whether in person or via an officially imposed distance, science is social. In that spirit, graduate student K. Guadalupe "Lupe" Cruz organized a collaborative art event via Zoom for female scientists in brain and cognitive sciences at MIT. She took a photo of Rosalind Franklin, the scientist whose work was essential for resolving the structure of DNA, and divided it into nine squares to distribute to the event attendees. Without knowing the full picture, everyone drew just their section, talking all the while about how the strange circumstances of Covid-19 have changed their lives. At the end, they stitched their squares together to reconstruct the image.

Examples abound of how Picower scientists, though mostly separate and apart, are still coming together to advance their research and to maintain the fabric of their shared experiences.

Source: MIT News - CSAIL - Robotics - Computer Science and Artificial Intelligence Laboratory (CSAIL) - Robots - Artificial intelligence

Reprinted with permission of MIT News : MIT News homepage

Use the link at the top of the story to get to the original article.

Started Today at 12:00:11 PM

check out this anime bot design in Bot Conversations


4 Comments | Started April 07, 2020, 11:13:41 PM

Releasing full AGI/evolution research in General Project Discussion

This is most of all my notes, images, algorithms etc summarized/unified in their recent forms. Here. I am 24 and started at 18 all this work. I make discoveries in my brain using mostly vision (visual language; shapes, cats, etc which has context, each explain each other like in a dictionary, a small world network of friendly connections).


I have 2 more videos to share but not yet uploaded. And 2 more notes. The last note will have some more very recent good data but these not yet given are of less immediate importance. The long file though does have a lot of recent knowledge in it though still. It's better you know all it, at least the movie.


Code-only of my advanced-ngram 'gpt2':
The software was made on a
The software was made on a wide variety of devices, and operating apps and applications that users can easily read as an app for android. It is a bit of a difference, but i was able to get it. The developers are not going to make it through a web applications, and devices i have seen in the running for the mobile apps. Applications allows users to access applications development tools, and allow applications of the app store. A multimedia entertainment entertainment device, and allows platforms enabled access to hardware interfaces. Using a bit of html application app developers can enable users to access applications to investors, and provide a more thorough and use of development. The other a little entertainment media, and user development systems integration technology. Applications allows users to automatically provide access to modify, optimize capability allows users to easily enable. Both users and software systems, solutions allowing owners software solutions solutions to integrate widgets customers a day. And if you are accessing services product, and mobile applications remotely access to the software companies can easily automate application access to hardware devices hardware systems creators and technologies. Builders and developers are able to access the desktop applications, allowing users access allows users to
((I checked the 400MB, not too long copies pastes, like only 3-5 words at most))

I almost got GPT-2 understood as shown at end of movie but need help, anyone understand it's inner workings? Looking to collaborate.

I recommend you do this as well and mentor each other.

more data by my top-of-mind pick:
AGI is an intelligent Turing Tape. It has an internal memory tape and an external memory tape - the notepad, the desktop, the internet. Like a Turing Tape it decides where to look/pay attention to, what state to be in now, and what to write, based on what it reads and what state it is in. The what/where brain paths. It will internally translate and change state by staying still or move forwards/backwards in spacetime. It'll decide if to look to external desktop, and where to look - notepad? where on notepad? internet? where on internet?

It's given Big Diverse Data and is trying to remove Big Diverse Data (Dropout/Death) so it can compress the network to lower Cost/Error and hence learn the general facets/patterns of the universe exponentially better while still can re-generate missing data despite having a small world network (all quantinized dictionary words explain each other). It uses Backpropagation to adjust the weights so the input will activate the correct node at the end. That node, can be activated by a few different sorts of images - side view of cat, front view of paw, cat ear, mountain, it's a multi dimensional representation space, Since the network Learns patterns (look up word2vec/Glove, it's same as seq2seq) by lowering error cost by Self-Attention evolution/self-recursion of data augmentation (self-imitation in your brain using quantinized visual features/nodes), it therefore doesn't modify its structure by adjusting existing connections (using weights/strengths) to remove nodes, it rather adjusts its structure by adjusting connections weights to remove error and ignores node count Cost.

Intelligence is defined as being flexible/general using little data, but walker bots are only able to solve what is in front of themselves, we need a thinker like GPT-2, and you can see the mind has evolved to simulate/forecast/predict the future using evolutionary mental RL self-imitation self-recursion of data. And intelligence is for survival, immortality, trying to find food and breed to sustain life systems, it's just an infection/evolution of matter/energy / data evolution.

45 Comments | Started December 25, 2019, 09:53:06 PM

Terra Sentia - Robot in Robotics News

A robot with a task of roving through rows of corn to help provide farmers better data for increased production, etc.


10 Comments | Started April 05, 2020, 05:55:26 AM

Has anyone read Wolfram's "A New Kind of Science"? in General Chat


I'm about to read Stephen Wolfram's 2002 book A New Kind of Science. Has anyone read it?

37 Comments | Started April 06, 2020, 07:42:11 AM

XKCD Comic : Homemade Masks in XKCD Comic

Homemade Masks
6 April 2020, 5:00 am

I'm going to change the sign so the pole is horizontal and the sign is mounted on the front like a plunger, so I can carry it around like a lance to gently push people back if they try to approach.

Source: xkcd.com

Started April 07, 2020, 12:01:42 PM

analogue feedback hitting petahz in General AI Discussion

if you can somehow get rid of the forward power in its feedback line, your speed is the rate of going through the logic gates, actually is the hz of the recursion.
So you dont need a clock to make a computer,  and if the logic chain (just think of and gates if your dumb.) can make it back before a nanometre.  u get a petahert.
It also counts as having memory without registers.

Should work perceptrons too.  Hopefully, you should do it.  Home made style, better than nuthin'.

1 Comment | Started April 07, 2020, 09:10:42 AM

Understanding Mice. in AI News


Mice have feelings too and now there is artificial intelligence software that can distinguish the facial expressions of mice to reveal how they're feeling.

Coming up, the study of mental illness in rats.

1 Comment | Started April 06, 2020, 11:56:30 PM

Arrows keyboard shortcuts in General Chat

Did you know that on Windows, ALT-26 produces a right arrow character?

ALT-24  ↑
ALT-25  ↓
ALT-26  →
ALT-27  ←

If you like syntax as I do, you'll probably find it very useful!

2 Comments | Started April 06, 2020, 10:38:04 AM
Star Wars: Rogue One

Star Wars: Rogue One in Robots in Movies

Rogue One follows a group of rebels on a mission to steal the plans for the Death Star, the Galactic Empire's super weapon, just before the events of A New Hope.

Former scientist Galen Erso lives on a farm with his wife and young daughter, Jyn. His peaceful existence comes crashing down when the evil Orson Krennic takes him away from his beloved family. Many years later, Galen becomes the Empire's lead engineer for the most powerful weapon in the galaxy, the Death Star. Knowing that her father holds the key to its destruction, Jyn joins forces with a spy and other resistance fighters to steal the space station's plans for the Rebel Alliance.

One of the resistance fighters is K-2SO a droid. He is a CGI character voiced and performed through motion capture by Alan Tudyk. In the film, K-2SO is a KX-series security droid originally created by the Empire.

Feb 25, 2020, 18:50:48 pm
Practical Artificial Intelligence: Machine Learning, Bots, and Agent Solutions Using C#

Practical Artificial Intelligence: Machine Learning, Bots, and Agent Solutions Using C# in Books

Discover how all levels Artificial Intelligence (AI) can be present in the most unimaginable scenarios of ordinary lives. This book explores subjects such as neural networks, agents, multi agent systems, supervised learning, and unsupervised learning. These and other topics will be addressed with real world examples, so you can learn fundamental concepts with AI solutions and apply them to your own projects.

People tend to talk about AI as something mystical and unrelated to their ordinary life. Practical Artificial Intelligence provides simple explanations and hands on instructions. Rather than focusing on theory and overly scientific language, this book will enable practitioners of all levels to not only learn about AI but implement its practical uses.

Feb 10, 2020, 00:14:42 am
Robot Awakening (OMG I'm a Robot!)

Robot Awakening (OMG I'm a Robot!) in Robots in Movies

Danny discovers he is not human, he is a robot - an indestructible war machine. His girlfriend was kidnapped by a mysterious organization of spies who are after him & now he must go on a journey to save his girl and find out why the hell he is a robot?!

Feb 09, 2020, 23:55:45 pm
Program Y

Program Y in AIML / Pandorabots

Program Y is a fully compliant AIML 2.1 chatbot framework written in Python 3. It includes an entire platform for building your own chat bots using Artificial Intelligence Markup Language, or AIML for short. 

Feb 01, 2020, 15:37:24 pm
The AvatarBot

The AvatarBot in Tools

The AvatarBot helps you in finding an Avatar for your Chatbot. Answer a few questions and get a match. Keep trying to get the one you really like.

Dec 18, 2019, 14:51:56 pm

Eva in Chatbots - English

Our chatbot - Eva - was created by Stanusch Technologies SA. Eva, just 4 weeks after launch, competed in Swansea (UK) for the Loebner Prize 2019 with programs such as Mitsuku and Uberbot! Now, she is in the top 10 most-humanlike bots in the world! :)

Is it possible for Eva to pass the turing test? It's creators believe it is.

Eva has her own personality: she is 23 years old, she is a student from the Academy of Physical Education in Katowice (Lower Silesia district/Poland). She is a very charming and nice young women, who loves to play volleyball and to read books.

Dec 14, 2019, 13:10:13 pm
Star Wars: Episode IX – The Rise of Skywalker

Star Wars: Episode IX – The Rise of Skywalker in Robots in Movies

Star Wars: The Rise of Skywalker (also known as Star Wars: Episode IX – The Rise of Skywalker) is an American epic space opera film produced, co-written, and directed by J. J. Abrams.

A year after the events of The Last Jedi, the remnants of the Resistance face the First Order once again—while reckoning with the past and their own inner turmoil. Meanwhile, the ancient conflict between the Jedi and the Sith reaches its climax, altogether bringing the Skywalker saga to a definitive end.

Nov 15, 2019, 22:31:39 pm
Terminator: Dark Fate

Terminator: Dark Fate in Robots in Movies

Terminator: Dark Fate is a 2019 American science fiction action film directed by Tim Miller and created from a story by James Cameron. Cameron considers the film a direct sequel to his films The Terminator (1984) and Terminator 2: Judgment Day. The film stars Linda Hamilton and Arnold Schwarzenegger returning in their roles of Sarah Connor and the T-800 "Terminator", respectively, reuniting after 28 years.


In 1998, three years after defeating the T-1000 and averting the rise of the malevolent artificial intelligence (AI) SkynetSarah Connor and her teenage son John are relaxing on a beach at Guatemala. A T-800 Terminator, sent from the future before Skynet's erasure, arrives and shoots John, killing him.

Mackenzie Davis stars as Grace: A soldier from the year 2042 adopted by Resistance leader Daniella Ramos who was converted into a cyborg and sent by her adoptive mother to protect her younger self from a new advanced Terminator prototype.

Oct 29, 2019, 21:27:46 pm
Life Like

Life Like in Robots in Movies

A couple, James and Sophie, buy an android called Henry to help around the house.

In the beginning, this is perfect for both James and Sophie as Henry does housework and makes a good companion to Sophie. But when Henry’s childlike brain adapts by developing emotions, complications begin to arise

Oct 29, 2019, 21:14:49 pm
I Am Mother

I Am Mother in Robots in Movies

I Am Mother is a 2019 Australian science fiction thriller film directed by Grant Sputore, from a screenplay by Michael Lloyd Green. Starring Clara Rugaard, Luke Hawker, Rose Byrne, and Hilary Swank, the film follows Daughter, a girl in a post-apocalyptic bunker, being raised by Mother, an android supposed to aid in the repopulation of Earth.

Sep 30, 2019, 21:39:16 pm
Mitsuku wins 2019 Loebner Prize

Mitsuku wins 2019 Loebner Prize in Articles

For the fourth consecutive year, Steve Worswick’s Mitsuku has won the Loebner Prize for the most humanlike chatbot entry to the contest. This is the fifth time that Steve has won the Loebner Prize. The Loebner Prize is the world’s longest running Turing-Test competition and has been organised by AISB, the world’s oldest AI society, since 2014.

Sep 30, 2019, 21:18:50 pm