Ai Dreams Forum
Robotics => General Robotics Talk => Topic started by: Freddy on July 26, 2017, 08:26:25 pm
-
Japan has a unique fascination with androids and the quest to make robots more like humans. One of the country’s most original thinkers in this area is Professor Takashi Ikegami of the University of Tokyo. He has created androids filled with sensors and artificial intelligence software. The technology allows them to perceive the outside world and react to it as they see fit. Hello World host Ashlee Vance traveled to Tokyo to meet with Professor Ikegami and see his latest android creation. The robot they encounter flails about and makes strange gurgling noises as it responds to their movements and conversation. While it all looks rudimentary today, the technology is the precursor of what Ikegami predicts will be a new robotic life form that has its own culture, language, and desires. What could go wrong?
https://www.youtube.com/watch?v=qjmMHGJUFX4
-
I think it would be interesting if they had a duplicate robot in front of the existing one so that they could see and hear each other and perhaps only be able to reach each other's hands / arms. No worries as their hands do not open and close.
Would the robots learn to adapt faster if they could see and interact with a representation of themselves and if so, how? How fast? Would they learn to communicate with each other?