NASA Uses Lessons From Space To Design An Efficient Building in AI News

NASA Uses Lessons From Space To Design An Efficient Building
30 November 2015, 12:00 am

                    There's a building in Mountain View, Calif., where energy-saving technologies of the future are being tried on for size. Step inside, and the first thing you notice is the building is dead quiet: no noisy air whooshing through louvers.

NPR TechnologyLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:03 AM


Roboticists learn to teach robots from babies in AI News

Roboticists learn to teach robots from babies
1 December 2015, 6:17 pm

A collaboration between developmental psychologists and computer scientists has demonstrated that robots can "learn" much like babies - by experiencing the world and eventually imitating humans.

Source: Artificial Intelligence News -- ScienceDaily

Started December 01, 2015, 11:00:05 PM


Existor in General Chat

Existor now has two entertaining, chatty avatars for you to choose from. Please tap or click on the one you’d like to talk to!
Deep, learning Artificial Intelligence, shared with Cleverbot, allows our avatars to express themselves fully - both words and emotions.
Teach a machine to behave the way you want it to with our Cleverscript, or simply by talking to it using our iOS app Cleverme. Our intelligent keyboard Tyyyp predicts what you’ll say next, while our Clevermessage and Clevertweet communicate for you!


Started December 01, 2015, 04:59:14 PM


The 'world's sexiest robot' Geminoid F turns heads in China in AI News

The 'world's sexiest robot' Geminoid F turns heads in China
28 November 2015, 12:00 am

                    An eerily life-like robot has been turning heads at the World Robot Exhibition in Beijing this week. Named Geminoid F, the robot has amassed a legion of fans, with some even describing her as 'the world's sexiest robot'.

Daily Mail - Sciencetech (UK)Link

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

3 Comments | Started November 29, 2015, 11:01:06 AM


My Date Brought A Date in Video

Started November 30, 2015, 09:46:17 PM


Is it just me or ?? in General Chat

Is it just me or does some of the people at ai_zone forum seem to be a tad testy when you bring up needing to do things different as far as chatbots go?  I think that there needs to be a different way of doing it as tossing facts at chatbots does not necessarily improve their conversational skills. There needs to be a way to give them common sense. Without it all the facts in the world won't make them better.

The very definition of  a chatbot is a program that talks like a person. IT carries on a conversation. It is not necessary for it to know everything. IT is necessary for it to somehow understand what we mean and the nuances of speech. Joy hardwood who created God Louise and i a long time ago spoke about this very thing. We agreed that something different needed to be done. There needed to be a new way to instill common sense to them. How that would be accomplished i am not sure but i think it some how needs to be close to it having true intelligence of some sort. Even animals can communicate with each other. I think that when we finally understand how the brain works that then we might be able to duplicate that in a computer program. Until then we are kinda swimming in the dark about it.

But it does not mean we should not keep trying because if nothing else chatbots are fun to play with.

22 Comments | Started November 24, 2015, 09:59:29 AM


Robots Podcast #196: : Marine robotics systems, with Stefan Williams in Robotics News

Robots Podcast #196: : Marine robotics systems, with  Stefan Williams
30 November 2015, 2:57 pm

sirius_auv-1024x678Link to audio file (20:03). Transcript below.

In this episode, Ron Vanderkley speaks with Stefan Williams of the University of Sydney’s Australian Centre for Field Robotics, Marine Systems Group. They discuss the future of Autonomous Underwater Vehicles (AUVs), and a recent expedition where they used multi-session SLAM to map the famous Antikythera Shipwreck (circa 60-80 B.C.), one of the richest ancient wrecks ever discovered. It is located under 55m of water on the NE coast of the island of Antikythera. The site is famous for the first Analog Computer known as the Antikythera Mechanism, a geared device designed to calculate and display celestial information, including phases of the sun and a luni-solar calendar.

The ACFR leads Australia’s Integrated Marine Observing System (IMOS) AUV Facility. IMOS is a nationally coordinated program designed to establish and maintain the research infrastructure required to support Australia’s marine science research. The IMOS AUV facility generates physical and biological observations of benthic variables that cannot be cost-effectively obtained by other means.

Stefan Williams

Stefan Williams is part of the Faculty of Engineering & Information Technologies, School of Aerospace, Mechanical and Mechatronic Engineering, University of Sydney’s Australian Centre for Field Robotics Marine Systems Group.




Ron Vanderkley: Stefan, can I get you to introduce yourself to our podcast listeners?

Stefan Williams: I’m a professor of Marine Robotics at the University of Sydney’s Australian Center for Field Robotics. We undertake both fundamental and applied research in the area of marine robotics and the development of underwater systems.

Ron Vanderkley: What is the main goal of the Marine Group?

Stefan Williams: Our focus over the last decade or so has been to develop tools and techniques for better understanding marine environments. We work in a variety of different application areas, from ecology to archeology and geoscience, and we work in the engineering research areas that underpin these applications – such as vehicle navigation, control planning, machine learning, and computer vision – in order to provide better understanding and information to our end users about the data we are collecting.

Ron Vanderkley: When you talk about autonomous marine surveying systems, what kind of capabilities are we talking about today? Is this a mix of manual and autonomous techniques? Where are the AUVs?

Stefan Williams: The main work that we do is on autonomous underwater vehicles. These are vehicles that are disconnected from the surface. We program them with a particular mission that we’d like them to execute, which might be a big long transect or a particular depth profile, or it might include very detailed studies over a particular feature of interest. Then a vehicle goes down and executes that mission. We typically stay in communication with the vehicle using acoustic modems; the vehicle tells us what it’s doing, it gives us feedback on the data that it’s collecting, and we have some supervisory control to tell it “no, you’re in the wrong spot, please come up” or “there’s some issue with the mission so abort” for example. We also have some ability to redirect the missions while they’re underway.

The data is collected onboard the vehicles. Once we recover them we download the data and build detailed 3D models of these marine habitats primarily using vision data, but we also collect multi-beam sonar, so we have acoustic data as well if we’re looking to map broader areas of the sea floor. That’s the core technology that has been developed around these autonomous platforms, but we’ve also built systems for both shallow water surveys using divers, and also very deep-water exploration using remotely operated vehicles. We also have platforms and sensor packages that we can use in other ways beyond just the autonomous platforms.

Ron Vanderkley: You spoke about the acoustic modems. What level of data can you get out of that? Is it to the level of telemetrics? Can you actually sift some data instead of just downloading it at the end of the mission?

Stefan Williams: Most of the data that we get back at the moment is supervisory in nature. The robot will tell us where it is, how deep it is, how far it is off the bottom, and the amount of data or number of images that it has collected. This lets us monitor the progress of the mission and make sure that things are on track. That modem is also piggybacked on positioning information, so it tells us where the vehicle is as an independent measurement from the ship. We then send both the ship position and the observations of the robot’s position relative to the ship down to the vehicle, and it updates its estimate of where it is using that information. We’re able to correct any drift in the navigation estimate using those external observations.

We’ve just taken on a new positioning and modem package in the past year or so that is capable of higher baud rates. In theory, we can get thumbnails of images. I’ve seen some work where they’re actually using this same sort of acoustic modem to do low-bandwidth video across an acoustic length, but that’s a capability we still have to develop.

Ron Vanderkley: I believe that your group is working on some novel enhancement to acoustic communication from the operator to the robot. Is that to increase the baud rate?

Stefan Williams: We’ve just been awarded an Australian Research Council Linkage Grant, which encourages the research community to engage with interested partners. This is a partnership with Tellus Australia – they do a variety of different developments in the marine space, particularly focused on defense applications. The project is really about managing the communication channels: it’s less about developing new acoustic communications equipment than it is about what information is best to send across the acoustic communication channel so that the operator can most effectively monitor the progress of the mission. In an information theoretic sense, we want to decide what information is best from the point of view of giving the operator an idea of what’s happening, and then allowing that operator to make decisions and query for further information from the vehicle. For instance, it might be used to refine and estimate where particular targets are in an environment, or to request more detailed high-resolution components of a particular area that might be of interest to the operator.

Ron Vanderkley: I also was aware of a multi-session SLAM-based AUV mapping system, and there was an example of a wreck that you worked on. What did that entail?

Stefan Williams: We’ve been involved over the past year in projects together with the Woods Hole Oceanographic Institute and the Hellenic Ministry of Culture and Sport (and a few other organizations) in mapping and excavating a 1st century BC shipwreck off the coast of the Greek island of Antikythera. This wreck was first discovered in 1900 and excavated around that time, yielding a whole trove of statues and other ancient artifacts – including something called the Antikythera Mechanism, which is one of the first examples of what people consider to be an analog computer. It was a geared mechanism that was used for predicting celestial motion over a fairly extended period of time to predict where the sun was and the phases of the moon, etc. It’s a really unique artifact that hasn’t been documented previously. It’s a pretty fascinating site to be exploring.

The objective of our work there was to provide the archeologists with a baseline map before they started further excavation on the site. We first visited the site in September 2014 and conducted surveys over the main target area – as well as looking at other targets of interest that had been identified using acoustic multi-beam data collected in the previous year.

We returned in June 2015 and mapped a very extensive part of the coastline around the wreck site because some of the diving operations they’d conducted in 2014 identified other areas of interest. We delivered a map of their target excavation sites and they were able to use those maps in situ. They actually had copies of the maps on iPads, and were able to annotate these while diving underwater in depths of 50-60 meters. They were able to log all of their finds and situate themselves on the wreck site relative to the maps that we generated. It provided them with a baseline of what the site looked like before they started with the excavation operations.

They’ve just finished their field research and an exhibit has opened in Basel, Switzerland showcasing some of the finds and describing the site and some of the archeological work that’s been going on there. It’s been a really exciting project to be part of and has attracted a lot of interest. Our part of that was using AUV systems to give them baseline high-resolution 3D maps of the wreck site.

Ron Vanderkley: It kind of sounds like “Jacques Cousteau meets robot”

Stefan Williams: In fact Jacques Cousteau visited the site in 1976 and they did some excavation that was part of their Calypso program. They did a lot of underwater exploration and recovered a few artifacts, but this is really the first time that the site has been systematically mapped.

Ron Vanderkley: What is the future of AUVs? Where are the challenges?

Stefan Williams: AUVs have a big role to play in expanding our understanding particularly in deepwater environments. Our main focus has been in coastal environments linked to ecological and biological studies. I think there is a role to play also in understanding how changes in oceanography (for example temperature, salinity, and current profiles) are affecting the marine habitat. We have a big program as part of Australia’s Integrated Marine Observing System Program, running AUVs to look at how key ecological environments around the country are changing over time. We revisit these areas, and work closely with biologists and ecologists to document changes.

Another big role for these autonomous platforms is in understanding deepwater environments – getting them out into the abyssal plains to find out more about what’s down in the deep parts of the oceans, of which we know very little about.

One thing that really excites me is looking at some of the advancements that were made in long-term autonomous deployment. A lot of these autonomous underwater vehicle systems are really starting to come into the norm. There’s some fantastic work going on around the world in designing and building these systems both on manned surface vessels that are capable of very long range operation, and on underwater vehicles that are capable of thousands of kilometers of deployment.

That really changes the dynamics of how ocean science can be conducted. It reduces the requirement of having people on very expensive ships out at sea for long periods of time if you can send these robotic systems off to work autonomously. We’ve seen the precursors in oceanography, with autonomous float systems being used to help oceanographers get a better understanding of ocean circulation and physical oceanography on a global scale. The next revolution will be autonomous platforms that can do mapping and excavation work over long periods of time in deep water across the world. I think there’s real potential in that area.

Ron Vanderkley: What is Australia’s role in that? Are we able to create our own submersibles? Are we concentrating on the programming in that arena? Where do you see us?

Stefan Williams: We have covered both areas natively in Australia. Some of these vehicles are being purpose-designed for particular tasks. We’re currently designing our next generation of hydrolution survey vehicle for supporting the Marine Robotics program, and there are programs in Queensland and Tasmania looking at requirements for working under ice or in complex reef environments.

There is work here in Australia in building these sorts of systems, though there is not a very large commercial scene yet (a lot of the commercial platforms are being developed in the US and Europe). But there’s certainly a lot of research effort focused on designing vehicles that are fit for purpose, and I think we’re particularly well known for the mapping and the navigation work that we do. We have links with all the major oceanographic groups around the world. I think Australia as a whole is very well regarded in robotics and autonomous systems – both within the marine space, and also in ground and air vehicles – and is having a big impact in this area.

Ron Vanderkley: Finally, I’d like to ask you about employment prospects in Australia. If you were a student going to university looking at the options, where would you point them?

Stefan Williams: We have a lot of students come both through undergraduate and postgraduate teaching and research schemes, and the majority of them do find work both domestically and internationally. There’s certainly a lot of activity currently in the robotics space. Some of the big companies like Google, Apple and Uber have made well documented forays into autonomous driving systems, and they’re certainly employing a lot of PhD students, post docs and research fellows who have been employed in various labs around the world, so there’s a lot of potential there. We have former students and staff embedded in the majority of those companies.

Here in Australia we have a domestic market with a number of small startups that are really doing well in developing robotics systems and sensor payloads. The experience students get – and the type of skills that they acquire through their study of mechatronics and robotics – have application across a wide variety of domains, from traditional engineering work, to software, electronics and mechanical design work … all the way through finance and business studies. There’s a lot of interest in data analytics and machine learning, and a lot of the tools and techniques that we apply in robotics can be transferred across a variety of different domains.

One of my most recent students is now heading up a small Machine Learning group at one of the major banks. The skills and underpinning tools required here in our lab to classify marine imagery have direct application into other areas. When students ask me about what jobs are out there for mechatronics engineers, I tell them that there are always good jobs for good students. One of the keys to that is really being passionate about the things that you’re studying, and I think robotics excites a lot of people.

I teach an experimental robotics course – it’s an elective in the 4th year mechatronics programs here at Sydney University. The things that the students do in the course of the 13-week curriculum are astounding. They put a lot of time, effort, and thought into building these robotic systems. I think they feel very rewarded at the end, and the feedback I’ve gotten has always been very positive.

Ron Vanderkley: Thank you, Stefan, for your time. I hope to speak to you again soon.

Stefan Williams: It’s a real pleasure, Ron. Thanks for helping to promote the activities of robotics here in Australia and overseas.

This audio interview was transcribed and edited for clarity. 


Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started November 30, 2015, 05:02:31 PM


Iclone 6 Test in Video

I will come to iclone 6. I started to learn, for 6. Lots of change.

25 Comments | Started November 15, 2015, 03:37:23 AM


technical and gametheory challenge - write code inside the game while playing it in AI Programming

There should be a massively multiplayer opensource game that we can build new code into from inside the game - A technical and gametheory challenge

Imagine a million people playing many kinds of games in a single 2d or 3d space, and when someone gets a good idea of how the slime monster could flow its slime better, they write a little (safely sandboxed) code and turn the slime flows dripping off it into a fluid puzzle game, and then somebody else drops in some custom shaped tiny boats and has a race on the waves.

Anything could happen. Rules would be agreed to through software in certain parts of the game instead of enforced in the core. Anything could be changed if you can get other players to click a button, or get their code to agree automatically after looking at it.

Its a challenge in both technical skill and gametheory, to build such a core of a massively multiplayer openended game that anyone can add code to while inside the game, from the tools found in the game. The challenge is extreme, but so would be the openended fun.

3 Comments | Started November 29, 2015, 07:21:31 AM


A.I. Takes a Stroll Through Amsterdam in AI News

A.I. Takes a Stroll Through Amsterdam
28 November 2015, 12:00 am

                    Read about the world's tallest building, the longest bicycle, the most modern tractor and a discovered time capsule that cannot be opened until the year 2957. Above: Shoe manufacturer New Balance is stepping onto the 3-D printing platform with a new running shoe that incorporates a 3-D printed midsole that can be customized to each runner.

Discovery - NewsLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

1 Comment | Started November 28, 2015, 11:00:04 PM
AI Virtual Pets

AI Virtual Pets in Other

Artificial life also called Alife is simply the simulation of any aspect of life, as through computers, robotics, or biochemistry. (taken from the Free dictionary)This site focus's on the software aspect of it.

Oct 03, 2015, 09:21:09 am
Why did HAL sing ‘Daisy’?

Why did HAL sing ‘Daisy’? in Articles

...a burning question posed by most people who have watched or read “2001: A Space Odyssey”: that is, why does the computer HAL-9000 sing the song ‘Daisy Bell’ as the astronaut Dave Bowman takes him apart?

Sep 04, 2015, 09:28:55 am

Humans in Robots on TV

Humans is a British-American science fiction television series. Written by the British team Sam Vincent and Jonathan Brackley, based on the award-winning Swedish science fiction drama Real Humans, the series explores the emotional impact of the blurring of the lines between humans and machines.

Aug 28, 2015, 09:13:37 am
Virtual Talk

Virtual Talk in Chatbots - English

[iTunes app] Virtual Talk is a AI chatting app that makes you talk with whomever you want. It remembers what you say and learns new dialogs. This app is one of the smartest chatbots in the world.

Aug 17, 2015, 13:33:09 pm
Robot Overlords

Robot Overlords in Robots in Movies

Not long after the invasion and occupation of Earth by a race of powerful robots wanting human knowledge and ingenuity, humans are confined to their homes. Leaving without permission would be to risk their lives. Monitored by the electronic implants in their necks, the robot sentries are able to track the movements of humans in order to control them. And if any person comes out of their home, they are given warnings by the robot sentries to get inside their home. If they do not comply, they are shot immediately.

Long article on the making of here...

Aug 15, 2015, 14:42:25 pm

Zerfoly in Chatbots - English

Zerfoly is a chatbot platform that makes it possible to create imaginary persons (chatbots) and teach them to talk to each other.

You will be able to let loose your creativity and imagination. Build persons, by writing interactive dialogues. The persons you create will gradually become individuals with unique personalities. One of the persons could bear your name and learn to talk like you; your alter ego. Another way of using Zerfoly is as an interactive diary.

Aug 09, 2015, 11:06:42 am

YARP in Robotics

YARP is plumbing for robot software. It is a set of libraries, protocols, and tools to keep modules and devices cleanly decoupled. It is reluctant middleware, with no desire or expectation to be in control of your system. YARP is definitely not an operating system.

Jul 31, 2015, 16:23:49 pm

Kimbot in Chatbots - English

Kimbot uses simple text pattern matching to search its database of past conversations for the most reasonable response to a given query. It learns by associating questions it asks with the responses that are given to it.

Jul 08, 2015, 10:10:06 am
Telegram Bot Platform

Telegram Bot Platform in Chatbots - English

Telegram is about freedom and openness – our code is open for everyone, as is our API. Today we’re making another step towards openness by launching a Bot API and platform for third-party developers to create bots.

Bots are simply Telegram accounts operated by software – not people – and they'll often have AI features. They can do anything – teach, play, search, broadcast, remind, connect, integrate with other services, or even pass commands to the Internet of Things.

Jul 06, 2015, 18:13:45 pm
Media Semantics

Media Semantics in Chatbots - English

Lets you create believable characters that present information and interact with users over the web.

Aug 06, 2008, 18:02:42 pm