Recent Posts

Pages: 1 2 [3] 4 5 ... 10
Home Made Robots / Re: Robot Message in a Bottle
« Last post by Freddy on February 22, 2017, 11:30:51 PM »
Interesting. Cute and furry might be the way to go if people are to submit ;)

Nice project and demonstration  O0
Graphics / Re: La-Masterpiece
« Last post by Freddy on February 22, 2017, 11:26:19 PM »
The images are interesting to look at. I was wondering how you are going to explain it - I mean what they represent.

Are you going to do a fly by video like a grand tour of the mind ?
General AI Discussion / Re: MusicNet
« Last post by keghn on February 22, 2017, 11:03:11 PM »
 Circles are nodes and the lines to them are edges in machine learning. In Neural network it one artificial neuron.
 But since artificial neural network is a sub directory of machine learning there could cross over.
Shell Ocean Discovery XPRIZE: Semi-finalists set sail on a journey to illuminate the ocean
22 February 2017, 3:30 pm

We have just taken another momentous step in the journey to unveil the hidden wonders of our own planet!

Since the launch of the Shell Ocean Discovery XPRIZE at the American Geophysical Union Fall Meeting in San Francisco in December 2015, individuals from around the world have been racing to form Teams and develop a range of groundbreaking technologies to access the deep-sea. Registration closed at the end of September 2016 with 32 bold Teams stepping forward to take on the challenge of mapping and imaging our ocean as never before.

Today, we announce the 21 semi-finalists Teams advancing in the Ocean Discovery XPRIZE. These innovative semi-finalist Teams, consisting of almost 350 individuals from 25 countries, represent a broad, impressive diversity of backgrounds and expertise, including middle and high school students, university students, maker-movement enthusiasts, and water and ocean industry professionals.

This diversity is also reflected in the array of technologies and techniques the semi-finalist Teams are developing. Some are proposing to use drones as a mechanism to drop subsea instruments into the water, while others are proposing to use drones that not only go through air, but then dive into the watery depths. Non-aerial Entries include autonomous surface vehicles carrying subsea robots that will return to the ‘mothership’ when their work is done, as well as vehicles and robots that will remain beneath the sea surface from the moment they leave the shore.

The 21 semi-finalist Teams will now have to prove their technology in the field in Round 1 testing, which will take place later this year. As with every XPRIZE, the Shell Ocean Discovery XPRIZE sets audacious goals. In Round 1, the Entries that these Teams are developing will have only 16 hours to map the sea-floor at depths of 2000m and produce a high-resolution map (at least 5m horizontal resolution, at least 0.5m vertical resolution) of at least 20% of the 500km2 Competition Area. Additionally, Teams will have to bring back 5 images of an archeological, biological, or geological feature, as well as an image of an object that we will specify. As if this wasn’t enough, they have an additional hurdle to overcome – they will have to deploy from the shore with no humans allowed in the competition area.

To put this challenge into perspective, it can take days to map 500 km2 of the ocean using current state-of-the-art technologies. If you want 5m accuracy, mapping can take over a week and requires going out on a ship, which can easily cost over $60,000 per day, making it a very expensive endeavor. The cutting-edge technology that will come out of this XPRIZE will truly revolutionize our ability to access the ocean.

In addition to the main Prize of $6 million, a dozen Teams have also opted to compete for the National Oceanic and Atmospheric Administration Bonus Prize of $1 million. These Teams are developing pioneering technology to detect a chemical or biological signal underwater and then autonomously track that signal back to its source.

Partnering with us to truly make this competition a global success are industry titans, Fugro and Esri. Fugro, the industry leader in ocean mapping and survey, will be provide the baseline maps against which the judging panel will assess the Entries. Esri, the global leader in GIS, is providing software to the Teams to produce their maps.

We are looking forward to an exciting competition with exponentially evolving technologies, including smart underwater robots, robot swarms, drones, and artificial intelligence to perform the tasks that have been laid out. These Teams will really take us where no one has gone before! As their journeys of discovery set sail, we invite you to follow the competion and this phenomenal mission to map our beautiful planet.

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
Drones for good 2.0: How WeRobotics is redefining the use of unmanned systems in developing countries
22 February 2017, 12:30 pm

Robotics undoubtedly has the potential to improve lives in the developing world. However, with limited budgets and expertise on the ground, putting this technology in place is no small task. Step forwards WeRobotics, a new Swiss/American NGO dedicated to meeting this goal through the creation of in-country ‘flying labs’. Co-founder Adam Klaptocz explains all.

Hey Adam. Let’s start with this: what is WeRobotics?

WeRobotics is a non-governmental organisation (NGO) dedicated to bringing developing countries access to robotics technology. Such countries don’t have access because of economics, supply chains and many other reasons. Our role is to help local people learn and use this technology, and then to deploy it for social good projects within their own communities.

We do this by establishing ‘flying labs’ in different countries. The goal of each is to serve as a hub of robotics technology, where staff host training sessions, webinars and teach people how to use technology. Each lab will have drones and software, on hand and in the cloud, thanks to our technology company partners. Labs will also function as an incubator for new, local, drone-based businesses.

WeRobotics co-founder Adam Klaptocz (right) and Nepal Flying Labs Coordinator Uttam Pudasaini, running a drone mapping training session in Kathmandu, Nepal. And who is behind WeRobotics?

We have four co-founders; myself and Sonja Betschart are based in Switzerland, and Patrick Meier and Andrew Schroeder, who are based in the U.S. We are a 501c3 non-profit organisation in the United States and have just founded a sister organisation in Switzerland too. Besides that, all of our flying labs will be legal entities as well, to go along with the concept of building local capacities and driving local engagement.

The goal is to engage local lab coordinators, local organisations, and ideally make the local flying lab its own organisation that has its own activities and is sustainable

The goal is to engage local lab coordinators, local organisations, and ideally make the local flying lab its own organisation that has its own activities and is sustainable

The goal is to engage local lab coordinators, local organisations, and ideally make the local flying lab its own organisation that has its own activities and is sustainable

And you’re talking about unmanned robots in their most general sense, not just as a synonym for flying drones?

That’s right. At WeRobotics, we’re looking at robotics in the largest sense—it’s not just eBee drones for mapping, it’s also things like drones for delivery, marine robots, underwater robots, and terrestrial robots. Anything to do with robotics and the technical ecosystem that goes around it. That includes the software and the platforms people use to share a drone’s data because, in the end, drones are just a tool for collecting and turning the data into actionable information.

We have tech partners, such as senseFly and other drone and software companies, with whom we are spreading this technology throughout our network of labs. The labs will then run training programmes, serve as incubators for locals who want to get into the business, and host conferences where these adopters can share their experiences.

So do you already have your first lab set-up? Where is that?

Absolutely, we’re a young NGO, but we’re trying to expand quite quickly.

Our first lab was officially launched recently in Nepal. It’s called Nepal Flying Labs. We have a local lab coordinator there and we’ve partnered with Kathmandu University, ICIMOD, which is a mountain and glacier study organisation, NAXA and Medair. We’re working with government agencies as well, including the survey department of the Nepalese government. So we have a certain number of partners that we’re working with to roll out projects and run trainings.

We have projects that will be running throughout the next year in Kathmandu and the surrounding areas; a lot of which is post-earthquake reconstruction work after last year’s earthquake. Also, a lot of climate-based research work.

That must be ground zero, that region, for such scientific study?

Exactly. Nepal is basically a giant mountain so the effects of climate change are really affecting communities there. There are a lot of landslides – the Himalayas are melting, the glaciers are melting – so there is a lot of work to be done in terms of disaster prevention and also in post-disaster reconstruction.

And in Tanzania, where we are setting up more flying labs. I know there’s already been a lot of work there by senseFly, Drone Adventures [case study], The World Bank and other organisations. We’re hoping to work with these organisations and others to consolidate a lot of the work that’s been done; to make it sustainable for years to come within Tanzania.

Our third major lab that we are setting up before the end of the year is in Peru. In Peru, we will start in the Amazon. Our first pilot project there is about drone delivery, but we’ll also be spinning out drone mapping as a business and other types of robotics as well.

In the countries where you operate, is one of the groups you’re engaging with the government itself? Do you see your role to also help governments regulate the use of drones?

The mandate of each flying lab, as an NGO, isn’t to replace government, it’s to help.

So the first thing to do is to work with the government and try to help boost, for example, its capacity for mapping. That’s something that we believe in strongly. We’re already working with the survey department in Kathmandu and we’ll be working with the Lands Department in Tanzania as well to try to reinforce their capacity to serve their own people. On top of that, of course, we want to be able to work within countries – within the rules of those countries – and so we’re trying to help with regulations, by working with the regulatory authorities in all of these countries.

A lot of developing countries are looking for guidance from other countries – where robotic technology might be a little more established – to see what is happening. That’s another thing that we hope, as an international NGO with locations all around the world, we’ll be able to facilitate—helping governments to learn what works in other countries, such as those our network covers, and helping them to work together, often with the civil aviation authority and other government agencies. It’s about implementing rules that make drone use responsible, but also leave the market open so that we can incubate businesses. Like that, the economy can develop around this technology instead of being blocked.

You mentioned before we started recording that the key, underpinning what WeRobotics is about, is building capacity on the ground.

Yes, that’s the key, local capacity. It shouldn’t have to be that a bunch of foreigners come in to do this work whichever way they want, often with their own desires, with their own culture and their own needs. Basically, making themselves feel good about helping instead of actually helping.

What would be best, if this ever happens again in Nepal, is that there are enough drone companies locally that know what they’re doing, and understand the legal framework within the country, so that they can do the job instead of having to bring in foreigners. And that goes for every country. I think that, in development, the more local capacity you build, the more you can really bring the country out of poverty so that it can have its economy self-sustain itself.

That’s our goal with the flying labs: to build self-sustainable local entities that build local capacity but have global contact with other labs and with our global technology partners.

What specific UAV platforms do you operate at present and how are these being used?

I truly believe that every job has a tool that’s best adapted to it. In Nepal, right now, our flying lab has aneBee and a Parrot Bebop drone. The eBee is used because Nepal is obviously very mountainous and having the capacity to deal with 3D flight mapping is very important.

Meanwhile, in Tanzania there are larger, mainly flat zones to be mapped. For that we have a Sky-Watch Cumulus drone. This is a longer-range drone that can fly for two hours and cover very large areas. So there that’s the best adapted drone. We’re also bringing a Bebop to Tanzania, because there is a big interest in drone journalism there and for that it’s important to have something that is small, cheap and easily deployed.

Then in Peru, we’re working with hybrid drones that are made more for cargo delivery. One option isQuantum Systems, which is a German company that makes even bigger, VTOL drones with an even bigger payload capacity.

So, different drones for different jobs.

100% platform agnostic.

Yes. One of our goals as an NGO is to remain more or less independent of technology. What’s important is that the job can be done with the best fit technology, in an economically viable way. We need UAVs that are easy to use, easy to train on, and relatively mature as well.

One of our goals as an NGO is to remain more or less independent of technology. What’s important is that the job can be done with the best fit technology, in an economically viable way.

We do work with some startups, but in general what you really need, if you want to make something work in the jungles of Papua New Guinea or in the middle of the desert somewhere, is a machine that’s durable and that has existed for a little while. A platform that has some sort of customer support is needed too, so it’s important that our technology company partners are happy to be involved as well.

If you imagine a few years down the road, having achieved your short and medium term goals, how does the world of WeRobotics look then?

We see ourselves, a few years down the road, having many flying labs around the world. We’re not sure if we’ll have five, or ten, or if there’ll be 50, but the goal is that we have setup a network of many flying labs in different countries.

And you’re planning on developing some in-house engineering capacity too?

Yes, because there are many problems, like conservation in developing countries, which can be solved with existing technology. But there are also many problems that can be solved with technology, but the technology doesn’t exist yet, because the people developing the technology—whether it’s Silicon Valley, China, or Europe—aren’t necessarily aware of the problems that exist in developing countries.

Whatever works, whatever’s needed to meet your goals on the ground then?

Yes. Unfortunately, there is no lack of work in development. Poverty is not disappearing next year. It’s not a market that you have to grasp today otherwise tomorrow it has been taken over. There is a huge amount of work left to do, and there’s definitely enough to go around.

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
Robotics News / At what point should an intelligent machine be considered a person?
« Last post by Tyler on February 22, 2017, 10:48:24 AM »
At what point should an intelligent machine be considered a person?
22 February 2017, 9:00 am

Science fiction has already explored the theme of robot rights, such as the film Bicentennial Man. Image: Columbia Pictures Science fiction likes to depict robots as autonomous machines, capable of making their own decisions and often expressing their own personalities. Yet we also tend to think of robots as property, and as lacking the kind of rights that we reserve for people.

But if a machine can think, decide and act on its own volition, if it can be harmed or held responsible for its actions, should we stop treating it like property and start treating it more like a person with rights?

What if a robot achieves true self-awareness? Should it have equal rights with us and the same protection under the law, or at least something similar?

These are some of the issues being discussed by the European Parliament’s Committee on Legal Affairs. Last year it released a draft report and motion calling for a set of civil law rules on robotics regulating their manufacture, use, autonomy and impact upon society.

Of the legal solutions proposed, perhaps most interesting was the suggestion of creating a legal status of “electronic persons” for the most sophisticated robots.

Approaching personhood The report acknowledged that improvements in the autonomous and cognitive abilities of robots makes them more than simple tools, and makes ordinary rules on liability, such as contractual and tort liability, insufficient for handling them.

For example, the current EU directive on liability for harm by robots only covers foreseeable damage caused by manufacturing defects. In these cases, the manufacturer is responsible. However, when robots are able to learn and adapt to their environment in unpredictable ways, it’s harder for a manufacturer to foresee problems that could cause harm.

The report also questions about whether or not sufficiently sophisticated robots should be regarded as natural persons, legal persons (like corporations), animals or objects. Rather than lumping them into an existing category, it proposes that a new category of “electronic person” is more appropriate.

The report does not advocate immediate legislative action, though. Instead it proposes that legislation be updated if robots become more complex; if and when they develop more behavioural sophistication. If this occurs, one recommendation is to reduce the liability of “creators” proportional to the autonomy of the robot, and that a compulsory “no-fault” liability insurance could cover the shortfall.

But why go so far as to create a new category of “electronic persons”? After all, computers still have a long way to go before they match human intelligence, if they ever do.

But it can be agreed that robots – or more precisely the software that controls them – is becoming increasingly complex. Autonomous (or “emergent”) machines are becoming more common. There are ongoing discussions about the legal liability for autonomous vehicles, or whether we might be able to sue robotic surgeons.

These are not complicated problems as long as liability rests with the manufacturers. But what if manufacturers cannot be easily identified, such as if open source software is used by autonomous vehicles? Whom do you sue when there are millions of “creators” all over the world?

Artificial intelligence is also starting to live up to its moniker. Alan Turing, the father of modern computing, proposed a test in which a computer is considered “intelligent” if it fools humans into believing that the computer is human by its responses to questions. Already there are machines that are getting close to passing this test.

There are also other incredible successes, such as the computer that creates soundtracks to videos that are indistinguishable from natural sounds, the robot that can beat CAPTCHA, one that can create handwriting indistinguishable from human handwriting and the AI that recently beat some of the world’s best poker players.

Robots may eventually match human cognitive abilities and they are becoming increasingly human-like, including the ability to “feel” pain.

If this progress continues, it may not be long before self-aware robots are not just a product of fantastic speculation.

The EU report is among the first to formally consider these issues, but other countries are also engaging. Peking University’s Yueh-Hsuan Weng writes that Japan and South Korea expect us to live in a human-robot coexistence by 2030. Japan’s Ministry of Economy, Trade and Industry has created a series of robot guidelines addressing business and safety issues for next generation robots.

Electronic persons If we did give robots some kind of legal status, what would it be? If they behaved like humans we could treat them like legal subjects rather than legal objects, or at least something in between. Legal subjects have rights and duties, and this gives them legal “personhood”. They do not have to be physical persons; a corporation is not a physical person but is recognised as a legal subject. Legal objects, on the other hand, do not have rights or duties although they may have economic value.

Assigning rights and duties to an inanimate object or software program independent of their creators may seem strange. However, with corporations we already see extensive rights and obligations given to fictitious legal entities.

Perhaps the approach to robots could be similar to that of corporations? The robot (or software program), if sufficiently sophisticated or if satisfying certain requirements, could be given similar rights to a corporation. This would allow it to earn money, pay taxes, own assets and sue or be sued independently of its creators. Its creators could, like directors of corporations, have rights or duties to the robot and to others with whom the robot interacts.

Robots would still have to be partly treated as legal objects since, unlike corporations, they may have physical bodies. The “electronic person” could thus be a combination of both a legal subject and a legal object.

The European Parliament will vote on the resolution this month. Regardless of the result, reconsidering robots and the law is inevitable and will require complex legal, computer science and insurance research.

This article was originally published on The Conversation. Read the original article.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
General AI Discussion / Re: MusicNet
« Last post by korrelan on February 22, 2017, 08:17:49 AM »
Which circles?

Do you mean the actual graphical representation of a neuron?

Home Made Robots / Robot Message in a Bottle
« Last post by on February 22, 2017, 04:04:38 AM »


Robot in a bottle with a message for a recipient to make a discovery some time from now in the future.

In this bottle I indoctrinate artificial intelligence developing hypothetical general knowledge. 

Raising awareness that helps people understand to accept the many advantages of being controlled by capable robots performing complex operations faster in a new, automated easier, way of life for the human race.


Robotics » Home Made Robots » Robot Head in a Bottle

Chatbot Tool Kit

General AI Discussion / Re: MusicNet
« Last post by yotamarker on February 22, 2017, 12:18:06 AM »
could someone explain those circles shown when he talks about neural networks
Robotics News / Ocado evaluating robotic manipulation for online shopping orders
« Last post by Tyler on February 21, 2017, 10:50:46 PM »
Ocado evaluating robotic manipulation for online shopping orders
21 February 2017, 3:55 pm

Ocado, the world’s largest online-only supermarket, has been evaluating the feasibility of robotic picking and packing of shopping orders in its highly-automated warehouses through the SoMa project, a Horizon 2020 framework programme for research and innovation funded by the European Union.

SoMa is a collaborative research project between the Technische Universität Berlin (TUB), Università di Pisa, Istituto Italiano di Tecnologia, Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), the Institute of Science and Technology Austria, Ocado Technology, and Disney Research Zurich.

One of the main challenges of robotic manipulation has been the handling of easily damageable and unpredictably shaped objects such as fruit and vegetable groceries. These products have unique shapes and should be handled in a way that does not cause damage or bruising. To avoid damaging sensitive items, the project uses a compliant gripper (i.e. one that possesses spring-like properties) in conjunction with an industrial robot arm.

The variation in shape of the target objects imposes another set of constraints on the design of a suitable gripper. The gripper must be sufficiently versatile to pick a wide variety of products, including Ocado’s current range which includes over 48,000 hypermarket items.

How RBO softhand could help address these challenges The SoMa project (EU Horizon 2020 GA 645599) aims to design compliant robotic hands that are suitable for handling fragile objects without much detailed knowledge of an item’s shape; in addition, the robotic arms should also be capable of exploiting environmental constraints (physical constraints imposed by the environment). The goal is to develop versatile, robust, cost-effective, and safe robotic grasping and manipulation capabilities.

An example of a compliant gripper is the RBO Hand 2 developed by the Technische Universität Berlin (TUB). The gripper uses flexible rubber materials and pressurized air for passively adapting grasps which allows for safe and damage-free picking of objects. With seven individually controllable air chambers, the anthropomorphic design enables versatile grasping strategies.

Due to its compliant design, the robotic hand is highly under-actuated: only the air pressure is controlled, while the fingers, palm, and thumb adjust their shape to the given object geometry (morphological computation). This simplifies control and enables effective exploitation of the environment.

Integrating the RBO Hand 2 with an industrial manipulator and testing with a standard object set The Ocado Technology robotics team replicated a production warehouse scenario in order to evaluate the performance of the RBO Hand 2 for Ocado’s use case. The team mounted the soft hand on two different robot arms, a Staubli RX160L and a KUKA LBR iiwa14. Both of these arms can operate in the standard position controlled mode; in addition to this, the KUKA provides the capability of demonstrating a certain amount of software controlled compliance in the arm.

Ocado designed a set of experiments to evaluate grasping performance on an example set of artificial fruit stored in an IFCO (International Fruit Container) tray. The adopted strategies attempted to exploit environmental constraints (e.g. the walls and the bottom of the tray) to perform the gripping tasks successfully.

The experiments started with the simple scenario of grasping a single object from the example set using only the bottom of the tray. Initial results showed that the hand is able to successfully grasp a variety of shapes and the results suggested the chance of success increased when environmental constraints are being used effectively to restrict the movement of the object.

In the coming months, Ocado plans to explore more complex scenarios, adding more objects in the IFCO, and introducing additional environmental constraints that could be exploited by a grasping strategy.

If you liked this article, you may also enjoy these:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
Pages: 1 2 [3] 4 5 ... 10


Please login or register.

Login with username, password and session length
mini a.i puzzles
by Freddy (General AI Discussion)
Today at 09:45:52 PM
trivia questions
by yotamarker (General AI Discussion)
Today at 07:54:30 PM
Robot Message in a Bottle
by (Home Made Robots)
Today at 04:04:21 AM
by LOCKSUIT (Graphics)
February 24, 2017, 10:26:32 PM
Auto-Food-Drone type delivery
by Art (General Chat)
February 24, 2017, 06:52:28 PM
Galton Machines or Bean Machines
by Freddy (General AI Discussion)
February 24, 2017, 03:13:07 AM
3D Printing
by Art (General Chat)
February 23, 2017, 12:51:23 PM
by keghn (General AI Discussion)
February 22, 2017, 11:03:11 PM
Robust bipedal Cassie to transform robot mobility
by Tyler (Robotics News)
February 24, 2017, 04:48:44 PM
Artificial intelligence: Understanding how machines learn
by Tyler (Robotics News)
February 24, 2017, 10:49:13 AM
Hard at work: A review of the Laevo Exoskeleton
by Tyler (Robotics News)
February 23, 2017, 04:48:12 PM
Shell Ocean Discovery XPRIZE: Semi-finalists set sail on a journey to illuminate the ocean
by Tyler (Robotics News)
February 22, 2017, 10:48:25 PM
Drones for good 2.0: How WeRobotics is redefining the use of unmanned systems in developing countries
by Tyler (Robotics News)
February 22, 2017, 04:48:07 PM
At what point should an intelligent machine be considered a person?
by Tyler (Robotics News)
February 22, 2017, 10:48:24 AM
Ocado evaluating robotic manipulation for online shopping orders
by Tyler (Robotics News)
February 21, 2017, 10:50:46 PM
Motor control systems: Bode plots and stability
by Tyler (Robotics News)
February 21, 2017, 04:48:52 PM

Users Online

21 Guests, 1 User
Users active in past 15 minutes:

Most Online Today: 50. Most Online Ever: 208 (August 27, 2008, 08:24:30 AM)