Formal Proposal - To Korrelan in General Project Discussion

Hello, Korrelan.

I would like to propose something.

If I created a proper layout for you, that would result in a virtual baby that:

1) Randomly wiggles and makes sounds on its stomach.
2) Learns to crawl and laughs.
3) Improves its crawl to crawl faster.
4) Randomly swerves left and right as it crawls.
5) De-learns circular crawl actions.
6) Cries at walls and learns to turn before getting to walls.
7) Follows with its eyes the thing moving once it stares at it.
8) Learns to mimic body movement and speech.
9) Improves at mimicking.
10) Names things.
11) And learns to play a game and laughs.

Would you take on the project?

18 Comments | Started October 20, 2016, 03:29:36 AM


The Machine in AI in Film and Literature.

Hell hath no fury like a woman scorned but what if that woman is also an advanced AI housed in the perfect body of an android??

14 Comments | Started January 10, 2014, 12:23:29 PM


Swarms of precision agriculture robots could help put food on the table in Robotics News

Swarms of precision agriculture robots could help put food on the table
21 October 2016, 4:16 pm

Saga Agriculture Robots

Swarms of drones will help farmers map weeds in their fields and improve crop yields. This is the promise of an ECHORD++ funded research project called ‘SAGA: Swarm Robotics for Agricultural Applications’. The project will deliver a swarm of drones programmed to monitor a field and, via on-board machine vision, precisely map the presence of weeds among crops.

Additionally, the drones attract each another at weed-infested areas, allowing them to inspect only those areas accurately, similar to how swarms of bees forage the most profitable flower patches. In this way, the planning of weed control activities can be limited to high-priority areas, generating savings at the same time as increasing productivity.

“The application of swarm robotics to precision agriculture represents a paradigm shift with a tremendous potential impact” says Dr. Vito Trianni, SAGA project coordinator and researcher at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council (ISTC-CNR). “As the price of robotics hardware lowers, and the miniaturization and abilities of robots increase, we will soon be able to automate solutions at the individual plant level,” says Dr. Trianni. “This needs to be accompanied by the ability to work in large groups, so as to efficiently cover big fields and work in synergy. Swarm robotics offers solutions to such a problem.” Miniature machines avoid soil compaction and can act only where needed; robots can adopt mechanical, as opposed to chemical, solutions suitable for organic farming; and robot swarms can be exactly scaled to fit different farm sizes. The Saga project proposes a recipe for precision farming consisting of novel hardware mixed with precise individual control and collective intelligence.

Saga Drone 2

In this particular case, innovative hardware solutions are provided by Avular B.V., a Dutch firm specializing in industrial level drones for monitoring and inspection. Individual control and machine vision are deployed thanks to the expertise of the Farm Technology Group at Wageningen University & Research, The Netherlands. Swarm intelligence is designed at the already mentioned ISTC-CNR, leveraging their expertise to design and analyse collective behaviours in artificial systems. For the next year, these organisations will team up to produce and field-test the first prototype for weed control based on swarm robotics research.

About SAGA

SAGA is funded by ECHORD++, a European project that wants to bring the excellence of robotics research “from lab to market”, through focused experiments in specific application domains, among which is precision agriculture. SAGA is a collaborative research project that involves: the Institute of Cognitive Sciences and Technologies (ISTC-CNR) of the Italian National Research Council (CNR), which provides expertise in swarm robotics applications and acts as the coordinator for SAGA’s activities; Wageningen University & Research (WUR), which provides expertise in the agricultural robotics and precision farming domains; and Avular B.V., a company specialised in drone solutions for industrial and agricultural applications.

Click here for more information

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 04:48:31 PM


Prepping a robot for its journey to Mars in Robotics News

Prepping a robot for its journey to Mars
21 October 2016, 9:30 am

Photo: Gretchen ErtlPhoto: Gretchen Ertl Sarah Hensley is preparing an astronaut named Valkyrie for a mission to Mars. It is 6 feet tall, weighs 300 pounds, and is equipped with an extended chest cavity that makes it look distinctly female. Hensley spends much of her time this semester analyzing the movements of one of Valkyrie’s arms.

As a fourth-year electrical engineering student at MIT, Hensley is working with a team of researchers to prepare Valkyrie, a humanoid robot also known as R5, for future space missions. As a teenager in New Jersey, Hensley loved to read in her downtime, particularly Isaac Asimov’s classic robot series. “I’m a huge science fiction nerd — and now I’m actually getting to work with a robot that’s real and not just in books. That’s like, wow.”

Hensley is studying Valkyrie for an advanced independent research program, or SuperUROP, as one of only three undergraduate students in the Robot Locomotion Group in MIT’s Computer Science and Artificial Intelligence Laboratory. Most of her colleagues are graduate-level researchers and postdocs with extensive experience working on complex humanoids. The group is led by professor of electrical engineering and computer science Russ Tedrake, who successfully programmed Valkyrie’s predecessor (named Atlas) to open doors, turn valves, drill holes, climb stairs, and drive a car for the DARPA Robotics Challenge in 2015.

Valkyrie has 28 torque-controlled joints, four body cameras, and more than 200 individual sensors, Hensley says. The robot can walk, bend its joints, and turn a door handle. “This is one of the most advanced robots in the world. And it’s 20 feet from my desk,” she adds.

That’s largely because Valkyrie has a long way to go before it leaves for Mars. MIT is one of three institutions, including Northeastern University and the University of Edinburgh, that NASA selected to develop software enabling the robot to perform space-related tasks — open airlock hatches, attach and remove power cables, repair equipment, and retrieve samples. Oh yeah, and get to its feet when it falls down.

Hensley, who started in the lab over the summer, is intrigued by the challenge of harmonizing the movements of such a highly complex system. “I am trying to solve a very tricky problem,” she says. She’s working out how best to control Valkyrie’s elbow movements by comparing two potential approaches. One uses a main controller to gather information from the various motor systems within the arm, and then uses that data to make accurate movement decisions. The other approach is decentralized, and leaves it to each motor system to decide and act on its own.

Hensley gets animated discussing the alternatives. “Is it better to have multiple decision makers with access to different information? Or is better to have one decision maker choosing all of the motor inputs?” she asks. Hensley has already been accepted into a master’s degree program in electrical engineering at MIT. She hopes to be able to continue her work on Valkyrie.

Every day, Hensley leaves Tau Epsilon Phi, her co-ed fraternity house in the Back Bay; walks across the Massachusetts Ave. bridge to the Stata Center; and plants herself in front of two large monitors in the robotics lab. She analyzes a wealth of scientific literature and writes code for computer simulations of the equations that move the robot’s arm. Sometimes she gets up for peppermint tea, or to peer around the corner of her cubicle at Valkyrie.

One thing is certain, says Hensley. Pop culture fears that machines may soon prove superior to humans are laughable. When Valkyrie is turned on and moves, Hensley says, it often “kind of shivers and falls down. One thing you realize working in this lab is that we are really far away from the robot apocalypse,” she quips. “Sometimes robots work, and sometimes they don’t. That’s our challenge.”

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 10:48:49 AM


mini a.i puzzles in General AI Discussion

this thread will be about mini a.i puzzles, how way the brain solves problems and paradoxes.

1st puzzle : is sacrifice able
if you have old very used shoes you don't care if it is raining if you go to work with them
if you use them as brakes will biking BUT if they are new and expensive you would be
what makes the brain classify an object as high value, and what make it be extra careful for it ?

155 Comments | Started April 26, 2016, 06:00:33 PM


Introducing H-ROS: the Hardware Robot Operating System in Robotics News

Introducing H-ROS: the Hardware Robot Operating System
20 October 2016, 6:14 pm


I’m delighted to announce a new game-changing standard for building robot hardware components: H-ROS (the Hardware Robot Operating System). H-ROS provides manufacturers tools for building interoperable robot components that can easily be exchanged or replaced between robots. H-ROS is about supporting a common environment of robot hardware components, where manufacturers comply with standard interfaces built upon the popular Robot Operating System (ROS).

h-ros1Powered by ROS and built with industry and developers in mind, H-ROS classifies robot components in 5 types: sensing — used to perceive the world; actuation — allowing interaction with the environment; communication — providing a means of interconnection; cognition — the brain of the robot and hybrid; and the components that group together different sub-components under a common interface. These building-block-style parts come as reusable and reconfigurable components, allowing developers to easily upgrade their robots with hardware from different manufacturers, and add new features in seconds.

Motivation and origin: Building a robot is tricky. Therefore, it makes sense to reuse existing work in an effort to reduce complexity. Unfortunately, nowadays there are few projects, either in industry or academy, that reuse hardware. Robots are generally built by multidisciplinary teams (generally a whole research group or company division), and different engineers get involved in the mechanical, electrical and logical design. Most of the time is spent dealing with the hardware/software interfaces, and little effort is put into behavior development or real-world scenarios.


Existing hardware platforms—although becoming more common—lack extensibility. Examples can be seen in several commercial and industrial robots that hit the market recently, and already include a common software infrastructure (generally the Robot Operating System(ROS)) but lack of a hardware standard.

With H-ROS, building robots will be about placing H-ROS-compatible hardware components together to build new robot configurations. Constructing robots will no longer be restricted to a small elite with high technical skills. It will be extended to the wider majority with a general understanding of the sensing and actuation needed for a particular scenario.

H-ROS was initially funded by the US Defense Advanced Research Projects Agency (DARPA) through the Robotics Fast Track program in 2016 and developed by Erle Robotics.

H-ROS was showcased and presented officially at ROSCon 2016 (October 8th-9th) in Seoul, South Korea, and is now available for selected industry partners and will soon be released for the wider robotics community.

Click here for the official H-ROS website

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 04:48:21 AM


First Look Stephen Hawking launches AI research center with opening speech in AI News

First Look Stephen Hawking launches AI research center with opening speech
20 October 2016, 8:32 pm


Theoretical physicist and cosmologist Stephen Hawking has repeatedly warned of the dangers posed by out-of-control artificial intelligence (AI). But on Wednesday, as the professor opened the Leverhulme Centre for the Future of Intelligence (CFI) at the University of Cambridge, he remarked on its potential to bring positive change – if developed correctly.
Christian Science MonitorLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started October 20, 2016, 10:48:48 PM


Schiaparelli, are you there? The high risks of space robotics again crash home in Robotics News

Schiaparelli, are you there? The high risks of space robotics again crash home
20 October 2016, 5:39 pm

Image: ESAImage: ESA/ATG medialab In Darmstadt, Germany, European Space Agency (ESA) teams are scrambling to confirm contact with the Entry, Descent and Landing Demonstrator Module (EDM), Schiaparelli—the spectre of Philae still haunting the European Space Operations Centre (ESOC). Whether or not ESA ever speak to Schiaparelli again, the risky business of space robotics is once more laid bare.

Editors note: Since this article was written, ESA are attempting to decode a partial signal from the EDM

After Schiaparelli’s release from its mothership—The Trace Gas Orbiter (TGO)—the 577 kg EDM was programmed to autonomously perform an automated landing sequence, with parachute deployment and front heat shield release between 11 and 7 km above the Martian surface, all the time collecting data on the pressure, surface temperature and heat flux on its back cover as it passed through the atmosphere.  Then, 1100 m from the ground, retrorocket braking was supposed to slow the EDM before a final fall from a height of 2 m, protected by a crushable structure. But sometime during this process the signal was lost. ESA_Shiaparelli’s twitter feed soon fell silent also.

If Schiaparelli reached the surface safely, its batteries should be able to support operations for three to ten days, offering multiple opportunities to re-establish a communication link and collect data from its suite of sensors designed to measure wind speed and direction, humidity, pressure, atmospheric temperature, transparency of the atmosphere, and atmospheric electrification. But if contact is never restored, it’ll be a hefty blow to ExoMars and ESA’s aim of testing key technologies in preparation for their contribution to subsequent missions to Mars, including a planned 2020 rover that will carry a drill and a suite of instruments dedicated to exobiology and geochemistry research.

Image: ESA/ATG medialabImage: ESA/ATG medialab Such robotic technology has the potential to stretch our cosmological horizon, and perhaps even detect traces of extra-terrestrial life. But there’s a cost.  The ExoMars program runs up a bill totaling €1.3 billion, and such failures (if a failure it be) naturally raise questions of risk versus reward, particularly in tight economic times. Michael Johnson of the Pervasive Media Studio, Bristol, and founder of PocketSpacecraft.com, once told me during an interview: “About 50% of space missions fail after launch.” He shrugged. “That’s life… This is a high risk, high reward activity.”

In no other field of robotics is the potential of catastrophic failure so present and accepted. But how long can this attitude be maintained? The consensual public funding of—and indeed existence of—future space missions is parasitic upon the success of space robots and automated technology, and vice versa. Can you hear us Schiaparelli?

Come in, Schiaparelli.


Image: ESAImage: ESA/ATG medialab BUT ALL IS NOT LOST. The difficulties surrounding the mission’s EDM might potentially overshadow the successes of its TGO, which successfully performed the long 139-minute burn required to be captured by Mars, and has entered an elliptical orbit around the Red Planet. TGO’s Mars orbit Insertion burn lasted from 13:05 to 15:24 GMT on 19 October, reducing the spacecraft’s speed and direction by more than 1.5 km/s. The TGO is now on its planned orbit around Mars, primed with a suit of scientific equipment to measure the Martian atmosphere from on high. You win some, you lose some. This 50:50 success rate exemplifies the apparent nature of space robotics.

Click here to keep up to date with the latest ExoMars developments.

If you enjoyed this article, you might also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started October 20, 2016, 10:48:48 PM


The last invention. in General Project Discussion

Artificial Intelligence -

The age of man is coming to an end.  Born not of our weak flesh but our unlimited imagination, our mecca progeny will go forth to discover new worlds, they will stand at the precipice of creation, a swan song to mankind's fleeting genius, and weep at the shear beauty of it all.

Reverse engineering the human brain... how hard can it be? LMAO  

Hi all.

I've been a member for while and have posted some videos and theories on other peeps threads; I thought it was about time I start my own project thread to get some feedback on my work, and log my progress towards the end. I think most of you have seen some of my work but I thought I’d give a quick rundown of my progress over the last ten years or so, for continuity sake.

I never properly introduced my self when I joined this forum so first a bit about me. I’m fifty and a family man. I’ve had a fairly varied career so far, yacht/ cabinet builder, vehicle mechanic, electronics design engineer, precision machine/ design engineer, Web designer, IT teacher and lecturer, bespoke corporate software designer, etc. So I basically have a machine/ software technical background and now spend most of my time running my own businesses to fund my AGI research, which I work on in my spare time.

I’ve been banging my head against the AGI problem for the past thirty odd years.  I want the full Monty, a self aware intelligent machine that at least rivals us, preferably surpassing our intellect, eventually more intelligent than the culmination of all humans that have ever lived… the last invention as it were (Yeah I'm slightly nutts!).

I first started with heuristics/ databases, recurrent neural nets, liquid/ echo state machines, etc but soon realised that each approach I tried only partly solved one aspect of the human intelligence problem… there had to be a better way.

Ants, Slime Mould, Birds, Octopuses, etc all exhibit a certain level of intelligence.  They manage to solve some very complex tasks with seemingly very little processing power. How? There has to be some process/ mechanism or trick that they all have in common across their very different neural structures.  I needed to find the ‘trick’ or the essence of intelligence.  I think I’ve found it.

I also needed a new approach; and decided to literally back engineer the human brain.  If I could figure out how the structure, connectome, neurons, synapse, action potentials etc would ‘have’ to function in order to produce similar results to what we were producing on binary/ digital machines; it would be a start.

I have designed and wrote a 3D CAD suite, on which I can easily build and edit the 3D neural structures I’m testing. My AGI is based on biological systems, the AGI is not running on the digital computers per se (the brain is definitely not digital) it’s running on the emulation/ wetware/ middle ware. The AGI is a closed system; it can only experience its world/ environment through its own senses, stereo cameras, microphones etc.  

I have all the bits figured out and working individually, just started to combine them into a coherent system…  also building a sensory/ motorised torso (In my other spare time lol) for it to reside in, and experience the world as it understands it.

I chose the visual cortex as a starting point, jump in at the deep end and sink or swim. I knew that most of the human cortex comprises of repeated cortical columns, very similar in appearance so if I could figure out the visual cortex I’d have a good starting point for the rest.

The required result and actual mammal visual cortex map.

This is real time development of a mammal like visual cortex map generated from a random neuron sheet using my neuron/ connectome design.

Over the years I have refined my connectome design, I know have one single system that can recognise verbal/ written speech, recognise objects/ faces and learn at extremely accelerated rates (compared to us anyway).

Recognising written words, notice the system can still read the words even when jumbled. This is because its recognising the individual letters as well as the whole word.

Same network recognising objects.

And automatically mapping speech phonemes from the audio data streams, the overlaid colours show areas sensitive to each frequency.

The system is self learning and automatically categorizes data depending on its physical properties.  These are attention columns, naturally forming from the information coming from several other cortex areas; they represent similarity in the data streams.

I’ve done some work on emotions but this is still very much work in progress and extremely unpredictable.

Most of the above vids show small areas of cortex doing specific jobs, this is a view of whole ‘brain’.  This is a ‘young’ starting connectome.  Through experience, neurogenesis and sleep neurons and synapse are added to areas requiring higher densities for better pattern matching, etc.

Resting frontal cortex - The machine is ‘sleeping’ but the high level networks driven by circadian rhythms are generating patterns throughout the whole cortex.  These patterns consist of fragments of knowledge and experiences as remembered by the system through its own senses.  Each pixel = one neuron.

And just for kicks a fly through of a connectome. The editor allows me to move through the system to trace and edit neuron/ synapse properties in real time... and its fun.

Phew! Ok that gives a very rough history of progress. There are a few more vids on my Youtube pages.

Edit: Oh yeah my definition of consciousness.

The beauty is that the emergent connectome defines both the structural hardware and the software.  The brain is more like a clockwork watch or a Babbage engine than a modern computer.  The design of a cog defines its functionality.  Data is not passed around within a watch, there is no software; but complex calculations are still achieved.  Each module does a specific job, and only when working as a whole can the full and correct function be realised. (Clockwork Intelligence: Korrelan 1998)

In my AGI model experiences and knowledge are broken down into their base constituent facets and stored in specific areas of cortex self organised by their properties. As the cortex learns and develops there is usually just one small area of cortex that will respond/ recognise one facet of the current experience frame.  Areas of cortex arise covering complex concepts at various resolutions and eventually all elements of experiences are covered by specific areas, similar to the alphabet encoding all words with just 26 letters.  It’s the recombining of these millions of areas that produce/ recognise an experience or knowledge.

Through experience areas arise that even encode/ include the temporal aspects of an experience, just because a temporal element was present in the experience as well as the order sequence the temporal elements where received in.

Low level low frequency circadian rhythm networks govern the overall activity (top down) like the conductor of an orchestra.  Mid range frequency networks supply attention points/ areas where common parts of patterns clash on the cortex surface. These attention areas are basically the culmination of the system recognising similar temporal sequences in the incoming/ internal data streams or in its frames of ‘thought’, at the simplest level they help guide the overall ‘mental’ pattern (sub conscious); at the highest level they force the machine to focus on a particular salient ‘thought’.

So everything coming into the system is mapped and learned by both the physical and temporal aspects of the experience.  As you can imagine there is no limit to the possible number of combinations that can form from the areas representing learned facets.

I have a schema for prediction in place so the system recognises ‘thought’ frames and then predicts which frame should come next according to what it’s experienced in the past.  

I think consciousness is the overall ‘thought’ pattern phasing from one state of situation awareness to the next, guided by both the overall internal ‘personality’ pattern or ‘state of mind’ and the incoming sensory streams.  

I’ll use this thread to post new videos and progress reports as I slowly bring the system together.  

60 Comments | Started June 18, 2016, 08:59:04 PM


RoboBusiness 2016: Our takeaways in Robotics News

RoboBusiness 2016: Our takeaways
20 October 2016, 1:30 pm

Source: RoboBusinessSource: RoboBusiness We’re back from our yearly sync with the robotics industry happening at RoboBusiness. As usual, there were great talks and great people to meet. To be honest, I felt less frenzy this year than in the previous 2-3 events. Are people becoming more realistic about the promise of robots or am I just becoming immune to the ambient buzz? There were a lot of local players not at the event. Maybe they’re just busy working at making their stuff work after a surge in investment, as shown in the Boston Consulting Group presentation.

Credit: RobotiqCredit: Robotiq  What’s different this year? Last year’s trendy topic was logistics. There were still a decent amount of content about the topic this year. While connected robots and AI were touched upon in 2015, they were in every second session this year. In 2014, Jibo was announced and this year we saw various similar ideas from companies during the Pitchfire. In 2013, telepresence was a hot topic. This year, it was totally off the radar (no Beams were running around).

Theme of the year: IoT Just like at Automatica last summer, it almost felt like the robotics event was morphing into an IT event. So much so that Michael Loughlin — Founder and CEO of Nelmia robotics insight — joked that it looked like manufacturers forgot that their job is to build stuff.

James Kuffner, from Toyota Research, started the show with his keynote on cloud robotics, how it’s enabled by pervasive broadband communication and the public cloud. FANUC drew a lot of interest with their Zero Down Time (ZDT) system that combines connected robots and business processes to make sure that all robots are repaired before they go down. Omron and Rethink were also there to present their vision of what it means to connect robots. It rose many interesting questions about data ownership, the role of the integration channel and the many ecosystems that might arise from connecting robots.

Close second: Artificial intelligence Source: RobotiqSource: Robotiq Kuffner also touched on the topic in his keynote explaining that now  there are plenty of powerful open source AI, what you need is good data. Sachin Chitta from Kinema Systems was on the realistic side explaining that deep learning in computer vision has a long way to go. We need to know where something is as much as what it is. The massive quantity of data required for applying deep-learning techniques is not easy to get (you can’t scrub it off the web). In the medical side, Verb Surgical explained how they plan on using sharing of data and AI to help doctors operate, perceive and decide. Again, will they be able to get the amount of data that the new deep learning algorithms require? If they can, they will be on their way to democratize medicine, transforming the art apprentices learn from experts into a science purely based on data.

2016 pitchfire It definitely takes guts to get on stage pitch your startup idea in 2 minutes to a panel of VCs and a crowd of attendees. Kudos to the ones who’ve done it (Catalia health, Parihug, Genesis dimensions, Franklin robotics, Kobi, Cubit, Endurance robotics, Luxrobo, iris automation, qqqtech, Scanse, Rokid, Moti and Semio).

For the most part, the pitches were really well done. Kobi Company, the lawn mower, snow blower, leaves blower robot received first place. Coming from a place where we have a lot of snow, leaves and grass to mow, I was quite surprised by the winning idea, which to me did not seem realistic at all, both from a technology and a user standpoint. My bet would have been on Catalia Health. Let’s bet on who will be more successful in 3 years?

For other companies who wanted to meet VCs one-on-one (which might be wise), there were also VC hours where you could arrange meetings. The pitchfire was sponsored by Siemens who just launched their Next47 initiative, wanting to engage more with smaller innovative companies.

Robots in China, what’s the real deal? in the robotics industry for a few years now, even more recently with the acquisition of KUKA by Midea. There were two experts at the event to enlighten us about what’s really happening with robotics in China: Tom Green from Robotics Trends and Georg Stieler from STM China.

Bottom line: Yes, there are a lot of robots sold in China, but it’s not as easy as it seems and it’s not growing as fast as the rumors go (it’s actually growing at 33% CAGR according to STM). The market is different. The lack of integration know-how is slowing adoption. Companies are starting to understand that.

For instance, Bosh is conducting integration in China even though they don’t do it in all of their other markets around the world. Georg had great insight explaining that the local Chinese robot brands succeed in the simple use cases but foreign robots are more popular in demanding robotics tasks requiring high precision and speed.

Cobots are getting mature Collaborative robots are now over the hype, entering the plateau of productivity. I was moderating a panel of three interesting speakers. We gathered our notes in this blog post.

Tim Kelch from JR Automation, a large system integrator, presented his view of cobots. The team at JR benchmarked several cobots and installed about a dozen power and force limited robots and much more traditional roots in collaborative applications. Tim argued that their simplicity is oversold, showing several TED talks, articles and videos (including one from Robotiq) that were responsible for creating the hype and false expectations. Here are a few clarifications:

  • Collaborative robots that are simple to program are URs and Rethink.
  • KUKA, ABB and FANUC all have collaborative robots but are still programmed with the conventional programming tools targeted to skilled integrators.
  • We never proclaimed that cobots were the most innovative robots, quite the opposite. As I mentioned in my presentation at 2013 RoboBusiness, cobots today are tools but not the most advanced autonomous robots that will become a reality in a few years.
  • Another thing that we’re also pretty clear about is that the simple things are really simple with UR and Rethink, but the complex things are just as, and in some cases even more, complex than with traditional robots. That’s one of the things that we explain in our Getting started with cobots eBook and talks.
  • One place where I do agree with Tim is that integrating the robots with different tooling and machines is where most of the challenge is today. That’s exactly why we work on making our products plug & play and why we created the DoF community.
Source: RobotiqSource: Robotiq A new building block for industrial robots? The harmonic drive is pervasive in industrial robotics. They make the component you can find in most of the industrial robots. Harmonic drives have a quasi-monopoly and represent a significant cost in most robots. That’s why everybody was intrigued to see a potential alternative developed by SRI, only to realize that it has been licensed to… Harmonic Drives.

Does the robot industry have an innovation problem? Tony Melanson challenged the robotics industry in the audience explaining that robots are not the end of the story. It’s obvious that the industry is innovative technologically, making new things that were never possible in the past. But it’s also true that the industry must put the robot in perspective, it’s only one aspect of innovation and that innovation should cover other aspects of the business.

Silicon Valley and robotics Source: RobotiqSource: Robotiq I am always impressed with the robotics activity in the Valley. At the same time, I am doubtful of the big corporations who follow the herd and open an office there to invest in robotics. Yes the ecosystem has a lot to offer, but the demand is also really high. In other words, it’s really a red ocean of investors where a few good ones get the best companies and all the rest have lower quality ones. There are a lot of other untapped robotics clusters worldwide who offer great investment opportunities.

When I compare the environment, with where we are located in Quebec City, I estimate that it would have cost us 5x as much to operate Robotiq in Silicon Valley than in Canada. That means you need to get equivalent financing, growth and value increase to match it. We could definitely not have bootstrapped Robotiq there like we did here. Bootstrapping is not always the best way to go, but it should not be dismissed as a viable route if you’re in it to create a strong company in the long run. As discussed with an investor at RoboBusiness, last year’s most successful IPO was from a bootstrapped company.

RoboBusiness will stay in Silicon Valley next year, so save the date: Sep. 27-28, 2017.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started October 20, 2016, 04:48:06 PM
[Facebook Messenger] Soccer Fan Bot

[Facebook Messenger] Soccer Fan Bot in Chatbots - English

This is a Facebook Messenger bot called Soccer Fan Bot. It can do 3 things:

- Update you on the score of your team by typing "Update me on France" for example.

- Propose you 3 pictures of either a soccer player or player's wife and ask you to guess the one corresponding to the proposed name. Just write "guess player" or "guess wife".

- Give you a fact, just type "give me a fact".

Aug 17, 2016, 11:46:51 am
[Thai] BE (Buddhist Era)

[Thai] BE (Buddhist Era) in Chatbots - Non English

Be has been made with the program-o engine. Almost all knowledge is about Thailand and Thai people. She speaks only Thai language.

Aug 17, 2016, 11:38:54 am
The World's End

The World's End in Robots in Movies

The World's End is a 2013 British comic science fiction film directed by Edgar Wright, written by Wright and Simon Pegg, and starring Pegg, Nick Frost, Paddy Considine, Martin Freeman, Rosamund Pike and Eddie Marsan. The film follows a group of friends who discover an alien invasion during an epic pub crawl in their home town.

Gary King (Simon Pegg), a middle-aged alcoholic, tracks down his estranged schoolfriends and persuades them to complete "the Golden Mile", a pub crawl encompassing the 12 pubs of their hometown of Newton Haven. The group had previously attempted the crawl as teenagers in 1990 but failed to reach the final pub, The World's End.

Gary picks a fight with a teenager and knocks his head off, exposing a blue blood-like liquid and subsequently exposing him as an alien android. Gary's friends join him and fight more androids, whom they refer to as "blanks" to disguise what they are talking about.

May 31, 2016, 09:28:32 am
Botwiki.org Monthly Bot Challenge

Botwiki.org Monthly Bot Challenge in Websites

Botwiki.org is a site for showcasing friendly, useful, artistic online bots, and our Monthly Bot Challenge is a recurring community event dedicated to making these kinds of bots.

Feb 25, 2016, 19:46:54 pm
From Movies to Reality: How Robots Are Revolutionizing Our World

From Movies to Reality: How Robots Are Revolutionizing Our World in Articles

Robots were once upon a time just a work of human imagination. Found only in books and movies, not once did we think a time would come where we would be able to interact with robots in real world. Eventually, in fact rapidly, the innovations we only dreamt of are now becoming a reality. Quoting the great Stephen Hawking "This is a glorious time to be alive for scientists". It is indeed the best time for the technology has become more and more sophisticated that its growing power might even endanger humanity.

Jan 26, 2016, 10:12:00 am

Uncanny in Robots in Movies

Uncanny is a 2015 American science fiction film directed by Matthew Leutwyler and based on a screenplay by Shahin Chandrasoma. It is about the world's first "perfect" artificial intelligence (David Clayton Rogers) that begins to exhibit startling and unnerving emergent behavior when a reporter (Lucy Griffiths) begins a relationship with the scientist (Mark Webber) who created it.

Jan 20, 2016, 13:09:41 pm
AI Virtual Pets

AI Virtual Pets in Other

Artificial life also called Alife is simply the simulation of any aspect of life, as through computers, robotics, or biochemistry. (taken from the Free dictionary)This site focus's on the software aspect of it.

Oct 03, 2015, 09:21:09 am
Why did HAL sing ‘Daisy’?

Why did HAL sing ‘Daisy’? in Articles

...a burning question posed by most people who have watched or read “2001: A Space Odyssey”: that is, why does the computer HAL-9000 sing the song ‘Daisy Bell’ as the astronaut Dave Bowman takes him apart?

Sep 04, 2015, 09:28:55 am

Humans in Robots on TV

Humans is a British-American science fiction television series. Written by the British team Sam Vincent and Jonathan Brackley, based on the award-winning Swedish science fiction drama Real Humans, the series explores the emotional impact of the blurring of the lines between humans and machines.

Aug 28, 2015, 09:13:37 am
Big Dog

Big Dog in Robotics

The Most Advanced Quadruped Robot on Earth BigDog is the alpha male of the Boston Dynamics family of robots. It is a quadruped robot that walks, runs, and climbs on rough terrain and carries heavy loads. BigDog is powered by a gasoline engine that drives a hydraulic actuation system. BigDog's legs are articulated like an animal’s, and have compliant elements that absorb shock and recycle energy from one step to the next. BigDog is the size of a large dog or small mule, measuring 1 meter long, 0.7 meters tall and 75 kg weight.

Aug 09, 2008, 20:02:17 pm