Recent Posts

Pages: [1] 2 3 ... 10
General Robotics Talk / Re: Robotic / tech bandwagons
« Last post by Freddy on Today at 04:56:45 PM »
Yes, the majority of these devices wouldn't be there if it wasn't for crowd funding - which is a good and bad thing. I'm guessing developers have gone the traditional route of funding without success. Does this escape things like being ready for market and a peer review of whether it is actually going to sell ? Again, could be good or bad.

Seems to me a lot of crowd funded projects fail. I've supported two, one was for a years development costs and five years or more later it has still not hit the shelf. The other, a VR product, was successful and I was pleased with it.

I agree that for me these assistants and the like really don't fill any gap that I can't do myself at the moment.
Robotics News / CES 2017, part one: Robocar technology and concept cars
« Last post by Tyler on Today at 04:48:17 PM »
CES 2017, part one: Robocar technology and concept cars
17 January 2017, 6:45 pm


CES is the big event for major car makers to show off robocar technology. Most of the north hall, and a giant parking lot next to it, were devoted to car technology and self-driving demos.


Gallery of CES comments Earlier I posted about many of the pre-CES announcements and it turns out there were not too many extra events during the show. I went to visit many of the booths and demos and prepared some photo galleries. The first is my gallery on cars. In this gallery, each picture has a caption so you need to page through them to see the actual commentary at the bottom under the photo. Just 3 of many of the photos are in this post.

BMW’s concept car starts to express the idea of an ultimate non-driving machine. Inside you see that the back seat has a bookshelf in it. Chances are you will just use your eReader, but this expresses and important message — that the car of the future will be more like a living, playing or working space than a transportation space.

Nissan The main announcement during the show was from Nissan, which outlined their plans and revealed some concept cars you will see in the gallery. The primary demo they showed involved integration of some technology worked on by Nissan’s silicon valley lab leader, Maarten Sierhuis in his prior role at NASA. Nissan is located close to NASA Ames (I myself work at Singularity University on the NASA grounds) and did testing there.

Their demo showed an ability to ask a remote control center to assist a car with a situation it doesn’t understand. When the car sees something it can’t handle, it stops or pulls over, and people in the remote call center can draw a path on their console to tell the car where to go instead. For example, it can be drawn how to get around an obstacle, or take a detour, or obey somebody directing traffic. If the same problem happens again, and it is approved, the next car can use the same path if it remains clear.

I have seen this technology a number of places before, including of course the Mars rovers, and we use something like it at Starship Technologies for our delivery robots. This is the first deployment by a major automaker.

Nissan also committed to deployment in early 2020 as they have before — but now it’s closer.


You can also see Nissan’s more unusual concepts, with tiny sensor pods instead of side-view mirrors, and steering wheels that fold up.

Startups Several startups were present. One is AIMotive, from Hungary. They gave me a demo ride in their test car. They are building a complete software suite, primarily using cameras and radar but also able to use LIDAR. They are working to sell it to automotive OEMs and already work with Volvo on DriveMe. The system uses neural networks for perception, but more traditional coding for path planning and other functions. It wasn’t too fond of Las Vegas roads because the lane markers are not painted there — lanes are divided only with Bott’s Dots. But it was still able to drive by finding the edge of the road. They claim they now have 120 engineers working on self-driving systems in Hungary.

Startup NAuto showed their dashcam system, currently sold for monitoring drivers. They plan to use the data gathered by it to help people train self-driving systems.

RobotTuner in the Netherlands has built a simulated environment for testing and development of cars. They offered to let people race it like a video game. The sample car was a neural network model — you can’t really train them in simulation or you will train on the artifacts of simulation — but you can use them to test out your simulation.

Civil Maps, only recently funded, was out pushing their new mapping database, built by using neural network techniques and people’s laser and video scans of the road.

Navya continues to push and deploy in more places in the shuttle market. They gave rides as well.

I didn’t take most of the rides. Turns out the rides are getting dull. If the system works well, the ride is boring. I see the wheel move and I see a screen where the car is drawing boxes around the cars and other obstacles I see. If they have designed the demo well it is not making too many mistakes.


Faraday Future finally showed a production car. Press on it has been mixed — it’s not shipping yet, and it has impressive specs but they don’t seem to be that much better than the Tesla. One cute feature was a pop-up 360 degree LIDAR in the hood. Frankly, the hood is not a great place to put such a LIDAR, it only sees forward and to the sides, but it looks sleek — as though looking sleek is important in first generation cars.

FF has talked about some interesting self-driving plans, but so far that has taken a backseat (!) to their push as a high-end EV vendor to compete with Tesla. Let’s hope we hear more in the future of Faraday Future.

There were several people promoting new LIDARs at low prices. These LIDARs use MEMs mirrors — the same tool used in DLP projectors — and will thus have no large moving parts. These vendors claimed prices and capabilities to compete with Quanergy’s solid state units. Two of the vendors claimed 200m of range. That’s either a revolutionary technology or a lie, however. LIDARs in the near-infrared just can’t emit enough light to be eye safe and have it bounce off a black car 200m away and come back to the sensor bright enough to be visible against sunlight. You would need a whole new type of sensitivity — and today’s LIDARs use single photon detectors — or a way to tune out noise. Which perhaps they have, but it would be big news. You can see a white car at 200m, which may be what they are promoting, but a LIDAR that only sees white cars is not really good enough.

But the world does need to see 200m. Today’s LIDARs don’t do that, and highway speeds demand able to see stalled objects (which radar is not good at) at 200m out.

Delphi I was impressed with Delphi’s work and attitudes. Their new test cars include even more sensors, including MobilEye. The OEMs that fail to produce their own internal self-driving systems will be going to companies like Delphi to get such systems.

Delphi believes, as I do, that the big market will be in cars for robotaxi (mobility on demand) services, rather than high-end luxury cars for individual owners that feature self-drive systems. Certainly, with services like Uber being as hot as they are, robotaxi services will be accessible to a much larger segment of the population. Most car companies focus on making cars with added self-driving at the high end, and such cars will sell — but the robotaxis will drive more miles and thus sell more.

Concepts There were a lot of concept cars. As per usual, most of them were silly — a lot of money spent to make the company look cool with something that will never actually ship.

Popular choices were very sparse and open interiors. Also shown by several were steering wheels that could fold away or hide. I think that’s a good thing to make, though it won’t be fancy multi-motor pop outs like we see here. Once you stop driving except in special situations, you might be fine with a much simpler driving interface like handlebars (which can pop out of a dashboard more easily) or even a wheel you pull out of a shelf and plug in. Eventually, it won’t have any mechanical linkage, it will have a drive-by-wire linkage that can bypass the self-drive systems in the event of total system failure.

Also super popular — see the BMW — are panels to hide the wheels. This can reduce drag but there are a lot of problems with doing it, especially on real roads. It is done to look futuristic.

Many concepts showed small sticks with sensors instead of side-view mirrors. Today, the law mandates side-view mirrors, but soon it will allow the use of cameras which show on a screen inside the car. Car makers want to do this because it reduces drag and they have to meet their fuel efficiency targets. Everybody also loves unusual doors, which either open upwards or are in the suicide door configuration.

I suspect all these fancy add-ons may appear in high-end cars, but not in robotaxis. There you want low costs. The convenience of electric doors (like in minivans today) may be popular for taxis, so we’ll probably see that. And no need for individual doors. One person vehicles will be meant for one person, and 4 person vehicles will be meant for 3 or 4, who all get in at the same time.

For more photos and commentary, check out my gallery on cars and look to a later posting of my gallery on the hype over “smart” and “connected” devices in the internet-o-thingies.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
General Chat / Re: Sinclair Computer
« Last post by Freddy on Today at 04:33:15 PM »
I still have two of the originals with various peripherals - they were still working the last time I tried them.
General Chat / Re: Commuter traffic have you at your wits end?
« Last post by Art on Today at 02:27:57 PM »
Yeah, funny...look how much everything was impacted once those "drones" (remote controlled quadcopters), hit the mass marketplace. All of the sudden, everyone who bought one suddenly became stupid with respect to common sense by flying them over public places, homes, stadiums, airports and government facilities. Yes, Stupid!

Those actions basically ruined things for everyone else who had been legitimate flyers for years at publicly sanctioned RC fields. The FAA got involved having to establish tighter controls and guidlines along with severe fines and penalties for violators. They also established an RC Registry for anyone flying an RC device weighing more than 1/2 pound.

Just imaging what would happen if those same "Happy to have one" people purchased and began "piloting a flying car!"
Who knows what crazy stunts they'd pull, like flying over the Super Bowl or Wimbledon to snap a Selfie above the venue. Not to mention the air-borne fender benders and newly established AAA (Automobile Aeronautics Administration) or something as weird.

It only get crazier from hereon. O0
General Chat / Re: Sinclair Computer
« Last post by korrelan on Today at 02:07:31 PM »
Looks cool, I had a speccy too.  I'm at loss as too why there bringing  it out though.

There are loads of good spectrum emulators for PC's and android phones/ pads.

General AI Discussion / Re: Existential threat by AI
« Last post by Data on Today at 01:28:03 PM »
Because your'll have to wait till the proof is ready.

Looking forward to that but please keep it real, true and fact.
Catalia Health uses social robots to improve health outcomes
17 January 2017, 10:18 am

Credit: Catalia HealthCredit: Catalia Health Catalia Health is leading the surge in social robotics, with Mabu, their patient care management system. Catalia Health likes to be seen primarily as a health company that utilizes robots, rather than a robotics company. This focus on solving real world problems while shipping a product has seen Catalia attract both customers and investors, and recently close their Series A round.


Interview with Cory Kidd, Founder & CEO of Catalia Health

(edited for clarity)

What is Catalia Health?

Catalia Health is a patient care management company. We focus on helping patients adhere to their treatment, whether that be taking medication, or managing chronic disease over the long term. That’s the focus of what we do, and part of how we deliver this to patients is through a cute little robot called Mabu who engages with patients through conversation. She’s a little over a foot tall, and can sit wherever you want to put her … on a countertop or bedside table … and she has big eyes that make eye contact with you while you’re talking to her. Conversations with her might last a minute or two, or maybe five or ten minutes; it really depends on the individual patient and what they want to talk about.


Mabu has a touch screen on the front that she can use to display information, but our overall focus is to create an engaging relationship between the technology and the patient. The reason that we use the robot — as opposed to just delivering this through a phone screen or a tablet or PC — is about psychology and not about technology. When we are in front of a robot that has eyes that can look at us and blink, we tend to be more engaged, and we find the robot to be more credible and informative than if the same information were delivered to us through an app. While we have a lot of healthcare applications that we’re looking to build, the core of this is really just basic psychology: how can we create engagement that lasts for a long time? Psychologists have studied the benefits of face-to-face communication for decades.

Is speech the primary interaction that people have with Mabu?

Our platform’s primary means of interaction is conversation, but this can happen in more than one way. For example, when Mabu is talking, she also displays what she is saying on her screen, to make it easy for anyone to understand what’s going on. And when I reply, I can speak back to her, or I can touch a button or location on the screen. And if I’m not at home, I can also get a reminder via text message … in the future this might happen through an app or other desktop interface.

The physical robot is the thing that’s creating the engagement — the relationship — but we can interact with people through other forms of technology as well.

Does the conversation with Mabu end at home? How is information transferred to the healthcare provider?

We do send information summaries back to health care providers — a pharmacist or physician or some other caregiver — but the overall problem we are trying to help with is that the healthcare system simply doesn’t have enough people to manage chronic disease at scale. So while our technology might also enable tele-operation or tele-presence, the focus of our business is to be able carry out an autonomous one-on-one interaction with patients in real time.

Is the patient, or end user, your customer?


Patients get a lot of benefit from our platform, but they are not the ones who are paying for it.

Our direct customers are pharmaceutical manufacturers and healthcare providers. They provide programs to help patients be more effective at taking their medications and managing their conditions, so in their eyes we are another tool in their arsenal.

Can you tell us about your first deployments?

The places where we are rolling out first are where there are existing care management programs already in place, and these tend to be in areas such as oncology and immunology where higher-end drugs are being used. Talking robots are very new and different, so we wanted our contract structure to look as similar as possible to existing offerings. These were the areas where there were already contract types that we could follow into market. We have been rolling out the first several hundred units in the first half of 2016 and are bringing patients onto the platform by the end of this year.

What is your business model? Is it “Robots as a Service”?

In terms of the patient relationship, our robot is key. But in terms of our business model and contracts, we don’t think of our robot as the key piece of what we’re delivering. We use a service model for care management; our customers pay us on a per patient per month basis.

What does interaction with your service look like from the patient’s perspective?

If you want to see what the patient interaction with Mabu actually looks like, we have a short video at

Once the patient plugs Mabu in, the robot comes alive and starts talking. The conversation starts off with greetings and small talk (such as “Good morning, great to see you!”) and then moves on to whatever issue is relevant to the patient at that point in time. Maybe this is simply to check in on whether the patient has taken their medication, or maybe the patient is at a point in their treatment where it’s common to experience certain side effects, and the conversation is about how best to mitigate those for that patient. It really depends on the particular condition or treatment the patient is dealing with. We do a lot of research on each condition before rolling the platform out to patients, in order to build an understanding of common treatment challenges into the application.

In the background, the conversation is being crafted in real time for that patient.


When Mabu first comes out of the box, we know a little about the patient’s medical condition — perhaps what drugs they are on — but we don’t know much else. So from that very first conversation we start learning about and adapting to the patient’s individual personality and the treatment issues they are facing. Mabu largely directs the conversations, but the patient has a lot of say in terms of where that conversation goes. As we build more conversations and more AI into the platform, we are able to craft appropriate conversations for the patient.

This will very quickly become applicable to a lot more drugs and a lot more disease states. Let’s look at side effects, for example. Our first conversations about side effects will be new, but there are many common side effects among drugs. So while we are starting out in just a handful of areas, our goal is to help any patient who’s dealing with a condition on an ongoing basis to better manage their care, and to provide information back to their caregivers so that they can be more effective in supporting them.

Is it valid to be concerned about robots being used to replace human companionship?

We certainly don’t think of this as robots replacing people; we think of it as robots augmenting people.

One of the big challenges in healthcare today is that there are not enough caregivers to deliver healthcare the way we need it. Almost half our population is managing a chronic disease in this country, and there are very similar rates in advanced nations around the world; if we look at the rate of people dealing with health issues on an ongoing basis, it approaches 2/3 to 3/4 of the population.

People might get to see their doctor for fifteen minutes every two months, but that’s not much time, and it’s not an effective way to provide the ongoing care that is needed. We simply don’t have enough people to manage healthcare the way we did 50 or 100 years ago.


Patients need reminders, and they need answers to all the little questions that come up — and that’s where technology like this comes in. We see our service as a way for the people who are providing health care — doctors, nurses, and other trained caregivers — to more effectively reach a larger group of patients. We are not trying to be people’s doctors, we are trying to help their doctor do a much more effective job.

What kind of feedback have you received so far?

Broadly speaking, the feedback has been very positive. People tend to like the interaction right from the very first conversation, and they like how Mabu adapts to them.

We have a great solution that we’ve shown can effectively help many patients, but we still have a lot to learn. We are really excited about the amount of data that we’re going to be getting back from hundreds of person-months of interaction with our platform this year, and how we’re going to use that to improve conversations and personalize them to every patient.


Thanks to social platforms like Siri, Jibo and Amazon’s Echo, we are starting to get used to having conversations with our devices. But you’ve taken a very specialized path into the market. Why did you pick this pathway and business model?

Scalability — being able to provide care to a growing number of patients — is a big challenge in health care. I spent about a year before launching Catalia Health really digging into the US healthcare market to explore the business opportunities. We were thinking broadly around medication adherence and chronic disease management, talking to potential customers and trying to understand where there was a need for this kind of technology. The quick answer was that it is needed pretty much everywhere within the healthcare system. The question of how to provide healthcare in a cost-effective and scalable way is definitely a challenge here in the US, and also in most other nations in the world. We see an enormous opportunity for using technology to provide scalable personalized care.

Of all the robotics and AI movies that have come out in the past five or ten years, Robot and Frank offers a vision comes closest to what we’re doing. The goal of the robot in that movie was to help Frank live healthier by building a relationship with him. We have the same underlying premise in what we’re doing: our technology is focused on building a relationship with the patient, because once we can do that, then we can talk to them about their health care. By comparison, usage rates on healthcare apps are incredibly low; most patients don’t pick them up after the first or second try. But as it turns out, there are particular psychological aspects of how people interact with robots that make them really effective at helping to solve this challenge.

Do you see ways that other robotics companies can leverage what you’ve learned so far?

The broad lesson is to understand where there is a real human or business need. Asking “Where is there a problem that I can solve?” rather than asking “Where can I build a robot?” or “What market can I serve?”

It’s also important to understand what the existing marketplace looks like for those kinds of solutions right now, because the solution today may look very different. Our robot is an alternative to talking to a pharmacist on the phone, and it’s a very different solution, but understanding what the business model is for that kind of service, how those contracts work, who the players are in the space — I think that’s something that any company would be smart to take a look at and understand deeply before trying to compete in those markets.

You’re tackling one of the largest growing areas of our economy, and you’re doing it with a combination of data, AI and robotics. What do you think has changed in the past couple of years to make robotics a viable solution to a broader range of applications?

One of the biggest changes has been in the cost of building both hardware and software. Our robot is pretty simple; we’re not doing anything cutting edge in terms of the physical device that we’re building. But ten years ago producing our device might have cost 100 times what it does today, and that would have limited us to a small set of business models and it would have been very hard to make money.

With the cost of building the technology drastically lowered, it has enabled us to do something very different today than what we could have done five years ago. Today we can build cutting edge technology at a reasonable price point and therefore deliver a cost effective solution.



Catalia Health is a patient health management system using social robotics. Founded in 2013 by Dr. Cory Kidd, Catalia Health builds on years of research into Human-Robot Interaction starting at MIT’s Media Lab and continuing with social robot startups like Intuitive Automata. In June 2015, Khosla Ventures led a $1.25 million seed round in Catalia Health for the first trial customer engagements. Catalia Health is on a mission to address both sides of the healthcare equation: improving patients’ health and extending the capabilities and efficiency of healthcare companies.


Silicon Valley Robotics is the industry group for robotics and AI companies in the Greater San Francisco Bay Area, and is a not- for-profit (501c6) that supports innovation and commercialization of robotics technologies. We host the Silicon Valley Robot Block Party, networking events, investor forums, a directory, and a jobs board, and we provide additional services and information for members, such as these reports.

We’ll be releasing additional essays from the reports every week or so. You can read full reports by visiting the website.

If you enjoyed this article you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
General Chat / Re: Commuter traffic have you at your wits end?
« Last post by korrelan on Today at 10:14:34 AM »
I like how the site mentions ‘Be a part of our team! We’re hiring’.. and also probably dropping too lol. 

Probably running short on test pilots.

Nice looking vehicle but I don’t understand why they have put the ‘pedestrian clearing device’ at the back… seems odd to have to reverse to utilise this asset.

General AI Discussion / Existential threat by AI
« Last post by LOCKSUIT on Today at 08:35:44 AM »
I didn't find a thread on the forum dedicated to the existential threat of AI, so I made this thread.

There are books out there that talk about AI and the existential threat from AI. They even say they should watch AI forums and nationalize initiated projects. And there's only like 3 English-speaking AI forums, 2 truly. Stephen Hawking and Elon Musk could be watching this very thread! Right? They sound really worried in their books, on the news, and on their dedicated website. Yous are going to be in for a treat by me sooner or later wahaha!!!! I also know deep extensive supercapabilities they'll have. And how to make terminator really work. But don't worry. Sit back and relax. Because your'll have to wait till the proof is ready.
EU needs to take the lead on regulating robots and artificial intelligence, MEPs suggest
16 January 2017, 4:59 pm


EU rules for the fast-evolving field of robotics, for example, compliance with ethical standards and liability for accidents involving driverless cars, should be put forward by the EU Commission, urged the Legal Affairs Committee.

Rapporteur Mady Delvaux (S&D, LU) said: “A growing number of areas of our daily lives are increasingly affected by robotics. In order to address this reality and to ensure that robots are and will remain in the service of humans, we urgently need to create a robust European legal framework”. Her report, approved by 17 votes to 2, with 2 abstentions, looks at robotics-related issues such as liability, safety and changes in the labour market.

MEPs stress that EU-wide rules are needed to fully exploit the economic potential of robotics and artificial intelligence and guarantee a standard level of safety and security. The EU needs to take the lead on regulatory standards, so as not to be forced to follow those set by third states, argues the report.

A new European Agency for robotics and a Code of Ethical Conduct

MEPs urge the Commission to consider creating a European agency for robotics and artificial intelligence to supply public authorities with technical, ethical and regulatory expertise.

They also propose a voluntary ethical conduct code to regulate who would be accountable for the social, environmental and human health impacts of robotics and ensure that they operate in accordance with legal, safety and ethical standards.

For example, this code should recommend that robot designers include “kill” switches so that robots can be turned off in emergencies, they add.

Liability rules

MEPs note that harmonised rules are urgently needed, especially for self-driving cars. They call for an obligatory insurance scheme and a fund to ensure victims are fully compensated in cases of accidents caused by driverless cars.

In the long-term, the possibility of creating a specific legal status of “electronic persons” for the most sophisticated autonomous robots, so as to clarify responsibility in cases of damage, should also be considered, MEPs say.

Social impact

The development of robotics could also result in big societal changes, including the creation and loss of jobs in certain fields, says the text. It urges the Commission to follow these trends closely, including new employment models and the viability of the current tax and social system for robotics.

Request for legislation

This legislative initiative invites the Commission to present a legislative proposal. It is not obliged to do so but must state its reasons if it refuses.

The full house will vote on draft proposals in February and needs an absolute majority.

If you like this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
Pages: [1] 2 3 ... 10


Please login or register.

Login with username, password and session length
Robotic / tech bandwagons
by Freddy (General Robotics Talk)
Today at 04:56:45 PM
Sinclair Computer
by Freddy (General Chat)
Today at 04:33:15 PM
Commuter traffic have you at your wits end?
by Art (General Chat)
Today at 02:27:57 PM
Existential threat by AI
by Data (General AI Discussion)
Today at 01:28:03 PM
Toothpick in the bellybutton...
by yotamarker (General AI Discussion)
January 18, 2017, 05:56:53 PM
(music) Nanocyte
by LOCKSUIT (Video)
January 16, 2017, 09:18:29 PM
by Art (General Robotics Talk)
January 16, 2017, 05:08:52 PM
(dream+music) Black Crypt
by Art (Video)
January 16, 2017, 02:09:04 PM
CES 2017, part one: Robocar technology and concept cars
by Tyler (Robotics News)
Today at 04:48:17 PM
Catalia Health uses social robots to improve health outcomes
by Tyler (Robotics News)
Today at 10:50:59 AM
EU needs to take the lead on regulating robots and artificial intelligence, MEPs suggest
by Tyler (Robotics News)
Today at 04:48:10 AM
Robot Exhibit at the Science Museum (London) 2017
by Tyler (AI News )
January 18, 2017, 10:48:09 PM
The Drone Center’s Weekly Roundup: 1/16/17
by Tyler (Robotics News)
January 18, 2017, 10:48:08 PM
EU to vote on declaring robots to be 'electronic persons'
by Don Patrick (AI News )
January 18, 2017, 08:13:56 PM
Humans are still better than robots at these 9 professional skills
by Tyler (AI News )
January 18, 2017, 04:48:10 PM
21 research reports forecast double-digit growth for robotics industry
by Tyler (Robotics News)
January 18, 2017, 04:48:08 PM

Users Online

18 Guests, 3 Users
Users active in past 15 minutes:
Freddy, kei10, yotamarker
[Trusty Member]
[Starship Trooper]

Most Online Today: 31. Most Online Ever: 208 (August 27, 2008, 08:24:30 AM)