A report on the ELS Workshop 20166 January 2017, 8:31 pm
A robot hands a medication bottle to a person. Photo credit: Keith Bujak. Source: Georgia Tech News Center The Ethical, Legal and Social Aspects of Social Robots in Healthcare and Education Workshop (also called ELS Workshop) was held in Yokohama the 14th
Nov 2016 during the JSAI-isAI Conference
. The workshop was twinned with another workshop in the New Friends Conference
in Barcelona the 2nd
The workshop aimed at addressing the concerns that the use of social robots in therapy end education pose in the legal, ethical and societal (ELS) domain in a constructive and proactive manner. The workshop undertook ELS challenges that arise from the introduction of robots in therapy and education. A booklet is available to anyone who wishes to find out more, and can be found here
The ELS Workshop first dealt with aspects of human dignity and the questions of how social robots might endanger an individual’s dignity, for example, by implying an overly simplistic model of human agency or by discriminating against certain demographics. Secondly, the workshop revolved around the topic of privacy. Third, the topic of liability was addressed.
These three overarching ELS issues – dignity, privacy, and liability – were discussed in an open discussion workshop format. The workshop focussed on defining the benefits, conflicts, and possible solutions to the conflicts that such new technologies pose. Researchers from all disciplines, as well as practitioners, actively engaged in the discussions – in particular, those participants who submitted an abstract for the workshop.Main findings
Here follows a summery of the issues raised by, and findings of the workshop.Challenges and issues of social robots in therapy and education
Addressing the Challenges / Finding Solutions
- Public participation: Users are not taken into account during the design process of the robot and that is why “user-centered” design is, at the current stage, not a true statement.
- Public awareness: Connected with the user-centered idea, citizens opinion is not taken into account during the policy-making process. At the same time, however, it seems that people do not read the government website and are not very engaged with the government. Some participants mentioned that in Korea individual privacy is not perceived to have much value and that the government does not promote public awareness concerning it.
- Public security: It was explained in the workshop that in Korea there are lots of security issues in the country, and a case concerning a data leak was explained. If the proliferation of robots in the marketplace is to increase, it will be crucial to address all the security issues, especially if elderly and disabled people are involved.
- Privacy postmortem: Should informed consent include privacy after the death of the person?
- Legal management: The case of the Henna Hotel was introduced as an example of how robotic technology is already in the market. While the industry has already taken steps towards the introduction of robotic technology to the market, there has been no legal accompaniment, especially in the case of educational and therapeutic robots.
- Standards: Due to the lack of this legal management, it is not very clear what standards apply to educational and therapeutic robots. They simply do not exist in the industry.
- Legal uncertainty: Unless there is a problem, it seems that the Law is not providing any pro-active measure towards how to ensure a correct design/use of robot technology.
- Immigrant discrimination: Japan is investing in robotic technology instead of hiring foreigners due to language difficulties as well as cultural differences. Elderly prefer to be treated by Japanese caregivers or by robot technology rather than by immigrants.
- Employment of robot technology: The immigrant discriminatory scenario is aggravated by the low wages Japanese caregivers receive.
- Dehumanizing practice: In Japan, it is normally the daughter or the daughter-in-law that takes care of the mother/grandmother. Dehumanizing the practice of care could bring more free time to the caregiver (that is in this case a relative) but could bring about the decrease of human-human interaction from the care-receiver side.
- Multiple legislations: There is an urgent need to address the ELS issues if this robotic technology is going to be shipped all over the world. Compliance with each national legislation will bring about many conflicts that challenge the insertion of these robots to the market.
- Multiple robots: Each robot is different and the context where they are inserted too. Therefore, it is difficult to standardize case-by-case scenarios.
- Substitution of humans: Some of the participants wondered why humans need to surrender to the robot taking-over scenario. and who is in charge of deciding this. This brought about the following issue: technology is not people-driven.
- Robot obligations: If academia is talking about the possibility of providing robot rights, this highlights the importance of also deciding obligations.
- Unexpected consequences: Some of the robots that are entering the market have not been tested in real contexts and unexpected consequences can easily arise.
- Agency: connected with the unexpected consequences, it is not very clear whether the agency comes from the context, or if the agency comes from the designers. Who is entitled to decide the agency of their robot?
- New agency: The literature has been comparing robots to animals, corporations and other things… should the robot be considered a new kind of agent?
- Religion: Agency is perceived differently depending on the religion. Some comments on how inanimate things have spirit were highlighted, as well as how Hinduism also welcomes low-agency objects to be important, e.g. a book cannot be touched by the feet. What is not clear is whether this is a problem of religion or the individual.
- Human behavior: An experiment in Japan found resulted in children beating a robot in a mall. This experiment was explained in the workshop in order to avoid the replica of the behavior in the human-robot interaction.
- Education: Not a lot of schools have access to robotic technology, although it seems to be important to educate the population on the correct use of robotics technology.
- New law: do robots challenge the legal system in a different manner than other agents do?
Main Issues (Recurring Topics)
- Legal certainty: it is important to have knowledge about domestic laws. A concept similar to “regulation by design” should be implemented.
- Regulatory prohibition: Depending on the use of robots we should ban the producing of robots to avoid unfortunate scenarios. The majority agreed that military robots should be banned.
- Public participation: the creation of a random jury, as we have in the legal system, could help on the participation of the general public on the decision-making process, as well as on the design of the robot technology. Some information and materials could be given to the citizens and they could provide their feedback.
- Employment of technology: increasing the salary of the workers could make the caregiver profession more attractive and increase efficiency. Working less hours can also increase efficiency although it is not clear how it could be modeled with care professions.
- Unexpected consequences: If we could predict and model an agent’s behavior then we could avoid unexpected consequences. Building robot strategies within the robot itself could also be of help.
- Education: it seems very difficult to know how to teach creativity in Japan, where the system is based on memorization. Helping this could help the acceptance of robotic technology as well as promoting the active participation on the design process of the robots.
- Transparency: The robot should be able to show or explain how it is using personal data, in order to promote transparency and user awareness. The use of black boxes could provide more information on the robot usage. It is not clear, however, what extent the government should have access to it.
- Regulated-by-design: This hybrid regulatory model should be embedded into the system to avoid any violations with domestic and international laws.
- Pro-activity: The system of Tokku in Japan should be spread all over the world in order to have Testing Zones for regulatory purposes.
- Regulatory model: we need a hybrid model that can cope with robotic technologies as well as with other technologies, as the delivery of legal counseling is changing rapidly – from having divorce apps to chatbots that deal with parking tickets.
- Regulatory model: connected to this idea, each project that involves robotic technology could be designed in a way that knowledge can be extracted from it for policy purposes.
- Case-by-case: the regulatory model should be created in a way that can cope with the particular needs of every robot and every context where this is inserted, as not all the robots possess the same degree of intelligence or the same capabilities.
- Robot Agency: the creation of an agency that deals with robots as it occurs in Japan could help having better management of all the ELS issues.
- Open international discussions: there is the need to open the debate to more people and to more countries.
- Replacing humans: robots should be conceived as helpers of the humans not a replacement of them.
- Clarity on the capabilities classification could bring about intelligibility on what is/should be permitted and what should not.
- This clarity relates also to the vision and expectations of the different countries on robotic technology, e.g. international future robotic trends.
- Decision-making of the robot: in the case of an accident, everyone could be assigned different values/importance in order the robot to choose the best target. This could raise discrimination issues at the same time.
- Harmonization: There is the need to harmonize what impact means, what is risk, and what are the legal issues concerning this type of robotic technology. There is also the need to have a common understanding.
- Alternative solutions: if the technology allows, the robot should be privacy friendly using other technologies, e.g. removing the cameras in the case of drones; or using vibrations of the floor to know that the patient has fallen.
- Privacy: a system similar to the noise that cameras in Korea make when they take a picture could be adopted in order to spread awareness among the users. The problem with this solution is that there are available apps that allow users to take a picture without making noise.
- Ethics could be addressed by common sense rules, although common sense varies between countries and individuals.
- The creation of a data bank for scientific innovations could help avoiding the non authorized uses of data.
- An international expert committee could be created to draft group reports
- Regulatory model: There are no clear rules on the use of robotic technology in therapy and education. Law is not pragmatic.
- All the parties want to avoid responsibility of harm occurrence
- Growing need of the importance of ELS issues when the human-robot interaction involves vulnerable parts of the society.
- Solutions: each solution can bring about negative consequences. Therefore, an analysis of all the pros and cons will have to be made in order to mitigate any risk posed by the solutions
- The future of work (involving the promotions of co-working spaces)
- Third uses of data (involving privacy post mortem)
- The need for an international set of rules and basic principles.
- Robots will change the way we conceive this world.
- The legal system needs to
- be disrupted
- be pro-active
- be dynamic
- be more accessible
- There is the need to run more ELS workshops in different countries with different target participants: industry, laymen, legal scholars, etc.
- A great effort has to be done in order to change how people learn as everything is converging in this world. The term people in this case involves any person from any age.
- There is the need to share the knowledge on robotic technology among legal scholars.
- A clear classification on the capabilities of robots (educational, therapeutic) could be of help.
- There is the need to speak with main organizations, e.g. WTO, about this topic.
- We do need an international legal framework that can cope with all these ELS issues in the use of robotic technology, especially if they work with vulnerable parts of the society.
- There should be placed the idea that robotic technology are a supplement of the human’s work
To visit any links mentioned please view the original article, the link is at the top of this post.