Robots Need To Learn About Ethics, Morality
Robots do not only need to learn about technical skills but also about ethics and morality. Expert says that ethical inputs should be done before developing any other skills.
The use of robots in people's lives has become increasingly popular. In fact, many people are troubled if one day these machines will take over their work, or even the planet.
In 1960, Isaac Asimov wrote "Three Laws of Robotics" which outlines morality guidelines that robots should live by. A robot may not injure a human being or allow him to come to harm. The machine must always obey orders given by a human except when the order will cause another human to be hurt. Robots are also expected to protect its existence as long as it will not break the first and second rule.
Today, British Standards Institute has also released guidelines that robot designer's should adhere to make robots with ethics and morality. On a more complex level, there is also a question whether it is within the morality for robots to form an emotional bond between humans and robots. Some robots are designed to interact with children and elderly who are more vulnerable to emotions.
According to Daniel Glaser, director of Science Gallery at King's College London, robot makers should learn from ethics and morality from human development. Children are taught about morality even before they begin to learn math.
Children are taught language skills and complex reasoning when they learn how to behave in social situations. Bomb-sniffing dogs can only be trained when after they learn how to sit. According to Glaser, robots should be taught in this manner as well.
If people are interested in really making robots people more like human, designers need to focus on ethics and morality first. It should be built into the machine's core. Ethics and morality cannot be retrofitted.