Let’s stop talking about bad robots and start talking about what makes a robot good. A good or ethical robot must be carefully designed. Good robot design is about much more than just the physical robot, and at the same time good robot design is about ‘less’. Less means no extra features, and in robotics that includes not adding unnecessary interactions. It may seem like a joke, but humanoids are not always the best robots.
‘Less’ is the closing principle of the “10 laws of design” from world famous industrial designer Dieter Rams, and design thinking has informed the discussion around guidelines for good robot design, as ethicists, philosophers. lawyers, designers and roboticists try to proactively create the best possible robots for the 21st century.
Silicon Valley Robotics has launched a Good Robot Design Council and our “5 Laws of Robotics” are:
Robots should not be designed as weapons.
Robots should comply with existing law, including privacy.
Robots are products: and as such, should be safe, reliable and not misrepresent their capabilities.
Robots are manufactured artifacts: the illusion of emotions and agency should not be used to exploit vulnerable users.
It should be possible to find out who is responsible for any robot.
These have been adapted from the EPSRC 2010 “Principles of Robotics” and we greatly thank all the researchers and practitioners who are informing this ongoing topic.
Silicon Valley is at the epicenter of the emerging service robotics industry, robots that are no longer just factory workers but will be interacting with us in many ways, at home, at work, even on holiday.
In 2015, we produced our first Service Robotics Case Studies featuring robotics companies: Fetch Robotics, Fellow Robots, Adept and Savioke. We will shortly release our second report featuring Catalia Health, Cleverpet, RobotLab and Simbe.
Design guidelines can not only create delightful products but can fill the ethical gap in between standards and laws.
After all, if our robots behave badly, we have only ourselves to blame.