• 8 Reasons why you shouldn’t carry out collision tests with your collaborative robot on your employees

12/18/2019

Many manufacturers of collaborative robots feel that trust in robot behavior can be best won through demonstration - so quite a number of videos have appeared on social media demonstrating that risk reduction measures like collision detection, proximity detection etc. work and will prevent humans from serious injury. COVR gives you 8 reasons why you shouldn't carry out collision tests with your collaborative robot on your employees.

 

Applications featuring collaborative robots need to be validated in order to prove that they conform to the relevant standards. But how best to test this? Some robotics users have had their employees test on themselves.

Indeed, many manufacturers of collaborative robots feel that trust in robot behavior can be best won through demonstration - so quite a number of videos have appeared on social media demonstrating that risk reduction measures like collision detection, proximity detection etc. work and will prevent humans from serious injury.

While some people may consider this kind of test as valid proof of the safety of the robotic system, COVR would like to point out the many problems with this kind of "safety demonstration" and share how to do this better.

1. It doesn’t tell you what you want to know.
In Europe, collision tests with collaborative robots are necessary to demonstrate conformity with existing standards and state of the art. If you install a collaborative robotic application in Europe, you need a CE Mark. And if the application features power and force limiting, you NEED to prove that the entire robotic system (robot, tooling, parts manipulated by the robot) conforms to the machinery directive, which you can do by showing that the forces and pressures are within the limits listed in the ISO/TS 15066.

2. People aren’t reliable measurement instruments.
What’s your accuracy and repeatability for how hard someone hits you? Individual feelings of pain and onset of injuries are very subjective and vary from person to person. When was the last time you hit someone with exactly 220 Newtons?

3. It’s not scientific. (or statistically relevant).
Testing with one person does not produce statistically relevant numbers. How do you get a meaningful cross-section of society (male, female, different ages, weights, heights, levels of health, etc.)? How many people were you planning on testing with? The only objective data you could gather are details about the injuries incurred.

4. It’s not ethical.
Does your student intern really have a choice about whether they want to participate in the tests? Informed consent is important too: Have you actually got informed consent from your employee to be a study subject?

5. Robots are strong, very strong.
Even a collaborative robot can be programmed to move in a way that can be extremely harmful. At any rate, this kind of testing exposes people to an unnecessary risk. 

6. It’s dangerous for humans.
A person could get hit in a bad spot, e.g. on the neck, which could cause blood clotting in blood vessels and even brain aneurisms. Try explaining that to your intern’s mom/daughter.

7. It's bad marketing.
If the whole reason for doing this is to build trust from your customers, would the practice of testing a potentially dangerous machine on a person without using reliable measurement instruments and without a proper, ethical approach beforehand send your customers the right message about your approach to safety and how serious you mean it?

8. Because there are better ways.
See the COVR toolkit and COVR protocols which tie in with ISO/TS 15066. 

In some cases, where user characteristics play an important role in the behavior of the collaborative robot (e.g. in rehabilitation robots or exoskeletons) certain safety evaluations will need involvement of human subjects. This however will only be allowed when e.g.: a proper risk assessment has been performed, a (medical) ethical committee has approved the research during which also the residual risks during normal use need to be evaluated, or when the person themselves has given informed consent.

Nevertheless, you need to know that the basic structure and behavior of the cobot is safe enough to put a person in, before testing it with healthy persons/patients.