Over the last years, mobile collaborative robots have been utilised in a series of diverse applications alongside human operators. The main goal has been to support operators towards completing tasks in an easier, time- and cost-effective manner. One of the main reasons why collaborative mobile platforms may be selected for certain applications is the fact that they can combine the dexterity of a robot arm and the ability to move around the shop floor. This way multiple operators in different workstations may be supported.
Operator tracking
While operators work in close proximity with the robot, their safety is of paramount importance. These collaborative mobile robotic platforms are typically equipped with laser floor scanners and the robotic arms that they carry are also equipped with joint torque sensors that may sense the presence of a human operator and will immediately lead to the decrease of the platform’s velocity or to a complete stop of the platform, thus ensuring that the robot causes no harm to the operator. Due to the nature of a typical manufacturing environment, the robots may have to pause or stop for rather long durations of time while working together with the operator. Therefore, to avoid this, various sensors may be used to track the operator's movements as a part of a more sophisticated multi-sensorial human detection system. The data obtained from these sensors can be utilised to perform trajectory planning for the robot arm, obstacle avoidance and navigation. Hence, the use of multiple operator tracking sensors may lead to the reduction of the number of times the robot has to stop its action, thereby improving its efficiency and ability to interact safely with the human operator. These sensors can be either wearable devices, such as Inertial Measurement Units (IMU) sensors or vision systems or a combination of these devices. The data from these sensors can be utilised to:
- Track the positional and rotational joints of the human operator.
- To perform gait analysis.
- To compute anthropomorphic data of the human operator.
- To track the activity of the operator.
- To closely monitor the distance between the collaborative manipulator and the human operator in the workspace.
At UCD-Laboratory for Advanced Manufacturing Simulation and Robotics (UCD-LAMS), a number of potential applications have been identified, where the safe interaction between the human and the robot can be achieved by tracking the human operator using different sensor devices. Furthermore, UCD-LAMS proposed a novel framework control using machine learning approaches to improve the performance of the collaborative mobile manipulator [1]. In some cases, the data from these sensor devices can be used together via sensor fusion algorithms, in order to complement the strong performance points of each sensor and to overcome their disadvantages. Some of the main challenges of using diverse sensors are:
- Establishing a communication network between the robot and the sensors.
- Collection and analysis of sensor data in real-time.
- Data privacy, collection and protection.
- Cost of hardware devices, such as sensors, and high-performance computers for processing data in real time.
Impact
In summary, the advantages of tracking a human operator using IMU and Vision system are:
- Allowing and enforcing safe Human-Robot interaction.
- Detecting the intentions of the human operator, so that mobile robot may start performing its own tasks in a fully synchronised manner.
- Identifying the activity performed by the human operator.
- Optimising the performance of the human-robot task.
Reference:
[1] Ramasubramanian AK, Aiman SM, Papakostas N. On using human activity recognition sensors to improve the performance of collaborative mobile manipulators: Review and outlook. Procedia CIRP 2021. https://doi.org/10.1016/j.procir.2020.05.227.
