This paper describes a novel approach in human robot interaction driven by ergonomics. With a clear focus on optimising ergonomics, the approach proposed here continuously observes a human user's posture and by invoking appropriate cooperative robot movements, the user's posture is, whenever required, brought back to an ergonomic optimum. Effectively, the new protocol optimises the human-robot relative position and orientation as a function of human ergonomics. An RGB-D camera is used to calculate and monitor human joint angles in real-time and to determine the current ergonomics state. A total of 6 main causes of low ergonomic states are identified, leading to 6 universal robot responses to allow the human to return to an optimal ergonomics state. The algorithmic framework identifies these 6 causes and controls the cooperating robot to always adapt the environment (e.g. change the pose of the workpiece) in a way that is ergonomically most comfortable for the interacting user. Hence, human-robot interaction is continuously re-evaluated optimizing ergonomics states. The approach is validated through an experimental study, based on established ergonomic methods and their adaptation for real-time application. The study confirms improved ergonomics using the new approach.
Collaborative robots or so-called cobots are opening new possibilities in human-robot interaction within industrial environments. Major design factors in the creation of collaborative robots are health and safety. The main issues covered within the area of health and safety in human-robot interaction are typically those relating to collision avoidance and ensuring that the human user is safe from immediate injury in case the robot and/or the user are not within their anticipated trajectory as well as behaviour due to any sort of failure or error. This is indeed critically important, and avoiding these states leads to averting immediate harm from the human. However, there is less emphasis on considering the human's long-term health and safety. Issues relating to the worker's comfort during working hours on the factory floor relate directly to their long-term health. Work related musculoskeletal disorders (WMSDs) are the result of a workers' comfort issues going unnoticed for a prolonged period of time. WMSDs are not only an issue of personal health for the worker, they also affect the business interests of the company they are working for. Prevention of WMSDs is therefore critical and of high importance.
Industrial workplaces are changing. The advances in safe human-robot interaction (HRI) have led to industrial robots moving past the large, heavy, fenced robots working on their own and towards relatively small, lightweight and safe robots that work hand-in-hand with human users [1]. This presents an opportunity to considerably improve ergonomics and comfort within the industrial workplaces through automation and with real-time ergonomics monitoring and response through robot assistance. This paper presents a novel interaction approach, where the robot can sense the human user’s ergonomic state based on established posturemonitoring methods, such as RULA, and is thus able to react to it, with the aim to constantly improve the human’s ergonomic state -in effect, optimising the interactions based entirely on the human’s comfort and ergonomics. The RULA worksheet by ErgonomicsPlus® is presented here in Figure 2 for reference throughout the paper.
There is robotics research where ergonomics methods have been considered. In [2], the concept of ergonomics-forone, i.e. the fitting of task and tool design to a specific person with special needs, is used for HRI. The authors create a robotic shopping cart for visually impaired users basing their design decisions on individual interviews with the users. The work in [3] concerns the creation of a nine degree-of-freedom model of the human arm to be used in the development, testing and comfort optimisation of an exoskeleton for the upper-arm. The authors report that the use of this method has resulted in the device being able to interact more comfortably with the human, and with more use of the natural limb workspace leading to better integration with human movements. In [4], a humanoid robot’s motion and manipulation planning is based on the RULA directives for human comfort, leading to human-like movements by the robot, aiming to achieve improved interaction with humans as the robot’s human-like movements will be more easily understood by the interacting human. In [5], a scooter robot learns the paths that the worker tends to use, and adjusts its wheel axes directions in a manner to reduce the forces applied by the user, improving the ergonomics of the task. In [6], the use of software-ergonomics to create more ergonomic collaborative tasks with the use of robots is proposed.
Recently, there has been an increase in efforts to bring ergonomic methods into the realm of HRI [7][8][9][10][11]. Here we present a computationally light method for human robot interactions based only and entirely on the optimization of the human user’s ergonomic state. In this manner, the type of handover industrial assistance cases described here will not need any other type of planning or programming, but rather, the system will just attempt to make the human comfortable, which we propose leads to optimal fulfilment of the task at hand.
In our set-up, we use a suite of sensors (including the Kinect depth sensor and inertial measurement units) to determine the posture of the interacting human and the Baxter® Research Robot to adjust the interaction objects so that ergonomics can be achieved. Our algorithms run in the Robot Operating System (ROS). In short, we aim to achieve robot assisted ergonomics integrating the posture ergonomics assessment method (here, Rapid Upper Limb Assessment -RULA) into our system.
The Kinect™ (which had been previously used with the RULA method in [12] and other ergonomic techniques in [13], [14]) is employed to ‘see’ the user’s posture as a human inspector would do [15]. Using open source libraries to use with the Kinect™ a human’s skeleton frames are broadcast, providing 15 joint positions, among them the head, neck, torso, shoulder, elbow and hand, that ca
This content is AI-processed based on open access ArXiv data.