Tracking moving objects is a relevant topic in a wide range of research activities. To meet this goal, sensor fusion is a frequently used approach that is able to exploit the synergies from different sensors, achieving high precision and accuracy. In this article, an application of this technique is presented for an actual industrial scenario, namely, the shield metal arc welding (SMAW) process. A wearable real-time tracking system, employing a camera and an inertial unit on a commercial helmet, is described and assessed to estimate the speed of a light-emitting target, namely, the welding pool, despite the noises introduced by sparks, smoke, and high-density illumination. The vision-based module was used to follow the weld pool and to calibrate automatically the system, while the inertial one was used to compensate for the influence of the yaw-angular displacements. Results from tests performed on four batches of electrodes showed how the system is able to catch the phenomenon with errors less than 25%. Compared with other systems, our prototype is able to acquire data from moving objects in real time through sensors positioned onto a moving reference system.
Head-Mounted Standalone Real-Time Tracking System for Moving Light-Emitting Targets Fusing Vision and Inertial Sensors
Digiacomo F.;Afroz A. S.;Pelliccia R.;Inglese F.;Milazzo M.;Stefanini C.
2020-01-01
Abstract
Tracking moving objects is a relevant topic in a wide range of research activities. To meet this goal, sensor fusion is a frequently used approach that is able to exploit the synergies from different sensors, achieving high precision and accuracy. In this article, an application of this technique is presented for an actual industrial scenario, namely, the shield metal arc welding (SMAW) process. A wearable real-time tracking system, employing a camera and an inertial unit on a commercial helmet, is described and assessed to estimate the speed of a light-emitting target, namely, the welding pool, despite the noises introduced by sparks, smoke, and high-density illumination. The vision-based module was used to follow the weld pool and to calibrate automatically the system, while the inertial one was used to compensate for the influence of the yaw-angular displacements. Results from tests performed on four batches of electrodes showed how the system is able to catch the phenomenon with errors less than 25%. Compared with other systems, our prototype is able to acquire data from moving objects in real time through sensors positioned onto a moving reference system.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.