The accurate teleoperation of robotic devices requires simple, yet intuitive and reliable control interfaces. However, current human–machine interfaces (HMIs) often fail to fulfill these characteristics, leading to systems requiring an intensive practice to reach a sufficient operation expertise. Here, we present a systematic methodology to identify the spontaneous gesture-based interaction strategies of naive individuals with a distant device, and to exploit this information to develop a data-driven body–machine interface (BoMI) to efficiently control this device. We applied this approach to the specific case of drone steering and derived a simple control method relying on upper-body motion. The identified BoMI allowed participants with no prior experience to rapidly master the control of both simulated and real drones, outperforming joystick users, and comparing with the control ability reached by participants using the bird-like flight simulator Birdly.

Data-driven body–machine interface for the accurate control of drones

Mintchev, Stefano;Coscia, Martina;Artoni, Fiorenzo;Micera, Silvestro
2018-01-01

Abstract

The accurate teleoperation of robotic devices requires simple, yet intuitive and reliable control interfaces. However, current human–machine interfaces (HMIs) often fail to fulfill these characteristics, leading to systems requiring an intensive practice to reach a sufficient operation expertise. Here, we present a systematic methodology to identify the spontaneous gesture-based interaction strategies of naive individuals with a distant device, and to exploit this information to develop a data-driven body–machine interface (BoMI) to efficiently control this device. We applied this approach to the specific case of drone steering and derived a simple control method relying on upper-body motion. The identified BoMI allowed participants with no prior experience to rapidly master the control of both simulated and real drones, outperforming joystick users, and comparing with the control ability reached by participants using the bird-like flight simulator Birdly.
2018
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/524058
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 51
social impact