Traditional training and education methodologies are accompanied by state-of-the-art simulation technologies, including robotic systems, haptic interfaces, and virtual reality headsets. The application of a broad spectrum of technologies facilitates the creation of immersive simulations of real-life scenarios within a controlled environment. In immersive simulators aimed at human training, such as for sports and rehabilitation, besides the application of these technologies, human movements and poses are essential indicators of how individuals interact with the simulation environment. Having a real-time method to estimate the inertial properties of the human body is crucial for the development of human-in-the-loop dynamic simulators, where human motion significantly influences the dynamics of the simulated action. This work addresses the necessity for a novel method capable of providing real-time updates of inertial quantities, with a focus on robustness and versatility in adapting to various model to discretize human body. The study focuses on developing a simplified geometric model of the human body in discrete shapes. The 3D body pose is estimated through a depth camera, identifying key landmarks through deep learning algorithms (i.e., MediaPipe Pose, MoveNet and YOLOv8). These tools vary in the number of points detected and their accuracy, affecting model accuracy and computational requirements. The proposed method is validated against established anthropometric standards, demonstrating high accuracy in estimating body segment masses and overall center of mass. The study explores the feasibility of a reduced model from 33 to 17 landmarks that, despite a slight decrease in accuracy, offers significant improvements in computational efficiency, suggesting its potential for real-time applications.

Measuring Human Body Inertia in Real-Time Using Stereo Cameras and Deep Learning: A Model Dependency Analysis

Giulietti N.
;
Sergenti C.;Giberti H.;Carnevale M.
2024-01-01

Abstract

Traditional training and education methodologies are accompanied by state-of-the-art simulation technologies, including robotic systems, haptic interfaces, and virtual reality headsets. The application of a broad spectrum of technologies facilitates the creation of immersive simulations of real-life scenarios within a controlled environment. In immersive simulators aimed at human training, such as for sports and rehabilitation, besides the application of these technologies, human movements and poses are essential indicators of how individuals interact with the simulation environment. Having a real-time method to estimate the inertial properties of the human body is crucial for the development of human-in-the-loop dynamic simulators, where human motion significantly influences the dynamics of the simulated action. This work addresses the necessity for a novel method capable of providing real-time updates of inertial quantities, with a focus on robustness and versatility in adapting to various model to discretize human body. The study focuses on developing a simplified geometric model of the human body in discrete shapes. The 3D body pose is estimated through a depth camera, identifying key landmarks through deep learning algorithms (i.e., MediaPipe Pose, MoveNet and YOLOv8). These tools vary in the number of points detected and their accuracy, affecting model accuracy and computational requirements. The proposed method is validated against established anthropometric standards, demonstrating high accuracy in estimating body segment masses and overall center of mass. The study explores the feasibility of a reduced model from 33 to 17 landmarks that, despite a slight decrease in accuracy, offers significant improvements in computational efficiency, suggesting its potential for real-time applications.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/1517917
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact