When we reach out to grasp objects, vision plays a major role in the control of our movements. Nevertheless, other sensory modalities contribute to the fine-tuning of our actions. Even olfaction has been shown to play a role in the scaling of movements directed at objects. Much less is known about how auditory information might be used to program grasping movements. The aim of our study was to investigate how the sound of a target object affects the planning of grasping movements in normal right-handed subjects. We performed an experiment in which auditory information could be used to infer size of targets when the availability of visual information was varied from trial to trial. Classical kinematic parameters (such as grip aperture) were measured to evaluate the influence of auditory information. In addition, an optimal inference modeling was applied to the data. The scaling of grip aperture indicated that the introduction of sound allowed subjects to infer the size of the object when vision was not available. Moreover, auditory information affected grip aperture even when vision was available. Our findings suggest that the differences in the natural impact sounds of objects of different sizes being placed on a surface can be used to plan grasping movements.

Integration of visual and auditory information for hand actions: preliminary evidence for the contribution of natural sounds to grasping

SEDDA, ANNA;BOTTINI, GABRIELLA;
2011-01-01

Abstract

When we reach out to grasp objects, vision plays a major role in the control of our movements. Nevertheless, other sensory modalities contribute to the fine-tuning of our actions. Even olfaction has been shown to play a role in the scaling of movements directed at objects. Much less is known about how auditory information might be used to program grasping movements. The aim of our study was to investigate how the sound of a target object affects the planning of grasping movements in normal right-handed subjects. We performed an experiment in which auditory information could be used to infer size of targets when the availability of visual information was varied from trial to trial. Classical kinematic parameters (such as grip aperture) were measured to evaluate the influence of auditory information. In addition, an optimal inference modeling was applied to the data. The scaling of grip aperture indicated that the introduction of sound allowed subjects to infer the size of the object when vision was not available. Moreover, auditory information affected grip aperture even when vision was available. Our findings suggest that the differences in the natural impact sounds of objects of different sizes being placed on a surface can be used to plan grasping movements.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/986299
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 30
  • ???jsp.display-item.citation.isi??? 26
social impact