Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.

Adding Gestures to Ordinary Mouse Use: a new Input Modality for Improved Human-Computer Interaction

LOMBARDI, LUCA;PORTA, MARCO
2007-01-01

Abstract

Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.
2007
9780769528779
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/32114
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 2
social impact