Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.
Adding Gestures to Ordinary Mouse Use: a new Input Modality for Improved Human-Computer Interaction
LOMBARDI, LUCA;PORTA, MARCO
2007-01-01
Abstract
Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.