Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user.
Scheda prodotto non validato
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo
Titolo: | Adding Gestures to Ordinary Mouse Use: a new Input Modality for Improved Human-Computer Interaction | |
Autori: | ||
Data di pubblicazione: | 2007 | |
Abstract: | Although the way we interact with computers is substantially the same since twenty years --based on keyboard, mouse and window metaphor-- machine perception could be usefully exploited to enhance the human- computer communication process. In this paper, we present a vision-based user interface where plain, static hand gestures performed nearby the mouse are interpreted as specific input commands. Our tests demonstrate that this new input modality does not interfere with ordinary mouse use and can speed up task execution, while not requiring too much attention from the user. | |
Handle: | http://hdl.handle.net/11571/32114 | |
ISBN: | 9780769528779 | |
Appare nelle tipologie: | 4.1 Contributo in Atti di convegno |