Time-of-flight (TOF) cameras are relatively new sensors that provide a 3D measurement of a scene. By means of the distance signal, objects can be separated from the background on the basis of their distance from the sensor. For virtual studios applications, this feature can represent a revolution as virtual videos can be produced without a studio. When TOF cameras become available to the consumer market, everybody may come to be a virtual studio director. We study real-time fast algorithms to enable unprofessional virtual studio applications by TOF cameras. In this paper we present our approach to foreground segmentation, based on smart-seeded region growing and Kalman tracking. With respect to other published work, this method allows for working with a non-stationary camera and with multiple actors or moving objects in the foreground providing high accuracy for real-time computation.

Tracking without Background Model for Time-of-Flight Cameras

BIANCHI, LUCA;LOMBARDI, LUCA;LOMBARDI, PAOLO
2009-01-01

Abstract

Time-of-flight (TOF) cameras are relatively new sensors that provide a 3D measurement of a scene. By means of the distance signal, objects can be separated from the background on the basis of their distance from the sensor. For virtual studios applications, this feature can represent a revolution as virtual videos can be produced without a studio. When TOF cameras become available to the consumer market, everybody may come to be a virtual studio director. We study real-time fast algorithms to enable unprofessional virtual studio applications by TOF cameras. In this paper we present our approach to foreground segmentation, based on smart-seeded region growing and Kalman tracking. With respect to other published work, this method allows for working with a non-stationary camera and with multiple actors or moving objects in the foreground providing high accuracy for real-time computation.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/146008
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact