The advent of digital technologies has been revolutionizing a lot of different field and video processing is certainly one of the most deeply impacted. Small digital cameras are nowadays common in many environments but unfortunately, digital video still carries some legacy from the old analog era, the interlacement being one of the most annoying. Video interlacement consists of alternatively sending only odd or even lines of frames in order to reduce necessary bandwidth. Interlacement is indeed a smart form of analog video compression made possible by CRT (Cathode-Ray Tube) monitor's optical properties. Unfortunately interlacement is unacceptable for computer monitors, which instead work using a non interlaced mode (progressive). Showing interlaced videos on progressive screens results in annoying video defects. Moreover, technology evolved enough not to need interlacement any more. Nevertheless, interlaced cameras are still very common in every application that needs to preserve compatibility with analog television. Therefore, to make interlaced videos usable in modern systems, deinterlacement techniques have been developed. There are many deinterlace algorithms of different complexities, but none of them is able to deliver the best possible quality with both still and fast changing scenes. The best results are obtained using adaptive techniques, which dynamically change the algorithm depending on scene's motion characteristics. Deinterlacing a video using adaptive motion technique is computationally expensive. Doing it real-time is a challenge topic. The proposed idea consists in the implementation of four real-time deinterlacers on a single FPGA (Field Programmable Gate Array) in order to give the possibility to select the most suitable for the specific application using either a manual selection or a motion detection algorithm.

A Video Elaboration System for Image Deinterlacing and Processing in Race Cars

DANESE, GIOVANNI;GIACHERO, MAURO;LEPORATI, FRANCESCO;MAJANI, ALESSANDRA;NAZZICARI, NELSON DAVIDE;
2010-01-01

Abstract

The advent of digital technologies has been revolutionizing a lot of different field and video processing is certainly one of the most deeply impacted. Small digital cameras are nowadays common in many environments but unfortunately, digital video still carries some legacy from the old analog era, the interlacement being one of the most annoying. Video interlacement consists of alternatively sending only odd or even lines of frames in order to reduce necessary bandwidth. Interlacement is indeed a smart form of analog video compression made possible by CRT (Cathode-Ray Tube) monitor's optical properties. Unfortunately interlacement is unacceptable for computer monitors, which instead work using a non interlaced mode (progressive). Showing interlaced videos on progressive screens results in annoying video defects. Moreover, technology evolved enough not to need interlacement any more. Nevertheless, interlaced cameras are still very common in every application that needs to preserve compatibility with analog television. Therefore, to make interlaced videos usable in modern systems, deinterlacement techniques have been developed. There are many deinterlace algorithms of different complexities, but none of them is able to deliver the best possible quality with both still and fast changing scenes. The best results are obtained using adaptive techniques, which dynamically change the algorithm depending on scene's motion characteristics. Deinterlacing a video using adaptive motion technique is computationally expensive. Doing it real-time is a challenge topic. The proposed idea consists in the implementation of four real-time deinterlacers on a single FPGA (Field Programmable Gate Array) in order to give the possibility to select the most suitable for the specific application using either a manual selection or a motion detection algorithm.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/214463
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact