Tracking of non-rigid objects (e.g. humans) is a crucial application for understanding the behavior of objects. Different methods have been presented in literature, whose main drawback is low robustness or high computational load in analysis of cluttered scenes. In the paper a low computational algorithm for tracking non-rigid objects in cluttered scenes is presented. The proposed approach models the shape of the objects by using corners. A learning algorithm is introduced in order to automatically extract the model of the object from a short video sequence acquired immediately before merging of more objects in the scene. The adaptive model extraction mechanism strongly improves method robustness. The method is tested on an existing video-surveillance system in order to track moving objects in cluttered scenes. Results show that the proposed approach gives good performances with low-processing times.

Adaptive tracking of multiple non rigid objects in cluttered scenes

Regazzoni, C
2000-01-01

Abstract

Tracking of non-rigid objects (e.g. humans) is a crucial application for understanding the behavior of objects. Different methods have been presented in literature, whose main drawback is low robustness or high computational load in analysis of cluttered scenes. In the paper a low computational algorithm for tracking non-rigid objects in cluttered scenes is presented. The proposed approach models the shape of the objects by using corners. A learning algorithm is introduced in order to automatically extract the model of the object from a short video sequence acquired immediately before merging of more objects in the scene. The adaptive model extraction mechanism strongly improves method robustness. The method is tested on an existing video-surveillance system in order to track moving objects in cluttered scenes. Results show that the proposed approach gives good performances with low-processing times.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1104910
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact