Increased communication capabilities and automatic scene understanding allow human operators to simultaneously monitor multiple environments. Due to the amount of data to be processed in new surveillance systems, the human operator must be helped by automatic processing tools in the work of inspecting video sequences. In this paper, a novel approach allowing layered content-based retrieval of video-event shots referring to potentially interesting situations is presented. Interpretation of events is used for defining new video-event shot detection and indexing criteria. Interesting events refer to potentially dangerous situations: abandoned objects and predefined human events are considered in this paper. Video-event shot detection and indexing capabilities are used for online and offline content-based retrieval of scenes to be detected.

Automatic detection and indexing of video-event shots for surveillance applications

C. S. Regazzoni;L. Marcenaro;
2002-01-01

Abstract

Increased communication capabilities and automatic scene understanding allow human operators to simultaneously monitor multiple environments. Due to the amount of data to be processed in new surveillance systems, the human operator must be helped by automatic processing tools in the work of inspecting video sequences. In this paper, a novel approach allowing layered content-based retrieval of video-event shots referring to potentially interesting situations is presented. Interpretation of events is used for defining new video-event shot detection and indexing criteria. Interesting events refer to potentially dangerous situations: abandoned objects and predefined human events are considered in this paper. Video-event shot detection and indexing capabilities are used for online and offline content-based retrieval of scenes to be detected.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1104997
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 51
  • ???jsp.display-item.citation.isi??? 35
social impact