In this paper we address the abnormality detection problem in crowded scenes. We propose to use Generative Adversarial Nets (GANs), which are trained using normal frames and corresponding optical-flow images in order to learn an internal representation of the scene normality. Since our GANs are trained with only normal data, they are not able to generate abnormal events. At testing time the real data are compared with both the appearance and the motion representations reconstructed by our GANs and abnormal areas are detected by computing local differences. Experimental results on challenging abnormality detection datasets show the superiority of the proposed method compared to the state of the art in both frame-level and pixel-level abnormality detection tasks.

Abnormal Event Detection in Videos using Generative Adversarial Nets

RAVANBAKHSH, SAYYED MAHDYAR;MARCENARO, LUCIO;REGAZZONI, CARLO;
2017-01-01

Abstract

In this paper we address the abnormality detection problem in crowded scenes. We propose to use Generative Adversarial Nets (GANs), which are trained using normal frames and corresponding optical-flow images in order to learn an internal representation of the scene normality. Since our GANs are trained with only normal data, they are not able to generate abnormal events. At testing time the real data are compared with both the appearance and the motion representations reconstructed by our GANs and abnormal areas are detected by computing local differences. Experimental results on challenging abnormality detection datasets show the superiority of the proposed method compared to the state of the art in both frame-level and pixel-level abnormality detection tasks.
File in questo prodotto:
File Dimensione Formato  
Abnormal Event Detection in Videos using Generative Adversarial Nets.pdf

accesso chiuso

Descrizione: Articolo principale
Tipologia: Documento in versione editoriale
Dimensione 325.81 kB
Formato Adobe PDF
325.81 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/876799
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 400
  • ???jsp.display-item.citation.isi??? 303
social impact