This paper addresses the problem of real time face recognition in unconstrained environments from the analysis of low quality video frames. It focuses in particular on finding an effective and fast to compute (that is, sparse) representation of faces, starting from classical Local Binary Patterns (LBPs). The two contributions of the paper are a new formulation of Group LASSO for structured feature selection (MCGroup LASSO) to cope directly with multi-class settings, and a face recognition pipeline based on a representation derived from MC-GrpLASSO. We present an extensive experimental analysis on two benchmark datasets, MOBO and Choke Point, and on a more complex dataset acquired in-house over a large temporal span. We compare our results with state-of-the-art approaches and show the superiority of our method in terms of both performances and sparseness of the obtained solution.

Structured multi-class feature selection for effective face recognition

FUSCO, GIOVANNI;ZINI, LUCA;NOCETI, NICOLETTA;ODONE, FRANCESCA
2013-01-01

Abstract

This paper addresses the problem of real time face recognition in unconstrained environments from the analysis of low quality video frames. It focuses in particular on finding an effective and fast to compute (that is, sparse) representation of faces, starting from classical Local Binary Patterns (LBPs). The two contributions of the paper are a new formulation of Group LASSO for structured feature selection (MCGroup LASSO) to cope directly with multi-class settings, and a face recognition pipeline based on a representation derived from MC-GrpLASSO. We present an extensive experimental analysis on two benchmark datasets, MOBO and Choke Point, and on a more complex dataset acquired in-house over a large temporal span. We compare our results with state-of-the-art approaches and show the superiority of our method in terms of both performances and sparseness of the obtained solution.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/687772
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 3
social impact