Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) - The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior. © 2012 Authors.

Estimation of FAPs and intensities of AUs based on real-time face tracking

Niewiadomski R.;
2013-01-01

Abstract

Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) - The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior. © 2012 Authors.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1124160
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact