We propose an architecture of an embodied conversational agent that takes into account two aspects of emotions: the emotions triggered by an event (the felt emotions) and the expressed emotions (the displayed ones), which may differ in real life, In this paper, we present a formalization of emotion eliciting-events based on a model of the agent's mental state composed of beliefs, choices, and uncertainties. This model enables to identify the emotional state of an agent at any time. We also introduce a computational model based on fuzzy logic that computes facial expressions of emotions blending. Finally, examples of facial expressions resulting from the implementation of our model are shown. © Springer-Verlag Berlin Heidelberg 2005.
Intelligent expressions of emotions
Niewiadomski R.;
2005-01-01
Abstract
We propose an architecture of an embodied conversational agent that takes into account two aspects of emotions: the emotions triggered by an event (the felt emotions) and the expressed emotions (the displayed ones), which may differ in real life, In this paper, we present a formalization of emotion eliciting-events based on a model of the agent's mental state composed of beliefs, choices, and uncertainties. This model enables to identify the emotional state of an agent at any time. We also introduce a computational model based on fuzzy logic that computes facial expressions of emotions blending. Finally, examples of facial expressions resulting from the implementation of our model are shown. © Springer-Verlag Berlin Heidelberg 2005.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.