Emotional facial expression analysis in the time domain



A facial expression is one or more motions or positions of the muscles beneath the skin of the face. According to one set of controversial theories, these movements convey the emotional state of an individual to observers. Facial expressions are a form of nonverbal communication. They are a primary means of conveying social information between humans, but they also occur in most other mammals and some other animal species. (For a discussion of the controversies on these claims, see Fridlund[1] and Russell & Fernandez Dols.[2]). The pioneer F-M Facial Action Coding System 2.0 (F-M FACS 2.0) [3] was created in 2017 by Dr. Freitas-Magalhães, and presents about 2,000 segments in 4K, using 3D technology and automatic and real-time recognition.

Humans can adopt a facial expression voluntarily or involuntarily, and the neural mechanisms responsible for controlling the expression differ in each case. Voluntary facial expressions are often socially conditioned and follow a cortical route in the brain. Conversely, involuntary facial expressions are believed to be innate and follow a subcortical route in the brain.
Facial recognition is often an emotional experience for the brain and the amygdala is highly involved in the recognition process.


The eyes are often viewed as important features of facial expressions. Aspects such as blinking rate can be used to indicate whether or not a person is nervous or whether or not he or she is lying. Also, eye contact is considered an important aspect of interpersonal communication. However, there are cultural differences regarding the social propriety of maintaining eye contact or not.
Beyond the accessory nature of facial expressions in spoken communication between people, they play a significant role in communication with sign language. Many phrases in sign language include facial expressions in the display.
There is controversy surrounding the question of whether or not facial expressions are worldwide and universal displays among humans. Supporters of the Universality Hypothesis claim that many facial expressions are innate and have roots in evolutionary ancestors. Opponents of this view question the accuracy of the studies used to test this claim and instead believe that facial expressions are conditioned and that people view and understand facial expressions in large part from the social situations around them.
Emotions have been studied for a long time and results show that they play an important role in human cognitive functions. In fact, emotions play an extremely important role during the communication between people. And the human face is the most communicative part of the body for expressing emotions; it is recognized that a link exists between facial activity and emotional states. In order to make computer applications more believable and friendly, giving them the ability to recognize and/or express emotions are research fields which have been much focused on. Being able to perform these tasks, firstly, we need to have knowledge about the relationship between emotion and facial activity. 


Up to now, there have been proposed researches on this relationship. However, almost all these researches focused on analyzing the relationship without taking into account time factors. They analyzed the relationship but did not examined it in the time domain. In this paper, we propose a work on analyzing the relationship between emotions and facial activity in the time domain. Our goal is finding the temporal patterns of facial activity of six basic emotions (happy, sad, angry, fear, surprise, disgust). To perform this task, we analyzed a spontaneous video database in order to consider how facial activities which are related to the six basic emotions happen temporally. From there, we bring out the general temporal patterns for facial expressions of each of the six emotions.

Nhận xét