Texture and shape information fusion for facial action unit recognition
Kotsia, Irene ORCID: https://orcid.org/0000-0002-3716-010X, Zafeiriou, Stefanos, Nikolaidis, Nikolaos and Pitas, Ioannis
(2008)
Texture and shape information fusion for facial action unit recognition.
In: The First International Conference on Advances in Computer-Human Interaction (ACHI 2008), 10 - 15 February 2008, Sainte Luce, Martinique.
.
[Conference or Workshop Item]
Abstract
A novel method that fuses texture and shape information to achieve Facial Action Unit (FAU) recognition from video sequences is proposed. In order to extract the texture information, a subspace method based on Discriminant Non- negative Matrix Factorization (DNMF) is applied on the difference images of the video sequence, calculated taking under consideration the neutral and the most expressive frame, to extract the desired classification label. The shape information consists of the deformed Candide facial grid (more specifically the grid node displacements between the neutral and the most expressive facial expression frame) that corresponds to the facial expression depicted in the video sequence. The shape information is afterwards classified using a two-class Support Vector Machine (SVM) system. The fusion of texture and shape information is performed using Median Radial Basis Functions (MRBFs) Neural Networks (NNs) in order to detect the set of present FAUs. The accuracy achieved in the Cohn-Kanade database is equal to 92.1% when recognizing the 17 FAUs that are responsible for facial expression development.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Research Areas: | A. > School of Science and Technology > Computer Science A. > School of Science and Technology > Computer Science > Intelligent Environments group |
Item ID: | 9613 |
Useful Links: | |
Depositing User: | Devika Mohan |
Date Deposited: | 04 Dec 2012 07:16 |
Last Modified: | 13 Oct 2016 14:25 |
URI: | https://eprints.mdx.ac.uk/id/eprint/9613 |
Actions (login required)
![]() |
View Item |
Statistics
Additional statistics are available via IRStats2.