Existing annotated databases of facial expressions in the wild are small and mostly cover discrete emotions (aka the categorical model). There are very limited annotated facial databases for affective computing in the continuous dimensional model (e.g., valence and arousal).To meet this need, we have created AffectNet, a new database of facial expressions in the wild, by collecting and annotating facial images. AffectNet contains more than 1M facial images collected from the Internet by querying three major search engines using 1250 emotion related keywords in six different languages. About half of the retrieved images (~440K) were manually annotated for the presence of seven discrete facial expressions (categorial model) and the intensity of valence and arousal (dimensional model). AffectNet is by far the largest database of facial expressions, valence, and arousal in the wild enabling research in automated facial expression recognition in two different emotion models. Two baseline deep neural networks are used to classify images in the categorical model and predict the intensity of valence and arousal. Various evaluation metrics show that our deep neural network baselines can perform better than conventional machine learning methods and off-the-shelf facial expression recognition systems.
Publication: AffectNet: A New Database for Facial Expression, Valence, and Arousal Computation in the Wild
STUDENTS: Please ask your academic advisor/supervisor to request access to AffectNet.
IMPORTANT: All papers (or any publicly available text) that use the compound or partial images of the database must cite the following paper:
Ali Mollahosseini, Behzad Hasani, and Mohammad H. Mahoor, AffectNet: A New Database for Facial Expression, Valence, and Arousal Computation in the Wild, IEEE Transactions on Affective Computing, 2017.
Clarification: The AffectNet database is intended for research purposes only. Students must have their academic advisor/supervisor request access to the database. All papers or publicly available text that use the compound or partial images of the database must cite the paper listed above.
The total numbers of manually annotated images in the training and validation sets in each category of emotions are given in the following Table.
For downloading AffectNet, Only Lab Managers or Professors can request AffectNet by downloading and completing and signing The LICENSE AGREEMENT FILE. Once the agreement is completed, use the following form to submit your request. Make sure you attach the agreement file to the form.
Submit A Request