AffectNet

AffectNet

Existing annotated databases of facial expressions in the wild are small and mostly cover discrete emotions (aka the categorical model). There are very limited annotated facial databases for affective computing in the continuous dimensional model (e.g., valence and arousal).

To meet this need, we have created AffectNet, a new database of facial expressions in the wild, by collecting and annotating facial images. AffectNet contains more than 1M facial images collected from the Internet by querying three major search engines using 1250 emotion related keywords in six different languages. About half of the retrieved images (~440K) were manually annotated for the presence of seven discrete facial expressions (categorial model) and the intensity of valence and arousal (dimensional model). AffectNet is by far the largest database of facial expressions, valence, and arousal in the wild enabling research in automated facial expression recognition in two different emotion models. Two baseline deep neural networks are used to classify images in the categorical model and predict the intensity of valence and arousal. Various evaluation metrics show that our deep neural network baselines can perform better than conventional machine learning methods and off-the-shelf facial expression recognition systems.

All submitted papers (or any publically available text) that use the compound or partial images of the database must cite the following paper:

Ali Mollahosseini, Behzad Hasani, and Mohammad H. Mahoor, “AffectNet: A New Database for Facial Expression, Valence, and Arousal Computation in the Wild”, In Press, IEEE Transactions on Affective Computing.

You can download our paper HERE.

AffectNet Contact Form

The "Researcher" has requested permission to use the AffectNet database (the "Database") created at the University of Denver. In exchange for such permission, Researcher hereby agrees to the following terms and conditions:

1- Researcher listed above assumes all responsibilities for the correct handling of the Database and guarantees that all national protocols and laws will be followed.
2- Researcher shall use the Database only for non-commercial research and educational purposes.
3- Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify University of Denver, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
4- Researcher shall not provide the database either partially or as a whole to second parties.
5- Researcher will obtain IRB approval, where applicable, at his/her home institution for any use of the databases,
6- All copies of this database will ONLY be stored in computers at their institution listed above
7- University of Denver reserves the right to terminate Researcher's access to the Database at any time.
8- University of Denver makes no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
9- If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
10- The law of the State of Colorado shall apply to all disputes under this agreement.
11- All submitted papers (or any publically available text) that uses the compound or partial images of the database must cite the following paper:

Ali Mollahosseini, Behzad Hasani, and Mohammad H. Mahoor, “AffectNet: A New Database for Facial Expression, Valence, and Arousal Computation in the Wild”, In Press, IEEE Transactions on Affective Computing

By clicking on "Agree & Submit" you agree to term of use mentioned above.