Datasets

Datasets Being Used/To be Used in Project

BP4D

 This is a 3D video database of spontaneous facial expressions in a diverse group of young adults. Well-validated emotion inductions were used to elicit expressions of emotion and paralinguistic communication. Frame-level ground-truth for facial actions was obtained using the Facial Action Coding System. Facial features were tracked in both 2D and 3D domains.  

Click Here for the Dataset Website

MMI

 This dataset, supresses the problem of having limited numbered databases that displays the behaviour and corresponding affect. To address this problem, the MMI-Facial Expression database was conceived in 2002 as a resource for building and evaluating facial expression recognition algorithms. The database addresses a number of key omissions in other databases of facial expressions. In particular, it contains recordings of the full temporal pattern of a facial expressions, from Neutral, through a series of onset, apex, and offset phases and back again to a neutral face.  

Click Here for the Dataset Website

MPI

This database contains videos of facial action units which were recorded starting in autumn of 2003 at the MPI for Biological Cybernetics in the Face and Object Recognition Group, department Prof. B├╝lthoff, using the Videolab facilities created by Mario Kleiner and Christian Wallraven. 

Click Here for the Dataset Website

SEMAINE

 The SEMAINE database was collected for the SEMAINE-project which aims to build a SAL, a Sensitive Artificial Listener, a multimodal dialogue system which can interact with humans with a virtual character, sustain an interaction with a user for some time and react appropriately to the user's non-verbal behaviour. 

Click Here for the Dataset Website

FER 2013

 The dataset consists of 48x48 pixel grayscale images of faces. The faces have been automatically registered so that the face is more or less centered and occupies about the same amount of space in each image. The task is to categorize each face based on the emotion shown in the facial expression in to one of seven categories (0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral). 

Click Here for the Dataset Website

CMU MULTI-PIE

 The CMU Multi-PIE face database contains more than 750,000 images of 337 people recorded in up to four sessions over the span of five months. Subjects were imaged under 15 view points and 19 illumination conditions while displaying a range of facial expressions. In addition, high resolution frontal images were acquired as well. In total, the database contains more than 305 GB of face data. 

Click Here for the Dataset Website

CK, CK+

 The Cohn-Kanade AU-Coded Facial Expression Database is for research in automatic facial image analysis and synthesis and for perceptual studies. 

Click Here for the Dataset Website

EmotiW 2014

 Dataset used in the First Emotion Recognition In The Wild Challenge was organised at the ACM Intenrnational Conference on Multimodal Interaction 2013, Sydney. 

Click Here for the Dataset Website

AFW (68 Landmark Annotations)

 AFW presents extensive results on standard face benchmarks, as well as a new "in the wild" annotated dataset, to perform remarkable results for such tasks: face detection, pose estimation, and landmark estimation in real-world, cluttered images. Dataset extensively used for Zhu-Ramanan's face detection work(X. Zhu, D. Ramanan. "Face detection, pose estimation and landmark localization in the wild"). 

Click Here for the Dataset Website

LFPW (68 Landmark Annotations)

 811 faces downloaded from the web using simple text queries on sites such as google.com, flickr.com, and yahoo.com. Annotated with 68 facial landmark points in each image. 

Click Here for the Dataset Website