EmotioNet: An Accurate, Real-Time Algorithm for the Automatic Annotation of a Million Facial Expressions in the Wild
Citations Over TimeTop 1% of 2016 papers
Abstract
Research in face perception and emotion theory requires very large annotated databases of images of facial expressions of emotion. Annotations should include Action Units (AUs) and their intensities as well as emotion category. This goal cannot be readily achieved manually. Herein, we present a novel computer vision algorithm to annotate a large database of one million images of facial expressions of emotion in the wild (i.e., face images downloaded from the Internet). First, we show that this newly proposed algorithm can recognize AUs and their intensities reliably across databases. To our knowledge, this is the first published algorithm to achieve highly-accurate results in the recognition of AUs and their intensities across multiple databases. Our algorithm also runs in real-time (>30 images/second), allowing it to work with large numbers of images and video sequences. Second, we use WordNet to download 1,000,000 images of facial expressions with associated emotion keywords from the Internet. These images are then automatically annotated with AUs, AU intensities and emotion categories by our algorithm. The result is a highly useful database that can be readily queried using semantic descriptions for applications in computer vision, affective computing, social and cognitive psychology and neuroscience, e.g., "show me all the images with happy faces" or "all images with AU 1 at intensity c".
Related Papers
- → A Survey on Automatic Multimodal Emotion Recognition in the Wild(2020)52 cited
- → Facial Emotion Recognition Using Context Based Multimodal Approach(2011)22 cited
- Facial Emotion Recognition Using Context Based Multimodal Approach(2016)
- → A Review of Emotions Recognition via Facial Expressions for Human-Robot Interaction(2023)1 cited
- → MDEAW: A Multimodal Dataset for Emotion Analysis through EDA and PPG signals from wireless wearable low-cost off-the-shelf Devices(2022)