Facial muscle action analysis

Download Report

Transcript Facial muscle action analysis

FERA2011: The First Facial Expression
Recognition and Analysis Challenge
FG’11 March 2011
Michel Valstar, Marc Méhu,
Marcello Mortillaro, Maja Pantic,
Klaus Scherer
Page 1
© Imperial College London
Participation overview
•
•
•
•
•
•
•
•
Page 2
Data downloaded by 20 teams
15 submissions
11 accepted papers
13 teams in Emotion Sub-Challenge
5 teams in AU Sub-Challenge
Institutes from 6 countries
53 researchers, median of 6 per paper
5 entries were multi-institute endeavours
© Imperial College London
Trends
Machine Learning trends:
• 13/15 teams used SVM
• Three teams used multiple kernel SVMs, including the AU
winner
• Only 1 team modelled time
• Only 1 team used probabilistic graphical models
Feature trends:
• 4 teams encode appearance dynamics
• 4 teams use both appearance and geometric features (including
AU winners)
• Only 1 team infers 3D, but appears successful! (AU winner)
• Only 1 team uses Geometric features only, ranked 11th
Page 3
© Imperial College London
Baseline System – LBP based Expression
Recognition
Local Binary Pattern appearance descriptors are applied to the
face region to detect AUs and discrete emotions
•
•
•
•
•
Page 4
Face is registered using detected eyes.
Uniform Local Binary Pattern features are
computed on every pixel (LBP).
Face is divided in 10x10 blocks. In each block
a 256 bin histogram of the LBP features is
generated.
For every AU a GentleBoost-SVM is learned.
Upper face AUs use the concatenated
histograms of the top five rows, Lower face
AUs the bottom five rows.
For every Emotion a GentleBoost-SVM is
learned using all rows. SVM predictions are
per frame, decision is made by voting.
© Imperial College London
Baseline Overview (LAUD)
B. Jiang, M.F. Valstar, and M. Pantic, “Action Unit detection using sparse appearance descriptors in
space-time video volumes”, FG’11
Page 5
© Imperial College London
Winner of the Emotion Detection subchallenge
1. University of California, Riverside
Songfan Yang, Bir Bhanu
2. UIUC-UMC
Usman Tariq, Xi Zhou, Kai-Hsiang Lin, Zhen Li,
Zhaowen Wang, Vuang Le, Thomas Huang, Tony
Han, Xutao Lv
3. Karlsruhe Institute of Technology
Tobias Gehrig, Hazim Ekenel
Page 6
© Imperial College London
Ranking – Emotion Sub-challenge
Page 7
© Imperial College London
Person independent/specific emotion
detection
Page 8
© Imperial College London
Emotion secondary test results
Page 9
© Imperial College London
Winner of the Action Unit Detection subchallenge
1. University of French West Indies &
Guyana
Lionel Prevost, Thibaud Senechal, Vincent Rapp,
Hanan Salam, Renaud Seguier, Keving Bailly
2. University of California San Diego
Nicholas Butko, Javier Movellan, Tingfan Wu, Paul
Ruvolo, Jacob Whitehill, Marian Bartlett
3. Karlsruhe Institute of Technology
Tobias Gehrig, Hazim Ekenel
Page 10
© Imperial College London
Ranking – Action Unit Sub-challenge
F1-Measure
Baseline
MIT-Cambridge
Chew
KIT
UCSD
IRIS
UCSD
Chew
Baseline
0
Page 11
0.2
0.4
© Imperial College London
0.6
0.8
Person independent/specific AU detection
IRIS
UCSD
KIT
Specific
Independent
U. Brisbane
MIT-Cambridge
Baseline
0
Page 12
0.2
0.4
© Imperial College London
0.6
0.8
Conclusion and new goals
Conclusions:
• Person dependent discrete emotion detection is incredibly
successful
• Dynamic appearance is very successful
• Combined appearance/geometric approaches seem to be the
way forward
• AU detection far from solved
New avenues:
• Given the high success of discrete emotion, dimensional affect may
be a new goal to pursue
• Explicitly detecting temporal segments of facial expressions
• Analyse sensitivity of approaches to AU intensities.
• Leverage person specific approaches for AU detection
• Detection of AU intensity levels
Page 13
© Imperial College London