Sei sulla pagina 1di 18

Facial Expression & Emotion Detection for Man-Machine Topic Tutorial Interaction

Two legends of their own..


People we cant forget..

Technology has made them eternal..

Why Emerging Trend???


The demand for humanoid robots as service robots for everyday life has increased during the last years. Detection of emotions which enables the robot to react appropriate to the emotional state of the communication partner. Humanoid robots are - as it can be seen in many movies - of great interest. Either as entertainment robots, enduring workers or for the care of elderly people. All these possible scenarios have one thing in common: the robot is an accepted member of society and therefore it must behave as a human would do.

Outline Facial Expression Estimation


Face Detection
Facial Feature Extraction Anatomical Constraints Anthropometry FP Localization FAP Calculation Expression Profiles

Gesture Analysis

Face Detection

Facial Feature Extraction Multiple cue Facial Feature Boundary Extraction :


Eyebrows Eyes Nose Mouth

Each mask is either Edge-based mask or Intensitybased mask. Each mask is validated independently.

Multiple Cue Facial Feature Extraction an example-

ANTHROPOMETRY - final mask validation


Facial distances Male/Female
separation measured by the US Army
(30 year period )

The measured distances are normalized by division with Distance 7, i.e. the distance between the inner corners of left and right eye, both points the human cannot move.

DA5n, DA10n: distances in figures normalized by division with distance DA7: (DA5n=DA5/DA7, DA10n=DA10/DA7) DAewn: eye width (calculated from DA5 and DA7) DAewn=((DA5-DA7)/2)/DA7

D5n

DA5n_m DA5n_m in ax

D10n

DA10n_ DA10n_ DAewn_ DAewn Dew_ln Dew_rn min max min _max

2.129

2.517

3.349

0.919

1.031

1.515

0.677

0.452

0.840 1.077

FAP (Facial Animation Parameters)


Discrete features offer a neat, symbolic representation of expressions Not constrained to a specific face model
Suitable for face cloning applications

MPEG-4 compatible: unified treatment of analysis and synthesis parts In MMI environments.

FAPs estimation
2
1 1 4 10 15 15 24 26 24
20 24 26 Lip stretcher Lip presser Jaw drop

2
1

AU

Description Inner brow raiser

Outer brow raiser

4
20 12

4
10

Brow lowerer
Upper lip raiser

20 12

12

Lip corner puller

15

Lip corner depressor

Detectable action units with feature points

Expression Profiles
Emotion Orignal Definition Adapted Definition

Fear

1+2+4+5+20+25

(1L+1R+2L+2R+20L+20R)/ 6

Surprise

1+2+5+26

(1L+1R+2L+2R+26+26)/6

Anger

4+5+7+24

(4L+4R+24L+24R)/4

Sadness

1+4+15

(4L+4R+15L+ 15R)/4

Disgust

4+9+ 10+17 6+12+25

(4L+4++10)/3

Happiness

(12L+ 12R)/2

Gesture Analysis
Gestures too ambiguous to indicate emotion on their own Gestures are used to support the confidence outcome of facial expression analysis
Emotion Joy Sadness Anger Fear Disgust Surprise Gesture Class hand clapping-high frequency hands over the head-posture lift of the hand- high speed italianate gestures hands over the head-gesture italianate gestures lift of the hand- low speed hand clapping-low frequency hands over the head-gesture

Emotion analysis system overview


G : the value of a corresponding FAP f : Values derived from the calculated distances

Distances of Neutral Face

Expression Profiles

Gesture Analysis

f
Feature Point Detection Distance Vector Construction FAP Estimation

Facial Expression . . Decision System .

Decision System

recognised emotion

System Interface

calculated FP distances

rules activated recognized emotion

Conclusion
This system is divided into two main parts the feature detection and the emotion interpretation. Estimation of a users emotional state based on a fuzzy rules architecture. Evaluation approach based on anthropometric models and measurements.

Thank You!!

Potrebbero piacerti anche