1
Facial Animation by the Manipulation of a Few Control Points Subject to Muscle Constraints Hiroyuki Kubo * Hiroaki Yanagisawa * Akinobu Maejima * Demetri Terzopoulos ** Shigeo Morishima * * Waseda University, Tokyo, Japan ** University of California, Los Angeles, USA [email protected] [email protected] [email protected] [email protected] [email protected] 2. Facial Measurement and Modeling 1. Introduction Muscle-based Facial Animation One of the best approaches to realizing a realizing a realistic, lifelike character realistic, lifelike character. However, the optimal control of each muscle to generate facial animation is complicated. The goal of our work is… • To synthesize realistic facial animation with a variety of facial expressions by automatically automatically estimating facial muscle parameters through estimating facial muscle parameters through the manipulation of only a few control points the manipulation of only a few control points on the face on the face. • To develop a facial expression cloning method that transfers an actor’s muscle parameters to another character. Y. Lee, D. Terzopoulos, K. Waters, Realistic modeling for facial animation, Proc. SIGGRAPH 95, 1995, 55-62. Constructing Actor’s Facial Model: Motion capturing: 10 Motion Capture Cameras (VICON MX-40) The motion of each marker on the actor’s face while he creates various facial expressions is recorded at 120 fps. 3. Facial Muscle Parameter Estimation ( ) ( ) = k n i k k i s k i r k k E 1 , , p x x p ( ) ( ) k k k k optimal E p p p argmin = •Scanning the subject using a 3D scanner. (Cyberware 3030 RGB) •Adapting a generic face mesh to the acquired facial data. Surface Error (SE) Minimization: The simulated facial model: Uniquely determined by 37 facial muscle parameters. To find the optimal facial muscle parameters: Minimize the difference between the surface of the actor’s face and that of the simulated face – Surface Error Surface Error . Total of Control Points: 20 motion capture markers 20 motion capture markers. Separating the Face Area into 4 Sub-Areas: The facial muscle parameters: Too numerous to easily solve this problem. Therefore The facial surface is divided into 4 sub The facial surface is divided into 4 sub - - areas areas. k p k n k i r , x k i s, x k E Muscle contraction parameters Surface error (SE) of Sub-Area k Total control points of Sub-Area k Positions of the real control point i Positions of the simulated control point i 1 2 4 3 Sub-Areas Facial Muscles 4. Facial Animation Synthesis Results 4. Facial Animation Synthesis Results Muscle contraction Animation of actor’s model: Estimated by minimizing the SE, according to the movement of the 20 motion capture markers 20 motion capture markers. 5. Conclusion Control Points Control Points (a) (b) (c) We proposed the automatic synthesis of facial expression by manipulating a few control points under facial muscle constraints. We have been able to synthesize facial expressions easily by adapting generic facial muscle parameters to any characters’ facial model only by fitting a generic model to personal data. (a): Actor’s expression. (b): Actor’s facial model with automatically estimated muscle parameters. (c): Facial model of another character with the muscle parameters of (b) applied. 6. Acknowledgement This research is supported by the Japan Science and Technology Agency, CREST project. Epidermal Surface Fascia Surface Skull Surface Dermal-fatty Layer Muscle Layer Epidermal Node Fascia Node Skull Node Actor Actor Actor Actor s Model s Model A Character A Character Muscle Muscle Parameters Parameters Estimated Estimated Muscle Muscle Parameters Parameters Applied Applied

Facial Animation by the Manipulation of a Few Control ...dt/papers/siggraph06-poster/siggraph06-p… · 10 Motion Capture Cameras (VICON MX-40) The motion of each marker on the actor’s

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Facial Animation by the Manipulation of a Few Control ...dt/papers/siggraph06-poster/siggraph06-p… · 10 Motion Capture Cameras (VICON MX-40) The motion of each marker on the actor’s

Facial Animation by the Manipulation of a Few Control PointsSubject to Muscle Constraints

Hiroyuki Kubo* Hiroaki Yanagisawa* Akinobu Maejima* Demetri Terzopoulos** Shigeo Morishima*

* Waseda University, Tokyo, Japan

** University of California, Los Angeles, [email protected] [email protected] [email protected] [email protected] [email protected]

2. Facial Measurement and Modeling

1. IntroductionMuscle-based Facial Animation• One of the best approaches to realizing a realizing a

realistic, lifelike characterrealistic, lifelike character. • However, the optimal control of each muscle to

generate facial animation is complicated.

The goal of our work is…• To synthesize realistic facial animation with a

variety of facial expressions by automatically automatically estimating facial muscle parameters through estimating facial muscle parameters through the manipulation of only a few control points the manipulation of only a few control points on the faceon the face.

• To develop a facial expression cloning method that transfers an actor’s muscle parameters to another character.

Y. Lee, D. Terzopoulos, K. Waters, Realistic modeling for facial animation, Proc. SIGGRAPH 95, 1995, 55-62.

Constructing Actor’s Facial Model:

Motion capturing:10 Motion Capture Cameras (VICON MX-40)

The motion of each marker on the actor’s face while he creates various facial expressions is recorded at 120 fps.

3. Facial Muscle Parameter Estimation

( ) ( )∑=

−≡kn

i

kkis

kir

kkE

1,, pxxp

( )( )kk

k

koptimal E pp

p

argmin=

•Scanning the subject using a 3D scanner. (Cyberware 3030 RGB)

•Adapting a generic face mesh to the acquired facial data.

Surface Error (SE) Minimization:

The simulated facial model:Uniquely determined by 37 facial muscle parameters.

To find the optimal facial muscle parameters:Minimize the difference between the surface of the actor’s face and that of the simulated face – Surface ErrorSurface Error .

Total of Control Points:20 motion capture markers20 motion capture markers.

Separating the Face Area into 4 Sub-Areas:

The facial muscle parameters:Too numerous to easily solve this problem.

Therefore –The facial surface is divided into 4 subThe facial surface is divided into 4 sub--areasareas.

kp

knkir ,xkis,x

kE

Muscle contraction parameters

Surface error (SE) of Sub-Area k

Total control points of Sub-Area kPositions of the real control point i

Positions of the simulated control point i

1 2

43 Sub-Areas

Facial Muscles

4. Facial Animation Synthesis Results4. Facial Animation Synthesis ResultsMuscle contraction → Animation of actor’s model:

Estimated by minimizing the SE, according to the movement of the 20 motion capture markers20 motion capture markers.

5. Conclusion

Control PointsControl Points

(a)

(b)

(c)

• We proposed the automatic synthesis of facial expression by manipulating a few control points under facial muscle constraints.

• We have been able to synthesize facial expressions easily by adapting generic facial muscle parameters to any characters’ facial model only by fitting a generic model to personal data.

(a): Actor’s expression. (b): Actor’s facial model with automatically estimated muscle parameters. (c): Facial model of another character with the muscle parameters of (b) applied.

6. Acknowledgement

This research is supported by the Japan Science and Technology Agency, CREST project.

Epidermal Surface

Fascia Surface

Skull Surface

Dermal-fatty LayerMuscle Layer Epidermal Node

Fascia NodeSkull Node

ActorActor

ActorActor’’s Models Model

A CharacterA Character

Muscle Muscle Parameters Parameters EstimatedEstimated

Muscle Muscle Parameters Parameters

AppliedApplied