Facial animation retargeting and control based on a human appearance space



Expressive facial animations are essential to enhance the realism and the credibility of virtual characters. Parameter-based animation methods offer a precise control over facial configurations while performance-based animation benefits from the naturalness of captured human motion. In this paper, we propose an animation system that gathers the advantages of both approaches. By analyzing a database of facial motion, we create the human appearance space. The appearance space provides a coherent and continuous parameterization of human facial movements, while encapsulating the coherence of real facial deformations. We present a method to optimally construct an analogous appearance face for a synthetic character. The link between both appearance spaces makes it possible to retarget facial animation on a synthetic face from a video source. Moreover, the topological characteristics of the appearance space allow us to detect the principal variation patterns of a face and automatically reorganize them on a low-dimensional control space. The control space acts as an interactive user-interface to manipulate the facial expressions of any synthetic face. This interface makes it simple and intuitive to generate still facial configurations for keyframe animation, as well as complete temporal sequences of facial movements. The resulting animations combine the flexibility of a parameter-based system and the realism of real human motion. Copyright © 2010 John Wiley & Sons, Ltd.