Editorial Issue 24.1
Article first published online: 8 FEB 2013
Copyright © 2013 John Wiley & Sons, Ltd.
Computer Animation and Virtual Worlds
Volume 24, Issue 1, page 1, January/February 2013
How to Cite
Magnenat-Thalmann, N. and Thalmann, D. (2013), Editorial Issue 24.1. Comp. Anim. Virtual Worlds, 24: 1. doi: 10.1002/cav.1495
- Issue published online: 8 FEB 2013
- Article first published online: 8 FEB 2013
This issue contains five regular papers. Ahmad Abdul Karim, Thibaut Gaudin, Alexandre Meyer, Axel Buendia and Saïda Bouakaz, from Université Lyon 1 and Spir.Ops in France, present a fully procedural method capable of generating in real time a wide range of locomotion for multi-legged characters in a dynamic environment, without using any motion data. Their system consists of several independent blocks: a character controller, a gait/tempo manager, a 3D path constructor and a footprints planner. The four modules work cooperatively to calculate in real time the footprints and the 3D trajectories of the feet and the pelvis. The system can animate dozens of creatures using dedicated level of detail techniques and is totally controllable allowing the user to design a multitude of locomotion styles through a user-friendly interface.
In the second paper, Amit Kumar and Aparajita Ojha, from PDPM Indian Institute of Information Technology in Jabalpur, in India, propose a new approach to natural path planning by adding wavelet noise to a path generated using subdivision-based corridor map method. Because wavelet noise is almost perfectly band-limited and provides good details with minimal aliasing effects, the resulting path becomes smoother and more natural. Moreover, by appropriately choosing the levels of down/up sampling in the wavelet noise generation algorithm, frequency of wavelet noise can be adjusted. This serves as an effective tool in bringing variations in path as per the requirement.
The third paper by Yanzhen Wang, Yueshan Xiong, Kai Xu and Dong Liu, from National University of Defense Technology in Changsha, Hunan, in China, describes a surgical procedure simulation system for the training of arthroscopic ACL reconstruction involving operations such as puncturing, probing, incision and drilling. In this system, the authors employ a linear elastic finite element method and position-based dynamics for deformable modelling. Simplified vertex duplicating method and an implementation of real-time Boolean operations are proposed for the topological change of tissue models involved in the incision simulation and tunnel construction. Two specially designed force-feedback models are introduced for the haptic rendering of probing and drilling operations.
In the fourth paper by Siddharth Hegde, Christos Gatzidis and Feng Tian, from Bournemouth University, in UK, look at the different methods presented over the past few decades that attempted to recreate digital paintings. Whereas previous surveys concentrate on the broader subject of non-photorealistic rendering, the focus of this paper is firmly placed on painterly rendering techniques. The authors compare different methods used to produce different output painting styles such as abstract, colour pencil, watercolour, oriental, oil and pastel. Although some methods demand a high level of interaction using a skilled artist, others require simple parameters provided by a user with little or no artistic experience.
The last paper by Llyr ap Cenydd and Bill Teahan, from Bangor University, in UK, describes a system for dynamically animating the locomotive behaviour of arthropods in real time, facilitating realistic and autonomous traversal across an arbitrary environment. By combining a de-centralized reactive behavioural model with a hybrid approach to motion that utilizes the comparative advantages of physical simulation and kinematic control, the system is capable of automatically generating complex organic motion over a wide range of surface features, independent of structural complexity. The reactive embodiment of the creature, combined with the physical simulation of the virtual world enables emergent behaviours to form that are entirely based on circumstance, including rigid-body interaction, grip recovery and adaptive wall climbing.