Hands On: a Real-time Adaptive Animation Interface
A new intuitive animation interface that exploits knowledge of physics and the physiology of human manipulation skills to simplify the animator’s task.
- A novel interface for interactive animation of hands manipulating 3D virtual objects.
- A method for bidirectional mapping of motions and forces between the low dimensional physical user interface and high dimensional animated or robotic hands.
- A process for automatically adapting the pose of an animated or robotic hand based on prior sampled knowledge and the current interaction context, which simplifies the creation of new, rich interactions.
People perform hundreds of interactions with their hands every day, effortlessly grasping and manipulating objects in the world. Replicating these manipulations for animation is remarkably challenging, both due to the large number of degrees of freedom of the hand and the difficulty in realistically complying with objects in the scene. The fundamental shortcoming of previous approaches is that the animator focuses purely on kinematic quantities, such as the motions of objects or fingers, and does not receive any feedback on fingertip forces. A second major shortcoming is that previous approaches require animators to control a large number of degrees of freedom directly. This makes animating hands cumbersome and the resulting motions look artificial.
Researchers from the University of British Columbia present Hands On: a real-time, adaptive animation interface, driven by compliant contact and force information, for animating contact and precision manipulations of virtual objects. Using the proposed interface, an animator controls an abstract grasper trajectory while the full hand pose is automatically shaped by proactive adaptation and compliant scene interactions. Haptic force feedback enables intuitive control by mapping interaction forces from the full animated hand back to the reduced animator feedback space, invoking the same human sensorimotor processes utilized in natural precision manipulations. The system provides an approach for online, adaptive shaping of the animated manipulator based on prior interactions, resulting in more functional and appealing motions.
The importance of haptic feedback for authoring virtual object manipulations is verified in a user study with non-expert participants that examines contact force trajectories while using the interface. The resulting motions compare very favorably to those obtained from non-haptic interfaces which attempt to map free-space recordings of the user’s hand to contact-driven motions, but take only a fraction of the time. Moreover, movements involving fast contact transients are far better, not only in terms of forces, but also visually.
Sever motion sequences, created using the interface are demonstrated in the video below
Figure 1 Interactively creating movements of animated hands using UBC interface
Figure 1 shows how the system can easily control a non-anthropomorphic robotic hand with the same interface. The example also shows stacking virtual objects with compliant contact.
Figure 2: Grasping a wine glass without (left) and with (right) proactive shaping adaptation.
Figure 2 highlights how proactive adaptation of the grasp shape also yields grasps with a more natural appearance. Without proactive shaping, the hand fails to conform to the curved surface of the wine glass, resulting in some fingers dangling away from the surface.
Figure 3: Tapping a block on the surface of a raised platform
Figure 3 shows the setup for a study, where participants were asked to grasp a block and use it to tap once on the surface of a raised platform before replacing the block at its original position. Here, haptic feedback facilitates a sharp response to the event, on the order of tens of ms, while motions without feedback resulted in reaction of 200 - 400 ms.