VISUAL COMPUTER, vol.34, no.3, pp.359-375, 2018 (SCI-Expanded)
We present a control approach for synthesizing physics-based walking motions that mimic the style of a given reference walking motion. Style transfer between the reference motion and its physically simulated counterpart is achieved via extracted high-level features like the trajectory of the swing ankle and the twist of the swing leg during stepping. The physically simulated motion is also capable of tracking the intra-step variations of the sagittal character center of mass velocity of the reference walking motion. This is achieved by an adaptive velocity control strategy which is fed by a gain-deviation relation curve learned offline. This curve is learned from a number of training walking motions once and is used for velocity control of other reference walking motions. The control approach is tested with motion capture data of several walking motions of different styles. The approach also enables generating various styles manually or by varying the high-level features of an existing motion capture data. The demonstrations show that the proposed control framework is capable of synthesizing robust motions which mimic the desired style regardless of the changing environment or character proportions.