It may seem a little counter-intuitive, but every accomplished dancer knows that in order to maintain stability your entire body has to move dynamically, especially your hips.
Unfortunately that knowledge seems to have escaped the nerdy robotics community, perhaps because most of us nerds can’t dance. Fortunately, South Korean robot researchers have been digging into the creation of dynamically stable whole body humanoid robot motions using data captured from the movements of human dancers, and the results are surprisingly life-like.
“This work presents a methodology to generate dynamically stable whole-body motions for a humanoid robot, which are converted from human motion capture data.
The methodology consists of the kinematic and dynamical mappings for human-likeness and stability, respectively. The kinematic mapping includes the scaling of human foot and Zero Moment Point (ZMP) trajectories considering the geometric differences between a humanoid robot and a human. It also provides the conversion of human upper body motions using the method in .
The dynamic mapping modifies the humanoid pelvis motion to ensure the movement stability of humanoid wholebody motions, which are converted from the kinematic mapping.
In addition, we propose a simplified human model to obtain a human ZMP trajectory, which is used as a reference ZMP trajectory for the humanoid robot to imitate during the kinematic mapping. A human whole-body dancing motion is converted by the methodology and performed by a humanoid robot with online balancing controllers.”