With several competitions coming up, starting with the three day AKIBA ROBOT celebration in Tokyo on November 3rd, we decided we better get cracking at motion creation. Our MANOI AT01 robot, affectionately named "Mondai-Noid", is fully functional and aligned properly (as far as we can tell), so it's definitely time to figure out some really cool, and hopefully effective, moves.
Since Mondai-Noid is a 'humanoid' robot, we decided to try and model motions as closely to those of real humans, recognizing the limitations imposed by only having 17 degrees of freedom to work with. Of course we've loaded and modified motions for Mondai-Noid before, sometimes even heavily modified them. Still, this was our first attempt at creating a 100% unique set of movements.
We've all seen lots of 'humanoid' robots that move like robots, but our ideal, our goal, was to create movements that come close to the wonderful life-like abilities demonstrated by ONIMARU.3 created by Yamaguchi-san in Kochi.
In many ways, ONIMARU.3 sets a very high standard for other robots and their builders to shoot at. For our first unique motion creation we decided to have Mondai-Noid lay down on his back from a standing position, then stand back up again. Programming the first part of the sequence took about 30 minutes including testing.
Programming the second part of the sequence, where the robot stands back up again, literally took about 90 seconds. We'll go into all the details and explain why the RCB-3 controller and the Heart 2 Heart 3 software make creating motion sequences like this so easy.
Much to our total amazement, the complete motion worked almost perfectly with very little adjustment or changes. Of course, we will spend some time optimizing the total sequence - we're sure there's a lot that can be polished or improved totally.
Note: The first segment in the video showing the robot laying down to rest on his back and then standing up again, plays at normal speed. The final segments are basically copies of the first segment sped up in the video so that we could see what potential problems we may have to face with the robot in the real world.