Echord Graspy Project – Multimedia Report (v4) (Video)

Here's a robot video that I found interesting:



TheAmazel - Echord Graspy Project GRASPY is an experiment of ECHORD gathering Aldebaran Robotics and DFKI The aim of the project was to give me the ability to exchange objects with my human friends. This will be very convenient for domestic but also professional applications of service robots. NAO explains: "Originally my cameras did not allow stereo-vision so it was difficult for me to localize objetcs in front of me. The engineers of Aldebaran put my cameras in my eyes : it helps for 3D vision. My stereo-vision software detects a cup given to me and determines its pose relative my head. It is based on a 3-d model using a pre-computed table of how the cup's contour looks like from different perspectives. So, for a hypothetical pose the algorithm looks along the pre-computed contour how much both images actually support that there is a contour. This is done with a novel contrast-normalized and highly optimized contour criterion, the so-called cns-image. The software globally searches through the image with a coarse raster of poses spread over several frames. Once a reasonable response is found it is locally refined by changing the pose towards growing responses. After that the cup is tracked by applying this refinement procedure every frame. The system runs a the full frame rate of 30 Hz. The grasping function finds a valid motion path from the current hand pose and thereby enables me to grasp an object. Since my processor is limited as well as the degrees of freedom of my arms, I use a reachability map in my motion planner for efficiency. By doing so the workspace of the hand is pre-calculated and discretized with a cube that is divided into equally sized smaller cubes. Each sub cube serves as a region in the workspace and stores possible hand rotations per hand position area. In order to select a grasp possible grasp points are matched with the reachability map and checked whether the corresponding possible lower arm directions are qualified for the grasp. The reachability map provides the grasp motion planner with 6-D information on the possible hand positions and lower arm directions. Since planning in 6-D is very expensive, my A*- based planning algorithm initially only uses the 3-D area grid and ignores the lower arm direction and hand orientation. Thereby, to be evaluated, nodes are checked for reachability and obstacle collision in order to calculate the heuristics only for verified nodes. In this process, nodes with more suitable lower arm directions are rated better than nodes with greater deviations from the lower arm goal direction. The output from the planning algorithm is a list of waypoints through the reachability map that is converted into a Bezier spline. When my human friend wants his object back, I release it only when I am sure he is grasping it correctly. ... I had great time working with the fellows of Aldebaran and DFKI. Thanks to the European Community and the ECHORD project, I will be much more efficient in my future jobs."
By TheAmazel

You might also enjoy:

  1. Project M: MANOI AT01 Robot Progress Report #8 (Video)
  2. Project M: MANOI AT01 Progress Report #5
  3. NAO Robot Wizards: Franck Calzada
  4. Project M: MANOI AT01 Progress Report #3 (Video)
  5. Aldebaran NAO and ROMEO Humanoid Robots on French TV (Video)
Share and Enjoy:
  • Facebook
  • Twitter
  • Reddit
  • email
  • LinkedIn
  • Slashdot
  • StumbleUpon
  • Google Bookmarks
  • Digg
  • del.icio.us

Leave a Reply