Here's a robot video that I found interesting:
(Images through the prime sense linked below.)
Here is a quick test of the collision detection using a Xtion Pro Live (Prime Sense) with a Raspberry Pi as the brain. To note the Rpi is overclocked to 1000Mhz. (You will notice a slight click in the middle of me turning on the collision interrupt call. I forgot I turned it off in the beginning since it was so close to the camera. I realized it after it wasn't responding to the cone in front of it.) Collision is only detected in the middle of it's vision. This is due to the legs sometimes crossing over into it's field of vision during full body rotations. I have also integrated the three vision options into a "heads up display" using OpenNI and OpenCV. (See links below.) Speech is handled via a espeak library my friend Kurt created. The gait algorithm is a complete rewrite in order to give as much stability to the camera. (Plus I just wanted to see if I could do it...) The USB hub is a de-cased powered 4 port which gets it power from a BEC that takes the 3-cell lipo (11 volt) and drops it down to 5.1 volts which in turn powers the Raspberry Pi, Xtion and the amplified speaker. All code is done in c++. Special thanks to Kurt Eckhardt for creating libraries for XBee communication and the espeak library. Communication to the servos is done with a USB2AX micro controller created by Nicolas Saugnier. I've linked a couple of images of the HUD and the three viewing options.
Heads up Display (viewed via VNC into the Raspberry Pi)
Vision options for Heads up Display
Shot of Raspberry Pi mounted inside the hex.
Still shot of Charlotte
A work in progress web site for this project
Trossen Robotics sells the body as a kit and comes with a great arduino based controller. It makes a great platform for development on other hardware like a raspberry pi or even a Beagle Board Black. http://www.trossenrobotics.com/
By Kevin Ochs