“Invisible Joystick” – Posture Recognition for Robots (Video)
Anyone that has successfully trained their dog quickly realizes that canines can easily recognize and respond to body language, especially simple hand gestures. Then why do we make controlling a robot so much more complex and difficult for users to understand?
As a part of a class project, Danpaul000, at the Colorado School of Mines, developed an IMU based glove and associated hardware to control the NAO robot in much the same way that dog owners would do.
The control system uses an IMU mounted on a glove worn by the operator. Data from the IMU is captured and formatted by an onboard Arduino processor then sent via Bluetooth to a nearby laptop. Python scripts executing on the laptop process the data and generate commands to control the robot.
It may seem simple, or even almost trivial, but the implications for future human/robot interaction are significant. Instead of forcing humans to adapt their behavior to suit the technical requirements of robots, this approach utilizes readily available technology, hardware, and software to make it much easier and more natural for people to operate robots - which is as it should be.
You might also enjoy:
- Heather Knight and Robot Data Interviewed on CNN (Video)
- Spark 181: A Future Vision for Subtile Human/Robot Interaction
- PYGMY – Robot Rings That Enhance Communication (Video)
- Bruno Maisonnier – Aldebaran Robotics – “Humanoids to Serve All” – TEDx (Video)
- Robot Image Recognition Couples Robonova-1 With Java (Video)