About two minutes after I posted about the first iOS app for the Aldebaran NAO humanoid robot, another one popped up on my radar screen. This one, iControlNao, is by Klaus Engel and seems to have fairly similar functionality.
One significant difference, assuming I'm interpreting the app description correctly, is that iControlNao detects all the Nao robots in the vicinity and, using Bonjour, automatically connects to the one you select. The app notes also imply that it can be used with the Nao simulation software, assuming you have access to it.
Here's the introductory video from Klaus Engel:
If you're lucky enough to have an Aldebaran NAO humanoid robot, then you'll want to zip right over to the Apple iTunes App Store and grab the new NAO Control Server iOS app by Tommy Kammerer.
The Nao Control Server app supports starting and stopping behaviors installed on the robot, use of a joystick to control Nao, triggering speech, and other functions. To take advantage of the full functionality you have to pre-install a related Nao app from the Nao Store. But, even without the Nao Store app, you can command the robot to pronounce text and also trigger behaviors.
According to the iTunes app page, the program is compatible with iPhone, iPod Touch, and iPads running iOS 4.2 or later.
ROS.org has a new tutorial posted to help beginners get the NAO humanoid robot up and running with the ROS system including NAOqi and the simulated model in rviz.
All of the software runs on Linux PC (Ubuntu is used in the tutorial) and doesn't require an actual NAO to be connected for the simulation. That being said, the NAOqi SDK needs to be installed, which typically requires being a registered NAO user so that you can download the SDK from the Aldebaran website. The tutorial also mentions that a precompiled NAOqi binary is included with the Webots 6.4.4 simulation software and will be covered in a different tutorial.
The Microsoft Innovation Center in Belgium posted a useful article documenting the methodology behind controlling an Aldebaran NAO humanoid robot with the Microsoft Kinect sensor.
Although the article is in French, the mathematics and processes are easy to understand even if you don't speak the language. And, this is one case where Google Translate does a good job of delivering the goods.
Anyone that has successfully trained their dog quickly realizes that canines can easily recognize and respond to body language, especially simple hand gestures. Then why do we make controlling a robot so much more complex and difficult for users to understand?
As a part of a class project, Danpaul000, at the Colorado School of Mines, developed an IMU based glove and associated hardware to control the NAO robot in much the same way that dog owners would do.
Shelly Palmer is introduced to the Aldebaran NAO humanoid robot by Intel, and gets some insight into how it is expected to improve and enhance human quality of life.