Génération Robots in France just announced the new NAO Next Gen package that will enable many individual robot developers and small companies to get involved with the NAO humanoid robot. Up until now, NAO sales were limited to a small community of selected academic institutions, members of the NAO Developer Program, and a few key corporate clients. The NAO Next Gen offers are designed, and priced, to encourage a much larger developer community to get involved with NAO.
Here are the program details provided by our friends at Génération Robots:
Meet NAO Next Gen, the most advanced programmable humanoid robot, now available at Génération Robots
NAO Next Gen key features
If you’re interested in service robotics, there are strong chances that you already met the programmable humanoid NAO robot. Since its creation Aldebaran Robotics in 2007, four versions have already been released, the fourth being NAO Next Gen. NAO Next Gen has 25 degrees of freedom, 2 HD cameras, 4 microphones, 2 hi-Fi speakers, runs an Intel ATOM 1.6 GHz CPU, WiFi connectivity, 9 tactile sensors, 8 pressure sensors, a voice synthesizer and an autonomy that can last up to 1.5 hours. You can see NAO showing its key features in the following video
All of these key features enable him to walk in all directions, talk in 19 languages, detect faces and shapes, protect itself when it falls and localize sound source. Since building robotics application can be challenging, Aldebaran Robotics created a multiplatform (Windows, Linux and Mac OS) and multi-languages SDK (C++, Python, Java and .Net) to create applications for your NAO Next Gen robot. You can also easily take advantage of NAO’s features by using the intuitive programming interface Chorégraphe (included in the software suite).
Génération Robots unique NAO next Gen offer
Programmable humanoid NAO Next Gen robot has the ability to catch attention, showcase your development skills and engage customers in a creative way. Your company can stand out from the crowd, be seen as an innovative partner and acquire customers with NAO Next Gen.
Until now, only academic, individual developers and a few handpicked corporate clients could acquire NAO Next Gen. The good news is that Génération Robots offers individual developers and small companies an unique opportunity to invest in a NAO Next Gen unit:
- A NAO Next Gen,
- A software suite license including Chorégraphe (Programming graphical interface), Webots for NAO (a 3D simulator) and the SDK
- A 12 months NAO store (NAO application distribution platform Application Validation service subscription)
- A 1 year manufacturer’s warranty
- An exclusive application “NAO plays football” created by Humarobotics,
Génération Robots close partnership with Aldebaran Robotics started in 2010 when our founder Jérôme Laplace became part of NAO beta testing team. Our company grew and in 2012 our HumaRobotics service team became official training providers for Aldebaran Robotics international customers. We proved our technical and programming expertise by creating among others the famous “NAO plays Connect 4” game in 2011:
To find out more about the «Develop on NAO " offer, join Generation Robots in our booth at Innorobo 2014, the largest European service robotics fair, which runs from March 18th to 22nd in Lyon (France). This is the opportunity to discover our other NAO exclusive applications such as "NAO playing poker.
Génération Robots “Develop on NAO Next Gen” offers range are the following: Essential, Premium and Expert, giving you more licenses (5 for our “Develop on NAO Next Gen-Expert” offer), warranty and NAO store application validation service years (2 years for our Premium and 3 years for our Expert offer).
Develop on NAO offers
About two minutes after I posted about the first iOS app for the Aldebaran NAO humanoid robot, another one popped up on my radar screen. This one, iControlNao, is by Klaus Engel and seems to have fairly similar functionality.
One significant difference, assuming I'm interpreting the app description correctly, is that iControlNao detects all the Nao robots in the vicinity and, using Bonjour, automatically connects to the one you select. The app notes also imply that it can be used with the Nao simulation software, assuming you have access to it.
Here's the introductory video from Klaus Engel:
If you're lucky enough to have an Aldebaran NAO humanoid robot, then you'll want to zip right over to the Apple iTunes App Store and grab the new NAO Control Server iOS app by Tommy Kammerer.
The Nao Control Server app supports starting and stopping behaviors installed on the robot, use of a joystick to control Nao, triggering speech, and other functions. To take advantage of the full functionality you have to pre-install a related Nao app from the Nao Store. But, even without the Nao Store app, you can command the robot to pronounce text and also trigger behaviors.
According to the iTunes app page, the program is compatible with iPhone, iPod Touch, and iPads running iOS 4.2 or later.
ROS.org has a new tutorial posted to help beginners get the NAO humanoid robot up and running with the ROS system including NAOqi and the simulated model in rviz.
All of the software runs on Linux PC (Ubuntu is used in the tutorial) and doesn't require an actual NAO to be connected for the simulation. That being said, the NAOqi SDK needs to be installed, which typically requires being a registered NAO user so that you can download the SDK from the Aldebaran website. The tutorial also mentions that a precompiled NAOqi binary is included with the Webots 6.4.4 simulation software and will be covered in a different tutorial.
The Microsoft Innovation Center in Belgium posted a useful article documenting the methodology behind controlling an Aldebaran NAO humanoid robot with the Microsoft Kinect sensor.
Although the article is in French, the mathematics and processes are easy to understand even if you don't speak the language. And, this is one case where Google Translate does a good job of delivering the goods.
Anyone that has successfully trained their dog quickly realizes that canines can easily recognize and respond to body language, especially simple hand gestures. Then why do we make controlling a robot so much more complex and difficult for users to understand?
As a part of a class project, Danpaul000, at the Colorado School of Mines, developed an IMU based glove and associated hardware to control the NAO robot in much the same way that dog owners would do.