Perhaps I'm missing something, it happens some times.
A recent article by Mark Brown on the Wired UK website presents the research being pursued by Yale Song and others at MIT exploring the potential to use hand gestures and body positions in a real time aircraft carrier environment to direct unmanned planes on the flight deck.
Their approach, which in some ways is similar to Microsoft's Kinect system, captures and analyzes the deck crew's body and hand motions extremely rapidly and with a high degree of accuracy in order to generate commands that the robotic drone aircraft can understand and respond to.
Historically, mankind has developed new weapons, deployed them, observed the damage, and then debated the ethics and morality of the weapons after the fact. That pattern seems to have held true from the time of the first crude clubs, spears, bows and arrows, and even the atomic bomb.
The latest development in that same series is the X-47B fully autonomous drone, currently being tested by the US Navy. The new robotic drone is so advanced that it can take off and land on an aircraft carrier at sea without human intervention.