The folks at Clearpath Robotics are no strangers to rugged robots. The makers of the Grizzly all-terrain RUV (Robotic Utility Vehicle) are all about robots that can ford streams, trundle through snow, and even take on a simulated Martian landscape. Built on an open-source platform, these off-road robots can travel alone or in packs, but up until recently what they couldn't do was take steering cues from nearby humans in the form of gestures.
When the folks at Clearpath got their hands on one of Thalmic Labs' Myo gesture-control armbands, they figured they had to see what one of their rugged robots could do with it. Selecting the Grizzly's little brother, the Husky, they programmed their 'bot to take its driving cues from the armband. Acceleration, braking, turning left or right, and driving forward or reverse were programmed into the Myo.
As you can see, both the robot and the armband perform wonderfully together, and since the Husky and its Clearpath Robotics cousins are designed to handle modular and swappable payloads, we can imagine some very complex gesture-controlled robots showing up in the near future. If you're thinking of robots like those in Pacific Rim or the upcoming game Titanfall, however, gesture control is going to have to get a whole lot more complex. Six gestures, after all, aren't quite enough to properly operate a fully-operational Mech.