Drone Pilot SchoolAutonomous Robots in the Fog of War

July 29, 2021by helo-10

Photo: Ethan Miller/Getty Images


General Atomics

DESCRIPTION: Unmanned aerial vehicle (UAV) for surveillance and, when equipped with Hellfire missiles, for combat. Can be remotely piloted or programmed to follow GPS waypoints.

STATUS: First deployed in 1995. Since 2001, primarily used for combat. Currently, 360 operated by U.S. military in Afghanistan, Iraq, Pakistan, and elsewhere. Also used by Italian Air Force and the United Kingdom’s Royal Air Force.



Photo: Qinetiq


Foster-Miller/Qinetiq Group

DESCRIPTION: 52-kilogram remotely operated unmanned ground vehicle that can be equipped for various missions, including infrared and night-vision cameras for reconnaissance, manipulator arm for improvised explosive device (IED) disposal, and rifle or grenade launcher for combat.

STATUS: Deployed by U.S. military in Bosnia, Iraq, Afghanistan, and elsewhere.


bluefin hauv

Photo: Bluefin Robotics


Bluefin Robotics Corp./Battelle Memorial Institute

DESCRIPTION: 79-kg unmanned underwater vehicle for ship hull inspection using high-resolution sonar. When equipped with a manipulator arm and camera, it can also do IED detection and disposal. Conducts surveys autonomously or can be remotely operated via fiber-optic tether.

STATUS: U.S. Navy awarded Bluefin US $30 million production contract in March 2011.

As a researcher at the Georgia Tech Research Institute and a board member of the world’s largest association for unmanned systems—the Association for Unmanned Vehicle Systems International—I’ve been working with robots for more than two decades, starting with underwater vehicles, then moving to air and ground vehicles, and most recently addressing collaborations among robots like those we demonstrated at the Robotics Rodeo. I can attest that while robots are definitely getting smarter, it is no easy task to make them so smart that they need no adult supervision. And call me a skeptic, but I doubt they’ll be cloning themselves anytime soon.

That said, I’m amazed at the pace of progress in the field. With thousands of researchers now engaged in advancing the intelligence and autonomy of unmanned systems, new breakthroughs are announced seemingly every week. Both the variety and the number of unmanned systems now deployed are breathtaking. UAVs run the gamut from the 1-metric-ton MQ-1 Predator drone made by General Atomics to AeroVironment’s tiny 430-gram Wasp micro air vehicle. There are unmanned ground vehicles that roll on treads like tanks, walk like dogs, and slither like snakes. Unmanned maritime vehicles include submarine-like vessels that can cruise underwater for kilometers and boatlike craft that patrol for pirates, smugglers, and other criminal types.

But none of these systems are fully autonomous. The RQ-4 Global Hawk UAV, made by Northrop Grumman, is guided by satellite waypoint navigation, yet it still requires a human pilot sitting in a remote ground station, plus others to operate the drone’s sensors and analyze the data being sent back. iRobot’s PackBot tactical robot is teleoperated by means of a video-game-style controller, complete with joystick. Even the driverless vehicles that participated in the Defense Advanced Research Projects Agency’s Grand Challenge competitions in 2004, 2005, and 2007 weren’t entirely autonomous, as the courses they had to negotiate were tightly controlled.

So why haven’t we seen a fully autonomous robot that can sense for itself, decide for itself, and seamlessly interact with people and other machines? Unmanned systems still fall short in three key areas: sensing, testing, and interoperability. Although the most advanced robots these days may gather data from an expansive array of cameras, microphones, and other sensors, they lack the ability to process all that information in real time and then intelligently act on the results. Likewise, testing poses a problem, because there is no accepted way to subject an autonomous system to every conceivable situation it might encounter in the real world. And interoperability becomes an issue when robots of different types must interact; even more difficult is getting manned and unmanned systems to interact.

To appreciate the enormous challenge of robotic sensing, consider this factoid, reported last year in The Economist: “During 2009, American drone aircraft…sent back 24 years’ worth of video footage. New models…will provide ten times as many data streams…and those in 2011 will produce 30 times as many.” It’s statistics such as those that once prompted colleagues of mine to print up lanyards that read “It’s the Sensor, Stupid.”

But a robot is more than just a platform of sensors. Let’s say an unmanned jeep is traveling down a city street. Its cameras may detect a parked car along the curb, an open manhole in the middle of the road, and a knot of school kids crossing at the intersection. But unless the jeep can correctly classify the car as a car, the manhole as a manhole, and the children as children, it won’t have sufficient information to avoid those obstacles.

So the sensing problem in robotics extends well beyond just designing sophisticated new sensors. An autonomous robot needs to be able to automatically process the data from those sensors, extract relevant information from those data, and then make decisions in real time based on that information and on information it has gathered in the past. The goal is to achieve what researchers call situational understanding.

Leave a Reply

Your email address will not be published. Required fields are marked *

There is more to being a drone pilot than just buying a machine and flying in your backyard. It can be that simple, but most of us will need to understand some drone laws before we try to take to the sky.


[contact-form-7 id=”300″ title=”Subscribe form”]
Objectively innovate empowered manufactured products whereas parallel platforms.