One of the major problems with the use of drones is how abstract it makes warfare and how little oversight there is. With a targeted bombing strike or assassination operation run by USSOCOM, the operation has to be briefed to a level to approve the specific mission, orders are drafted, logistical and transportation plans have to be made, et cetera. With an armed drone, that process is often abbreviated to a short list of generic operational orders and criteria for initiating a strike which is approved locally (e.g. by the officer overseeing the drone pilots) and may never be reviewed at a theater level command authority.
Drone piloting duty is not highly regarded in the Air Force, being just above missileer in undesirable assignments for junior officers, resulting in a lot of turnover. This has caused the Air Force to outsource a lot of drone flying to private contractors (just as the CIA currently does) who are under ostensible oversight of an officer but are often managed on a daily basis by other contractors. The Air Force is actually considering allowing enlisted to become drone pilots because of the issues and cost. Pilots (both Air Force and contractor) are selected primarily for their skill at operating the drone, which are the same kinds of coordination and decision making skills used in first person shooter games; almost none of the drone pilots has ever been in an actual combat and so lack the kind of situational assessment or leadership skills that professional soldiers and airmen have to develop in battlefield conditions, and so the progression of viewing warfare as little more than infrared imagery and PowerPoint slides that has so infused the upper ranks is making its way down to the junior officer level, ensuring a future armed forces that views warfare as a series of briefings and newscasts rather than a conflict in which real people die, often needlessly.
The future is even worse; as recruiting drone pilots becomes more difficult and contractors more expensive the desire to reduce costs and streamline the process will result in applying automation in the form of pattern recognition algorithms to target identification and possibly even targeting decisions so that a single operator can oversee a fleet of drones that act essentially autonomously up to requesting approval. This isn’t some kind of paranoid fantasy; the Air Force is already experimenting with ways to twin drones and reduce operator workload so that two or more drones can be flown by a single pilot, and it is entirely possible that drones may one day made fully autonomous with no man-in-the-loop control at all.
Here safely ensconced in the US or Western Europe our biggest concerns about drones are a loss of privacy, or that some foolheaded person might crash a drone into an airliner while trying to film a takeoff, or that Amazon might start using drones for delivery instead of UPS. In Pakistan, Iraq, Afghanistan, and elsewhere, the population has to worry about random strikes coming out of a blue sky with no warning or reason. People there speak of waiting for clouds and rain to be able to move around outside safely because drones can’t operate in those conditions (though that’s only a matter of time, too, until we improve remote sensing technology to use microwave scattering and low frequency IR it “identify” targets). If an incorrect target is taken out, the typical response, if any, is a vague, “Sorry we misidentified your son/brother/husband”. If it was a especially careless death an operator may be reprimanded or a contractor fired.
It isn’t that drones are some fundamental revolution in warfare technology per se; they do the same things that soldiers and bombers have done for centuries. But they do it so much more efficiently, with so little cost or planning, so quickly. Certainly there has always been collateral damage and misidentification. But with drones, we’re presented with the claims of how surgically precise their strikes can be while we see the evidence that such convenience and automation has led to egregious failures of due diligence and discipline, while putting the actual responsibility of making decisions in the hands of private contractors and junior officers with no experience beyond ripping their mates at Destiny. They offer the potential of handing over warfare to automated systems in a situation where no single person can be held accountable for mistargeting. It’s as if someone watched The Terminator and came away with the lesson that we need to build better a better killbot than Arnold Schwarzenegger so we can more efficiently wipe out the enemy, whomever we might decide that to be.
Stranger