So, we’ve got these UAVs that can circle some remote place, while a lieutenant in Arizona monitors and decides whom/when to shoot.
I’m wondering what the latency period is between an image being acquired, that image getting to where the operator is, and the operator’s instruction getting back to the device (for that matter, not necessarily to run the weapons, but even mundane things like changing course).
Now, I’m sure what I really want to know is classified, but I’d be happy to know what best-case is with civilian kit, if that’s possible, and assume the military can do at least that well.
What I’m really getting at is this: I know these things are out there, but I would think that even with dedicated communications networks the total latency would make hitting a moving target very difficult.
In a pure civilian basis, the worst-case scenario I’ve experienced is 300 millisecond ping, which required a really terrible connection on the receiving end, for a connection from Canada to Australia., though it’s usually been closer to 250 ms.
A military application will not get anything nearly so bad as this. In (the unlikely) situation where the drone commander is literally on the opposite side of the world, the absolutely fastest ping would be roughly 140 ms, about 1/7th of a second. (The world is ~1/7th light second in diameter.) In practice, I expect a millitary connection would be somewhere between the two extremes: slower than 140, faster than 250…
Guiding something with 200 ms latency isn’t that hard, it was regularly done in the old Quake 2 deathmatch days. The lag can be accounted-for with a little practice, and an explosive radius takes care of the rest.
That said, most the weapons that are carried have other forms of guidance and are not manually steered to target anyway. The Predator drone (the one that’s had the most media attention) often carried Hellfire missiles, which had varieties that could be self-guiding.
Note that controlling a drone isn’t like playing a FPS video game. In a game, you’re doing the aiming yourself: That is to say, you’re directly controlling the direction the gun points via your mouse, and directly controlling when to fire, when (to within the limits of your reflexes and the lag) the gun is pointing at the target you wish to shoot. With a drone, though, it’s more like playing a game using an aimbot cheat: You’re using a mouse or something similar to designate a target, but the drone is then tracking the target, and controlling the direction the gun points and pulling the trigger when, according to the drone’s reflexes, it’s pointing at what it believes to be the target you designated. As long as the lag isn’t so long that the drone loses track of which object is your target, the lag won’t matter.
That is the answer I believe as far as weapons firing.
The Hellfire versions used from unmanned a/c are generally if not always semi-active laser guided, IOW somebody/something has to hold a laser designating beam on the target throughout the missile’s flight. Similarly for the smaller laser guided missiles and bombs which have been the UAS’s alternative weapons. The versions of Hellfire with ‘fire and forget’ millimeter wave radar seekers are mainly fired by manned helicopters, among a/c. The Hellfire replacement JAGM, now used on some UAS’s has a multimode seeker, but in general US ‘drone strikes’ have been with laser guided weapons.
The electro/optical laser designator systems on those a/c have autotrack modes, but so do similar systems on manned a/c, to reduce crew workload not just because of latency. The autotrack mode isn’t necessarily used in any given engagement. How much it’s used might be more a question for actual payload operators and subject to classification. The payload operator of the UAS does press a button to fire; the idea the UAS decides that is part of the politics of criticizing those systems and would be a much bigger issue if it were actually true.
Those unmanned a/c have been controlled in take off and landing by local controllers, where the UAS doesn’t do that autonomously, which they haven’t always. So it does seem that issues with very long range comms, though maybe the possibility of intermittent interruption as much as minimum latency, made it impractical to ‘hand’ fly them in delicate phases of flight from 1,000’s miles away. However, as mentioned, once they are up and away the far remote a/c operator tells the UAS where to go or what heading, altitude, speed etc. rather than giving it direct ‘stick and rudder’ inputs like an actual remote pilot. Which is part of the evolution in terminology from ‘remotely piloted vehicle’ ca. 1970’s of idea of RPV strike a/c (which were in development for possibly attacking SAM sites in Vietnam) v the more recent ‘unmanned aerial vehicle/system’ terminology implying more autonomy for the vehicle. But not to decide which targets to engage or to fire by itself, as yet.
I could see that being true, if we had ethernet cables passing straight through the earth. But we don’t, at least not yet. If your drone is on the far side of the earth, then the min one-way path length is 12,500 miles along the surface of the earth. More likely you’ll be bouncing the signal off of a satellite; if it’s a geosynchronous one (orbiting at altitude of ~22,000 miles), then your min one-way path length is 44,000 miles. Round trip, 88,000 miles. Now you’re edging up to around half a second between the drone beaming video content to you, and its control surfaces moving in response to your control inputs.
as it happens, the lag is actually much longer: more like two seconds. That is, drone sends you video, and two seconds later its control surfaces move (assuming you react instantaneously to what you see on video). According to info at that link, takeoff and landing are handled by operators local to the drone (reducing lag to negligible amounts), while the main portion of flight is handled by the far-side-of-the-world operator; two seconds of lag isn’t a big deal at 15,000 feet. This handoff operation seems weirdly complicated, but it means one operator on the ground in Afghanistan can manage arrivals/departures for a dozen drones, while a dozen operators here in the US manage each drone during the main part of the flight, resulting in fewer personnel in the field.
To be clear, when I said that the drone pulls the trigger, I didn’t mean “the drone decides that that’s something that needs to be shot”. It’s still not shooting unless a human gives the order to. I just meant that the order is not “Fire the weapon immediately, in the direction it’s pointed at the moment you receive this order”, but something more like “Fire on the already-designated target that you’re already tracking, as soon as you can get the weapon aimed at that target”.
Drones probably are technologically capable of choosing and engaging their own targets, at least to a degree. But even though they’re capable of it, we choose not to use them that way.
To be clear, when I said that the drone pulls the trigger, I didn’t mean “the drone decides that that’s something that needs to be shot”. It’s still not shooting unless a human gives the order to. I just meant that the order is not “Fire the weapon immediately, in the direction it’s pointed at the moment you receive this order”, but something more like “Fire on the already-designated target that you’re already tracking, as soon as you can get the weapon aimed at that target”.
Drones probably are technologically capable of choosing and engaging their own targets, at least to a degree. But even though they’re capable of it, we choose not to use them that way.
As Machine Elf mentions, the lag is pretty shitty. It’s possible, in perfect conditions and perfect weather (including space weather effects), to operate an entire mission from a terminal in the states, but it isn’t done like that. Depending on the aircraft, it may have autoland capability or, if not, it will be handed off the a ground control station nearer to (or at) the actual runway.
For shooting at targets (moving or not), there is a Multi-Spectral Targeting System on the aircraft. The person doing the shooting uses that to lock-on to a target on the ground. Once the target is locked-on, the sensors will follow it automatically. This system can then designate the target with a laser, and keep the laser on the target automatically as the target moves. At this point, the drone is ready to fire as soon as the command is given. Other than lag time, there is no delay built in to the system. It’s already locked-on, so there is no reason to wait while it does any further aiming. Once the command is sent, the missile will be released and will fly to the laser. Sometimes laser designaters from other aircraft or from units on the ground might be used.
That’s the simplest explanation of it all, I think.
The greatest lag comes from the time between the Target Engagement Authority’s command to fire, and the actual release of the missile! I won’t even get into that, but understand that the people flying the drones, and the people shooting the drones, are not the same people deciding when and at whom the drone will shoot. This chain of communication and decision making process almost makes any latency in electronic communications and remote control signals a moot point.
There might still be some delay, if at the moment that the order to fire is given, circumstances exist that prevent a clear shot. In that case, I presume that the drone would wait until there was a clear shot, and than fire. It probably wouldn’t be a very long wait, though.
I wonder what happens if drone loses tracking between order to fire and actually firing. Will it hold fire?
I guess it’s just too bad if it loses tracking after firing, but before target hit. If drone was smart enough, I guess it could try to target an empty field?
[quote=“OldOlds, post:1, topic:855677, full:true”]
What I’m really getting at is this: I know these things are out there, but I would think that even with dedicated communications networks the total latency would make hitting a moving target very difficult.[/quote] But how fast is that ‘moving target’ moving?
Mostly these drones are being used against surface (land or sea) vehicles. So for an enemy jeep or similar, top speed of maybe 75 mi/hr in the field. We’ve been talking here about something like 2 seconds delay as the worst case. How far could a vehicle move in 2 seconds? Aiming at where it was 2 seconds ago may still be very effective.
And the drones aren’t doing that : their electronics are tracking the designated target in real time, so they will shoot at the current location of the target that was ‘locked-in’ 2 seconds ago.
And the drones aren’t doing that : their electronics are tracking the designated target in real time, so they will shoot at the current location of the target that was ‘locked-in’ 2 seconds ago.
They’re not “shooting” at a target in the sense of aiming a gun, pulling a trigger, and hoping the gun was pointed in the right direction when the bullet was released. The air-to-ground missiles launched by a Predator drone (Hellfire, Griffin) are guided to their target either by laser or radar. When the missile is launched, it’s already locked onto the target (which may or may not be moving), and will steer itself toward the target even if the drone wasn’t pointed in in exactly the right direction at the time it receives the command to launch (~2 seconds after the operator in Kansas pulls the trigger). If laser-guided, the laser (operated by the drone, another aircraft, or someone on the ground) is kept pointed at the target until impact. If the laser designator is operated by a ground crew, then the laser can be kept manually aimed at the target until time of impact. If the laser designator is fitted to an aircraft (either the drone or another aircraft), then it’s typically got automatic systems built in to keep the laser pointed at the target of choice: image processing allows the identification of objects of interest in the field of view, allows selection of a target by the drone operator, and keeps the laser pointed at the target it even when the target (or drone) changes position.