ADVERTISEMENT
Loitering munitions: A gamble in the skyAmid the noisy exaggerations and falsehoods being spread while India and Pakistan face off, it’s possible to discern at least one technological development that’s likely to become much more prominent in the years to come: the rise of loitering munitions.
Adya Madhavan
Last Updated IST
<div class="paragraphs"><p>Loitering munitions: A gamble in the sky</p></div>

Loitering munitions: A gamble in the sky

Credit: PTI photo

Amid the noisy exaggerations and falsehoods being spread while India and Pakistan face off, it’s possible to discern at least one technological development that’s likely to become much more prominent in the years to come: the rise of loitering munitions.

ADVERTISEMENT

Loitering munitions are unique in that they combine the features of missiles and unmanned aerial vehicles (UAVs). After being launched, they ‘loiter’ above a target as a UAV would for extended periods and can conduct reconnaissance. Once a target has been identified using human inputs and sensor data, they can then be used to strike it with an onboard warhead. For this reason, they are often referred to as ‘suicide’ or ‘kamikaze’ drones because they are destroyed once they make contact with a target.

Munitions of this kind have been used in recent wars between Russia and Ukraine, and Israel and its many foes. While the details are unclear, there are reports that India has utilised Israeli Harop loitering munitions to suppress Pakistan’s air defences, while Pakistan claims it shot down many of these.

Loitering munitions pose new challenges – some of which have been evident in recent conflicts – because they presume a certain degree of autonomy. As these munitions grow in sophistication and become crucial to a wide range of military needs, they are likely to be granted greater autonomy. This comes with significant risks. Battlefields are difficult environments to navigate as rubble, smoke, electronic warfare, and kinetic actions limit useful information available to automated systems and introduce errors. The consequences are potentially huge: hitting the wrong target could lead to inadvertent escalation, friendly fire, or civilian casualties.

One key challenge with machine autonomy is that software may not understand the ‘commander’s intent’ the way humans might. ‘Commander’s intent’ refers to the articulation of the purpose of an operation, the desired result and the ‘why’ of the operation. Even with an operation solely dependent
on human actors, understanding the commander’s intent is key, and there is a need for specificity to ensure that there is no error. As the autonomy
of military systems increases, the complexity of understanding intent is likely to pose several difficulties.

This does not pose as much of a problem for older loitering munitions, which still rely on a handful of sensors and enjoy comparatively lower autonomy. However, as autonomy increases and platforms like lethal autonomous weapon systems (LAWS) become widely deployed, the challenges of communicating ‘commander’s intent’ will become more acute.

Harop and Harpy, both developed by Israel Aerospace Industries (IAI), are widely used loitering munitions and are exported to various countries. Harpy is designed to detect and destroy radar systems and is a ‘fire and forget’ weapon that searches for radar emissions in that general direction once it is launched towards a target. Equipped with an explosive warhead, the Harpy is designed to suppress enemy air defences. The Harop employs a newer technology and is capable of using sensors as well as radar detection to hone in on both static and dynamic targets, enabling it to be used in more complex environments and target both land-based and naval targets.

Ensuring risk-free deployment

Even with operator-in-the-loop capabilities, loitering munitions like Harop enforce compressed decision cycles on commanders and operators. They provide flexibility in their ability to loiter but once a target has been identified, a call must be made rapidly before enemy air defences manage to eliminate the munition itself. The level of autonomy of ‘fire and forget’ systems also poses challenges, since they are, by and large, reliant on programming and sensor input. Military decision-makers find themselves between a rock and a hard place, balancing their strategic needs with the risks of using weapons systems
that automate some decisions thus taking certain critical functions out of the decision-makers’ hands.

Beyond the compulsions and concerns of the current India-Pakistan conflict, the increasing autonomy of military technologies will have profound implications. China is already emerging as a leader in AI systems and advanced military technologies and is likely to be at the forefront of developing sophisticated autonomous weaponry. To remain strategically competitive, India must focus on developing and honing indigenous autonomous and semi-autonomous systems. For India, it is imperative to strike a balance by harnessing the potential of military autonomy without risking civilian deaths.

New, transformative technologies must be capitalised upon in full. However, at the end of the day, a balance needs to be struck between utilising technologies and the danger they may pose to the very people they were designed to protect.

(The writer is a researcher at the Takshashila Institution)

ADVERTISEMENT
(Published 29 May 2025, 01:19 IST)