Warfare, technology, ethics and collateral damage
Posted on October 7th, 2024

Fazal Masood Malik and Farhan Khokhar, Canada

In modern warfare, few events expose their ethical complexities as dramatically as the recent Israeli operation against Hezbollah, where explosives were concealed in ordinary pagers, leading to numerous casualties (Israel planted explosives in Hezbollah’s Taiwan-made pagers, say sourcesSeptember 20, www.reuters.com). The fusion of cutting-edge technology and conflict has blurred the line between the battlefield and civilian life, demanding urgent moral scrutiny.  Currently, these attacks are initiated through programmed triggers; however, in the not-so-distant future, the algorithm itself will be responsible for identifying and initiating attacks. As artificial intelligence (AI) takes a seat at the table of warfare, the Israeli pager incident reminds us of the pressing need to recalibrate our ethical compass.

Throughout history, civilian casualties have been presented under the euphemism of collateral damage” (Crawford, Neta C., Accountability for Killing: Moral Responsibility for Collateral Damage in America’s Post-9/11 Wars, www.academic.oup.com). This rhetorical sleight of hand has dulled our sensitivity to the devastating loss of innocent lives. Yet, with precision-guided weapons becoming commonplace and AI promising to remove human interface in combat, we now find ourselves in a moral predicament. Precision may increase, but so too does the ethical ambiguity.

Before the nuclear age, the Just War Theory” offered a moral framework that emphasized distinguishing combatants from civilians (Nathanson, Stephen, Terrorism and the Ethics of War, Cambridge University Press, 2010, www.cambridge.org). However, the advent of modern warfare, beginning with the bombings of Hiroshima and Nagasaki, has shown how technological advancements erode these boundaries. The Israeli pager operation is but one example of how civilian and military targets can be obscured, leaving us to grapple with the question of when acceptable risk morphs into indiscriminate harm. This is no longer an academic exercise—it is a matter of life and death.

Religious teachings have long offered moral guidance in times of conflict. The Holy Quran cautions, And create not disorder in the earth after it has been set in order…” (Surah Al-A’raf Ch.7: V.57) reflecting Islam’s commitment to humane warfare. Similarly, Judaism, Christianity, and Buddhism advocate for restraint and compassion in conflict (Deuteronomy 20:19; Matthew 5:44; Contemporary Buddhism 22 (1–2): 73–87”)

These ethical principles offer a beacon for navigating the stormy seas of modern warfare, where the line between combatant and civilian is increasingly blurred.

In today’s world, we witness numerous armed conflicts where civilian casualties, environmental destruction, and disregard for sacred sites have become tragically commonplace. In stark contrast, we can look back to a pivotal moment in Islamic history for guidance on ethical conduct during times of war. Shortly after the passing of the Holy Prophet Muhammadsa, Islam faced a critical juncture. Hazrat Abu Bakrra, newly elected as the first Khalifa, was confronted with an advancing army of apostates threatening to crush the Muslims. He upheld and emphasised the moral principles of warfare established by the Prophetsa. As the Muslim army prepared to depart Medina, Hazrat Abu Bakrra, advising them not to harm any places of worship or the scholars of faith, issued a profound set of instructions: Do not kill women or children or an aged, infirm person. Do not cut down fruit-bearing trees. Do not destroy an inhabited place. Do not slaughter sheep or camels except for food. Do not burn bees and do not scatter them.” (Muwatta Imam Malik, Kitab al-Jihad, Book 10, Hadith 10)

As we advance into the era of AI-driven combat, the ethical stakes become even higher. The integration of AI introduces new dilemmas surrounding accountability.  If an algorithm miscalculates and causes civilian casualties, who is responsible? The programmers? The commanders? The machine itself? These questions reflect an urgent need to establish ethical guidelines that minimize harm to innocents. (Editorial: Ethical challenges in AI-enhanced military operations, www.frontiersin.org)

Addressing these dilemmas requires more than individual morality; it demands systemic reform. Military organisations must prioritize the protection of civilians at all levels of decision-making. (Crawford, Neta C., Accountability for Killing: Moral Responsibility for Collateral Damage in America’s Post-9/11 Wars, www.academic.oup.com). This includes investing in non-lethal technologies and focusing on conflict resolution strategies. The aim must be to steer the world towards peace rather than further destruction.

The stakes are nothing short of monumental. Civilian casualties do not merely cost lives; they also sow resentment, weaken public support, and prolong conflicts. The Holy Quran’s wisdom, And if they incline to peace, then incline thou also to it…” (Surah Al-Anfal Ch.8: V.62), transcends religious boundaries, offering a strategic insight that restraint is often the most effective course of action.

So, how do we proceed in a world increasingly driven by greed and aggression? First, global militaries must develop rigorous ethical frameworks for emerging technologies, drawing from both secular and religious traditions. Second, decision-making processes need an overhaul to prioritize civilian protection (Roblyer, D. A. (2005),Peace and Conflict: Journal of Peace Psychology, www.doi.org) As Israel’s recent operations demonstrate, decisions are often made behind closed doors, leaving casualties uncounted and accountability elusive (Israeli undercover forces disguised as women and medics kill three Palestinians in West Bank hospital, www.pbs.org).

Finally, there must be greater transparency about military actions and their consequences. Governments must foster trust with their citizens by providing clear and accurate information about operations. The details are often obscured to prevent public outcry, an approach that erodes trust and undermines democratic accountability (Roblyer, D. A. (2005). Peace and Conflict: Journal of Peace Psychology, www.doi.org).

Technological advancements should prompt us to devote more resources to developing non-lethal alternatives and improving diplomacy. Navigating this ethical minefield won’t be easy, war is chaotic, and split-second decisions can have devastating consequences. Yet, if we fail to confront these moral challenges, we risk stumbling into a future where warfare becomes increasingly lethal and indiscriminate.

The words of the Prophet Muhammadsa, The best of people are those who bring the most benefit to the rest of mankind” (Ibn Abi ad-Dunya, Qada’ al-hawa’ij, Hadith 36), serve as a guiding light. In the age of AI, as we create ever more efficient methods of warfare, accountability of action is further eroded.

While technology may revolutionise the battlefield, some truths are immutable. Innocent lives matter. The use of excessive force without responsibility is counterproductive. And lasting peace can only be achieved through justice, not military dominance. As we navigate the ethical terrain of 21st-century warfare, these principles must remain our guide. The task before us is clearly to harness technological innovation while holding fast to our ethical convictions. Anything less risks leading us into a grim future where collateral damage becomes the accepted norm.

Leave a Reply

You must be logged in to post a comment.

 

 


Copyright © 2024 LankaWeb.com. All Rights Reserved. Powered by Wordpress