Global Issues

Normative Breakdown in the Digital Battlespace: Reconstructing International Humanitarian Law Amid Algorithmic Warfare -By Fransiscus Nanga Roka

And yet, the international response remains half hearted. Calls for ethical guidelines, voluntary codes of practice and non binding frameworks betray fear to meet the challenge squarely. We are regulating algorithms as if they were tools, but they are in fact becoming legislators.

Published

on

Over the centuries, war has always been a test for law. In the age of algorithms, though, this test has become something more. It is now extending law being no longer merely tested. In 1996, the International Criminal Tribunal for the former Yugoslavia (ICTY) overturned the generally accepted interpretation of this rule. ICTY judges found that although “laws” had been developed, they were never codified and became custom instead of any written form which is far different from even customary usage today! At first sight, that statement seems to be wrong. Operations against high value targets are examined by the highest levels of military and political leadership of a state. We expect to be able hold them accountable. Now, in reality that premise is being overthrown because there is a new player who neither sees nor judges: the algorithm. This type of warfare is increasing, so to speak, the automation of military targeting. These systems promise both precision and efficiency. But under that promise rests a much grimmer picture – the erosion of legal responsibility. When a machine is making the decisions, who gets blamed? IHL’s fundamental principles distinction, proportionality, and precautionall depend on human perception. Take distinction, for example: identifying who is a civilian. Proportionality requires the military benefit of an action to be weighed against the harm done to civilians. Precautions demand foresight and self restraint. These are not just dry calculations; they carry serious moral judgments incorporated within law. But algorithms lack moral judgment. They can process data. And data in war is never neutral. It is incomplete, reflective of the biases and prejudices that produced it. Training datasets are biased towards their developers’ perspectives. Pattern recognition systems infer threats from behavior, metadata, or close context so that human lives are reduced to probabilistic profiles. What is at risk here, is not just error. It is systemic misclassification. Civilian people are no longer protected persons; they are now data points with risk scores. That is where normative failure begins to take root. IHL does not decline because states abandon it formally. It collapses because its principles are silently transmuted into technical parameters that no longer mark their original meaning. Proportionality as we used to understand it, becomes a code threshold. Distinction rock solid! As soon as these US soldiers disobeyed their orders in order to “augment” battlefield recognition equipment for greater safety, they were brought right down. Precaution becomes optimization.

Law was not so much violated as rewritten. Rewritten into a jargon which escapes examination.

As a result, algorithmic systems operate in a kind of void of accountability.

Their actions are neither open nor public, and often kept under wraps; they also hide behind security concerns or “proprietary” control. When civilians are harmed,responsibility is diffused through a chain of operators: programmers, commanders, contractors and machines. Each step widens the gap, making it even more difficult to hold anyone to account.

This architecture does not eliminate responsibility, but it does melt it down.

Advertisement

More worrying is how fast digital warfare moves. Decisions once had to be thought out can now be taken in milliseconds. The pace of war has outrun legal review. A meaningful human control which was considered to serve as a cushion is now reduced to merely a routine procedure.

There may be a human “in the loop,” but not in control.

This creates a potentially fatal illusion; that simply by having a human operator on scene, the act is therefore legal. In fact, it often masks just how far decision making has already been handed over to systems which are beyond human understanding.

The paradox is stark. The more sophisticated warfare is because of technology, law less able it has convinced people it is actually managing us.

And yet, the international response remains half hearted. Calls for ethical guidelines, voluntary codes of practice and non binding frameworks betray fear to meet the challenge squarely. We are regulating algorithms as if they were tools, but they are in fact becoming legislators.

Advertisement

This is not legislative oversight but an omission in the construction of governing bodies. If IHL is to remain on the scene, it must be give a new look.First of all, meaningful human control must be redefined. Not as institutional supervision, but real decision making power that cannot be relinquished to anyone else.Second, the operation of algorithmic systems in warfare must be subject to strict transparency and audit requirements, including independent confirmation of legality. Third, responsibility must be re-situated: liability can not be tossed onto machines. It must clearly and efficiently stick to people and institutions.On a more important matter, the law should not try to convert its principles into wholly technical terms. Not everything that can be calculated can be corroborated in theoretical advance.Because when legal judgment is turned into code, human nature itself becomes a variant.When a system is pressed upon us that it serves to make war more mathematical but not to pin anybody for anything–that is nothing at all but fraud! Yet when war is carried on by formulas that cannot fathom the value of human life, it’s not just slow in catching up law it has no meaning. Will the law control war in the digital battlefield? The question now is whether the war has not already gone beyond the law.

Fransiscus Nanga Roka

Faculty of Law University 17 August 1945 Surabaya Indonesia

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version