Call Us Now

+91 9606900005 / 04

For Enquiry

Israeli armed robots to patrol borders


State-owned Israel Aerospace Industries has unveiled a four-wheel-drive remote-controlled armed robot- “REX MKII” which is capable of patrolling battle zones, tracking infiltrators and opening fire.


GS-III: Science and Technology, GS-IV: Ethics

Dimensions of the Article:

  1. Benefits of using robots in war
  2. Disadvantages of robots in war
  3. Ethicality of using Lethal Autonomous Weapon Systems (LAWS)

Benefits of using robots in war

  • Autonomous robots, because they are not physiologically limited, can operate without sleep or food, perceive things that people do not, and move in ways that humans cannot. The use of a broad range of robotic sensors is better equipped for battlefield observations than human sensory abilities.
  • The robots provide following benefits: faster, cheaper, better mission accomplishment; longer range, greater persistence, longer endurance, higher precision; faster target engagement; and immunity to chemical and biological weapons.
  • Robots do not need to protect themselves in cases of low certainty of target identification. Autonomous armed robotic vehicles do not need to have self-preservation as a foremost drive, if at all. They can be used in a self-sacrificing manner if needed and appropriate, without reservation by a commanding officer.
  • Reducing loss of human lives forms one of the core principles of ethics of war, which can be accomplished by the use of the robots.

Disadvantages of robots in war

  • The use of robot soldiers will cheapen the cost of war, making future wars more likely. The threshold of entry into warfare may be lowered as we will now be risking machines and fewer human soldiers. This could violate the conditions of just warfare.
  • Such weapons are worrisome because they can’t be trusted to distinguish between combatants and civilians or make proper calls about the harm attacks may do to nearby civilians.
  • Machines cannot understand the value of human life, which in essence undermines human dignity and violates human rights laws. Therefore, machines are likely to commit atrocities and violate the basic rules of war like the Hague Conventions, and other declarations delimiting how a war should be fought.
  • There will always be risks like proliferation of the technology to other nations and terrorists. Also, the robotic machines are prone to cyber-security attacks or hacking and they can be used against their own people.

Ethicality of using Lethal Autonomous Weapon Systems (LAWS)

  • On the international stage, the phrase “meaningful human control” is discussed as a key component of the ethical development of LAWS. Yet no consensus has been reached as to what “meaningful human control” would look like in reality.
  • Many philosophers—and probably most ordinary people—believe that morality cannot be boiled down to a list of instructions. This view goes back to Socrates—that acting morally is a “craft” that takes experience, practice and nuance. And it requires something else—judgment or intuition or a moral sense—that is not expressible in words. If that’s right, then morality could never be captured in a set of requirements and just handed over to a machine to follow perfectly.
  • No matter how complicated a machine becomes, it will never be able to act for the right reasons. “Acting for the right reasons” is important as it has a strong intuitive appeal. There’s a difference between someone who saves a drowning child out of pure selflessness, and someone who does it because they hope to be rewarded handsomely. The difference in these cases is that one action is performed for the right reasons and the other one is not. There is also a long history in the military ethics tradition of people arguing that soldiers should fight in war only for the right reasons.
  • Naturally, some people will liken autonomous weapons to simply very smart cruise missiles – but this comparison is faulty: cruise missiles and bullets do not make the decision that specific people should die. For every cruise missile and bullet, there is some human behind it that made that decision, who then transmits their intention through that weapon. That human acts for reasons; their weapon does not decide anything. – Autonomous weapons are not like that—they make lethal decisions on their own.

-Source: The Hindu

April 2024