• We are fast approaching a pivotal moment in the global arms race for Lethal Autonomous Weapons (LAWs)
  • Artificial intelligence (AI) and machine learning will disrupt the way things are being done right now
  • Major countries and technology companies all over the world are developing LAWs capable of acquiring, identifying and engaging targets without any meaningful human control
  • Israel Aerospace Industries’ Harpy, for instance hovers high in the sky surveying the land, and when an enemy radar signal is detected
  • Demands have risen for a comprehensive international treaty to pre-emptively ban the development of AI and other technologies in the field of LAWs
  • The proliferation of such weapons could have widespread ramifications on the way warfare is conducted and the future of our society
  • Algorithms used in LAWs internalize prejudices, but do not account for human suffering and, therefore, could cause extensive violence
  • This leads to the next visible challenge of this technology: fixing accountability. Who should be held responsible for the unintended actions of LAWs? Should it be its developer?
  • With LAWs, we jump off a moral precipice: autonomous bodies decide who lives and dies. These weapons make war beholden to objective standards, whereas the causes of warfare are naturally driven by subjective issues, such as nationalist sentiment, political strategy and human disagreements.
  • Also, what if this technology falls into the wrong hands.
  • Incidents all over the world have shown that even the most advanced security systems are susceptible to hacking.
  • Terrorists or rogue states could use such weapons on civilians
  • In time, the economic feasibility of such weapons will lower the cost barriers to war. Conflict will be reduced to a game where civilian and personnel casualties are reduced to mere statistics on a screen.
Share this article on

Leave a Reply

Your email address will not be published. Required fields are marked *

Enable Notifications    OK No thanks