Tuesday, July 27, 2010

Robotic Warfare: A Fucking Conundrum.

Modernity has presented humanity with a plethora of innovations that have provided fruitful economic drivers along side increasingly more frequent paradigm shifts in the human knowledge-base.  It's been observed frequently that when the military complex adopts a new technology, it’s a matter of time before these innovation trickle down to the masses.  We’ve seen this with microwave technology and computer networking just to cite relatable examples. In this year of 2010  we are now well on the way into the new robotic age. Companies like iRobot and their ever popular Roomba, being a shining example.

Prior to this era, robotics were implemented to replace the menial grunt work of the common assembly worker. These machines were designed to perform a single function repeatedly.  This new age that we have entered into, for good or for ill has seen the sophistication and finessing of robotics within the all facets of their utility.  In an effort to gain a strategic advantage in the current theatres of war, unmanned drones are becoming an increasingly attractive option, particular within the American forces.It keeps boots off the ground and enables a free and remote range of motion.Currently America employs approximately 5000 robots in Afghanistan, 1200 of which are unmanned air vehicles or UAVs like the predator drone.

This new brand of warfare presents huge moral grey area and list of very serious problems.   First lets explore the conundrum of the “Cubicle Warrior”.  The American predator drone is a remotely controlled unmanned air vehicle. It’s capable of prolonged surveillance and also equipped with two hellfire missiles.  This machine is operated remotely by a specially trained solder in a safe faraway location.  The problem presented with this model of combat is the total depersonalization of combatants. The UAV operator need not fear injury or death, while being fully capable of imparting those very things.  When damage is inflicted the operator does witness the carnage with their own eyes, but through a digital representation.  The operator also may pass off control of their UAV unit to another operator, then go get a coffee and take a cigarette break. The operator can end their shift, drive back their homes and spend an evening in the comfort of their family and their home.  The combatants on the other side of the coin are faced with an enemy that does not sleep, feel pain or know fear.  This very fact in its self could act as a rallying call to escalate their cause, to rally against what they would perceive as their technologically endowed but cowardly enemy.    

Another major issue with this problem is the fact that eventually once the technology has become less expensive and available, private interests and individual persons could use this technology for their own aims abroad or within their own state.   When a machine is used to commit a warcrime,  who does the hammer of justice fall down upon?  The machine without will of its own or an unknown operator who is secluded in another country outside of  the jurisdiction of their accuser? Does the operator suffer the consequence, or their superior sitting in an office issuing orders?  What happens when  the technology  becomes sophisticated enough to achieve autonomy and acts seemingly on its own accord? These questions currently do not have answers, and this scenario is not 10 years away, this is happening now in 2010. 

We have international laws to dictate rules of engagement, but these rules currently do not have any way to define these problems.  These laws, such as the Geneva convention are still reflecting a cold war age, before Moore's law acceleration has surpassed our common sense.

No comments:

Post a Comment