International Affairs Forum: In your book [“Wired for War”] you talk about “digital warriors” and the very identity of who fights our wars. Can you expand on that a little bit?
Dr. Peter Singer: The idea here is that there’s something very big going on in this history of war and maybe even humanity itself. We went into Iraq with a handful of unmanned planes, drones, UAVs, we now have over 7,000 in the U.S. military inventory. We went into Iraq with zero unmanned ground vehicles used in the invasion force, we now have over 12,000. And these are just the start; these are the Model T Fords, the Wright brothers flyers compared to what’s coming. And in fact one of the people that appears in the book is an Air Force three star general who talks about how we’re soon going to have tens of thousands of these robots fighting our wars. And it’s important here to say, to note that we’re not talking about tens of thousands of today’s robots but the ones that come after.
[T]he Packbots and the Predators, and this is the idea that you have Moore’s law in action with our technologies, that every two years or so they double in their computing power. Now what that means is the kind of things that we only used to talk about in science fiction really now need to be talked about by those who explore issues of war, and politics, and ethics, and law. And it’s the idea that we are at the forefront of a robot revolution, and that really connects to this sense of “digital warriors”.
Because we’re not talking about a revolution where, you know, Arnold Schwarzenegger is going to show up at your door à la “the Terminator”. It’s not that kind of revolution. It’s a revolution in war itself. Because every so often there’s a technology that comes along that rewrites the rules of the game. It forces us to ask new questions about not only what’s possible but what’s proper. And the interesting thing is that usually in war it’s been a technology that changes the “how” of war. It is a system with a bigger boom at a dramatic level like the atomic bomb. It’s a system that shoots dramatically faster like the machine gun. It’s a system that goes dramatically further like what gun powder allowed or the Long Bow Revolution. Now that’s definitely happening with robotics in terms of changing the how of war. But it also is affecting the “who” of war. It is a revolution that is reshaping warriors’ experience, and warriors’ very identity.
Another way of putting this, the big historic deal here, is that humankind has had a 5,000 year monopoly in fighting wars. And that monopoly is breaking down in our lifetime.
IA-Forum: Let’s start with talking about robots on the battlefield. How about describing what these robots basically are, and what are they doing in Iraq and Afghanistan, and who’s operating them?
Dr. Singer: These robots come in all shapes and sizes, used in every domain of war. In Iraq we have more than 22 different types of ground robotics at work right now. Some of the most popular so to speak are the Packbot, the Packbot is a robot made by the iRobot Company, the same company that brought you the Roomba vacuum cleaner also brings you the Packbot military robot. It’s a very interesting company that sells both to the Pentagon and to Linens ‘n Things. It’s of course named after the fictional Isaac Asimov novel and the Will Smith movie, which is pretty telling because in that fictional world robots start out by doing day-to-day chores, and then move on to making life and death decisions, which is actually the trend line we’re doing with our real world robots.
The Packbot started out being just for observation and reconnaissance, and then was utilized for things like counter-IED work, that is going after those roadside bombs and defusing them. And now they’re exploring arming it. And actually this is similar what’s happened with the Packbot’s main corporate rival, which is the TALON, it’s made by the Foster-Miller Company. The TALON similarly was just for surveillance, and they’ve explored arming it with the SWORD system, which packs everything from machine guns to missiles, to its new version, which is the MAARS, M-A-A-R-S. The point here is that actually we’re seeing with robotics, they’re following the same pathway that happened with things like the ground systems using the internal combustion engine, that is cars, trucks, tanks, and those that fly. The first planes were just for observation. Then someone said, “You know what, I can see them, I want to do something about it”. So we first had jerry-rig bombers, which is the same thing that happened with robotics.
The first robot to draw blood on the battlefield in Iraq was one armed with a gun, specifically it was one that soldiers had jerry-rigged by putting claymore mines on top of and they’d drive them up the enemy and then explode them. But then you get specially designed bomber planes, same things happening with our robotics on the ground, and of course with the Predator drones in the air. And then you start to move into, “hold it, they’re bombing me, I’m bombing them, I want to do something about it in the air”. So they’re designed to fight each other, and that’s that trend line that we’re following here.
IA-Forum: What do you think are the most controversial legal, or ethical, elements to using these unmanned warbots?
Dr. Singer: You can’t even get into the issue of most controversial because basically it’s like saying, what’s the most controversial thing to come out of Pandora’s Box. It goes into so many different directions. It goes into questions like “does it make war more or less likely if we start to view our wars as costless?” How does it affect the public’s relationship with the war when they can now watch what’s happening, but not have to experience it, via the YouTube clips that our drones capture. What are the consequences for the warriors’ experience when they can go to war but not actually have to go to a place where there is any risk? How does the other side view it? What is the impact of robots on the war of ideas, that is not what is the message we think we are sending with these systems, but what is the message received in places like Lebanon or Pakistan? What are the questions of accountability that pop up as you have systems that are incredibly powerful but haven’t gotten rid of Murphy’s Law? You still have the stakes, you still have errors, you still have the wrong people killed. Who do you hold responsible, the mission commander, the pilot, the sales person, the computer programmer? What happens as you get into systems that have more and more autonomy, start making more and more of their decisions. Does that make war crimes more or less likely? To what system of law do you turn to, to adjudicate those questions? There’s a scene in the book where I went around asking that at Human Rights Watch, and two senior leaders there get in an argument in front of me as to whether it’s the Geneva Conventions we should turn to or the Star Trek prime directive.
If you want to be an ethical roboticist, what kind of system should you build, what kind of system shouldn’t you? There’s no answer to that. What about the ethics of who gets to control these systems? Is the Predator drone just a military system that should be limited to the military? Well guess what, too late; Department of Homeland Security already has six. Is it something that we should be OK with local police departments having? Well guess what, too late. London, Vancouver, L.A. police departments are all exploring the purchase of drones. How about me? Is it my second amendment right to have a robot that bears arms. The point is these are the kind of things that once people would have debated at science fiction conventions. We are talking about real systems that are already here today, and those who wrestle with issues of war and politics need to get their head out of the sand when it comes to the capabilities and possibilities of the technology that’s already here.
IA-Forum: By using warbots to do some of our fighting you mentioned that we’re removing some of the risk factor of direct participation. Isn’t this why some in the Muslim world say that the U.S. using robots is cowardly?
Dr. Singer: It’s a very interesting illustration of sometimes a message you think you’re sending is not the message that’s received. It’s also an example of a tough policy dilemma right now. What are robots’ impact on the very real, very human war of ideas that we’re fighting against radical groups around the world? So I went around trying to find out the answer to this. And on the terms of the message we think we are sending, I thought a senior Bush administration official put it well, where he described that our “unmanning” of war quote, “plays to our strength, the things that scares people is our technology”, end quote. That’s the perception, but what about what’s being received, and what about what’s being received by those people? And I think another person who appears in the book put it really well, and he’s the leading newspaper editor in Lebanon. And he’s describing this actually while there is a drone flying above him at the time. And he says how “this is just another sign of the cold-hearted cruel Israelis and Americans who are cowards because they send machines to fight us. They don’t want to fight us like real men, and all we have to do to defeat them is just kill a few of their soldiers”. And so it’s this absolute disconnect between “message sent” versus “message received”. Or as another analyst put it, quote, “the objects of this look really freaking bad, makes us look like the evil empire from Star Wars, and the other side look like the rebel alliance”.
IA-Forum: That’s a good analogy.
Dr. Singer: But, what are you going to do, because these drone strikes for example, are actually very effective. Eleven out of the top 20 Al Qaeda leaders we’ve killed have not been via boots on the ground, have not been via manned bombers or spies, or the like, it’s been via robotic drones, 11 out of 20, the majority. And if you had not utilized the drones and sent in forces on the ground for example, you would have most definitely had U.S. casualties which you didn’t have with the drones. You would have likely had much greater civilian casualties, and you probably wouldn’t have gotten those 11 out of 20. Now the problem though is again the broader context because we are getting very good at killing leaders with these technologies, but we also are getting really good at convincing the 12-year olds to join these groups via the use of these technologies. And there’s a good illustration in the book which was a popular song in Pakistan last year, a rock song actually has as its lyrics a description of how America doesn’t fight with honor. It looks at Pakistan just like killing insects on the ground.
IA-Forum: Let me switch for a second here and go back to who’s operating some of these remote, unmanned vehicles. So my understanding is that there’s a site in Nevada. Should the 19-year old drone pilot who’s fighting the insurgency in Iraq while never leaving Nevada, be considered a “remote combatant”, and therefore a legitimate target for the enemy even as he sits in the U.S.?
Dr. Singer: It’s a great question, and actually it’s one that not just the 19-year old asked, but one of the people who appears in the book was a Predator drone squadron commander who led those 19-year olds. And they’re talking about a great deal of the very different issues that it brings commanding drone squadrons versus a manned squadron, and in particular challenges of fighting at home, and still having your family around you while you’re dealing out death during the day. But one of the things he asked in terms of his confusion on the laws of war is that he very much saw himself as a combatant in the cockpit. That is when he was flying that system, he saw himself as a legal combatant. He wore a uniform and what he did fit within the laws of war in his perception. But he did not know, he posed it actually as a question, what happens when I walk outside the door and I’m back in the U.S., and what if I’m walking down the street in Las Vegas, and someone shoots me? Is that an act of war? And does it matter whether they do it up close or they do it from behind me from afar?
We may think it seems right or wrong, but again how does that fit into it? Is it an act of war? Is it an act of terrorism? Is it an act of murder? He was just a combatant just a couple minutes ago. His geographical location -- he didn’t have a good answer to it, and it points to these issues, the take away here is this: we have laws of war that are so old right now they qualify for Medicare. And yet we’re trying to apply them both to 21st century technologies, as well as a war against 21st century adversaries. So you have technologies like the Reaper drone on one side of the challenge to the laws of war, and on the other side we’re using them to target insurgents who know we’re not supposed to blow up hospitals, so they put their weapons in the hospital; who know we don’t like to target women and children because that’s against our understanding of laws of war, and so surround themselves with women and children. So you’ve got this double pressure on the understandings of the laws of war.
IA-Forum: So how do you think the enemy will adapt to this robotics revolution in war fighting?
Dr. Singer: We’re already seeing adaptation in lots of different ways. You’re seeing a very quiet back and forth in technology, even in counterinsurgency. There’s this mythology out there that for example, as a prominent political science professor put it, Iraq was a war that proved that technology didn’t manner. No, Iraq was actually the war where robotics were finally accepted. Iraq is a war where you have a very interesting and important back and forth with technologies. For example the insurgents have come up with 90 different ways to detonate IEDs. We’re using robotics, they’re using jammers against our robotics, back and forth. And third, these technologies have proven to be critically important even and with things like the surge, that is supposedly the story of just adding more troops, or the Sunni Awakening, well oh by the way, it’s also the story of Task Force ODIN which is a unit that used a combination of human intelligence, analysts, and unmanned drones to kill more than 2400 insurgents either making IEDs or placing IEDs. That’s what broke the IED network in Iraq.
The point here is that anyone who speaks in absolute terms is always wrong, and I say that with the full sense of the irony. Technology is not useless in counterinsurgencies, nor is it the silver bullet the way the Rumsfeld folks thought. So to circle back around to your question, the responses are everything from developing their own tactics and procedures on how to avoid or mitigate our unmanned systems to their responses in the war of ideas, painting us from being cowards using them, or trying to amp up the levels of civilian casualties around these incidents, to actually developing their own systems, and using their own systems. And this is another aspect that’s changing in war. Just like the software industry, warfare is going “open source”, that is it’s not just the big boys that can use the most advanced technologies anymore. An illustration of that is Hizbollah may be a terrorist/paramilitary group, it’s certainly not a nation-state, but in its war against Israel, it flew four drones back at Israel.
IA-Forum: The subject you’re writing about points to a larger debate in a globalized business world where the U.S. military robots have, as you say, “computer chips made in China with software from India”, how do you think the U.S. national security community reconciles possible vulnerabilities that could come from these amalgam products?
Dr. Singer: That’s a major, major challenge. And the way I think about it is this -- there is a lesson of both war and technology. There is no such thing as a permanent first mover advantage. So in technology your readers are probably not reading this article on their Commodore computer. The same in war -- the British invent the tank, the Germans figure out how to use the tank better. And the challenge particularly for the U.S. is, we may be ahead with military robotics but we are most certainly not the only ones in this field. There are 43 other countries that are building military robotics today, and they include countries like Russia, China, Pakistan, Iran. Actually just last week we shot down an Iranian drone over Iraq. But there’s something beyond it which you just linked to, which is the question of where does the state of the American manufacturing economy, and I would add the state of science and mathematics education in our schools, take us in this revolution? What does it mean to be sending out more and more “soldiers” whose hardware is made in China and whose software is written in India? And it also illustrates another new domain, a new direction of war that using digital warriors takes you into, which is that you have battles not merely of destruction but battles of persuasion. That is, there is an enemy system, in the past I look at that system and say how do I blow up that tank, how do I disable that tank using kinetic power. Well now if it’s digital I have the possibility of “how do I jam it”, or even better, “how do I persuade it”, that is, “how do I hack into it and make it do what I want to do”? And you can’t do that to a human brain cell form, but you can to a system.
IA-Forum: Do you see sort of our taking this robotics to space and having cybersecurity issues in space?
Dr. Singer: The point here is that every place, every domain that humans fight, we are now starting to use more and more robotics in it. That’s true in air, on land, and at sea. Humans are also actually starting to engage in conflict or prepare for conflict in new domains that we haven’t fought in before. And those include cyberspace and outer space. And actually by their very nature these latter two almost require you to be using bots in cyberspace. Well one, this isn’t a world of Tron, we can’t get ourselves into little figures and get in there. When it comes to things like hacking and the like, human hackers are good, but it’s actually bots, AI [artificial intelligence], that is really where this is all headed. And there’s actually a debate as to whether a virtual bot meets the definition of a robot.
The human role is by its very definition in cyberspace circumscribed even though that we know is going to be a domain of conflict. In outer space it’s actually another area that it almost has to be unmanned, one’s by just shear cost. It takes a little over $9,000 per pound to launch something into space. So to get a human up there, and more importantly, the food, air, water, to keep them alive is prohibitively expensive as compared to an unmanned system. They also are incredibly vulnerable when they’re up there. Pop a little hole in the spacecraft, out goes all the oxygen, you’ve destroyed the system. Versus it’s not the case for an unmanned system. So until we get to the age of energy shields all of the Star Ship Enterprise, any conflict in space by its very nature is going to be unmanned, and that just leads to the question is there potential of conflict in space? Well don’t know if it’s going to happen but we know that this is an area where the U.S. and other nations have put an incredible amount of value in both economic but also military value up in space. We’re dependent on everything from spy satellites to GPS. We can’t operate our military without what we have in space. That in turn makes it a critical vulnerability that others want to target. So the U.S. has done over 20 studies of conflict in space, and in turn last year the senior colonel in the Chinese military said that if the U.S. thinks it’s going to be the only space superpower it better think again.
IA-Forum: Would you briefly describe the “mothership” concept of fighting versus “swarming”, and how these might applied to unmanned war? You say, “Swarms are decentralized, in control, but concentrate fire power, while motherships are the opposite”, correct?
Dr. Singer: Yes. And the pullback on this is that really this is a question of doctrine. What we know about the history of war is that it’s often not how good your weapons technology is, or even how many you have of them, but it’s your doctrine; how you train, organize and equip your force to use these in war. And the best example of this is the doctrinal questions coming out after World War I related to the tank. The British and the French actually had more tanks than the Germans. They were the ones who invented and first used the tank. By some argument they actually even had better tanks, but they chose the wrong doctrine in how to use those tanks, and that’s why the Germans cleaned their clocks in the opening stages of World War II. The question for robotics is that there seems to be two doctrinal directions we can go in. One is the idea of the “mothership” which is concentrated control and command, and distributed fire power. It’s the idea of a guy sitting in a mothership, and lots of little unmanned systems around him linked back, and he’s pointing and clicking, and moving them around. It’s the way we’re headed with, for example, the LCS, the Navy’s Littoral Combat Ship. The opposite end of the spectrum is the “swarm” model. Imagine it like bees, and it’s distributed decision-making but concentrated fire power. That is you have lots of tiny small, maybe not even all that intelligent systems, that distribute themselves around each doing, following their own protocols, and then they find a target and they swarm it and circle it, and overwhelm it. That is actually also the model, it’s been traditionally used in war for example, the Mongols used this in war. Insurgents in some cases used this. Now the question is, which is the right doctrine? We don’t know yet, and the interesting thing though is we could be headed down certain doctrinal pathways in terms of what we buy and what that locks us into, without knowing which one is the best yet. And it’s those kinds of decisions that have huge consequences in history.
IA-Forum: We’ve talked a little bit about these insect type robots like bee-bot things, tiny enough to be released in a city location that could either spy on people or release chemicals for instance. Now this provides a tactical advantage on the battlefield perhaps, but what about their use domestically. How do we sort out where to draw the line with homeland security?
Dr. Singer: Oh gosh, great, great question. I remember thinking about this when I was at a certain Pentagon research lab, and they showed me a system that literally fit on the tip of your finger, and it’s just absolutely extraordinary and frightening at the same time. To me it’s really an ethical, legal question at the end of the day. Who gets to control these questions, and what are the parameters under which they can use them? So in this domestic sense, the police departments are already exploring the purchase of these kinds of systems, places like London, Vancouver, Los Angeles. I was just meeting earlier today with someone from the French defense ministry who said they were exploring it for the purposes of having one fly over the suburb areas where they’ve had all the riots.
The minorities gathered there. And the L.A. police department wants to put it over high crime neighborhoods. So the thing is that by a certain logic that makes great sense. Why wouldn’t you want that “all-seeing” eye in the sky to see what bad guys are doing. Why wouldn’t I want this little tiny bug that could sneak up and actually see what the guy who’s holding a child hostage inside the building, you can’t get the police up close, but the little drone could sneak up on him and find him. This is also incredibly appealing, but of course there’s always the other side to this tension. I may not be so psyched to have that drone, that all-seeing eye in the sky over my neighborhood. I may not be so psyched to have that little bug with a bug crawling up and listening in my room à la Minority Report, and the response back would be, “well if you’re not doing anything bad what do you have to worry about?” And the response back to that, “well, have you seen Minority Report? Do you like the scene in it where the little bugs walk up to the six-year old and get her ID?” She’s not doing anything bad, they’re just checking her ID, but it’s traumatic. And this is the thing here -- think about not just the questions of rights, but the questions of blowback, what’s the message that the French government for example is sending to those in that neighborhood when it carries out that policy. It’s a cross between “we’re “big brother” always watching you, we find you suspicious”, and, “but we’re afraid to have our people on the ground, we’re afraid to engage with you on a day-to-day basis”. So it’s a both “we inspire fear” but also reveal our own fear message.
There’s a broader issue here which is one of broader political theory. We may be living through the end of humankind’s 5,000 year monopoly on wars, we’re also potentially living through the end of the state’s 400-year old monopoly on wars. It is the whole idea of Hobbes’ social contract hinged on a deal that was made between the sovereign and the citizen. We always talk about one side of the deal which is that the sovereign protects citizens, otherwise life would be nasty, brutish and short. But there’s another side of the deal we often fail to add in, which is the sovereign respects the citizen’s rights, which is you may have to be loyal to the king, but the king has to respect that you’re a king in your own castle. And the interesting thing is that because of these technologies, and sorry, just to walk back for a second, that deal comes about during the gunpowder revolution.
You know, the war made the state, the state made war. Well now we’ve got a new revolution, and these new technologies that are coming along are making both sides of the deal more difficult. The state’s finding it harder and harder to deal with these small groups of individuals who have this incredible power, it’s not a couple of guys with a musket that really can disrupt a state, but with these new technologies you can be more challenging for the state. So that’s one part of the deal that states find it harder and harder to deliver on a promise of protection. So what’s the state response? The universal observation: more involvement in citizens’ lives.
So it’s also finding it harder to deliver on the second part of the deal, to respect the bubble around you, saying I’ve got to be able to pierce that bubble, I’ve got to be able to look inside to protect you. And so it’s this kind of the political theory equivalent of “to save the village we had to destroy the village”.
IA-Forum: Thank you, Peter.
Peter Warren Singer is Senior Fellow and Director of the 21st Century Defense Initiative at the Brookings Institution. Dr. Singer has written two other books, Corporate Warriors: The Rise of the Privatized Military Industry (2003) and Children at War (2005) exploring another new force in modern warfare, child soldier groups. His website can be viewed at: http://wiredforwar.pwsinger.com
|
Comments in Chronological order (0 total comments) |
|
|