Targeted by Technology

We get along in life, I believe, by routinely ignoring the rather constant dangers that surround us.  Oh, we’ve taken care of the larger faunal predators, but we’ve replaced them with ourselves.  Our success as a species leads us to places we might not be comfortable being.  I was recently exposed to the documentary National Bird.  It’s about drones.  Not the friendly ones from Amazon that we hear will soon be delivering books to our doorstep, but the military grade kind.  I first became aware of how pervasive the military use of drones is while reading Wired for War (on which I posted here some years back).  The difference between that academic knowledge and watching the documentary is the human element.  Drones are assassination machines with high explosives and they are subject to no regulation.

Many of us feel, occasionally, some level of discomfort with how much information “they” have on us.  We don’t even know who “they” are or what they want.  Using the internet, we give them our information.  Caving to our desire for instant communication, we carry around smart phones that know where we are constantly.  Martin Luther once said you couldn’t stop birds from flying over your head, but you can prevent them from making a nest in your hair.  It’s becoming harder to shoo them away.  The nest is well established.  Our houses are easily found on Google maps, and drones can keep constant watch, like weaponized guardian angels.  Only they’re not our guardians.  As National Bird makes clear, drones kill civilians.  Women and children.  The conversations of the operators reveal how much they’ve bought into the jingoism of the “war on terror.”  The film also deals with the human cost of those who operate drones.

Technology stands to make life better, for some.  Watching people who have very little, who live in what would be considered poverty in this part of the world, being bombed by people remotely, is disturbing.  The operators, trained as if they’re playing a game, kill and then have to deal with it.  The use of tech to try to sanitize brutality was dealt with decades ago on a particularly famous episode of Star Trek appropriately called “A Taste of Armageddon.”  Rather than try to resolve conflict we, like those of Eminiar 7, readily accept it if it’s kept at a distance.  Only drones aren’t science fiction.  We’ve been using them for over a decade now, and we prefer not to think about it.  This isn’t an option, unfortunately, for those who’ve been targeted by technology.  The predators are still out there after all.

Battle Bots

Our local high school robotics meetings start up again this week. Actually, they’ve been going on all summer since robots do not require the rest and mental downtime that we mere creatures of flesh do. Glancing through the headlines of the Chronicle of Higher Education I saw a leading article on a topic I’ve been reading about: the military use of robots. On a college campus visit last semester I came across a robotics display and, since I’ve picked up some of the lingo, I engaged an engineering student sitting nearby. He told me that most of the funding for robotics at the collegiate level (there, anyway) came from the Department of Defense. Earlier this year I had read Wired for War, a book as stunning as it is frightening. In fact, P. W. Singer is cited in the article. What makes this interesting, however, was the role of Ronald Arkin, a Georgia Tech professor of robotics and ethics. Dr. Arkin believes robots to be morally superior to humans at making battlefield decisions. He’s not alone in that assessment.

The more I pondered this the more troubled I became. Morality is not a scientific purview. Ethics that have been quantified always fail to satisfy because life is just a little too messy for that. Who is more morally culpable: the policeman who shot a thief dead when the man was only stealing bread because his family was starving? Hands down the most challenging undergraduate class I took was bio-medical ethics. It was thornier than falling into a greenhouse full of roses. Sick minds and reality cooperated to draw scenario after scenario of morally ambiguous situations. I left class with two more things than I’d taken in: a headache and a conviction that there are no easy answers. Having a robot vacuum your floor or assemble your car is one thing, having one decide who to kill is entirely another.

The article cites the rules of war. The main rule seems to be that no matter what, some people will always kill others. We try to sanitize it by making the inevitable death-dealing follow ethical conventions. While religion often takes a bad rap these days, one of the things that it is capable of doing pretty well is providing an ethical foundation. People may not always live up to the standards, but religions only in very rare situations give people an excuse to hurt others. Nearly all religions discourage it. The rules of a science-based morality would likely fall along a logical algorithm. Unfortunately, there’s more gray than black or white in this equation. Algorithms, in my experience, are not so forgiving. So as I get ready for my first robotics meeting of the year I need to remind myself that the robots are capable of great good as well as great evil. Like with humans, it all depends on who does the programming.

After Before the Dawn

Apropos of reading Nicholas Wade’s Before the Dawn, as a sometime scholar of religion a number of points struck me. According to both material artifacts and DNA, several changes took place among human beings some 50,000 years ago. Having just read P. W. Singer’s Wired for War as well, the early coalescence of war and religion in human history was unavoidable. Wade ties the emergence of both with the development of language. It is only when we can speak that we can begin to express our theological speculations and, as history continues to teach us, despise those who disagree with us. It becomes clear quite early in the tome that Wade has an interest in explaining religion. Like many science writers he struggles with the issue of why religion persists, despite the explanatory value of science. We know how multiple aspects of our world work, yet we still defer to a divine that no one has ever seen or registered in any empirically verifiable way.

Not only does this tendency stretch back to our distant, distant relatives. The Natufians, about whom I generally lectured my students (itself ancient history), are marked as well by the dual achievements of religion and war. Wade is one of the few scholars I’ve discovered who concurs with my assessment that religion was among the earliest of human behaviors. In my mind, it is tied to consciousness and its evolution. Once we begin to realize that we are not in control of our destiny, we start to seek explanations from above, and hope that God loves us. Otherwise the picture isn’t so pretty. Indeed, Wade suggests that religion evolved as a socially cohesive force. Tying the concept to ethics and trust, he suggests early people had to learn to get along with strangers and religion cemented that bond.

I’m not a scientist, so I cannot assess whether this explains religion or not. It does seem clear, however, that if Wade is right religion itself has evolved into a more aggressive beast. Sure, religions still serve to bind people together—but only so far. As populations separated, their various religions evolved and led them to distrust one another. Instead of bonding humans together, religion began to put them into competition for the truth. Here, Wade’s analysis is sadly true—religion and war evolve together. Our small planet is yet too big for everyone to get along, to know and trust the stranger. Religion had helped us at the critical stage when we needed social bonding, and now it has naturally evolved into the opposite—a socially divisive force of orthodoxy and heresy. If Wade is correct, we all need religion to take on its most ancient role and bring people together instead of giving us excuses for war.

Waiting for the dawn

Fighting God

Quoting Orson Scott Card, P. W. Singer notes in Wired for War that two of humanity’s “primary occupations” are war and religion. These two aspects of life are simultaneously very distant while abutting each other. While analysts cite many causes of war, there is no agreement concerning why we seem to be constantly belligerent. As a species we are keenly aware of small differences, perhaps like ants, and use those minor points to excuse the exercise of violence. Yet we are also a profoundly religious species as well, believing in supernatural powers that sometimes deliver us from, sometimes into, war. The Bible, just by way of example, contains many accounts of war. Often they are undertaken at the behest of deity. Religion and war coexist a little too comfortably.

Although Singer’s purpose in this book is to analyze the impact of robotic technology on the practice of war, he also finds indications about the origins of war itself. In today’s affluent world, dominated by technology, we should expect that armed conflict would be on the decline. Instead, it would be difficult to find any historic era when unfair distribution of basic goods has been more pronounced. As Singer notes, social disruption today tends to begin in cities, places where those in squalor daily see the opulence of their neighbors’ lifestyles. Our culture awards the aggressive—those with bigger houses, bigger cars, bigger payrolls. To these we defer. At the same time, the vast majority have difficulty finding enough to survive, let alone thrive. Still, we offer tax breaks to those who don’t need them and remind the poorest of their social obligations. This is often done in the name of religion. God is the ultimate capitalist.

The sum result, it seems, is not to lessen human hopes for religious deliverance. The belief in fairness, biologists inform us, is deeply embedded in primate evolution. We believe in fairness, and when it is elusive we thrust it toward the heavens, trusting in divine justice. Millions have died awaiting that justice that isn’t forthcoming. Again, another quote from one of Singer’s sources, “Amid galaxies of shining technologies there is a struggle to redefine human meaning… Half the world is looking for God anew, and the other half is behaving as though no god exists” in the words of Ralph Peters. Although the reference here is to technology, it could just as easily be to money or war. It appears as though we have an actual trinity of casus belli that are inseparable: technology, money, and God.

Some of our earliest technology

Robo-Stop

I have just read the most disturbing book yet. And for me, that is saying something. The facets of fear that P. W. Singer’s Wired for War manages to cut are sharp and dangerous. That he was able to write the book with a good dose of quirky humor only ameliorated the troubles a minor bit. The subtitle of the book is The Robotics Revolution and Conflict in the 21st Century. I was drawn into robotics by the FIRST Robotics competitions in which my daughter’s school competes. Not an engineer or programmer, I merely sit on the sidelines and cheer along those who understand mechanics, hydraulics, and electronics. Quite often I get the sense that since science works so well there is little room left for serious consideration of the humanities. Particularly religion. By the end of Singer’s book, however, my choices in life were reaffirmed. I would rather spend the limited days left to the human race celebrating our humanity. For, it seems, our days may be numbered indeed.

Considering that Wired for War was published three years ago, the technology must surely now be even more advanced than it was when the book went to press. That such technology as Singer describes exists is not in itself too much cause for worry, but the fact that such technology rests in military hands is decidedly disturbing. One of the few resources able to tap into the tremendous budget of the United States with impunity, the military services have been able to commission robots that are even now deployed in our various conflicts. A strong ethical question run through Singer’s account: we are racing ahead with lethal technology and artificial intelligence—and no one is really driving this machine. Shouldn’t someone be?

One of the more sobering aspects of Singer’s account is how humans are increasingly left “out of the loop” when it comes to lethal decisions being made by robots. Their logic is flawless, as is their aim. Their understanding, however, is purely mythical. As I read this gripping account, several issues spiraled out to be considered on their own. I arrived home disheartened and concerned for a future that seems to be inevitably in the hands of those I fear most: those with excess capital. Military robots do not possess empathy or compassion, just physics and trigonometry. And they already exist. When those powerful enough to wage war discuss the rules, their decisions are tellingly called “the doctrine of war.” Doctrine, whether military or religious, is always a sure sign of danger to come. And the robots aren’t coming. They’re already here.