Our local high school robotics meetings start up again this week. Actually, they’ve been going on all summer since robots do not require the rest and mental downtime that we mere creatures of flesh do. Glancing through the headlines of the Chronicle of Higher Education I saw a leading article on a topic I’ve been reading about: the military use of robots. On a college campus visit last semester I came across a robotics display and, since I’ve picked up some of the lingo, I engaged an engineering student sitting nearby. He told me that most of the funding for robotics at the collegiate level (there, anyway) came from the Department of Defense. Earlier this year I had read Wired for War, a book as stunning as it is frightening. In fact, P. W. Singer is cited in the article. What makes this interesting, however, was the role of Ronald Arkin, a Georgia Tech professor of robotics and ethics. Dr. Arkin believes robots to be morally superior to humans at making battlefield decisions. He’s not alone in that assessment.
The more I pondered this the more troubled I became. Morality is not a scientific purview. Ethics that have been quantified always fail to satisfy because life is just a little too messy for that. Who is more morally culpable: the policeman who shot a thief dead when the man was only stealing bread because his family was starving? Hands down the most challenging undergraduate class I took was bio-medical ethics. It was thornier than falling into a greenhouse full of roses. Sick minds and reality cooperated to draw scenario after scenario of morally ambiguous situations. I left class with two more things than I’d taken in: a headache and a conviction that there are no easy answers. Having a robot vacuum your floor or assemble your car is one thing, having one decide who to kill is entirely another.
The article cites the rules of war. The main rule seems to be that no matter what, some people will always kill others. We try to sanitize it by making the inevitable death-dealing follow ethical conventions. While religion often takes a bad rap these days, one of the things that it is capable of doing pretty well is providing an ethical foundation. People may not always live up to the standards, but religions only in very rare situations give people an excuse to hurt others. Nearly all religions discourage it. The rules of a science-based morality would likely fall along a logical algorithm. Unfortunately, there’s more gray than black or white in this equation. Algorithms, in my experience, are not so forgiving. So as I get ready for my first robotics meeting of the year I need to remind myself that the robots are capable of great good as well as great evil. Like with humans, it all depends on who does the programming.