As someone who dedicated four years—particularly long winter nights—to the cause of high school robotics, I found myself knowing quite a bit before I walked into the room. Now let me post a disclaimer here: I’m no techie. I’ve studied the humanities throughout my education and although I’ve been able to engineer a bookshelf or two, and even the occasional project with moving parts, the technical eludes me. I claim absolutely no expertise in it. So where was I going that robotics came to mind? A lecture on Artificial Intelligence. AI. You see, I’ve been a bit concerned about it for some time because I’ve seen what robots can do. A friend recently showed me some episodes of Battle Bots on YouTube (the relative who started this blog for me was doing quite well in the competition last season), and from my own experience watching hours of FIRST robotics competitions, I know enough to be afraid.
The lecturer assured us that we had nothing—or next to nothing—to fear. Artificial Intelligence, he assured us, is a misnomer. Machines have no will. No mind. There’s nothing they want. They do as they’re told. You write a program and feed it to your bot and your mechanical friend can do only what it’s told to do. This sounds uncomfortably like slavery to me and although I know I’m projecting, I have to wonder if robots think the same way about it too. No, the lecturer assured us, they do only the tasks assigned. They don’t think at all. Then he said something that made me shiver. That wasn’t his intent. He said something like, “We don’t even know what consciousness is, so how can we replicate it?” That was meant to be reassuring.
I took this idea and flipped it over in my head. Rotated it. Ran it through my own programming. If we don’t know what consciousness is, how can we be sure we haven’t accidentally created it? Herein lies the heart of fear. Scientists have been trying for decades to define, to explain empirically, what consciousness is. We simply don’t know. We all recognize it when we see it in other humans. We’re finally starting to recognize it in animals (long overdue). How do we know that it isn’t a function of complexity? And when does something become complex enough to qualify? I don’t know about you, but videos of swarm robots send me hiding under the bed. Not that it will do me much good. They’ll know where I’m hiding. Maybe I could use some intelligence right now. Even something artificial might help to stop me from shivering.